Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@jayceslesar
Copy link

In my organization, we often add many "duplicate" DBC's to a database to build a larger database. We found an optimization to cache the work done by create_encode_decode_formats and key it with the signal, so that in the event of adding "duplicate" signals, we are not doing work that has already been done.

This change showed an decrease in runtime of at least 16% when calling self.refresh() in many cases, but this scales with how many "duplicate" signals you are adding to a given database.

I don't really love the _signal_cache_key function as it really just makes more sense to add a hash function to the Data and Signal classes respectively IMO but I am not a maintainer so would appreciate guidance in the right direction there.

@coveralls
Copy link

Pull Request Test Coverage Report for Build 18507065384

Details

  • 21 of 23 (91.3%) changed or added relevant lines in 1 file are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage decreased (-0.008%) to 93.913%

Changes Missing Coverage Covered Lines Changed/Added Lines %
src/cantools/database/utils.py 21 23 91.3%
Totals Coverage Status
Change from base Build 18122136525: -0.008%
Covered Lines: 7467
Relevant Lines: 7951

💛 - Coveralls

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants