Thanks to visit codestin.com
Credit goes to github.com

Skip to content

untangle lru_cache and persistent cache #450

@reece

Description

@reece

The current lru_cache code is complicated because it mixes several intentions. Let's untangle them.

The two goals are 1) in-memory LRU memoization to lessen remote data fetches; 2) persistent caching, primarily so that tests do not require network access.

Consequences of mixing these concerns are:

  • Can't use external lru_cache code (incl. in 3.x)
  • Configuration is confusing. uta connect() requires a cache mode that's different than the lru_cache mode, and neither checks whether the supplied value is legit.
  • Learning mode is slow (probably because it writes back every time).
  • As implemented, the hdp interface is also entangled in caching.

I'm going to park this as a placeholder for discussion. I think I'd like to see us revert to a cache-unaware UTA module, and a (new) caching hdp that accepts an underlying hdp (eg uta) that merely caches data.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    Status

    No status

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions