-
Notifications
You must be signed in to change notification settings - Fork 1k
[lit-html] Use a double-keyed LRU cache for server rendered static html #5118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
🦋 Changeset detectedLatest commit: 81dd1be The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
📊 Tachometer Benchmark ResultsSummarynop-update
render
update
update-reflect
Resultsthis-change
render
update
update-reflect
this-change, tip-of-tree, previous-release
render
update
nop-update
this-change, tip-of-tree, previous-release
render
update
this-change, tip-of-tree, previous-release
render
update
update-reflect
|
|
The size of lit-html.js and lit-core.min.js are as expected. |
382b730 to
4fd1dc2
Compare
kyubisation
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Generally looks good to me.
One question about the public API; Should the LRU cache be exposed or should this be an internal implementation detail?
| let stringsCache: Cache; | ||
| if (isServer) { | ||
| stringsCache = new Map< | ||
| TemplateStringsArray, | ||
| LRUCache<string, TemplateStringsArray> | ||
| >(); | ||
| } else { | ||
| stringsCache = new Map< | ||
| TemplateStringsArray, | ||
| Map<string, TemplateStringsArray> | ||
| >(); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there any difference here? Both paths initialize a Map and the type is given by Cache.
From my understanding this can simply be reduced to the following suggestion. Or am I missing something?
| let stringsCache: Cache; | |
| if (isServer) { | |
| stringsCache = new Map< | |
| TemplateStringsArray, | |
| LRUCache<string, TemplateStringsArray> | |
| >(); | |
| } else { | |
| stringsCache = new Map< | |
| TemplateStringsArray, | |
| Map<string, TemplateStringsArray> | |
| >(); | |
| } | |
| const stringsCache: Cache = new Map(); |
| super.delete(key); | ||
| super.set(key, value); | ||
| if (this.size > this.maxSize) { | ||
| const keyToDelete = this.keys().next().value; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
During the community call there was a question about performance of accessing this.keys().next().value.
From a brief, naive test case this seems acceptable.
Performance test case
const perfTest = (name, setup, test) => {
console.log(`Starting performance test: ${name}`);
const times = [];
for (let i = 0; i < 100; i++) {
const data = setup();
const start = performance.now();
test(data);
times.push(performance.now() - start);
if (i % 10 === 0) {
console.log(`Completed ${i} runs`);
}
}
const avgTime = times.reduce((a, b) => a + b, 0) / times.length;
console.log(`Average time over 100 runs: ${avgTime.toFixed(6)} ms`);
}
perfTest(
'Small map key access',
() => new Map().set(1, 1).set(2, 2),
(data) => data.keys().next().value
);
perfTest(
'Large map key access',
() => [...Array(100000).keys()].map(() => crypto.randomUUID()).reduce((m, k) => m.set(k, k), new Map()),
(data) => data.keys().next().value
);| if (isServer) { | ||
| innerCache = new LRUCache<string, TemplateStringsArray>(10); | ||
| } else { | ||
| innerCache = new Map<string, TemplateStringsArray>(); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code style nitpick; Feel free to ignore.
| if (isServer) { | |
| innerCache = new LRUCache<string, TemplateStringsArray>(10); | |
| } else { | |
| innerCache = new Map<string, TemplateStringsArray>(); | |
| } | |
| innerCache = isServer | |
| ? new LRUCache<string, TemplateStringsArray>(10) | |
| : new Map<string, TemplateStringsArray>(); |
This change switches static html to use a double keyed cache. The outer key is the pre-flattened template strings and the inner key is the final template string after all of the static parts have been merged into it (which was the previous behavior). It also implements the inner cache as a simple LRU with a relatively small max size. The idea here is that the ideal usage of static html has a small cardinality, but improper usage may have a high/infinite cardinality and we don't want to OOM the server (when using SSR) with an ever-growing cache.