Tags: speed

219

Codestin Search App

Tuesday, August 5th, 2025

It’s time for modern CSS to kill the SPA - Jono Alderson

SPAs were a clever solution to a temporary limitation. But that limitation no longer exists.

Use modern server rendering. Use actual pages. Animate with CSS. Preload with intent. Ship less JavaScript.

Saturday, July 19th, 2025

I’m more proud of these 128 kilobytes than anything I’ve built since | by Mike Hall | Jul, 2025 | Medium

I don’t normally link to articles on Medium—I respect you too much—and I do wish this were written on Mike Hall’s own site, but this is just too good not to share.

And don’t dismiss this as a nostalgiac case study from the past:

At no point did the constraints make the product feel compromised. Users on modern devices got a smooth experience and instant feedback, while those on older devices got fast, reliable functionality. Users on feature phones got the same core experience without the bells and whistles.

The constraints forced us to solve problems in ways we wouldn’t have considered otherwise. Without those constraints, we could have just thrown bytes at the problem, but with them every feature had to justify itself. Core functionality had to work everywhere, and without JavaScript crutches proper markup became essential.

This experience changed how I approach design problems. Constraints aren’t a straitjacket, keeping us from doing our best work; they are the foundation that makes innovation possible. When you have to work within severe limitations, you find elegant solutions that scale beyond those limitations.

Tuesday, July 15th, 2025

(optional.is) Latency and the Sea

Brian’s excellent comparison of network latency and the nervous system of animals:

If an earthquake occurs in California USA, halfway around the globe someone can find out faster than a blue whale detects something has touched its tail.

Friday, January 10th, 2025

Website Speed Test

Here’s a handy free tool from Calibre that’ll give your website a performance assessment.

Wednesday, October 16th, 2024

content-visibility in Safari

Earlier this year I wrote about some performance improvements to The Session using the content-visibility property in CSS.

If you say content-visibility: auto you’re telling the browser not to bother calculating the layout and paint for an element until it needs to. But you need to combine it with the contain-intrinsic-block-size property so that the browser knows how much space to leave for the element.

I mentioned the browser support:

Right now content-visibility is only supported in Chrome and Edge. But that’s okay. This is a progressive enhancement. Adding this CSS has no detrimental effect on the browsers that don’t understand it (and when they do ship support for it, it’ll just start working).

Well, that’s happened! Safari 18 supports content-visibility. I didn’t have to do a thing and it just started working.

But …I think I’ve discovered a little bug in Safari’s implementation.

(I say I think it’s a bug with the browser because, like Jim, I’ve made the mistake in the past of thinking I had discovered a browser bug when in fact it was something caused by a browser extension. And when I say “in the past”, I mean yesterday.)

So here’s the issue: if you apply content-visibility: auto to an element that contains an SVG, and that SVG contains a text element, then Safari never paints that text to the screen.

To see an example, take a look at the fourth setting of Cooley’s reel on The Session archive. There’s a text element with the word “slide” (actually the text is inside a tspan element inside a text element). On Safari, that text never shows up.

I’m using a link to the archive of The Session I created recently rather than the live site because on the live site I’ve removed the content-visibility declaration for Safari until this bug gets resolved.

I’ve also created a reduced test case on Codepen. The only HTML is the element containing the SVGs. The only CSS—apart from the content-visibility stuff—is just a little declaration to push the content below the viewport so you have to scroll it into view (which is when the bug happens).

I’ve filed a bug report. I know it’s a fairly niche situation, but there are some other issues with Safari’s implementation of content-visibility so it’s possible that they’re all related.

Wednesday, June 26th, 2024

Pivoting From React to Native DOM APIs: A Real World Example - The New Stack

One dev team made the shift from React’s “overwhelming VDOM” to modern DOM APIs. They immediately saw speed and interaction improvements.

Yay! But:

…finding developers who know vanilla JavaScript and not just the frameworks was an “unexpected difficulty.”

Boo!

Also, if you have a similar story to tell about going cold turkey on React, you should share it with Richard:

If you or your company has also transitioned away from React and into a more web-native, HTML-first approach, please tag me on Mastodon or Threads. We’d love to share further case studies of these modern, dare I say post-React, approaches.

Tuesday, May 21st, 2024

Speculation rules

There’s a new addition to the latest version of Chrome called speculation rules. This already existed before with a different syntax, but the new version makes more sense to me.

Notice that I called this an addition, not a standard. This is not a web standard, though it may become one in the future. Or it may not. It may wither on the vine and disappear (like most things that come from Google).

The gist of it is that you give the browser one or more URLs that the user is likely to navigate to. The browser can then pre-fetch or even pre-render those links, making that navigation really snappy. It’s a replacement for the abandoned link rel="prerender".

Because this is a unilateral feature, I’m not keen on shipping the code to all browsers. The old version of the API required a script element with a type value of “speculationrules”. That doesn’t do any harm to browsers that don’t support it—it’s a progressive enhancement. But unlike other progressive enhancements, this isn’t something that will just start working in those other browsers one day. I mean, it might. But until this API is an actual web standard, there’s no guarantee.

That’s why I was pleased to see that the new version of the API allows you to use an external JSON file with your list of rules.

I say “rules”, but they’re really more like guidelines. The browser will make its own evaluation based on bandwidth, battery life, and other factors. This feature is more like srcset than source: you give the browser some options, but ultimately you can’t force it to do anything.

I’ve implemented this over on The Session. There’s a JSON file called speculationrules.js with the simplest of suggestions:

{
  "prerender": [{
    "where": {
        "href_matches": "/*"
    },
    "eagerness": "moderate"
  }]
}

The eagerness value of “moderate” says that any link can be pre-rendered if the user hovers over it for 200 milliseconds (the nuclear option would be to use a value of “immediate”).

I still need to point to that JSON file from my HTML. Usually this would be done with something like a link element, but for this particular API, I can send a response header instead:

Speculation-Rules: “/speculationrules.json"

I like that. The response header is being sent to every browser, regardless of whether they support speculation rules or not, but at least it’s just a few bytes. Those other browsers will ignore the header—they won’t download the JSON file.

Here’s the PHP I added to send that header:

header('Speculation-Rules: "/speculationrules.json"');

There’s one extra thing I had to do. The JSON file needs to be served with mime-type of “application/speculationrules+json”. Here’s how I set that up in the .conf file for The Session on Apache:

<IfModule mod_headers.c>
  <FilesMatch "speculationrules.json">
    Header set Content-type application/speculationrules+json
   </FilesMatch>
</IfModule>

A bit of a faff, that.

You can see it in action on The Session. Open up Chrome or Edge (same same but different), fire up the dev tools and keep the network tab open while you navigate around the site. Notice how hovering over a link will trigger a new network request. Clicking on that link will get you that page lickety-split.

Mind you, in the case of The Session, the navigations were already really fast—performance is a feature—so it’s hard to guage how much of a practical difference it makes in this case, but it still seems like a no-brainer to me: taking a few minutes to add this to your site is worth doing.

Oh, there’s one more thing to be aware of when you’re implementing speculation rules. You have the option of excluding URLs from being pre-fetched or pre-rendered. You might need to do this if you’ve got links for adding items to shopping carts, or logging the user out. But my advice would instead be: stop using GET requests for those actions!

Most of the examples given for unsafe speculative loading conditions are textbook cases of when not to use links. Links are for navigating. They’re indempotent. For everthing else, we’ve got forms.

Wednesday, April 17th, 2024

Faster Connectivity !== Faster Websites - Jim Nielsen’s Blog

The bar to overriding browser defaults should be way higher than it is.

Amen!

Tuesday, April 16th, 2024

Standing still - a performance tinker | Trys Mudford

What Trys describes here mirrors my experience too—it really is worth occasionally taking a little time to catch the low-hanging fruit of your site’s web performance (and accessibility):

I’ve shaved nearly half a megabyte off the page size and improved the accessibility along the way. Not bad for an evening of tinkering.

Tuesday, March 26th, 2024

Fidinpamp

If you’re a fan of gratuitous initialisms, you’ll love Google’s core web vitals. Just get a load of the obfuscation in the important-sounding metrics like CLS, FCP, LCP, and more.

To be fair to Google, this is a problem in the web performance world in general. Practioners prefer to talk about TTFB rather than “time to first byte” even though both contain exactly the same number of syllables.

The big news in the web performance community this month is the arrival of a new initialism. INP sounds like one of those pseudo-scientific psychologic profiles but it’s meant to stand for Interaction to Next Paint (even if they were to swear off pointless initialisms, you’d still have to pry Pointless Capitalisation from Google’s cold dead hands).

This new metric is a welcome one. It’s replacing first input delay. Sorry, First Input Delay, or FID, one of the few web vital initialisms that can be spoken as a word, making it a true acronym (fortunately fid’s successor, inp, also works as an acronym).

First Input Delay has long outstayed its welcome. It was always an outlier in the core web vitals. It didn’t seem to measure anything actually useful. I know it sounds like it’s measuring the delay until the user can interact with a web page, but when you dive into what it actually does, it’s a mess:

FID measures the time from when a user first interacts with a page (that is, when they click a link, tap on a button, or use a custom, JavaScript-powered control) to the time when the browser is actually able to begin processing event handlers in response to that interaction.

See that word “begin” in there? It’s doing a lot of work. First Input Delay doesn’t measure the lag between the user interaction and the browser response; it only measures the lag between the user interaction and the browser beginning to respond. The actual response could take ages, but that lag doesn’t get measured. Unlike the other core web vitals, this metric is very far removed from what actually matters to the user’s experience.

What the fid where they thinking? How the fid did this measurement ever get included in core web vitals in the first place?

Well, feel free to take what I’m about to say as pure gossip, but I have my sources, I trust ’em, and no, I’m not going to reveal ’em…

It’s because of AMP.

Remember Google AMP? An acronym so pointless they eventually just forgot it ever stood for anything?

The AMP project ended up doing incredible damage to Google’s developer relations. By colluding with the search team to privilege the appearance of AMP pages in the top news carousel, Google effectively blackmailed the entire publishing industry into using their format.

In the end, it didn’t work. It was a shit format. All they did was foster resentment and animosity:

AMP seems to have faded away. Most publishers have started dropping support, and even Google doesn’t seem to care much anymore.

It turns out that Google search wasn’t the only team infected by AMP. The core web vitals team also had to play ball.

Originally they had a genuinely useful metric for measuring the lag between input and response. But guess which pages did terribly? That’s right: AMP pages.

Rather than ship an actually-useful measurement, the core web vitals team instead had to include the broken First Input Delay, brainchild of a certain someone on the AMP team.

Now it all makes sense.

So good riddance to FID. Welcome to INP. And here’s hoping it won’t be much longer till we’re finally burying AMP.

Tuesday, February 27th, 2024

JavaScript Bloat in 2024 @ tonsky.me

This really is a disgusting exlusionary state of affairs.

I hate to be judgy, but I honestly wonder how the people behind some of these decisions can call themselves web developers.

Thursday, February 22nd, 2024

PageSpeed Insights bookmarklet

I’m a little obsessed with web performance. I like being able to check a page’s core web vitals quickly and easily.

Four years ago, I made a Lighthouse bookmarklet. Whatever web page you were on, when you clicked on the bookmarklet you’d get the Lighthouse results for that page. Handy!

It doesn’t work anymore. This is probably because Google are in the loop. Four years is pretty good innings for anything involving that company.

I kid (mostly). Lighthouse itself is still going strong, despite being a Google product. But the bookmarklet needs updating.

Rather than just get Lighthouse results, I figured that the full PageSpeed Insights results would be even better. If your website is in the Chrome UX report, you get to see those CrUX details too.

So here’s the updated bookmarklet:

PageSpeed Insights

Drag that up to your desktop browser’s bookmarks toolbar. Press it whenever you want to test the page you’re on.

Monday, February 19th, 2024

Speedier tunes

I wrote a little while back about improving performance on The Session by reducing runtime JavaScript in favour of caching on the server. This is on the pages for tunes, where the SVGs for the sheetmusic are now inlined instead of being generated on the fly.

It worked. But I also wrote:

A page like that with lots of sheetmusic and plenty of comments is going to have a hefty page weight and a large DOM size. I’ve still got a fair bit of main-thread work happening, but now the bulk of it is style and layout, whereas previously I had the JavaScript overhead on top of that.

Take a tune like Out On The Ocean. It has 27 settings. That’s a lot of SVG markup that needs to be parsed, styled and rendered, even if it’s inline.

Then I remembered a very handy CSS property called content-visibility:

It enables the user agent to skip an element’s rendering work (including layout and painting) until it is needed — which makes the initial page load much faster.

Sounds great! But there are two gotchas.

The first gotcha is that if a browser doesn’t paint the element, it doesn’t know how much space the element should take up. So you need to provide dimensions. At the very least you need to provide a height value. Otherwise when the element comes into view and gets rendered, it pushes down on the content below it. You’d see a sudden jump in the scrollbar position.

The solution is to provide a value for contain-intrinsic-size. If your content is dynamic—from, say, a CMS—then you’re out of luck. You don’t know how long the content is.

Luckily, in my case, I could take a stab at it. I know how many lines of sheetmusic there are for each tune setting. Each line takes up roughly the same amount of space. If I multiply that amount of space by the number of lines then I’ve got a pretty good approximation of the height of the sheetmusic. I apply this with the contain-intrinsic-block-size property.

So each piece of sheetmusic has an inline style attribute with declarations like this:

content-visibility: auto;
contain-intrinsic-block-size: 380px;

It works a treat. I did a before-and-after check with pagespeed insights on the page for Out On The Ocean. The “style and layout” part of the main thread work went down considerably. Total blocking time went from more than 600 milliseconds to less than 400 milliseconds.

Not a bad result for a little bit of CSS!

I said there was a second gotcha. That’s browser support.

Right now content-visibility is only supported in Chrome and Edge. But that’s okay. This is a progressive enhancement. Adding this CSS has no detrimental effect on the browsers that don’t understand it (and when they do ship support for it, it’ll just start working). I’ve said it before and I’ll say it again: the forgiving error-parsing in HTML and CSS is a killer feature of the web. Browsers just ignore what they don’t understand. That’s what makes progressive enhancement like this possible.

And actually, there’s something you can do for all browsers. Even browsers that don’t support content-visibility still understand containment. So they’ll understand contain-intrinsic-size. Pair that with a contain declaration like this to tell the browser that this chunk of content isn’t going to reflow or get repainted:

contain: layout paint;

Here’s what MDN says about contain:

The contain CSS property indicates that an element and its contents are, as much as possible, independent from the rest of the document tree. Containment enables isolating a subsection of the DOM, providing performance benefits by limiting calculations of layout, style, paint, size, or any combination to a DOM subtree rather than the entire page.

So if you’ve got a chunk of static content, you might as well apply contain to it.

Again, not bad for a little bit of CSS!

Wednesday, January 31st, 2024

SpeedCurve | The psychology of site speed and human happiness

Tammy takes a deep dive into our brains to examine the psychology of web performance. It opens with this:

If you don’t consider time a crucial usability factor, you’re missing a fundamental aspect of the user experience.

I wish that more UX designers understood that!

Tuesday, October 24th, 2023

Ship Faster by Building Design Systems Slower | Big Medium

Josh mashes up design systems and pace layers, like Mark did a few years back. With this mindset, if your product interface are in sync, that’s not good—either your product is moving too slow or your design system is moving too fast.

The job of the design system team is not to innovate, but to curate. The system should provide answers only for settled solutions: the components and patterns that don’t require innovation because they’ve been solved and now standardized. Answers go into the design system when the questions are no longer interesting—proven out in product. The most exciting design systems are boring.

Friday, September 15th, 2023

Speedy tunes

Performance is a high priority for me with The Session. It needs to work for people all over the world using all kinds of devices.

My main strategy for ensuring good performance is to diligently apply progressive enhancement. The core content is available to any device that can render HTML.

To keep things performant, I’ve avoided as many assets (or, more accurately, liabilities) as possible. No uneccessary images. No superfluous JavaScript libraries. Not even any web fonts (gasp!). And definitely no third-party resources.

The pay-off is a speedy site. If you want to see lab data, run a page from The Session through lighthouse. To see field data, take a look at data from Chrome UX Report (Crux).

But the devil is in the details. Even though most pages on The Session are speedy, the outliers have bothered me for a while.

Take a typical tune page on the site. The data is delivered from the server as HTML, which loads nice and quick. That data includes the notes for the tune settings, written in ABC notation, a nice lightweight text format.

Then the enhancement happens. Using Paul Rosen’s brilliant abcjs JavaScript library, those ABCs are converted into SVG sheetmusic.

So on tune pages there’s an additional download for that JavaScript library. That’s not so bad though—I’m using a service worker to cache that file so there’ll only ever be one initial network request.

If a tune has just a few different versions, the page remains nice and zippy. But if a tune has lots of settings, the computation starts to add up. Converting all those settings from ABC to SVG starts to take a cumulative toll on the main thread.

I pondered ways to avoid that conversion step. Was there some way of pre-generating the SVGs on the server rather than doing it all on the client?

In theory, yes. I could spin up a headless browser, run the JavaScript and take a snapshot. But that’s a bit beyond my backend programming skills, so I’ve taken a slightly different approach.

The first time anyone hits a tune page, the ABCs getting converted to SVGs as usual. But now there’s one additional step. I grab the generated markup and send it as an Ajax payload to an endpoint on my server. That endpoint stores the sheetmusic as a file in a cache.

Next time someone hits that page, there’s a server-side check to see if the sheetmusic has been cached. If it has, send that down the wire embedded directly in the HTML.

The idea is that over time, most of the sheetmusic on the site will transition from being generated in the browser to being stored on the server.

So far it’s working out well.

Take a really popular tune like The Lark In The Morning. There are twenty settings, and each one has four parts. Previously that would have resulted in a few seconds of parsing and rendering time on the main thread. Now everything is delivered up-front.

I’m not out of the woods. A page like that with lots of sheetmusic and plenty of comments is going to have a hefty page weight and a large DOM size. I’ve still got a fair bit of main-thread work happening, but now the bulk of it is style and layout, whereas previously I had the JavaScript overhead on top of that.

I’ll keep working on it. But overall, the speed improvement is great. A typical tune page is now very speedy indeed.

It’s like a microcosm of web performance in general: respect your users’ time, network connection and battery life. If that means shifting the work from the browser to the server, do it!

Tuesday, July 4th, 2023

The Cost Of JavaScript - 2023 - YouTube

A great talk from Addy on just how damaging client-side JavaScript can be to the user experience …and what you can do about it.

The Cost Of JavaScript - 2023

Wednesday, May 3rd, 2023

The intersectionality of web performance

Web performance is an unalloyed good. No one has ever complained that a website is too fast.

So the benefit is pretty obvious. Users like fast websites. But there are other benefits to web performance. And they don’t all get equal airtime.

Business

A lot of good web performance practices come down to the first half of Postel’s Law: be conservative in what send. Images, fonts, JavaScript …remove what you don’t need and optimise the hell out of what’s left.

That can translate to savings. If you’re paying for the bandwidth every time a hefty file is downloaded, your monthly bill could get pretty big.

So apart from the indirect business benefits of happy users converting to happy customers, there can be a real nuts’n’bolts bottom-line saving to be made by having a snappy website.

Sustainability

This is related to the cost-savings benefit. If you’re shipping less stuff down the wire, and you’re optimising what you do send, then there’s less energy required.

Whether less energy directly translates to a smaller carbon footprint depends on how the energy is being generated. If your servers are running on 100% renewable energy sources, then reducing the output of your responses won’t reduce your carbon footprint.

But there’s an energy cost at the other end too. Think of all the devices making requests to your server. If you’re making those devices work hard—by downloading, parsing, executing lots of JavaScript, for example—then you’re draining battery life. And you can’t guarantee that the battery will be replenished from renewable energy sources.

That’s why sites like the website carbon calculator have so much crossover with web performance:

From data centres to transmission networks to the billions of connected devices that we hold in our hands, it is all consuming electricity, and in turn producing carbon emissions equal to or greater than the global aviation industry. Yikes!

Inclusivity

There comes a point when a slow website isn’t just inconvenient, it’s inaccessible.

I’ve always liked the German phrase for accessible: barrierefrei—free of barriers. With every file you add to a website’s dependencies, you’re adding one more barrier. Eventually the barrier is insurmountable for people with older devices or slower internet connections. If they can no longer access your website, your website is quite literally inaccessible.

Making the case

I’ve noticed that when it comes to making the argument in favour of better web performance, people often default to the business benefits.

I get it. We’re always being told to speak the language of business. The psychology seems pretty straightforward; if you think that the people you’re trying to convince are mostly concerned with the bottom line, use the language of commerce to change their minds.

But that’s always felt reductive to me.

Sure, those people almost certainly do care about the business. Who doesn’t? But they’re also humans. I feel like if really want to convince them, speak to their hearts. Show them the bigger picture.

Eliel Saarinen said:

Always design a thing by considering it in its next larger context; a chair in a room, a room in a house, a house in an environment, an environment in a city plan.

I think the same could apply to making the case for web performance. Don’t stop at the obvious benefits. Go wider. Show the big-picture implications.

Tuesday, January 24th, 2023

Efficiency over performance

I quite like this change of terminology when it comes to making fast websites. After all, performance can sound like a process of addition, whereas efficiency can be a process of subtraction.

The term ‘performance’ brings to mind exotic super-cars suitable only for impractical demonstrations (or ‘performances’). ‘Efficiency’ brings to mind an electric car (or even better, a bicycle), making effective use of limited resources.

Wednesday, September 21st, 2022

Will Serving Real HTML Content Make A Website Faster? Let’s Experiment! - WebPageTest Blog

Spoiler: the answer to the question in the title is a resounding “hell yeah!”

Scott brings receipts.