So, yeah, the subject of my talk is when optimizations backfire. It's about cases when you achieve some common optimization but it, in turn, makes the app slower, not faster.
So first, let's talk about CDNs. So who here uses a CDN in production? Hm, quite a lot of people. Nice. So one time, a while ago, I was a developer in a big complex web app, and the tab was slow. Slow apps aren't nice, so we had a project to make the app faster, and as part of that project, we decided to shape a CDN.
Now, a CDN is a service that brings your files closer to the user, right? If your server is in the US, and the user is in India with a CDN, the request for the bundle does not need to go all the way back to the US. It can just go to the CDN server close to the user in India. So our server was in the US, and our users were all around the world. So we decided to try this optimization. We configured the buckets to upload built results into, we configured the CDN in front of that, and we pointed all URLs in the app to that CDN server. Basically, our index.html, instead of looking like this, started looking like this, with the CDN origin in front of all the URLs.
So far so good, right? But when we run performance tests, we discover that suddenly, the site became not faster, but slower. Anyone, by the way, has an idea why? Based on the optimizations, based on the change we made, does anyone have an idea why this happens? No hands. Good. Well, good for me. So let's try to investigate this, as if I was investigating it today with the knowledge and the methods I have now.
So what's important to know about the lighthouse score is it's not just an abstract performance score, right? It's calculated based on these five metrics. If one metric gets worse, the whole score gets worse. If one metric gets better, the whole score gets better. There's even a calculator for this. Now, most metrics in these two tests, before adding a CDN and after adding a CDN, stayed more or less the same. Lars' Contentful Paint even got better, which is good, but one metric, First Contentful Paint, got significantly worse. So First Contentful Paint is a metric that measures how quickly the first content renders. It's the time from the moment the site started loading to the moment the site rendered its first content, and it got worse.
So now, whenever I have a site that has better rendering speeds, like First Contentful Paint or Lars' Contentful Paint, what I like to do is I like to look at the frame-by-frame rendering of that site, at the network waterfall of that site, and then compare them to try to figure out what the heck has happened. So in this case, if I compare both versions of this site, one without CDN and one with, I'll notice two big things. The first one is the responses now actually take less time, as intended. With the CDN, these tests were made from Canada, for example, using webpagetest.org, and with the CDN, this error was closer to the users, so the round trip was shorter, so the file took less time to download than without CDN.
Comments