Green Bytes: How Enhancing Web Vitals Contributes to Environmental Sustainability

This ad is not shown to multipass and full ticket holders
JSNation US
JSNation US 2025
November 17 - 20, 2025
New York, US & Online
See JS stars in the US biggest planetarium
Learn More
In partnership with Focus Reactive
Upcoming event
JSNation US 2025
JSNation US 2025
November 17 - 20, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

With this talk we will dive into the intersection of web performance optimisation and environmental conservation, focusing on how improving Web Vitals — key indicators of a website's health and user experience — can lead to a more sustainable digital footprint.

We will explore the core Web Vitals: Largest Contentful Paint (LCP), Interaction to Next Paint (INP), and Cumulative Layout Shift (CLS), and how these metrics influence not only the user experience but also the efficiency of web resources. Finally we'll discuss the direct and indirect environmental impacts of web operations, including energy consumption of data centres to transmission networks to the billions of connected devices that we hold in our hands, while taking a look at tools that help us calculate a web application's footprint.

This talk has been presented at JSNation US 2024, check out the latest edition of this JavaScript Conference.

FAQ

The internet accounts for 3.7% of global CO2 emissions, which is similar to the entire aviation industry.

Larger page weight requires more resources, requests, and network usage, leading to higher power consumption and increased carbon emissions.

Core Web Vitals are a set of standardized metrics introduced by Google to help site owners understand user experience. They focus on loading performance, interactivity, and visual stability.

Tools like Digital Beacon, Website Carbon, and Ecooping.Earth can measure a website's carbon footprint.

Developers can optimize images, defer loading of elements outside the viewport, and eliminate layout shifts to improve Core Web Vitals.

The AI industry is expected to increase internet-related CO2 emissions exponentially due to its high energy consumption.

Improving website sustainability can enhance user experience, SEO rankings, and reduce carbon emissions, aligning with environmental goals.

Developers can use green hosting providers, content delivery networks, and efficient caching policies to reduce carbon footprint.

INP measures interactivity by observing interaction events on a page. A good INP score indicates efficient processing power use and quick user feedback.

Optimizing Core Web Vitals leads to smaller page weights, which reduces power usage and carbon emissions.

Dimitris Kiriakakis
Dimitris Kiriakakis
28 min
18 Nov, 2024

Comments

Sign in or register to post your comment.
Video Summary and Transcription
Today's Talk focused on the importance of optimizing web vitals and performance for both user experience and the environment. The Internet's carbon footprint is significant, with page weight being a key factor. By reducing page weight and improving core web vital scores, developers can contribute to reducing CO2 emissions. The Talk highlighted how optimizing web vitals improved the loading performance, interactivity, and visual stability of a web application. It also discussed the importance of NextPaint interaction and profiling to enhance the NextPaint score. The Talk emphasized the connection between performance optimization and reducing the carbon footprint of web applications. Various tools and practices were recommended to measure and reduce the carbon footprint, including asset optimization, green hosting providers, and content delivery networks. The Talk also mentioned the need for AI regulations and the role of corporations in prioritizing sustainability. Overall, the Talk provided valuable insights into the intersection of performance and sustainability in software development.

1. Introduction to Green Bytes

Short description:

Today I want to talk to you about green bytes and how aiming for good web vitals scores benefits the planet. I've been a developer for 11 years, focusing on web applications and performance. Let's discuss the carbon footprint of the web, core web vitals, and tools for eco-friendly web practices.

Hello, everyone. Today I want to talk to you about green bytes and actually how aiming for good web vitals scores can be a good thing for our planet, too.

A few words about myself. My name is Dimitris, and I've been working as a developer for the past 11 years. Currently I work as a full-stack developer for Zeal. Zeal is an EU-based online lottery provider, probably the biggest one in Germany. And over the past few years my main focus has been on web applications, web performance, so occasionally I'll share articles about these topics on my DevTool and Medium profiles.

So today we're going to talk about the carbon footprint of the web. We're going to refresh our knowledge about the core web vitals, go through some use cases that show the correlation between web vitals improvements and carbon footprint reduction, and finally I'm going to share with you some useful tools and websites when it comes to eco-friendly web practices.

2. The Carbon Footprint and Core Web Vitals

Short description:

The Internet contributes 3.7% of worldwide CO2 emissions, similar to the aviation industry. Page weight is a crucial factor in the carbon footprint of the web, impacting resources, network usage, and power consumption. Reducing page weight can have a significant impact, as seen in the case of a plugin developer who reduced 1 kilobyte and saved emissions equivalent to five flights. Core Web Vitals, introduced by Google, focus on loading performance (LCP), interactivity (INP), and visual stability (CLS) metrics to ensure a good user experience.

In case you didn't know, the Internet consumes a lot of electricity, and a lot of electricity comes with a lot of emissions. So who thinks that the Internet's emissions are 1.5% of the global CO2 emissions? Please raise your hands. Okay, no one. Who thinks it's 2.5%? And who thinks it's more than 3.5%? Yeah, most of you got it right. In fact, 3.7% of worldwide CO2 emissions come from the Internet, a number which is similar to the number we get from the entire aviation industry, and this number is expected to increase exponentially in the upcoming years due to the growing AI industry.

So the carbon footprint of the web is caused by the infrastructure, the transfer of data, and the usage of end-user devices. One of the most important factors is the page weight, which refers to the total size of a web page. And this is because bigger page weight means more resources, more requests, more infrastructure points being involved, more network usage, and eventually more power used on the end-user's devices. We can measure the page weight of a website in the network tab of our browser. Maybe I can briefly show you an example here. So if you have a website and we open our developer tools, we can switch to the network tab, clean everything first, and then right-click on the Refresh button and do an empty cast and hard reload, and hopefully after a couple of seconds, we will see down here an indication about the website resources that have just been downloaded. And yes, we can also see the page weight, which in this case is 913 kilobytes. So, the average page weight keeps increasing over time. When in 2011, we used to have an average page weight of 500 kilobytes on desktop and 200 kilobytes on mobile. Nowadays, the average page weight is 2.6 megabytes on desktop and 2.3 megabytes on mobile. And it will keep growing unless we start optimizing our websites. But how big of an impact a page weight can have on a cavern footprint? A developer called Danny Van Coten, he's the author of some famous WordPress plugins, such as MailSim for WordPress, so he estimated that reducing the size of one of his plugins by 1 kilobyte resulted in a carbon footprint reduction equivalent to cancelling five flights from Amsterdam to New York. And given this, reducing the page weight of a popular website or a popular plugin that is used across the internet seems to have some impact.

Now speaking of reducing the page weight, let us talk a bit about Core Web Vitals. In the domain of web development, delivering a good user experience is essential, yet for many years there were no clear standards of what constitutes a good user experience. This changed in 2020 when Google introduced the Core Web Vitals, a set of standardized metrics that help site owners and developers understand how visitors actually experience their websites. These metrics focus on loading performance, interactivity, and visual stability. And here it's worth mentioning that earlier this year, the metric that was used for interactivity, which used to be first-input delay, got replaced by interaction to next paint. So loading performance, the related Core Web Vital metric is called Largest Contentful Paint, or LCP. Essentially, it is a render time of the largest text block, image, or element in the user's viewport relative to when the user first navigated to our website. To provide a good user experience, our website should have an LCP render time of 2.5 seconds or less. When it comes to interactivity, the related Core Web Vital metric is currently Interaction to Next Paint, or INP, which basically observes all the interaction events throughout our page. So if, for example, you have a button that triggers a heavy task, which takes a few seconds to finish, and we give no immediate feedback to the user, besides using a lot of processing power on the end-user's device, we will also get a bad INP score. To provide a good user experience, our websites should not have any interaction that could cause the paint with a delay longer than 200 milliseconds. And when we refer to visual stability, the related Core Web Vital metric is Cumulative Layout Shift, or CLS.

3. Optimizing Web Vitals and Performance

Short description:

During the page load, a stable page structure is crucial for a good Cumulative Layout Shift (CLS) score. An Angular application was intentionally optimized to improve its Web Vitals performance. With image optimization, deferred loading of elements, and elimination of layout shifts, the page weight was reduced by 70%. The performance timeline shows improved loading and stability, while the Web Vitals scores can be viewed in the browser's performance tab.

In a few words, if a user navigates to our website and there are elements moving around during the page load, we will have a bad CLS score. A good CLS score means that the page structure remains stable during the page load, regardless of when the elements are actually being loaded. So now that we've referenced our Web Vitals knowledge, I'd like to show you a simple application that demonstrates the correlation between Web Vitals optimization and the footprint of a website.

So as part of a blog post of mine, I developed this Angular application, which intentionally performs really bad. As we can see from Google Lighthouse, we get an LCP of 13.2 seconds, and a CLS of 0.367, while the threshold for a good user experience is 0.1. Also in the timeline, we can see that there are elements outside the viewport being loaded because of the layout shift. So during the page load, we can see that there are six elements on the Pokemon grid here, of which the last two are then being pushed outside the viewport when the main banner kicks in. And we could have avoided this. We could have avoided loading these two elements if we had a more stable layout. My own goal while writing this blog post was to improve the Web Vitals course of this application as much as possible. If you want to find out more details about the complete optimization process, you can check the full article under this URL.

But briefly for this presentation, I worked on the images. I generated smaller variants for mobile viewport, and I converted them to Webby format, which offers more efficient compression for our images. Then I prioritized the largest quant for paint image loading compared to other elements in the viewport, and I deferred the loading of other elements that are outside the initial viewport. Finally, I eliminated the layout shifts to have a better CLS score. And after these improvements, the page weight was decreased by 70%. Here are the results I got in terms of performance. This is the timeline showcasing the page load before and after the optimization under the same network conditions and throttling. The page structure has been stabilized. There are no additional elements entering the viewport than the ones we really need for the initial viewport. And the LCP event, the largest quant for paint event, happens in a timely manner.

If we go to our browser and open the developer tools, most of the browsers nowadays have this performance tab. And actually, latest Chrome versions, we also got these nice widgets here that show us the Web Vital scores value that we get for our local system. You might notice that the interaction to NextPaint is basically empty, and this is because we have no interactions yet on the web application. If we want to get the interaction to NextPaint score, we have to start interacting with our web application. So, if I click here or if I try to open the menu, you will see that the interaction to NextPaint score keeps getting updated. And we also get this view here, which basically tracks all the interaction events that happened in our web application.

4. NextPaint Interaction and Profiling

Short description:

To improve the NextPaint score, interact with the web application and track the events. Profiling provides a timeline of the page load and structure, helping to detect and fix layout shift issues. The largest contentful paint occurs within 800 milliseconds, and the related element can be identified in the DOM.

If we want to get the interaction to NextPaint score, we have to start interacting with our web application. So, if I click here or if I try to open the menu, you will see that the interaction to NextPaint score keeps getting updated. And we also get this view here, which basically tracks all the interaction events that happened in our web application.

If we want to do the profiling that I showed you before, we just go here and click record and reload. And hopefully after a couple of seconds, the profiling will finish and we will get this nice timeline that shows how our page load develops over time and how the page structure looks. This can be really helpful when we detect and we want to fix layout shift issues. And we also get the most important events here. So, for example, we get the first paint and the first contentful paint. And the one that is most relevant to us right now, which is the largest contentful paint. In this case, it happens within 800 milliseconds. And from this view, we can also detect the actual largest contentful paint element as the related node. And if we click it, we will also get it in the DOM.

5. Improving LCP and Carbon Footprint

Short description:

To improve the LCP score, prioritize loading the element compared to other viewport elements. The Angular project aimed to improve web vital scores in the latest versions. A significant carbon footprint reduction of over 50% was achieved. The Zeal websites in 2021 had poor scores with an LCP of 13 seconds and a CLS of 1.775. Mixed-up architecture, outdated Angular versions, and lack of loading priority contributed to the mess. Despite some improvements, Zeal's popular website, Lotto24, generated 25 tons of CO2 emissions in 2022. Fast-forward to 2024, and Web Vital Scores are close to passing the assessment. The Lotto24 website's carbon footprint is significantly reduced to 0.39 grams of CO2 per visit, saving 16 tons of emissions annually. The work done over the past two years is truly worthwhile, focusing on both user experience and carbon footprint.

So, now we know that if we want to improve the LCP score of our web application, we have to prioritize the loading of this element compared to the other elements that show in the viewport. The goal of this Angular project was to improve the web vital scores and to figure out how these improvements can happen in the latest Angular versions.

But then since I had the app deployed, both the bad version and the optimized one, I went on and did the carbon footprint check. And as you can see, I got a significant reduction, bigger than 50%. For such an application which has a couple visits a month, most of the time it's from a medium user, saving a couple of grams of emissions is not a big deal. But can you imagine what would happen in a more popular website? And here comes the second use case.

In 2021, we found out that the Zeal websites were in a really bad condition, and we had to do something about it. We had some pretty bad scores. Our LCP was at 13 seconds. And the CLS was at 1.775. This might be a world record. I don't think anyone else ever had a worse CLS score. If you think otherwise, please approach me and let me know. And this mess was due to many things. We used to have mixed-up architecture. A part of our app was written in AngularJS. Another part was written in Angular version 2. We had a pile of spaghetti services that were injected all over the place. And most importantly, we had no way of establishing a loading priority among the elements of our web application.

One year later, in 2022, and while the optimization project was still in progress, I found out about website Carbon.com, and even though we had already made some improvements, our footprint was still bad. The Carbon footprint for Zeal's most popular website, Lotto24, was 1.13 grams of CO2 per page load. This number might sound small, but if we consider that in 2022, the Lotto24 page was loaded 22 million times, it actually generated 25 tons of CO2 emissions, which is equivalent to burning 3,000 gallons of gasoline. Fast-forward to 2024, and now we finally have Web Vital Scores that we can talk about and not be ashamed of. We're still working on it, but we are close to finally passing the Web Vital Assessment for the 75th percentile.

She has probably mentioned that if we want to pass the Web Vital Assessment for our web application, we should be able to provide the core Web Vital Scores that we mentioned earlier for at least 75 percent of our users. As I said, we're still working on it, but as you can see, the Carbon footprint of the Lotto24 website is already looking way better, and now 0.39 grams of CO2 is produced every time someone visits our home page, and if we consider the amounts of visits that we had in 2024, this already translates to a nanoreduction of 16 tons of CO2 emissions. That's the equivalent of saving 2,000 gallons of gasoline every year. This really makes us feel like the work we did over the past two years was actually worth it.

Summarizing, to be honest with you, until some time ago, when I optimized the website, I was solely focused on passing the Web Vital Scores to offer better user experience and for better SEO ranking. I didn't really take the Carbon footprint under consideration, but the numbers speak for themselves.

6. Reducing Carbon Footprint and Tools

Short description:

Improving Web Vital Scores reduces the Carbon footprint. Asset optimization and deferred loading are necessary. Better INP scores mean less processing power and no unnecessary element loading. Other ways to reduce the footprint include switching to Green Hosting providers, using content delivery networks, and efficient casting policies. Tools like Digital Beacon, Website Carbon, Ecooping.Earth, and The GreenWeb Foundation can help measure and reduce the Carbon footprint. Join the ClimateAction.Tech community to stay updated.

When we improve the Web Vital Scores of a website, we are also decreasing its Carbon footprint. Improving the largest component of a website is not possible without asset optimization and deferred loading. When we have a better INP score, then we use less processing power on the end user's devices, and eliminating layout shifts ensures that no elements outside the initial viewport are loaded before we actually need them. Bottom line is that Web Vitals optimization comes with smaller page weight, and smaller page weight comes with smaller Carbon footprint, so it is good for our planet too.

Besides improving the Web Vital Scores, there are also other things that can help reduce the Carbon footprint. We have the option of switching to Green Hosting providers. Google Cloud and Azure claim to be Carbon Neutral by now, and AWS has already several Carbon Neutral locations. All three providers are taking significant steps to use only renewable energy, and I also think they have a goal of being completely Carbon Neutral by 2030. We can also use a content delivery network to save our assets and reduce round trips. Say our website is located in New York and we have users from Australia. With a content delivery network in place, our visitors would receive a copy of their asset from a nearby data center, rather than waiting for the asset to travel across the Pacific. Another thing to consider is to use efficient casting policies to further reduce the Carbon footprint for our returned visitors.

Finally, here are some tools and websites I want to share with you. First, we have the tools that help us measure the Carbon footprint of a website. Digital Beacon is my personal favorite and also offers data for our returned visitors. Website Carbon is another one that I use rather often. Then there is also Ecooping.Earth, which is more like an enterprise solution for measuring the Carbon footprint. Besides the regular footprint checks that we get from the other tools, it can also scan multiple paths and it offers scheduled footprint checks on a daily, weekly, or monthly basis. The GreenWeb Foundation is another one that can help us identify if our hosting provider is using green energy. And last but not least, there is also a community where you can receive updates about actions at green software-related events. If you are interested in the topic, I would highly recommend to check it out and get involved. The community is called ClimateAction.Tech.

7. Performance and Sustainability Connection

Short description:

And this was basically it. I have a deep interest in this space. The tie between performance and sustainability is a straight connection. Striving for performance optimization reduces the carbon footprint of our web applications. Ongoing research explores the use of highly compressed assets, such as WebP.

And this was basically it. I think that we will move soon to questions. If you want to check out the slides, feel free to scan the QR code and connect with me on LinkedIn, or GitHub, Instagram, Medium, or Twitter.

Thank you very much for that talk. I'll be very frank with you. When we were getting organized and we were deciding who is going to introduce what. By the way, I do want to apologize for being late. I was on my way here and someone stopped me. Long story. But I definitely wanted to do this because I have a deep interest in this space. Because I was like, oh wow, when I discovered what was going on, it was very interesting. So, thank you for that conversation. And it is relatively tied to performance, without a doubt.

Yes, exactly. Do you believe, and we will get some questions. Do you believe, how much do you feel like the tie between performance and sustainability is like a straight connection? There is definitely a straight connection. I mean, you saw the use cases, right? Even for simple applications, the really small ones that we have for our own portfolio. Or the enterprise application that we are working for during our main job. There is always correlation. If we strive for performance optimization, we will also get results in terms of the carbon footprint of our web applications.

Absolutely. And once upon a time, we did spend a lot of time around the page way conversation. Just kind of pre-sustainability. And people were like, well, x, y, and z. And you will see why there is always ongoing research. And I do call it research because it is very important. Around the ability to use highly compressed assets. So we might think that we are done at having JPEGs and that is all we need. But this is part of the reason why we tend to have these sort of ongoing conversations around getting some more compressed images being used. So that is why WebP came around as much as people had particular comments about that.

QnA

AVIF and Layout Shift Impact

Short description:

AVIF is part of the reason. The carbon calculation of the site should include developer and test visits. Layout shift impacts carbon footprint by ensuring a stable page structure without unnecessary elements.

And that is why AVIF came around as well. And that is why people had, well, AVIF, whatever. We are fine. This is part of the reason.

I know there was a question here and I will get to it. But I may ask, because I was trying to understand here. You are sure to include. So this question by Amy. You are probably in the room somewhere. I can't see you. Does the carbon calculation of your, I feel you might have said, you want to say site include developer and test visit, is that correct Amy? Yes. Oh, it is totally cool. So does the carbon calculation of your site include developer and test visits? Probably does. Did you want to expand on that? Yeah, I could. But I am not that deep into how the percentage of these visits could be. But I am pretty sure they should be included. Because most of our QAs, they are also doing tests on the live site. So, okay.

Let's move on to the next one. Let me see here. How slash why else does layout shift impact carbon footprint besides the lazy loading gains that you talked about? Let me see. How else does layout shift impact carbon footprint besides the lazy loading gains that you talked about? Yeah, it is not. Maybe I can answer already. It is not only about the lazy loading part. But when we have a stable page structure, then we make sure that no additional elements are entering the viewport. Right? Because, I mean, if we don't have a stable page structure, for example, we have a main banner and in the main banner we have a large video. We will wait for a couple of seconds until the video is there. And then we load all the rest of the elements. And then the video kicks in and puts everything to the bottom. And we have loaded all these elements without actually needing them.

Sustainability and AI Regulations

Short description:

Part of the sustainability conversation is about timely loading of assets. There should have been regulations for AI already, but they have been kept as a back burner conversation. Join the Climate Tech community for more discussions and meetups.

So ultimately, part of the sustainability conversation is about timely loading of assets. Yes? Timely what? The timely loading of assets. Timely loading of assets, yes. Oh, OK, it's happening here, too. I didn't realize.

OK, so I actually can't lie. Let me see here. So I did that one. Alex, AI uses a lot of energy and is used more and more every day. Do you think there will be any regulations due to the environmental concerns or will devs have to rethink AI? Great question, by the way. It's a great question indeed. There should have been some regulations already, I think. We recently had a recent discussion in the community that I shared in my slides about the regulatory part of it. As far as I know, there is not such a thing yet, but hopefully there should be. Yeah. And I would imagine there is. There will be. There will be. And I've had these conversations a few times, in fact. And for now, unfortunately, it sounds like they've sort of kept it as a bit of a back burner conversation because they definitely want this sort of momentum of innovation to take place. And I think to have immediate sustainability concerns might hold them back. But I think they're going to have to face that no matter what, because there are a lot of resources being spent. Yes, absolutely. And if you have more ideas or you want to get involved in these discussions, again, I invite you to join the Climate Tech community. Really great community, by the way. Are you a member? What is it? Are you a member? I believe I am. I get emails and all that. Nice. There are also some meetups that are being organized in New York, actually, in person meetups. Absolutely.

Web Almanac and Corporate Sustainability

Short description:

The Web Almanac is a digital book that comes out every year. There is a sustainability chapter in the latest edition. Convincing corporations to prioritize sustainability depends on their business goals. Some companies with sustainability practices are considered preferred partners. It was initially hard to convince my company, but now they see the impact and push for sustainability more. If you have comments, go to Slido.com and leave them in the poll section.

And I think you had some quotes from the Web Almanac in your slide. I thought I'd seen the HSP archive, Web Almanac. Yes, right. So it's kind of like this digital book that comes out every year. They skipped, I think, last year. The last edition was 2022, but they launched 2024 last week. And there's a fairly long sustainability chapter there. I'll actually tweet it out and I'll use a hashtag to talk about it.

I think we'll get to the last question here. I just want to see what I was talking about this before. Okay. I got that. Oh, I like this question as well. What ways are there to convince corporations to prioritize sustainability? Are there any carbon footprint incentives or tax breaks they could or they would be interested in? Have you heard of anything on that side? Not really. And I would say it depends on the business goals of its company. But what I would suggest to developers like ourselves is that we can push this topic masks us better user experience, masks us better SEO ranking for our web application. Because again, if we thrive for better performance and better user experience, we also get the carbon footprint reduction as a gift.

And I'm going to add to that as well. And this simply because I've done a bit of reading about this. Some companies are actually or gleefully sharing that they do have some sustainability initiatives and whatnot. And what is then happening? And this may sort of become a bit more du jour. And, you know, some companies that have sustainability practices are being or are considered preferred partners. So I know the gov.uk has been sort of like working around this space for a while, and they do intend on working with organizations that do have sustainability policies. To add on top, like speaking for my company, it was hard to convince them to take such initiatives. But now that they see the impact that they have, now they like it and now they push it more. Yeah, yeah, absolutely. Awesome.

Thank you very much for your conversations, folks. Again, if you do have, you know, comments to make about this talk, you could go to Slido.com. Is that it? Yeah. And punch in 1118 and you can go to the... I think it was like a poll section or something like that and leave some comments there and we'll go from there. Thank you very much for your presentation, sir. Thank you. Give it up for Dijit Mitri, please.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced 2022React Advanced 2022
25 min
A Guide to React Rendering Behavior
Top Content
This transcription provides a brief guide to React rendering behavior. It explains the process of rendering, comparing new and old elements, and the importance of pure rendering without side effects. It also covers topics such as batching and double rendering, optimizing rendering and using context and Redux in React. Overall, it offers valuable insights for developers looking to understand and optimize React rendering.
Speeding Up Your React App With Less JavaScript
React Summit 2023React Summit 2023
32 min
Speeding Up Your React App With Less JavaScript
Top Content
Watch video: Speeding Up Your React App With Less JavaScript
Mishko, the creator of Angular and AngularJS, discusses the challenges of website performance and JavaScript hydration. He explains the differences between client-side and server-side rendering and introduces Quik as a solution for efficient component hydration. Mishko demonstrates examples of state management and intercommunication using Quik. He highlights the performance benefits of using Quik with React and emphasizes the importance of reducing JavaScript size for better performance. Finally, he mentions the use of QUIC in both MPA and SPA applications for improved startup performance.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
Watch video: React Concurrency, Explained
React 18's concurrent rendering, specifically the useTransition hook, optimizes app performance by allowing non-urgent updates to be processed without freezing the UI. However, there are drawbacks such as longer processing time for non-urgent updates and increased CPU usage. The useTransition hook works similarly to throttling or bouncing, making it useful for addressing performance issues caused by multiple small components. Libraries like React Query may require the use of alternative APIs to handle urgent and non-urgent updates effectively.
How React Compiler Performs on Real Code
React Advanced 2024React Advanced 2024
31 min
How React Compiler Performs on Real Code
Top Content
I'm Nadia, a developer experienced in performance, re-renders, and React. The React team released the React compiler, which eliminates the need for memoization. The compiler optimizes code by automatically memoizing components, props, and hook dependencies. It shows promise in managing changing references and improving performance. Real app testing and synthetic examples have been used to evaluate its effectiveness. The impact on initial load performance is minimal, but further investigation is needed for interactions performance. The React query library simplifies data fetching and caching. The compiler has limitations and may not catch every re-render, especially with external libraries. Enabling the compiler can improve performance but manual memorization is still necessary for optimal results. There are risks of overreliance and messy code, but the compiler can be used file by file or folder by folder with thorough testing. Practice makes incredible cats. Thank you, Nadia!
Optimizing HTML5 Games: 10 Years of Learnings
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Optimizing HTML5 Games: 10 Years of Learnings
Top Content
Watch video: Optimizing HTML5 Games: 10 Years of Learnings
PlayCanvas is an open-source game engine used by game developers worldwide. Optimization is crucial for HTML5 games, focusing on load times and frame rate. Texture and mesh optimization can significantly reduce download sizes. GLTF and GLB formats offer smaller file sizes and faster parsing times. Compressing game resources and using efficient file formats can improve load times. Framerate optimization and resolution scaling are important for better performance. Managing draw calls and using batching techniques can optimize performance. Browser DevTools, such as Chrome and Firefox, are useful for debugging and profiling. Detecting device performance and optimizing based on specific devices can improve game performance. Apple is making progress with WebGPU implementation. HTML5 games can be shipped to the App Store using Cordova.
The Future of Performance Tooling
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Today's Talk discusses the future of performance tooling, focusing on user-centric, actionable, and contextual approaches. The introduction highlights Adi Osmani's expertise in performance tools and his passion for DevTools features. The Talk explores the integration of user flows into DevTools and Lighthouse, enabling performance measurement and optimization. It also showcases the import/export feature for user flows and the collaboration potential with Lighthouse. The Talk further delves into the use of flows with other tools like web page test and Cypress, offering cross-browser testing capabilities. The actionable aspect emphasizes the importance of metrics like Interaction to Next Paint and Total Blocking Time, as well as the improvements in Lighthouse and performance debugging tools. Lastly, the Talk emphasizes the iterative nature of performance improvement and the user-centric, actionable, and contextual future of performance tooling.

Workshops on related topic

React Performance Debugging Masterclass
React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Top Content
Featured Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Next.js 13: Data Fetching Strategies
React Day Berlin 2022React Day Berlin 2022
53 min
Next.js 13: Data Fetching Strategies
Top Content
Workshop
Alice De Mauro
Alice De Mauro
- Introduction- Prerequisites for the workshop- Fetching strategies: fundamentals- Fetching strategies – hands-on: fetch API, cache (static VS dynamic), revalidate, suspense (parallel data fetching)- Test your build and serve it on Vercel- Future: Server components VS Client components- Workshop easter egg (unrelated to the topic, calling out accessibility)- Wrapping up
React Performance Debugging
React Advanced 2023React Advanced 2023
148 min
React Performance Debugging
Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Building WebApps That Light Up the Internet with QwikCity
JSNation 2023JSNation 2023
170 min
Building WebApps That Light Up the Internet with QwikCity
WorkshopFree
Miško Hevery
Miško Hevery
Building instant-on web applications at scale have been elusive. Real-world sites need tracking, analytics, and complex user interfaces and interactions. We always start with the best intentions but end up with a less-than-ideal site.
QwikCity is a new meta-framework that allows you to build large-scale applications with constant startup-up performance. We will look at how to build a QwikCity application and what makes it unique. The workshop will show you how to set up a QwikCitp project. How routing works with layout. The demo application will fetch data and present it to the user in an editable form. And finally, how one can use authentication. All of the basic parts for any large-scale applications.
Along the way, we will also look at what makes Qwik unique, and how resumability enables constant startup performance no matter the application complexity.
High-performance Next.js
React Summit 2022React Summit 2022
50 min
High-performance Next.js
Workshop
Michele Riva
Michele Riva
Next.js is a compelling framework that makes many tasks effortless by providing many out-of-the-box solutions. But as soon as our app needs to scale, it is essential to maintain high performance without compromising maintenance and server costs. In this workshop, we will see how to analyze Next.js performances, resources usage, how to scale it, and how to make the right decisions while writing the application architecture.
Maximize App Performance by Optimizing Web Fonts
Vue.js London 2023Vue.js London 2023
49 min
Maximize App Performance by Optimizing Web Fonts
WorkshopFree
Lazar Nikolov
Lazar Nikolov
You've just landed on a web page and you try to click a certain element, but just before you do, an ad loads on top of it and you end up clicking that thing instead.
That…that’s a layout shift. Everyone, developers and users alike, know that layout shifts are bad. And the later they happen, the more disruptive they are to users. In this workshop we're going to look into how web fonts cause layout shifts and explore a few strategies of loading web fonts without causing big layout shifts.
Table of Contents:What’s CLS and how it’s calculated?How fonts can cause CLS?Font loading strategies for minimizing CLSRecap and conclusion