The Performance Impact of Generated JavaScript

This ad is not shown to multipass and full ticket holders
JSNation US
JSNation US 2025
November 17 - 20, 2025
New York, US & Online
See JS stars in the US biggest planetarium
Learn More
In partnership with Focus Reactive
Upcoming event
JSNation US 2025
JSNation US 2025
November 17 - 20, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

When was the last time you peeked inside the dist folder to inspect the JavaScript generated by your framework or bundler?


The reality of modern JavaScript development with it's reliance of bundlers, frameworks and compilers is that the JavaScript you write is not the same as the JavaScript that runs on your browser. Tools like TypeScript and compilers like Babel allow you to support a variety of older browsers, environments, and runtimes while writing modern, maintainable code, but it can be hard to tell what is going on in that final bundle. It's crucial to understand and optimize the generated JavaScript produced during your build process so that you maximize performance.


Join Abhijeet Prasad, maintainer of Sentry's open source error and performance monitoring JavaScript SDKs as he walks through the performance and bundle size implications of generated JavaScript and techniques you can use to optimize it. He'll walk through transpilation nuances, tree-shaking, minification, and loading strategies so you can understand how to deliver better experiences for your users.

This talk has been presented at JSNation US 2024, check out the latest edition of this JavaScript Conference.

FAQ

Generated JavaScript is the output from a build process that transforms source code into a version that runs in the browser. It's important to consider because it impacts bundle size and performance.

Minification reduces JavaScript file size by removing unnecessary characters and shortening variable names, which can decrease load times and improve performance.

Down-compilation converts source code to a more backwards-compatible version. This can increase bundle size, especially when targeting older browsers, due to necessary polyfills and code transformations.

TypeScript enums are considered expensive because they generate additional code for reverse lookups, which increases bundle size and cannot be easily minified.

Compression, using methods like gzip or Brotli, reduces the size of JavaScript files sent over the network, resulting in faster load times for users.

Tools like CodeCub bundle size, size limit action, and bundle analyzers for Webpack and Rollup help track and analyze JavaScript bundle size.

Developers often express concerns about the large bundle size of Sentry's JavaScript SDK, particularly the front-end browser version.

Managing JavaScript bundle size is important because it directly affects page load speed and user experience. Smaller bundles lead to faster loading times and a more responsive user interface.

Strategies to reduce JavaScript bundle size include using smaller libraries (e.g., Preact instead of React), lazy loading JavaScript, and switching from client-side to server-side rendering.

Abhijeet Prasad
Abhijeet Prasad
17 min
21 Nov, 2024

Comments

Sign in or register to post your comment.
Video Summary and Transcription
Today's Talk discussed the performance impact of generated JavaScript and the importance of bundle size in relation to page load speed and user experience. The use of a build process, minification, and avoiding unnecessary polyfills were highlighted as strategies to reduce bundle size. API design considerations, such as avoiding deeply nested object lookups and using functions and objects instead of classes, were discussed in relation to minification. The concepts of down-compilation and transpilation were explained, with a focus on the challenges and benefits they present. The Talk also emphasized the need to avoid using TypeScript enums and instead use string constants, as well as the importance of compressing code and tracking bundle size changes. Bundle analyzers were recommended for visualizing bundle contents and component connections.

1. Performance Impact of JavaScript

Short description:

Today, we're going to be talking about the performance impact of generated JavaScript. The JavaScript SDK provided by Sentry is often considered too big, leading to concerns about bundle size. Bundle size directly affects page load speed and user experience. To reduce bundle size, developers analyze the code they write and use, considering smaller alternate libraries, lazy loading, or removing JavaScript altogether.

♪ Hey, everybody. Welcome. Today, we're going to be talking about the performance impact of generated JavaScript. My name is Abhijit. I currently work at Sentry, maintaining our open-source MIT-licensed JavaScript SDKs. And so, these are some of the most used SDKs in the world, and we have a little bit of experience of how people are building apps and using JavaScript.

One of the biggest things that people come to us all the time is that the JavaScript SDK that we give them, especially our front-end browser one, is just way too big. And what I mean by big is by bundle size. It's so large that they're hesitant in loading it in their app. So, now, this is actually a really valid concern and something that we're looking to fix all the time, but it's important to think, okay, why do people keep coming to us about this? This is actually because bundle size is pretty important.

So, it has a direct correlation with how fast it takes to load a page. And so, therefore, it's directly also correlated with your user experience. So, you want to have websites that have fewer loading spinners, that load faster, that feel snappier. All of that directly affects how a user perceives and uses your site. And so, managing your bundle size, having less JavaScript loaded is really important. Now, we know that, okay, this is something that people want to reduce. So, how do people typically reduce their bundle size? Well, so, usually, you take a look at the code you write and use. This means that you're analyzing whatever you're adding to your app and you're determining if it's important or not.

2. Generated JavaScript and Minification

Short description:

In modern JavaScript apps, the user runs generated JavaScript. You need to understand the generated JavaScript to improve your bundle size. Using a build process is useful for tree shaking, transpiling, and injecting logic at build time. Minification makes JavaScript assets smaller by removing unnecessary tokens and shortening names.

So, typically, this will be like using smaller alternate libraries. For example, switching from React to Preact, lazy loading your JavaScript so that you're only loading exactly what you need. And whenever you happen to use a component or some other logic, you lazy load it. Or you remove JavaScript altogether and you try to eliminate the usage of it from your app. The most common scenario for this is, for example, moving from client-side rendering your application to server-side rendering it. But an important theme here is that these are all strategies for the code you write. But the code you write is not actually the code that runs on the browser, is it?

In modern JavaScript apps, this is completely different. In modern JavaScript apps, the user runs generated JavaScript. This means you need to look at and understand the generated JavaScript to improve your bundle size, not just the JavaScript that you wrote. Now, what do I mean by generated JavaScript? I brought it up a couple of times now as we've talked together. I mean via a build process. We've talked together. I mean via a build process. For most modern JavaScript stacks, whether you're client-side rendered or server-side rendered with a framework like Next.js, Nuxt or SvelteKit, you take some input JavaScript and TypeScript, pass it into a bundler like Veet, Webpack, Rollup, and you get some JavaScript at the end. Using a build process is really useful. You can tree shake out unused JavaScript, transpile from TypeScript to JavaScript, support older browsers via injecting polyfills or down compiling out and down compiling older browser APIs, and you can inject logic at build time, which is super useful for statically built sites. Everybody pretty much uses a build process at this point. You're probably using Veet or Webpack or one of those things under the hood now. So with that in mind, everyone's using a build process. You can go no build, but everybody running huge, big apps are running build processes. And so there's a lot of improvement that we can take with how builds generate JavaScript. Here we have five main areas of improvement. Minification, down compilation, transpilation, compression, and tree shaking. We're going to be skipping tree shaking this time, but I'll touch upon it briefly at the end. So first is minification, which is a process of making your JavaScript assets as small as possible, typically by removing unnecessary tokens and shortening variable or function names. So you can see here is an example. On the left, we go from a JavaScript function that has an object inside of it, and we minified it down, removing comments, white space, shortening names whenever possible. The tricky thing with minification and where the optimization part comes in is that not everything can be minified or shortened. You cannot minify reserved keywords like typeof, function, return. You can't minify object keys because they're needed to do lookups on our object.

3. Minification Strategies

Short description:

API design matters for minification. Class methods cannot be minified, even if considered private. Avoid deeply nested object lookups. Use config to mangle based on regex. Use functions and objects instead of classes. Arrays are even better than objects for minification. React hooks illustrate this well.

Otherwise, you can't do lookups. And you can't minify class methods because you have to call the methods by their name. And so if you shorten them, they no longer work. This actually means that API design matters a lot, and how you design your programs so that they can be minified more easily can affect your bundle size a lot.

Here's an example of the class methods. On the left is an example API class, something actually we used to use in the Sentry SDK, but I de-Sentryfied it a lot. And on the right here is a minified version. I prettified it so that we kept white space so that it's easier to read. And you can actually see that in the minified version, a lot of the names still state because all of these class methods can't get minified. They're part of the public API of this API class. Even things that we consider private like encoded auth still doesn't get minimized because of how JavaScript works.

So, what are some strategies so we can write more minification-friendly code? Well, first of all, try to avoid deeply nested object lookups and builder patterns. On the left here, you can see that we have this complicated lookup on event.exception.values.type. And we want to make sure that everything exists properly so we do this Boolean operation. But all of these values, exception.values.type, and of course the SentryError here, are completely unminifiable. And so we end up with a lot of redundant information in our bundle. Given that we're already using a try catch, we can just shorten this to a single line. And if we hit a type error, it'll hit the catch block instead.

Another way to write more minification-friendly code is to use config to mangle based on regex. So, for example, if we go back to the class methods example here, if we say, hey, everything that starts with the underscore should be minified, we can suddenly now minify encoded auth and all of its usages here. Most bundlers have config that allow you to set up custom mangling based on some regular expression. Lastly, and this is where the API design component comes in, you can use functions and objects instead of classes. And this means classes, of course, they have their whole public API and every method, public method, can't be minified. But if you use functions and objects, function names can be minified because they're just simply top-level methods and all they care about is where they're called and what they're called with. Even better than objects, because, again, object keys can't be minified, is actually using arrays. So if you have some internal data structure that you know is commonly used, then perhaps making an array instead of an object. React hooks actually does this really well. Things like useState and useReducer return an array. So that these structured array arguments can actually be minified by a bundler. A great example of comparing minification-friendly libraries is Zod versus ValleyBot.

4. Down-Compilation and Transpilation

Short description:

Zod and ValleyBot offer different approaches to API design. Down-compilation is converting source code to a more backwards-compatible version. It can result in larger JavaScript bundles. Optional chaining may require down-compilation for older browsers. Consider using Boolean conditions instead. Check your JavaScript generation for patterns that down-compile poorly. Avoid unnecessary polyfills by updating your browser list and using modern bundlers. Transpilation converts source code between languages.

So Zod is a really popular schema validation library. ValleyBot is kind of a competitor for this. And you can actually see the two different approaches in API design. For Zod, you have this builder pattern where you can actually notice here that string, email, and endsWith is chained from this zexport, and these chained methods cannot be minified. And so you end up with all of these in your bundle. And if you're using Zod a lot and it's spread around everywhere, then that adds up over time. Meanwhile, for ValleyBot, these are just top-level functions exported. And so therefore all of these functions, string, email, and endsWith in the ValleyBot example will be minified. Requires this kind of like top-level API design change, but suddenly you have a much more minification, bundle size-friendly library.

Next, let's think about down-compilation. Down-compilation is a process of converting source code to a more backwards-compatible version of that source code. Unfortunately, more backwards-compatible JavaScript can be bigger. Let's say you need to target IE11. And so you're emitting ES5-compatible JavaScript. This means actually there's no classes because ES5 didn't have classes yet. That's an ES6 feature. And so you might have to polyfill those out or down-compile them to have alternate JavaScript. A great example of this is optional chaining. So on the very left here, you can see the optional chaining feature, which allows you to kind of do deeply nested lookups on objects. But if you target something older than ES2020, which is when optional chaining came out, and you want to support more browsers than that, you'll end up down-compiling your JavaScript and your optional chaining to something that looks like this. This is super expensive. And you can imagine if you use optional chaining a lot, especially if this was some big, huge nested statement, you would have a ton of conditionals in here. Instead, especially if you use TypeScript, you can probably just replace this with a simple Boolean assertion, which is what we do in our Sentry SDK codebase. We actually lint against using optional chaining and force everybody to use Boolean conditions instead. So the biggest thing for this is to really look at how your JavaScript is getting generated and make sure that nothing looks strange, nothing looks off, and disallow patterns that down-compile not well because you need to support more browsers. For polyfills, because sometimes a lot of people just start injecting them unnecessarily, just take a look and see what you're bundling and update your browsers list or use modern bundlers. Please don't use Webpack 4 anymore.

Next, we have transpilation. This is the process of converting source code from one language to another. We used to have a lot of these, CoffeeScript, BuckleScript, ReasonML, things like that.

5. TypeScript Enums and Compression

Short description:

TypeScript enums are expensive and should be avoided. Use string constants instead. Compress your code for savings. Test changes against gzipped files. Track your bundle size with CodeCub bundle size.

But nowadays, this pretty much means TypeScript and JavaScript to the average front-end developer. For what it's worth, the TypeScript transpiler is pretty good. But one thing to note is that TypeScript enums are pretty expensive, and I recommend not using them. On the left, we have an enum. Looks pretty simple, but on the right, we can see what this generates. And this is because enums allow reverse lookups. This is really expensive, and oftentimes, just the enum values cannot be minified at all, too. And so there's a double whammy there. It's pretty much easier just to not use them, use string constants and do enums that way. Strings also have the benefit of compressing really well.

Compression is the last thing that we can talk about here. The browser automatically uncompresses compressed JavaScript assets, also CSS and other fonts, that are compressed with gzip, Brotli, or deflate. And so therefore, you should always be, alongside minifying your code, compressing it for big savings while users have to load your resources over the network. What ends up happening is, though, that repeated strings or code compress really well because the increased similarity helps with gzip and Brotli algorithms. So for example here, if we had to switch this enum to repeated string constants, that would actually improve its gzipability. gzip and the other compression algorithms, though, can be a little bit of a black box. It's confusing. And so the biggest recommendation here is, as you're examining your bundle and trying to fix your polyfills, your down compilation, and your minification strategies, test your changes against whatever is gzipped. And if you're not already doing it, please compress your JavaScript before you're sending it to your users.

So we covered a lot of ground, but there's a lot more to take a look. Really the best way to start exploring and to start investigating is to read more generated JavaScript. Get comfortable with reading minified code and see what is actually emitted from your build processes. Sometimes if you take a look, you can find some really great quick wins. In general, though, please start tracking your bundle size. I've been to many different conferences by now and I always ask this question. How many of you are tracking your bundle size? And the answer is disappointingly low. I recommend using CodeCub bundle size. You can track it over time. It'll also give you a PR comment. But I'm a little biased because I helped build it.

6. Bundle Size Tracking and Visualization

Short description:

Use the size limit action to track bundle size changes. Bundle analyzers like Webpack and Rollup can help visualize bundle contents and component connections.

A great alternative is the size limit action. This is what we use in the Sentry SDK. It'll leave a PR comment on GitHub that basically says, oh, this was the previous bundle size and this is the new bundle size and here's what's changed. If you want to dive in and you just don't want to only read the generated JavaScript, then I recommend you use bundle analyzers for your bundlers. There's a Webpack one, the Rollup one, which also works for Vite. These are really great ways to also visually see how your bundles, what's in your bundle and the connections between different components.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced 2022React Advanced 2022
25 min
A Guide to React Rendering Behavior
Top Content
This transcription provides a brief guide to React rendering behavior. It explains the process of rendering, comparing new and old elements, and the importance of pure rendering without side effects. It also covers topics such as batching and double rendering, optimizing rendering and using context and Redux in React. Overall, it offers valuable insights for developers looking to understand and optimize React rendering.
Speeding Up Your React App With Less JavaScript
React Summit 2023React Summit 2023
32 min
Speeding Up Your React App With Less JavaScript
Top Content
Watch video: Speeding Up Your React App With Less JavaScript
Mishko, the creator of Angular and AngularJS, discusses the challenges of website performance and JavaScript hydration. He explains the differences between client-side and server-side rendering and introduces Quik as a solution for efficient component hydration. Mishko demonstrates examples of state management and intercommunication using Quik. He highlights the performance benefits of using Quik with React and emphasizes the importance of reducing JavaScript size for better performance. Finally, he mentions the use of QUIC in both MPA and SPA applications for improved startup performance.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
Watch video: React Concurrency, Explained
React 18's concurrent rendering, specifically the useTransition hook, optimizes app performance by allowing non-urgent updates to be processed without freezing the UI. However, there are drawbacks such as longer processing time for non-urgent updates and increased CPU usage. The useTransition hook works similarly to throttling or bouncing, making it useful for addressing performance issues caused by multiple small components. Libraries like React Query may require the use of alternative APIs to handle urgent and non-urgent updates effectively.
How React Compiler Performs on Real Code
React Advanced 2024React Advanced 2024
31 min
How React Compiler Performs on Real Code
Top Content
I'm Nadia, a developer experienced in performance, re-renders, and React. The React team released the React compiler, which eliminates the need for memoization. The compiler optimizes code by automatically memoizing components, props, and hook dependencies. It shows promise in managing changing references and improving performance. Real app testing and synthetic examples have been used to evaluate its effectiveness. The impact on initial load performance is minimal, but further investigation is needed for interactions performance. The React query library simplifies data fetching and caching. The compiler has limitations and may not catch every re-render, especially with external libraries. Enabling the compiler can improve performance but manual memorization is still necessary for optimal results. There are risks of overreliance and messy code, but the compiler can be used file by file or folder by folder with thorough testing. Practice makes incredible cats. Thank you, Nadia!
Optimizing HTML5 Games: 10 Years of Learnings
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Optimizing HTML5 Games: 10 Years of Learnings
Top Content
Watch video: Optimizing HTML5 Games: 10 Years of Learnings
PlayCanvas is an open-source game engine used by game developers worldwide. Optimization is crucial for HTML5 games, focusing on load times and frame rate. Texture and mesh optimization can significantly reduce download sizes. GLTF and GLB formats offer smaller file sizes and faster parsing times. Compressing game resources and using efficient file formats can improve load times. Framerate optimization and resolution scaling are important for better performance. Managing draw calls and using batching techniques can optimize performance. Browser DevTools, such as Chrome and Firefox, are useful for debugging and profiling. Detecting device performance and optimizing based on specific devices can improve game performance. Apple is making progress with WebGPU implementation. HTML5 games can be shipped to the App Store using Cordova.
The Future of Performance Tooling
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Today's Talk discusses the future of performance tooling, focusing on user-centric, actionable, and contextual approaches. The introduction highlights Adi Osmani's expertise in performance tools and his passion for DevTools features. The Talk explores the integration of user flows into DevTools and Lighthouse, enabling performance measurement and optimization. It also showcases the import/export feature for user flows and the collaboration potential with Lighthouse. The Talk further delves into the use of flows with other tools like web page test and Cypress, offering cross-browser testing capabilities. The actionable aspect emphasizes the importance of metrics like Interaction to Next Paint and Total Blocking Time, as well as the improvements in Lighthouse and performance debugging tools. Lastly, the Talk emphasizes the iterative nature of performance improvement and the user-centric, actionable, and contextual future of performance tooling.

Workshops on related topic

React Performance Debugging Masterclass
React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Top Content
Featured Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Next.js 13: Data Fetching Strategies
React Day Berlin 2022React Day Berlin 2022
53 min
Next.js 13: Data Fetching Strategies
Top Content
Workshop
Alice De Mauro
Alice De Mauro
- Introduction- Prerequisites for the workshop- Fetching strategies: fundamentals- Fetching strategies – hands-on: fetch API, cache (static VS dynamic), revalidate, suspense (parallel data fetching)- Test your build and serve it on Vercel- Future: Server components VS Client components- Workshop easter egg (unrelated to the topic, calling out accessibility)- Wrapping up
React Performance Debugging
React Advanced 2023React Advanced 2023
148 min
React Performance Debugging
Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Building WebApps That Light Up the Internet with QwikCity
JSNation 2023JSNation 2023
170 min
Building WebApps That Light Up the Internet with QwikCity
WorkshopFree
Miško Hevery
Miško Hevery
Building instant-on web applications at scale have been elusive. Real-world sites need tracking, analytics, and complex user interfaces and interactions. We always start with the best intentions but end up with a less-than-ideal site.
QwikCity is a new meta-framework that allows you to build large-scale applications with constant startup-up performance. We will look at how to build a QwikCity application and what makes it unique. The workshop will show you how to set up a QwikCitp project. How routing works with layout. The demo application will fetch data and present it to the user in an editable form. And finally, how one can use authentication. All of the basic parts for any large-scale applications.
Along the way, we will also look at what makes Qwik unique, and how resumability enables constant startup performance no matter the application complexity.
High-performance Next.js
React Summit 2022React Summit 2022
50 min
High-performance Next.js
Workshop
Michele Riva
Michele Riva
Next.js is a compelling framework that makes many tasks effortless by providing many out-of-the-box solutions. But as soon as our app needs to scale, it is essential to maintain high performance without compromising maintenance and server costs. In this workshop, we will see how to analyze Next.js performances, resources usage, how to scale it, and how to make the right decisions while writing the application architecture.
Maximize App Performance by Optimizing Web Fonts
Vue.js London 2023Vue.js London 2023
49 min
Maximize App Performance by Optimizing Web Fonts
WorkshopFree
Lazar Nikolov
Lazar Nikolov
You've just landed on a web page and you try to click a certain element, but just before you do, an ad loads on top of it and you end up clicking that thing instead.
That…that’s a layout shift. Everyone, developers and users alike, know that layout shifts are bad. And the later they happen, the more disruptive they are to users. In this workshop we're going to look into how web fonts cause layout shifts and explore a few strategies of loading web fonts without causing big layout shifts.
Table of Contents:What’s CLS and how it’s calculated?How fonts can cause CLS?Font loading strategies for minimizing CLSRecap and conclusion