Ladle: The Story About Modules and Performance

The bigger they are, the slower it gets. I am talking about your applications and the bundling process. Fortunately, there might a be better future - the one without bundlers. This talk will be about JavaScript modules, Vite and how we built Ladle - a speedy tool for your React stories.

Rate this content
Bookmark
Slides
Video Summary and Transcription
Ladle is an open-source tool that enhances the development and testing of React components, serving as a powerful alternative to Storybook. It leverages Vite, a development environment that optimizes JavaScript modules for faster server startup and incremental rebuilds. JavaScript modules, also known as ESM, improve web development by supporting asynchronous loading and dynamic imports, which enhance page interactivity. Vite excels in performance by eliminating the need for bundling, unlike traditional tools like Webpack. Ladle offers features like accessibility testing, responsive design checks, and dynamic component control. It supports customization without initial setup, making it versatile for various projects. The future of frontend tooling is expected to be dominated by JavaScript modules, with emerging trends like using Rust and Go for JavaScript compilation to boost speed. The video also highlights the benefits of using unbundled JavaScript modules and the role of tools like Ladle in modern web development.

FAQ

Vite is a new development environment that leverages JavaScript modules, which do not require bundling before being served to browsers. This results in instant dev server startups, unlike Webpack which requires parsing and resolving all modules to create a bundle, leading to slower startup times as the application grows.

Ladle is an open-sourced tool developed for creating and testing React components through stories, built on top of Vite. It serves as an alternative to Storybook, offering faster startup times and improved performance due to Vite's efficient handling of JavaScript modules.

JavaScript modules, supported by modern browsers, load asynchronously, preventing pages from being blocked and becoming interactive. They support dynamic imports, tree shaking to load only necessary code, and are standardized across all browsers and environments, improving compatibility and reducing the need for additional tooling.

Vite provides major performance benefits by eliminating the need for full-code parsing and bundling before server startup, offering immediate server launch regardless of codebase size. It also supports incremental rebuilds that remain fast, enhancing developer productivity and experience.

Ladle offers features like accessibility checks, responsive design testing through viewport adjustments, dynamic control of component properties, and event logging. It also supports themes, source code previews, and structured metadata exports for automation and testing.

Yes, Ladle allows customization with no initial configuration required. It supports adding fonts, stylesheets, and full access to Vite's configuration. It's adaptable for different monorepo setups and has a robust programmatic API, making it versatile for various project requirements.

Wojta Mikšu predicts that JavaScript modules will continue to dominate, reducing the need for heavy processing before code reaches browsers. He also anticipates increased use of languages like Rust and Go for compiling JavaScript, aiming for further speed improvements in frontend development.

Ladle automatically uses file names and export names to organize stories in its UI. Developers can customize story organization through parameters like 'story name' and 'export default title', allowing for flexible and structured component testing.

1. Introduction to Open Tool for React Components#

Short description:

Hello everyone, my name is Wojta Mikšu. I work at Uber as a web infrastructure engineer. Today, I will tell you about a new open tool that supercharges developing and testing your React components. This talk has 4 sections. The first is a short history lesson about Javascript features we've been missing. Then we talk about Javascript modules, Vite, and Ladle. Finally, we will wrap this up with some future predictions. JavaScript was missing a concept of modularization. Node.js adopted CommonJS, but it has weaknesses for browsers.

Hello everyone, my name is Wojta Mikšu. I work at Uber as a web infrastructure engineer, and today I will tell you about a new open tool that supercharges developing and testing your React components.

This talk has 4 sections. The first is a short history lesson about Javascript features we've been sorely missing for a long time. Then we talk about Javascript modules, also known as ES6 modules or ESM. The third section introduces Vite, a new type of bundler, followed by an introduction of Ladle, a tool we built on top of Vite. And finally, we will wrap this up with some future predictions.

Do you remember what 1995 browsers looked like? I don't, but they were very different for sure. It's not surprising that JavaScript was missing some features when it was introduced. The one feature that's relevant to this talk is the way how JavaScript is loaded into browsers. You have to use the script tag. The code itself can be inlined or point to a file. This was fine 15, 10 years ago when JavaScript was used to add a bit of interactivity. But when we build modern frontend applications, it causes some serious issues. The files are loaded and executed synchronously and their order matters. Top level variables end up in the global scope, so it's easy for two unrelated libraries to cause naming collisions. Every time you create a new file, you have to load it through an additional script tag and bind it through the global scope. There's no easy way to eliminate unused code either.

JavaScript was missing a concept of modularization. This was especially a big problem for server-side JavaScript known as Node.js. Some better system for splitting and encapsulating code was badly needed. So, Node.js adopted CommonJS. This syntax should be very familiar to anyone that touched JavaScript in recent years. It addresses a lot of issues, but not all of them. It introduces a concept of code providers and consumers. It's also a whole philosophy and it was used to create the biggest package registry for code sharing called npm. It enables dynamic code loading. However, there are some major weaknesses that make it unusable for browsers. For example, there are file system references like the usage of dirname. But the biggest issue is that module resolution and loading needs to be done synchronously.

2. Introduction to JavaScript Modules and Vite#

Short description:

CommonJS was a better system than JavaScript ever had, allowing developers to split code into modules without worrying about browser support. JavaScript modules were introduced as an official standard, supported by all modern browsers and runtimes. They work asynchronously, load additional modules when needed, and can be easily tree shaken. Despite some compatibility issues, it's clear that JavaScript modules are the future. Vite is a new development environment that takes advantage of unbundled JavaScript modules, resulting in instant dev server startup. Vite uses ESbuild and Rollup plugins to convert existing libraries from CommonJS and remove unnecessary dependencies like JSX and TypeScript.

That would block rendering in browsers and lead to really bad experiences. CommonJS was developed independently from the ECMAStandards party, so it never became the core part of JavaScript. However, it was still a better system than JavaScript ever had, and developers wanted to use server code in their frontend applications. So the bundlers like Webpack were introduced. Webpack can analyze and resolve all CommonJS modules before creating a single bundled file that serves two browsers. This was a big win-win. Developers can use CommonJS to split code into modules and don't have to worry about browser support in it.

But we were still missing an official standardized way that would work in browsers and other runtimes without additional tooling. JavaScript modules were introduced. Today, they are supported by all modern browsers and runtimes. You can load them by setting the type attribute to module. And the best part is that browsers understand import-export syntax and load additional modules when needed. Modules work asynchronously and don't prevent pages being interactive. There's a promise API to import modules dynamically. The code can be easily tree shaken. Only the bits you use need to be loaded. It works everywhere across all browsers and environments. And it is an official standard now.

Don't get me wrong. There are still some issues when it comes to compatibility with common JS and existing ecosystem, so the adaption can be sometimes cumbersome. However, it's clear that this is the future of JavaScript.

So what is Vite? It's a new development environment built on the fact that JavaScript modules don't need to be bundled before being served to browsers. It was a major and very slow part of Webpack. Therefore, the dev server startup can be instant. Some bundling and compilation is still needed. A lot of existing libraries need to be converted from common JS. We also need to remove things like JSX and TypeScript. Vite uses ESbuild and Rollup plugins for that. Let's compare Webpack and Vite startup process side-by-side. This is the old Webpack approach.

3. Introduction to Ladle for React Component Testing#

Short description:

The code base needs to be parsed and modules resolved before creating a bundle. Vite starts the server immediately and serves the entry module to the browser. Ladle is an open-sourced tool built on top of Vite for developing and testing React components through stories. It provides a UI for navigating and testing different stories, and also includes accessibility checks and responsive design testing.

The code base needs to be all parsed and all modules resolved before creating a bundle. Only then the server can be started and the bundle served. The bigger your application gets, the longer the startup time is.

On the other hand, Vite starts the server immediately and serves the entry module to the browser. Since the browser understands following imports, it loads additional modules only when needed. Vite only needs to do a light preprocessing, like stripping typescript types, before sending these files back to the browser. However, it doesn't matter how big your code base is, the server always starts instantly and incremental rebuilds also stay very, very fast.

We love these new performance benefits, so it made sense to put Vite to a test and use it as a foundation for an alternative solution to storybook. We built an open-sourced tool called Ladle. A tool for developing and testing React components through stories, built from ground up on top of Vite.

Let's do a demo. We are starting with an empty folder, so let's initialize our project. We also need to add some dependencies as react, react-dom and ladle. By default Ladle is looking for the folder source. And specifically looks for files ending with .stories.dsx. So let's create file welcome-stories.dsx.

Stories are also just React components, but they need to be exported. So let's export story name first, that exports a heading. Of course this could be like any React component, for example a component from your design system. And now we can start Ladle with command ladleserve. This immediately starts a v8 dev server and this is the Ladle's UI. On the left side you can find your component and on the right side there's a navigation listing all discovered stories. By default the file names and the export names are used to create this navigation. But this can be also changed through the code through parameters story name and export default title.

Let's add some additional stories to demonstrate other features. So this story has a button that's not very accessible. As you can see the call contrast is not great. You can write accessibility, you can run accessibility check with X by clicking on this icon. And now X tells us there is one violation and it gives us details so we can debug it and fix it. Another feature is useful when testing responsiveness and responsive design.

4. Introduction to Ladle Features#

Short description:

Ladle allows you to set different viewports and test different sizes. It provides controls for displaying and testing variations of a single component. It also offers features for logging event handlers, switching themes, previewing source code, and exporting metadata. Ladle is a single dependency package that combines a weed plugin, ReactUI, and CLI.

It allows you to set different viewports and test different media queries. Through this add-on, you can change the viewport and test different sizes. You can also set a default viewport size through the parameter .meta and width.

Another feature is useful when you want to display and test different variations of a single component. In this case, we are changing the value of size. This feature is called controls and right now we can change this one variable called size and we can pick from three different values. As you can see, even without touching the code, we can see different variations. And this is how you set it up in the code through the API called archetypes. You define what variables you want to make dynamic, you define the options and then this gets passed as a prop into the story itself when it can be used inside of your component. There are other control types you can use like booleans, strings, radios. So this is very flexible for things like design systems when you want to display different variations of a single component, but you don't want to create multiple stories.

Another feature is great for logging event handlers. So, for example, this button is using action handler and once we start clicking on it, you get this notification and it shows you what event was fired and its details. The action is a function as exploited from Nadl and can be just simple pass to your handlers. And there are some other features you can switch between light and dark theme. You can switch to full screen. You can switch between left to right and right to left. You can also preview the source code of the story. It highlights the specific story and lines of code. It also shows the location of the story. And finally, Ladle exposes endpoint meta.json. This gives you a structured information about what stories are in your instance. This can be extremely useful for further automation and testing. For example, we use this file to run a playwright script that opens each URL and takes a snapshot. Then we can use it for visual snapshot testing. As you can see, this metadata also exports information like the width. When you set the default viewport, not only does your story get displayed in this viewport, but it also gets exported with it, together in this file. This can be very useful for further testing automation.

What is ladle? Ladle is weed plugin, ReactUI and CLI in one single package. It's a single dependency, so you will never run into versioning issues between multiple packages.

5. Introduction to Ladle Benefits and Future Trends#

Short description:

It requires no initial configuration and fully supports component story format. It can be customized and works as a drop-in replacement for Storybook. Ladle has received great feedback from the community and has shown significant improvements in dev server startup and hot module replacement. The future of frontend tooling is heading towards modules ruling everything, allowing us to skip heavy processing and leverage browser capabilities for module resolution and caching. Using other languages like Rust and Go for JavaScript compilation is also a trend that brings speed improvements.

It requires no initial configuration, but you can still customize things like fonts, add additional stylesheets or fully access weed's configuration. It fully supports component story format and works as a drop-in replacement of storybook.

It also supports different monorepo setups and has a great programmatic API, so it can be re-bundled as a part of your library. It was released earlier this year and so far it got great feedback from the community. We've been using it for many internal and public projects and saw sometimes even 10x improvements when it comes to starting dev servers and hot module replacement.

So where is the frontend tooling heading now? It's pretty clear that modules are not going anywhere and over time they will rule everything. This is great for our tooling. We can skip some heavy processing before sending our code to browsers. Browsers can do the module resolution for us. They're also much better when it comes to caching. There's also a trend to use other languages like Rust and Go when compiling JavaScript. That's another great source of speed improvements.

Vojtech Miksu
Vojtech Miksu
16 min
24 Oct, 2022

Comments

Sign in or register to post your comment.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced 2022React Advanced 2022
25 min
A Guide to React Rendering Behavior
Top Content
This transcription provides a brief guide to React rendering behavior. It explains the process of rendering, comparing new and old elements, and the importance of pure rendering without side effects. It also covers topics such as batching and double rendering, optimizing rendering and using context and Redux in React. Overall, it offers valuable insights for developers looking to understand and optimize React rendering.
Speeding Up Your React App With Less JavaScript
React Summit 2023React Summit 2023
32 min
Speeding Up Your React App With Less JavaScript
Top Content
Watch video: Speeding Up Your React App With Less JavaScript
Mishko, the creator of Angular and AngularJS, discusses the challenges of website performance and JavaScript hydration. He explains the differences between client-side and server-side rendering and introduces Quik as a solution for efficient component hydration. Mishko demonstrates examples of state management and intercommunication using Quik. He highlights the performance benefits of using Quik with React and emphasizes the importance of reducing JavaScript size for better performance. Finally, he mentions the use of QUIC in both MPA and SPA applications for improved startup performance.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
Watch video: React Concurrency, Explained
React 18's concurrent rendering, specifically the useTransition hook, optimizes app performance by allowing non-urgent updates to be processed without freezing the UI. However, there are drawbacks such as longer processing time for non-urgent updates and increased CPU usage. The useTransition hook works similarly to throttling or bouncing, making it useful for addressing performance issues caused by multiple small components. Libraries like React Query may require the use of alternative APIs to handle urgent and non-urgent updates effectively.
The Future of Performance Tooling
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Today's Talk discusses the future of performance tooling, focusing on user-centric, actionable, and contextual approaches. The introduction highlights Adi Osmani's expertise in performance tools and his passion for DevTools features. The Talk explores the integration of user flows into DevTools and Lighthouse, enabling performance measurement and optimization. It also showcases the import/export feature for user flows and the collaboration potential with Lighthouse. The Talk further delves into the use of flows with other tools like web page test and Cypress, offering cross-browser testing capabilities. The actionable aspect emphasizes the importance of metrics like Interaction to Next Paint and Total Blocking Time, as well as the improvements in Lighthouse and performance debugging tools. Lastly, the Talk emphasizes the iterative nature of performance improvement and the user-centric, actionable, and contextual future of performance tooling.
How React Compiler Performs on Real Code
React Advanced 2024React Advanced 2024
31 min
How React Compiler Performs on Real Code
Top Content
I'm Nadia, a developer experienced in performance, re-renders, and React. The React team released the React compiler, which eliminates the need for memoization. The compiler optimizes code by automatically memoizing components, props, and hook dependencies. It shows promise in managing changing references and improving performance. Real app testing and synthetic examples have been used to evaluate its effectiveness. The impact on initial load performance is minimal, but further investigation is needed for interactions performance. The React query library simplifies data fetching and caching. The compiler has limitations and may not catch every re-render, especially with external libraries. Enabling the compiler can improve performance but manual memorization is still necessary for optimal results. There are risks of overreliance and messy code, but the compiler can be used file by file or folder by folder with thorough testing. Practice makes incredible cats. Thank you, Nadia!
Optimizing HTML5 Games: 10 Years of Learnings
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Optimizing HTML5 Games: 10 Years of Learnings
Top Content
Watch video: Optimizing HTML5 Games: 10 Years of Learnings
PlayCanvas is an open-source game engine used by game developers worldwide. Optimization is crucial for HTML5 games, focusing on load times and frame rate. Texture and mesh optimization can significantly reduce download sizes. GLTF and GLB formats offer smaller file sizes and faster parsing times. Compressing game resources and using efficient file formats can improve load times. Framerate optimization and resolution scaling are important for better performance. Managing draw calls and using batching techniques can optimize performance. Browser DevTools, such as Chrome and Firefox, are useful for debugging and profiling. Detecting device performance and optimizing based on specific devices can improve game performance. Apple is making progress with WebGPU implementation. HTML5 games can be shipped to the App Store using Cordova.

Workshops on related topic

React Performance Debugging Masterclass
React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Top Content
Featured WorkshopFree
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Building WebApps That Light Up the Internet with QwikCity
JSNation 2023JSNation 2023
170 min
Building WebApps That Light Up the Internet with QwikCity
Featured WorkshopFree
Miško Hevery
Miško Hevery
Building instant-on web applications at scale have been elusive. Real-world sites need tracking, analytics, and complex user interfaces and interactions. We always start with the best intentions but end up with a less-than-ideal site.
QwikCity is a new meta-framework that allows you to build large-scale applications with constant startup-up performance. We will look at how to build a QwikCity application and what makes it unique. The workshop will show you how to set up a QwikCitp project. How routing works with layout. The demo application will fetch data and present it to the user in an editable form. And finally, how one can use authentication. All of the basic parts for any large-scale applications.
Along the way, we will also look at what makes Qwik unique, and how resumability enables constant startup performance no matter the application complexity.
Next.js 13: Data Fetching Strategies
React Day Berlin 2022React Day Berlin 2022
53 min
Next.js 13: Data Fetching Strategies
Top Content
WorkshopFree
Alice De Mauro
Alice De Mauro
- Introduction- Prerequisites for the workshop- Fetching strategies: fundamentals- Fetching strategies – hands-on: fetch API, cache (static VS dynamic), revalidate, suspense (parallel data fetching)- Test your build and serve it on Vercel- Future: Server components VS Client components- Workshop easter egg (unrelated to the topic, calling out accessibility)- Wrapping up
React Performance Debugging
React Advanced 2023React Advanced 2023
148 min
React Performance Debugging
Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
High-performance Next.js
React Summit 2022React Summit 2022
50 min
High-performance Next.js
Workshop
Michele Riva
Michele Riva
Next.js is a compelling framework that makes many tasks effortless by providing many out-of-the-box solutions. But as soon as our app needs to scale, it is essential to maintain high performance without compromising maintenance and server costs. In this workshop, we will see how to analyze Next.js performances, resources usage, how to scale it, and how to make the right decisions while writing the application architecture.
Maximize App Performance by Optimizing Web Fonts
Vue.js London 2023Vue.js London 2023
49 min
Maximize App Performance by Optimizing Web Fonts
WorkshopFree
Lazar Nikolov
Lazar Nikolov
You've just landed on a web page and you try to click a certain element, but just before you do, an ad loads on top of it and you end up clicking that thing instead.
That…that’s a layout shift. Everyone, developers and users alike, know that layout shifts are bad. And the later they happen, the more disruptive they are to users. In this workshop we're going to look into how web fonts cause layout shifts and explore a few strategies of loading web fonts without causing big layout shifts.
Table of Contents:What’s CLS and how it’s calculated?How fonts can cause CLS?Font loading strategies for minimizing CLSRecap and conclusion