Creating Videos Programmatically in React

This ad is not shown to multipass and full ticket holders
JSNation US
JSNation US 2025
November 17 - 20, 2025
New York, US & Online
See JS stars in the US biggest planetarium
Learn More
In partnership with Focus Reactive
Upcoming event
JSNation US 2025
JSNation US 2025
November 17 - 20, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

Introduction to Remotion, a new library with which we can turn React code into MP4 videos, create our motion graphics programmatically and server-side render them. I give an overview of the priciples, the philosophy and of course we are going to code a video!

This talk has been presented at React Summit Remote Edition 2021, check out the latest edition of this React Conference.

FAQ

Remotion is a library that allows developers to write real mp4 videos declaratively using React. It involves defining a composition with metadata such as width, height, duration, and framerate. Developers use the UseCurrentFrame hook to render frames based on the current time expressed as an integer. The process involves bundling with webpack, capturing frames in a headless Chromium instance, and stitching them together using ffmpeg.

Key features of Remotion include the ability to parametrize video elements like React components, better version control by coding videos instead of traditional video editing, and efficient rendering processes that support parallelization. It also supports audio and sequences for more complex video production.

No, CSS transitions do not work in Remotion because the library requires each frame to be rendered based on the current frame number, creating static images that together form an animation. Instead of CSS transitions, Remotion uses JavaScript to calculate and render property values for each frame directly.

Existing videos can be integrated into a Remotion project by using the HTML5 video tag. Remotion provides a video component that synchronizes the video playback with the current frame time, ensuring smooth integration and playback within the composed video.

Remotion 2.0 introduced significant enhancements including audio support, allowing users to add audio tracks, adjust volume per frame, and apply cutting and trimming directly within the library. These features enable more complex and dynamic video productions directly from React projects.

Remotion improves version control by enabling developers to handle video projects like software development. Changes are tracked programmatically, allowing better management of revisions and history compared to traditional video editing software, which often only saves snapshots or requires manual versioning.

To create a video using Remotion, start by setting up a React project and defining a composition. Use the UseCurrentFrame hook to manage frame-specific rendering, and apply animations using the interpolate or spring functions. After development, use webpack to bundle the project, capture frames via a headless browser, and stitch them together with ffmpeg to create the final video.

Jonny Burger
Jonny Burger
34 min
14 May, 2021

Comments

Sign in or register to post your comment.
Video Summary and Transcription
The Talk discusses the use of ReMotion, a library that allows for declarative video creation in React. It covers the process of creating videos, animating elements, and rendering multiple compositions. The Talk also mentions the features of ReMotion, such as audio support and server-side rendering. ReMotion 2.0 introduces audio support and the possibility of live streaming. The Talk concludes by highlighting the frustration with existing video editing tools and the integration of existing videos into ReMotion projects.

1. Introduction and Remote Conference

Short description:

Hey, everyone! I'm Jonny Burger, joining you from Zurich, Switzerland. Let's make the most of this remote conference. Pre-recorded videos, like the Apple keynote, are more dynamic and enjoyable to watch.

Hey, everyone, and welcome. My name is Jonny Burger, and I am coming to you from beautiful Zurich, Switzerland. Unfortunately, we can only do this conference remotely, which means that I am at home, and you are at home, and you don't see me personally, but you just see a video. But it's not so bad. I can make a more interesting video, I can choose different angles, and I can edit out all the mistakes that I make. Have you seen the Apple keynote recently? Now that they pre-record everything, their videos are much more dynamic and much more enjoyable to view.

2. Writing Videos Declaratively in React

Short description:

I want to reuse elements in my video and parametrize them like I do with React components. I made a library called Remotion which allows you to write real mp4 videos declaratively in React. Making a video in React sounds like something very magical but it is actually not and it is not a black box and in order to use ReMotion correctly we actually need to know how it works. We start off in a React project and we define a composition. We have to pass in the width, the height, the duration and the frame rate of the video and which component will be used to render the video. Once we have written our video, we enter a three-stage rendering process. We open multiple tabs and make multiple screenshots at the same time.

Anyway, how we normally edit these videos are graphical programs like Adobe Premiere, iMovie, After Effects, DaVinci Resolve. But for me as a developer, something has always been missing. I want to reuse elements in my video and parametrize them like I do with React components. I want something that is more dynamic than just copy and pasting layers and undoing and redoing my actions. I want something that is more declarative.

Also version control is horrible for videos. If I save my project, close the program and reopen it up, then basically all my history is gone and I only have a snapshot of the current time unless I make copies of my file with v1, v2 file extensions and sometimes the program will just crash and all my progress is gone. I hope you see where I am going with this. I want to write videos programmatically.

I made a library called Remotion which allows you to write real mp4 videos declaratively in React. I am such a big believer in it that this video itself was edited in React using Remotion, at least all the videos that I am submitting to React Summit. This is also an open source video so you can go to the GitHub link you see on the right right now and visit the source code of the video, all the footage is there and all the edits and slides written in React.

Making a video in React sounds like something very magical but it is actually not and it is not a black box and in order to use ReMotion correctly we actually need to know how it works. So let's take a bird eye view at how ReMotion turns code into a video. We start off in a React project and we define a composition. A composition is basically a video but it has metadata on it that we have to define. We have to pass in the width, the height, the duration and the frame rate of the video and which component will be used to render the video. Inside that component, we need to call the UseCurrentFrame hook to basically get back the current time and based on that time, which is expressed as the current frame, which is an integer, we need to render anything that we want using our favorite web technologies. important that we use this frame variable to drive our animation rather than something else like a CSS transition. More on that later.

So, once we have written our video, we enter a three-stage rendering process. The first step is to bundle the video using webpack much like any other web application. In the second step, we open a headless Chromium instance and open the web page in it and then take a screenshot for each frame of the video. To make this process efficient and fast, we need to parallelize it. So, we actually open multiple tabs. The number depends on the number of cores of your CPU, or you can also customize it, but you want to have some kind of parallelism if you can afford it. We open multiple tabs and make multiple screenshots at the same time. This is the reason why it's very important that we use the UseCurrentFrame hook to drive all our animations. Because if you use something like a CSS transition or a RequestAnimationFrame, or you try to include a GIF it will not work because during rendering we open multiple instances of your animation. It pretty much only works if I give you a frame, you must return a static image that does not have any side effects.

3. Creating Videos with ReMotion

Short description:

To create a video using ReMotion, we need to ensure that side effects are synchronized with the current time. After collecting all the frames, we stitch them together using ffmpeg. In the demo, we open the terminal, install the necessary dependencies, and open the development server to preview the video. We define a compositional timeline in ReMotion and can have multiple compositions. An idea for a video project involves creating graphics for each speaker at react summit by importing the list of speakers and animating them.

If you have any side effects where things will animate without the current time that ReMotion says it is changing, then the video will have some artifacts or have some lag. Assuming you don't step over the stone, we can then take all the frames, collect them in the third step and stitch them together using ffmpeg and tada we have a real video that we can then post on social media for example.

Of course there is much more to it like how to add audio, but I think this is the most important concept in ReMotion that you should know. It's time for a quick demo. Let's create our first video together. To get started we simply open up the terminal and type in yarn create video. And the installation has finished. So let's cd into our video and open it up in VS Code. Also, once we have done that, let's run npm start to open up the development server. And you will see that we will get a visual preview of the video. Let's wait a moment until it's done loading. And there we go.

As you can see, there is a Hello World project included in Reemotion. And there's also a timeline with which you can essentially control the current frame. So I have made a quick cut to adjust my project a little bit. I removed all the Hello World stuff and replaced it with a black canvas. Also I have upgraded the Reemotion version to a version that will be out by the time that you watch this video, but it was not out at the time that I was recording. Anyway, so this is how we define a compositional timeline in this entry file of Reemotion and we define the ID, the width, the height, the duration and the frame rate of a video as well as the component as I have previously mentioned. We can also have multiple compositions, they need to have different IDs and then you can just switch in the left side bar between them.

So, and I have replaced this with a black canvas. And now I want to tell you my idea of what kind of video we can make together. I saw on Twitter that react summit is making a graphic for each speaker and posting it, has like the profile picture, the talk name and everything about each speaker on it. And I also noticed that there's like a JSON file in the source code of the react summit website that contains all the speakers. So, I thought it would be fun to create all these graphics at once and also animate them so they are a video, not an image. So, this is what I've done. I've imported the list of all react speakers into this project. To get started, we don't have much time, so I'm gonna code something really quick. See you in a second. What I have quickly created in, pretty much, normal react markup and CSS. As you can see I did not use any library.

4. Animating Elements with ReMotion

Short description:

You can use any library that you want. Instead of a lowercase image tag, I used the image component from ReMotion. It works the same as the native image element, but with a loader that waits for the image to load before rendering. I added circles using SVG. Now, let's animate it. We use the UseCurrentFrame hook to drive the animations. We can animate the profilePicture by scaling it up using the interpolateHelper function. Apply the scale using CSS. For smoother animation, use the spring function and pass in the frame and frames per second. Let's give it a try!

You can use any library that you want. But here I just did this with normal markup and CSS. Just one thing to note. Instead of a lowercase image tag, I used the image component from ReMotion. It works pretty much exactly the same as the native image element. Except that it is wrapped in a loader that will wait until the image is loaded before it renders the frame. So that your video will not contain an image that is not fully loaded. Also I have added some circles here in the background using SVG which is also pretty cool.

Anyway, so now that we have this basic setup which I did not go through in full detail because you probably already know a little bit of React at least. Let's animate it. So, we need to drive all our animations using the UseCurrentFrame hook. So, I'm gonna say const frame equals UseCurrentFrame and now we need to animate things over time.

Let's say we want to animate the profilePicture, so that it scales up. For that we can use the interpolateHelper function in ReMotion. I'm gonna say scale and I'm going to interpolate the scale over time. What I mean by that is I pass in what drives the animation, the frame, and I say an input range. I say from 0 to 30, so that basically means that the duration of the animation is 30 frames. I'm just gonna say it will scale from 0 to 1. Now I just apply this style using normal CSS to the image. And as you can see my image is scaling in over the duration of 1 second. So it doesn't look that smooth. Which is why I prefer to use a spring animation. So let's get rid of that and say const scale equals spring. For that we need to pass in the frame and the current frames per second. So this is the necessary information that the spring function needs to know in order to calculate the video. Yeah, let's just give this a try and there we go! It's animating in! Let's also pass this scale to our circles in the background. Alright, looking pretty good, right? Let's slide in the title as well. So we can say const title, title.translation for example. And I love to use spring animations for everything. Let's pass in fps and frame, but this time let's do it a bit more advanced.

5. Animating Elements and Creating Multiple Videos

Short description:

We can adjust the physical parameters of the spring animation, such as increasing the damping to prevent overshooting. By lying to the spring function about the current frame, we can delay the transition. We also need to interpolate the range from 0 to 1 to something like 0 to 1000. After fixing a small mistake, we animate from 1000 to 0 and apply the translation to all elements. Now, let's create animations for each speaker at React Summit using a JSON file and the .map function.

We can play around with the physical parameters of our spring animation. I like to increase the damping so that it will not overshoot. As you can see here it overshoots and then becomes smaller again. I want to delay this transition so that the animation doesn't immediately start.

For that I simply lie to the spring function, which frame it currently is. And I say, actually it is 20 frames earlier than you think it is. Now the value goes from 0 to 1, as any spring animation does. We actually also need to interpolate it, so that the range is not between 0 and 1, but something like 0 and 1000. So in here, the input range is 0 and 1 and we say, 0 to 1000. I'm gonna pass a transform.

Okay, so I made a small mistake there. I accidentally animated in my name instead of the talk title and also it's going down instead of up. But of course with fast refresh we can easily fix these things. So we actually want to animate in from 1000 to 0. Let's take a look at how that looks like. All right. Let's also apply this translation really quick to all the other elements so that I think it will look good. I'm just going to copy-paste it over here. Maybe you will find a more engineered way to do so. But I think this gets the job done quickly. This is how our video is now looking like. Nice!

Now for what I think is the most fun part. Let's turn one video into dozens of videos. Let's create one of these animations for each of the speakers that appear on React Summit. For this, I have already, as shown before, imported a JSON file into the project. Where all the metadata is there. And I'm just gonna use .map to iterate over each speaker and return a composition for them. So it might look something like that. Let's give each one a unique key. And also very handy that there is a Slack property on each speaker.

6. Dynamic Composition and Video Rendering

Short description:

Let's make the composition dynamic by aligning the components to take props like speaker name, company, avatar URL, and talk title. We can define default props for each composition, and later overwrite them on the command line. Now, let's create an mp4 video using a specific composition ID and run the build command to render the video.

Yeah, there we go. And now as you can see, all the speakers are in the left sidebar. So now, let's make the composition also dynamic by aligning the components to take some props. So let's say speaker name should be a string, company should be a string, avatar URL can also be a string and what else is missing? Talk title. I'm just going to accept these. And speaker name and fill them in, in my video. Here I have a small naming conflict with a style, so I'm just going to do this. Let's put this as the talk title. This is the speaker. Sorry, another naming conflict. I hope this will now solve it. Okay, it was just an indentation problem. Never mind, never mind. Here, let's put the avatar URL. Here we can define some default props for each composition, and I say default because these props can later be overwritten on the command line. But for the preview here in the browser, we're going to put some default props. The avatar URL is going to be speaker.avatar.company, speaker.company, speakerName, speaker.name, and the tagName speaker.game. Let's do speaker.activities.tags, 0.title. I think that's going to be it. And let's cast this to a string. There we go!

So here we have one for Brandon Bayer. Here we have Brent Watney's tag. Lee Robinson's. So now let's make an mp4 video out of it. Let's take the video from KentzC. I'm just going to copy the composition ID of his tag. And I'm just going to put this into my build command, which you can execute using npm run build as a shorthand, or you can write all of this out. So let's write this to kent.mp4 and run npm run build. And yeah, this will take a few seconds. You can get a taste of how long it takes to render a two-second video.

QnA

Features and Open Source

Short description:

Here we can see the 8x concurrency and the video has been created. ReMotion offers features like sequences, audio support, and server-side rendering. Check out the documentation at reemotion.dev for more information. The talk is open source, and you can find the link to the edited video on screen. Thank you for listening and enjoy the Q&A session. Remotion is amazing, and despite some missing features, it's open source for anyone to contribute.

Here we can see, for my laptop it shows the 8x concurrency. So it makes eight screenshots at the same time. And the video has been created. Let's have a look.

And here we have an announcement of Kent CDOT's talk fireside chat with Kent CDOT. I've only been able to barely touch the surface of what ReMotion can do. A few things that you could explore next. The first thing is sequences, which is a construct that allows you to shift time. Which is really handy if you have multiple scenes that you want to work on individually and then you want to put them together and that they play after each other. Which really helps you to organize your code and organization and reusability and encapsulation is the key to scaling up your video production. Another thing is audio support. By the time that you are watching this, we will have launched audio support. You will be able to do things like have as many tracks as you want, you will be able to change the volume on a pair frame basis and you will be able to cut and trim audio. Another big topic server side rendering. So how to render a video based on an API call. For all these things I would recommend you to check out the documentation at reemotion.dev where we will explain all the topics and we also have some video tutorials there to help you understand the ins and outs of Reemotion. Also remember that this talk is open source and by that I don't just mean the slides but like the whole video everything that you saw was edited using Reemotion and you can check out you can check it out with the link that is on screen now.

Yeah, that's it. Thanks a lot for hearing me out. I will be live for a Q&A right after this talk and I hope you will afterwards also enjoy the other talks of this evening.

Hey, good. Thanks for having me. Yeah, so it's my pleasure to have you here. I'm also following you on Twitter for a while now and you're posting so many amazing things, but this Remotion is on a different level, I have to tell you directly. It's amazing, what you did with only React.

Cool. Yeah, thank you. The audience doesn't seem to agree, they don't want to use it apparently. For the moment, I think there are still maybe missing features, right, but you're working on it. It's also open source, so anyone can contribute on that.

Reomotion 2.0 and Live Streaming

Short description:

So, I don't see the point why in the near future, maybe next year, everyone will create videos using Reemotion. The biggest feature in Reomotion 2.0 is audio support. Reomotion will create as complex as necessary FF MPEG filter to convert your React markup into a real audio track of an mp4 file. It's amazing. It's not the intended use case to use it for live streaming, but I would say it's generic in a way that I could foresee it to happen. You can just play the video in a browser, and soon I'm gonna make it that you can embed this in your own web page and then change the props of the composition live.

So I don't see the point why in the near future, maybe next year, everyone will create videos using Reemotion. So it seems that the GUI has 53%. It's almost tied, right? I mean, we can say an even poll here. Yeah, that's pretty good. I mean, if half of the people want to write in React, that's still a big chunk.

So, actually, not so bad. Let's see if we have any questions. I'll go now to the chat. Let's see. All right. So it seems that we don't have so many questions. Just, if you have any, just shoot them in the Basecamp Q&A. I have some questions on my own.

So, Johnny, I saw that you recently, two hours ago, actually announced Reomotion 2.0, which is a major bump. What were the improvements that you did? What were the new features that you've added? If you can add just a little bit there.

Sure. So there's a brand new version of Reomotion out just two hours ago. And the reason why it's just now is because I mentioned it in my talk. And then, so I was really motivated to actually ship it because I announced it here. I just did it in time. The biggest feature in Reomotion 2.0 is audio support. I think really cool that you can just declaratively put these audio tags, cut and trim, put them at any position, put multiple audio tracks, and even like change the volume per frame, create fading effects, fade it out at certain times. Reomotion will create as complex as necessary FF MPEG filter to convert your React markup into a real audio track of an mp4 file. That's really powerful, because I saw on YouTube for example lots of channels are using this kind of equalizer like Soundwave for all the videos. So just adding a video without going through a third party application and just directly in the browser maybe, or just popping up the terminal and type something and pass the mp4 mp3 and have an mp4 as an output. It's amazing.

Now, I am wondering, is it possible to make it live, like, to just feed an mp3 or live streaming session let's say audio and just have it out there. Yeah, interesting. I would say it's not the intended use case to use it for live streaming, but I would say it's generic in a way that I could foresee it to happen. So, I mean, right now, as you saw, you can just play the video in a browser, and soon I'm gonna make it that you can embed this in your own web page and then change the props of the composition live.

Lumotion Inspiration and Integration

Short description:

If you script it cleverly, you can stream it. The original inspiration for creating Lumotion was frustration with existing video editing tools. Remotion does not make videos from CSS transitions. Instead, it requires creating static images for each frame. Integrating existing videos into a project is done using an HTML5 video tag and a video component in ReMotion that synchronizes with the current time.

So, yeah, I mean, if you script it in a clever way then and then stream that, why not? We have a question from Afro Dev Girl is saying, what was the original inspiration for creating Lumotion? The original inspiration. Well, I would say it was more like a frustration with the existing video editing tools. Since, yeah, I mean, I was missing these features that I am used to as a developer. I just like have this version history, just be able to pull in data from APIs. Do stuff with an API call programmatically. With video editing programs, I would just have to open it. And no good tool for abstraction except copy pastes. All these things led me to eventually create my own video editing program.

Great. I can see that frustration. I also have it. Another question from Vadik. Does Remotion makes videos from CSS transitions only? If it makes them from CSS transitions only? No, not at all. I think I mentioned this in the video, that CSS transitions actually don't work at all. The reason is that you're supposed to create a bunch of static images in React. If I give you a frame number, you create a static image. And these static images together make an animation. Whereas a CSS transition, it's not so much an animation that is derived from the frame number. You just put it in your CSS file and it moves itself without you doing any other stuff. So instead of using a CSS transition, you would just like calculate the position of, calculate the property value for each frame and then render that. And so, yeah, that is a constraint, but also a really necessary one. And once you get the hang of it, it is quite nice because then you can just like take the timeline cursor and drag it back and forth, pause your animation and that you cannot do with a CSS transition.

Great. Another question from jrock94, how do you integrate existing video into a project? I'm not sure if he's thinking about MP4s or ReMotion projects. That works pretty well. You just use an HTML5 video tag, and you just load your video into using an import statement like you would in Webpack, and pass that to a source of a video tag. Like I said, normally, this would not be driven by the frame that ReMotion thinks it is. But we have made a video component that will synchronize the video with the current time. So that works pretty well. Actually, all the screencasts that you have seen on the right side of the screen while I was coding, it was just a video imported into ReMotion and played back.

Conclusion and Q&A

Short description:

It was a smooth and enjoyable project. Unfortunately, we ran out of time. Thank you, Johnny, for sharing another way to create videos using code, especially React. More questions are coming in, and Johnny will be available on Spatial chat to answer them shortly.

It was so smooth that I think it was not possible to see that I did not just submit the video like it was. Yeah, it's pretty much an inception, right? Just recording yourself inside the recording. He's like heading yourself inside the recording. Really nice. It's a fun project.

Unfortunately, we ran out of time. And thank you so much, Johnny, for taking your time and showing us another way of creating videos using code. Especially React.

Are you available for any other questions? Because I see that they are coming now. Are you on Spatial chat, and people can find you there? Yeah, absolutely. Yeah, I have it marked in my calendar. I will now move over to Spatial chat. And great to see that more questions are coming in. I'm very happy to answer them in a few minutes over there. Thank you so much, Johnny. Once again, it was a pleasure to have you here. Enjoy the rest of the day. Thank you. Bye. Thank you. Bye. Enjoy the conference.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced 2022React Advanced 2022
25 min
A Guide to React Rendering Behavior
Top Content
This transcription provides a brief guide to React rendering behavior. It explains the process of rendering, comparing new and old elements, and the importance of pure rendering without side effects. It also covers topics such as batching and double rendering, optimizing rendering and using context and Redux in React. Overall, it offers valuable insights for developers looking to understand and optimize React rendering.
Building Better Websites with Remix
React Summit Remote Edition 2021React Summit Remote Edition 2021
33 min
Building Better Websites with Remix
Top Content
Remix is a web framework built on React Router that focuses on web fundamentals, accessibility, performance, and flexibility. It delivers real HTML and SEO benefits, and allows for automatic updating of meta tags and styles. It provides features like login functionality, session management, and error handling. Remix is a server-rendered framework that can enhance sites with JavaScript but doesn't require it for basic functionality. It aims to create quality HTML-driven documents and is flexible for use with different web technologies and stacks.
React Compiler - Understanding Idiomatic React (React Forget)
React Advanced 2023React Advanced 2023
33 min
React Compiler - Understanding Idiomatic React (React Forget)
Top Content
Watch video: React Compiler - Understanding Idiomatic React (React Forget)
Joe Savona
Mofei Zhang
2 authors
The Talk discusses React Forget, a compiler built at Meta that aims to optimize client-side React development. It explores the use of memoization to improve performance and the vision of Forget to automatically determine dependencies at build time. Forget is named with an F-word pun and has the potential to optimize server builds and enable dead code elimination. The team plans to make Forget open-source and is focused on ensuring its quality before release.
Using useEffect Effectively
React Advanced 2022React Advanced 2022
30 min
Using useEffect Effectively
Top Content
Today's Talk explores the use of the useEffect hook in React development, covering topics such as fetching data, handling race conditions and cleanup, and optimizing performance. It also discusses the correct use of useEffect in React 18, the distinction between Activity Effects and Action Effects, and the potential misuse of useEffect. The Talk highlights the benefits of using useQuery or SWR for data fetching, the problems with using useEffect for initializing global singletons, and the use of state machines for handling effects. The speaker also recommends exploring the beta React docs and using tools like the stately.ai editor for visualizing state machines.
Routing in React 18 and Beyond
React Summit 2022React Summit 2022
20 min
Routing in React 18 and Beyond
Top Content
Routing in React 18 brings a native app-like user experience and allows applications to transition between different environments. React Router and Next.js have different approaches to routing, with React Router using component-based routing and Next.js using file system-based routing. React server components provide the primitives to address the disadvantages of multipage applications while maintaining the same user experience. Improving navigation and routing in React involves including loading UI, pre-rendering parts of the screen, and using server components for more performant experiences. Next.js and Remix are moving towards a converging solution by combining component-based routing with file system routing.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
Watch video: React Concurrency, Explained
React 18's concurrent rendering, specifically the useTransition hook, optimizes app performance by allowing non-urgent updates to be processed without freezing the UI. However, there are drawbacks such as longer processing time for non-urgent updates and increased CPU usage. The useTransition hook works similarly to throttling or bouncing, making it useful for addressing performance issues caused by multiple small components. Libraries like React Query may require the use of alternative APIs to handle urgent and non-urgent updates effectively.

Workshops on related topic

React Performance Debugging Masterclass
React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Top Content
Featured Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Next.js for React.js Developers
React Day Berlin 2023React Day Berlin 2023
157 min
Next.js for React.js Developers
Top Content
Featured WorkshopFree
Adrian Hajdin
Adrian Hajdin
In this advanced Next.js workshop, we will delve into key concepts and techniques that empower React.js developers to harness the full potential of Next.js. We will explore advanced topics and hands-on practices, equipping you with the skills needed to build high-performance web applications and make informed architectural decisions.
By the end of this workshop, you will be able to:1. Understand the benefits of React Server Components and their role in building interactive, server-rendered React applications.2. Differentiate between Edge and Node.js runtime in Next.js and know when to use each based on your project's requirements.3. Explore advanced Server-Side Rendering (SSR) techniques, including streaming, parallel vs. sequential fetching, and data synchronization.4. Implement caching strategies for enhanced performance and reduced server load in Next.js applications.5. Utilize React Actions to handle complex server mutation.6. Optimize your Next.js applications for SEO, social sharing, and overall performance to improve discoverability and user engagement.
Concurrent Rendering Adventures in React 18
React Advanced 2021React Advanced 2021
132 min
Concurrent Rendering Adventures in React 18
Top Content
Featured Workshop
Maurice de Beijer
Maurice de Beijer
With the release of React 18 we finally get the long awaited concurrent rendering. But how is that going to affect your application? What are the benefits of concurrent rendering in React? What do you need to do to switch to concurrent rendering when you upgrade to React 18? And what if you don’t want or can’t use concurrent rendering yet?

There are some behavior changes you need to be aware of! In this workshop we will cover all of those subjects and more.

Join me with your laptop in this interactive workshop. You will see how easy it is to switch to concurrent rendering in your React application. You will learn all about concurrent rendering, SuspenseList, the startTransition API and more.
React Hooks Tips Only the Pros Know
React Summit Remote Edition 2021React Summit Remote Edition 2021
177 min
React Hooks Tips Only the Pros Know
Top Content
Featured Workshop
Maurice de Beijer
Maurice de Beijer
The addition of the hooks API to React was quite a major change. Before hooks most components had to be class based. Now, with hooks, these are often much simpler functional components. Hooks can be really simple to use. Almost deceptively simple. Because there are still plenty of ways you can mess up with hooks. And it often turns out there are many ways where you can improve your components a better understanding of how each React hook can be used.You will learn all about the pros and cons of the various hooks. You will learn when to use useState() versus useReducer(). We will look at using useContext() efficiently. You will see when to use useLayoutEffect() and when useEffect() is better.
Introducing FlashList: Let's build a performant React Native list all together
React Advanced 2022React Advanced 2022
81 min
Introducing FlashList: Let's build a performant React Native list all together
Top Content
Featured Workshop
David Cortés Fulla
Marek Fořt
Talha Naqvi
3 authors
In this workshop you’ll learn why we created FlashList at Shopify and how you can use it in your code today. We will show you how to take a list that is not performant in FlatList and make it performant using FlashList with minimum effort. We will use tools like Flipper, our own benchmarking code, and teach you how the FlashList API can cover more complex use cases and still keep a top-notch performance.You will know:- Quick presentation about what FlashList, why we built, etc.- Migrating from FlatList to FlashList- Teaching how to write a performant list- Utilizing the tools provided by FlashList library (mainly the useBenchmark hook)- Using the Flipper plugins (flame graph, our lists profiler, UI & JS FPS profiler, etc.)- Optimizing performance of FlashList by using more advanced props like `getType`- 5-6 sample tasks where we’ll uncover and fix issues together- Q&A with Shopify team
React, TypeScript, and TDD
React Advanced 2021React Advanced 2021
174 min
React, TypeScript, and TDD
Top Content
Featured Workshop
Paul Everitt
Paul Everitt
ReactJS is wildly popular and thus wildly supported. TypeScript is increasingly popular, and thus increasingly supported.

The two together? Not as much. Given that they both change quickly, it's hard to find accurate learning materials.

React+TypeScript, with JetBrains IDEs? That three-part combination is the topic of this series. We'll show a little about a lot. Meaning, the key steps to getting productive, in the IDE, for React projects using TypeScript. Along the way we'll show test-driven development and emphasize tips-and-tricks in the IDE.