JavaScript Isn’t Slow – It’s Just Scheduled Wrong

This ad is not shown to multipass and full ticket holders
JSNation US
JSNation US 2025
November 17 - 20, 2025
New York, US & Online
See JS stars in the US biggest planetarium
Learn More
In partnership with Focus Reactive
Upcoming event
JSNation US 2025
JSNation US 2025
November 17 - 20, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

In this session, we’ll explore why JavaScript’s single-threaded model causes UI lag, why setTimeout() and requestIdleCallback() fail at task prioritization, and how scheduler.postTask() finally gives developers fine-grained control over execution.

Through real-world examples, performance analysis, and a live demo, we’ll show how prioritized scheduling can eliminate UI freezes, improve responsiveness, and make JavaScript execution truly user-centric. If you care about JavaScript performance, this is the talk you don’t want to miss! 🚀

This talk has been presented at JSNation 2025, check out the latest edition of this JavaScript Conference.

FAQ

The main topic of Srilakna's talk is that JavaScript isn't inherently slow, but rather its scheduling model is inflexible.

JavaScript appears to be slow because of its inflexible scheduling model that doesn't prioritize tasks effectively, leading to potential UI freezes and lag.

JavaScript's scheduling and task prioritization are problematic because the single-threaded, cooperative scheduling model can block important tasks like user interactions if a heavy computation is running.

The scheduler API mentioned by Srilakna is the 'postTask' API, which allows developers to queue tasks with specific priorities to improve performance and responsiveness.

The 'postTask' API allows tasks to be inserted into a global task queue with specific priority levels, such as user-blocking, user-visible, and background, influencing how and when they're executed.

The 'postTask' API has three priority levels: user-blocking (highest priority), user-visible (medium priority), and background (lowest priority).

SetTimeout cannot prioritize or interrupt tasks, and requestIdleCallback only runs tasks when the thread is idle, which may never happen if the user continuously interacts with the browser.

The benefit of using the 'postTask' API is that it provides explicit control over task priority, ensuring important tasks run promptly and preventing UI freezes and lags.

Srilakna is a JavaScript engineer who specializes in improving JavaScript performance and task scheduling.

The 'postTask' API is currently available in Chromium-based browsers, with polyfills available for others. Safari and Firefox are also tracking its implementation.

Sulagna Ghosh
Sulagna Ghosh
14 min
16 Jun, 2025

Comments

Sign in or register to post your comment.
Video Summary and Transcription
Srilakna discusses JavaScript performance, highlighting scheduling challenges and the limitations of existing models. The introduction of the Post Task Scheduler API addresses these issues, offering promise-based scheduling for specific priorities. The internal workings and implementation of the API are explained, emphasizing task prioritization to ensure smooth UI performance. The API enables developers to control task priorities effectively, preventing UI freeze and lag.

1. Analyzing JavaScript Performance

Short description:

Srilakna, a JavaScript engineer, discusses how JavaScript isn't slow but scheduled wrong. He presents a demo to showcase responsiveness issues and explores performance troubleshooting techniques.

Hey, everyone. I'm Srilakna. I'm a JavaScript engineer. And today I'm going to talk about a very interesting topic, JavaScript isn't slow, it's just scheduled wrong. I know the title is a bit dramatic, but soon we will find out why I've said that.

So I have prepared a small demo to introduce myself. So let me go at the demo part. I'm very excited for that. So here's my demo. On clicking on one or no more, you will see a short introduction of myself. And this test responsiveness and counter is for the later demo purpose.

So let me click on the button. Wait, what happened? The counter froze for a moment, right? OK, let me do it again. One or no more. Test responsiveness button is not clickable for a moment. What is happening? I was not prepared for that. OK, let me calm myself down and try to fix this. I can do this. So I will start with checking at the console tab if there is any error or not. So let me do that at first. Console and I'm refreshing. One or no more. Test responsiveness. No, there is no error in the console. So I don't have any network call. So I'm skipping the network tab too. This is something related to lag. So I think performance tab can help us. So let me go at the performance tab and. Again, start recording.

2. JavaScript Scheduling Challenges

Short description:

Exploring issues with JavaScript scheduling, heavy computation blocking the main thread, user click priority, and task prioritization in performance visualization.

Test responsiveness. OK, stop. OK, we can see a very long yellow mountain. I know what is this for? Actually, I have a synchronous heavy computation, but actually used chunking for that so that it doesn't block the main thread. But this doesn't work, actually, somehow. These are the compute chunk method. And I can see this is, I think, the button click for test responsiveness. Let me check. So. Yes, function call. Yes, this is the test responsiveness button click. And I was actually expecting this button to be registered somewhere in between the compute chunk, in between two compute chunks. But this is not actually happened.

Entire block of JavaScript, heavy JavaScript was executed completely. Then this registered and the set interval for the counter is getting registered. And I know, I think I know why is this happening. OK, so before I explain this to you people, let me quickly summarize what is happening in here. So heavy computation block the entire main thread, though I have used chunking to prevent that from happening. And second is a user click, which should have the highest priority. That is not getting properly registered. That is registered after the entire heavy execution is done. And from this performance visualization, we can clearly see one thing that something is wrong with JavaScript scheduling and task prioritization.

Before we move far with this debugging, we should remember two important things about JavaScript that JavaScript is single threaded, means JavaScript uses one thread, single thread to perform tasks, logic, UI of desk and JavaScript uses cooperative scheduling. Means if JavaScript is currently executing something and I go to JavaScript telling like, please execute my task, JavaScript will tell me let me finish my task first, then I will schedule or execute yours. This is actually the basic concept of cooperative scheduling. And when we are talking about cooperative scheduling, let us quickly check how the priority model that JavaScript uses behind. So at the very first is do not yield. This is reserved for the synchronous codes that this runs fully to completion has the highest priority. Next is the next highest priority has micro tasks.

3. JavaScript Scheduling Model Limitations

Short description:

Exploring web priorities, challenges with JavaScript scheduling, and limitations of set timeout and request idle callback for task prioritization and interruption.

Next is the next highest priority has micro tasks. These are promise callbacks, mutation observers. This run next. After that, the third highest priority is request animation frame. Callbacks that run just before the next three pins. It has the third highest priority. After that, interframe tasks, normal tasks like timer functions, fetch callbacks, this kind of task falls under this category. This runs at ordinary priority. And at the very last is request idle callback. This has the lowest priority, means when the thread is in the idle state, only then this kind of task will get executed. So this is the hierarchical structure of web priorities.

Even though I've used chunking to prevent this heavy task getting blocked, the main thread as JavaScript is single threaded and uses cooperative scheduling. JavaScript is like complete the entire execution of the heavy task first, and then it will check for the next highest prioritized task. We developers clearly have no control to tell JavaScript that after you will complete this chunk, after you will complete executing a particular chunk, please execute this user click, and then you can continue with the next chunk. This inflexibility is the problem with JavaScript. And this is not how fast JavaScript is, but how inflexible the JavaScript scheduling model has been.

There are options like set timeout and request idle callback, but they have limitations. Set timeout is just a timer function without task prioritization or interruption capabilities. Request idle callback runs only when the thread is in idle mode, which may not happen during constant user interaction, leading to unpredictability. Dependence on these options is unreliable due to their constraints. These insights highlight the challenges in task prioritization and interruption within JavaScript's scheduling model.

4. Introducing Post Task Scheduler API

Short description:

Discussing the limitations of set timeout and request idle callback, introducing the new post task scheduler API with promise-based scheduling for specific priorities, currently available in Chromium-based browsers.

So no set timeout can't. Then request idle callback. We have seen the priority hierarchy. It runs when the thread is in idle mode. Think about the situation. If the user is constantly interacting with the browser, then there is no idle mode for the thread. So the task will never run. So complete unpredictability. So these two options, we cannot depend on these two. So this is actually a real scheduling problem and we need a real solution for this.

No workarounds like using web workers when that is not even needed or showing a sync of it at everything. No, a real solution. Well, if you would have asked me this some years ago, I would say no. But today, yes, we have the answer. JavaScript has released a brand new family of scheduler APIs. And today we are going to talk about the most important one and most prioritized one. That is post task.

So what is post task? It is a promise based scheduling API that lets you queue your work with a specific priority. And this priority influences how browsers, event loops schedules it. So this is how the syntax it is. We will see about this syntax a few moments later. Another important thing is currently this is available in Chromium based browsers. Polyfill exists though. And Safari and Firefox, non Chromium based browsers are also tracking this. This is a breadth of scheduler API.

5. Understanding Post Task Internal Workings

Short description:

Explaining the internal workings of the post task scheduler API, highlighting task queue usage and priority levels for different types of tasks.

So next important thing is how it works internally. So it uses a global task queue. So first question, what is task queue? Task queue is a queue where tasks means JavaScript function you want the browser to run. Wait for data. And when we use scheduler.postTask, you are inserting a task into one of these queues.

And each task queue has a specific priority level from the syntax. We have a glimpse of the priority. So there are three kinds of priorities. User blocking, user visible and background. User blocking has the highest priority and reserve for user clicks and important tasks. Next is user visible with medium priority and default for post task. Background task is lowest priority for non-visual and heavy computation tasks.

Another important thing is when you post a task, it inherits the priority of the queue you placed it in. A task in the background queue is treated as low priority and yields to more important work. This gives a clear understanding of how the post task API works. The syntax of the post task API is quick and implementing it practically is the next step in the process.

6. Implementing Post Task API in Code

Short description:

Demonstrating the implementation of the post task API in code for heavy computation tasks, ensuring smooth UI performance without lag or screen freezing.

And a task, as I said, in the background queue is treated as a low priority and will yield to more important work. So we clearly have an understanding of how this post task API works. And this is the syntax of the post task API, just a quick syntax. Let me implement them in my code, enough of the theoretical part.

So what I will do, this is my compute chunk function. Here, I have this long synchronous block of heavy computation. I will wrap this compute chunk with the scheduler API as it is a promise-based function. So I use a sink of it and enable the appropriate button for this.

Yes, this is what I was expecting, and it is working just as I expected. No UI lag, no frozen screen. Let me show you how this is getting placed in the main thread. First, let me refresh it again and click on this. This responsiveness stops. Now, we are seeing multiple chunks, and I'll explain what is happening here. Every compute chunk is wrapped with this post task priority, and we will see the interaction button. Click somewhere in here. Let me check if the click function is called. Yes, this is the event.

7. Understanding Post Task Prioritization

Short description:

Explaining the prioritization of compute chunks using the post task API to prevent UI freeze and lag, showcasing the power of explicit task priority control for developers.

This responsiveness stop. OK, so now we are seeing, actually, there are multiple of chunks. And I'll explain what is happening in here. So this is every compute chunk is wrapped with this post task priority. And we will see the interaction button. Click somewhere in here.

Yes. In here. So. Let me check. Even click function called. Yes, this is the event. So what is happening here as we have wrapped the compute chunk with this post task with priority background, it is treating them as a lowest priority task. So when it first started the compute chunking and then when the user click enters, then it gives user click the space of its own.

And after the processing of the user click, then it again starts the compute chunks functions or computing the chunks, individual chunks. And yes, we have prevented the UI freeze and lagging. So we are at the very end of our session and we made JavaScript fast again. See, JavaScript is an inherently slow. Its default scheduling is just rigid.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced 2022React Advanced 2022
25 min
A Guide to React Rendering Behavior
Top Content
This transcription provides a brief guide to React rendering behavior. It explains the process of rendering, comparing new and old elements, and the importance of pure rendering without side effects. It also covers topics such as batching and double rendering, optimizing rendering and using context and Redux in React. Overall, it offers valuable insights for developers looking to understand and optimize React rendering.
Speeding Up Your React App With Less JavaScript
React Summit 2023React Summit 2023
32 min
Speeding Up Your React App With Less JavaScript
Top Content
Watch video: Speeding Up Your React App With Less JavaScript
Mishko, the creator of Angular and AngularJS, discusses the challenges of website performance and JavaScript hydration. He explains the differences between client-side and server-side rendering and introduces Quik as a solution for efficient component hydration. Mishko demonstrates examples of state management and intercommunication using Quik. He highlights the performance benefits of using Quik with React and emphasizes the importance of reducing JavaScript size for better performance. Finally, he mentions the use of QUIC in both MPA and SPA applications for improved startup performance.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
Watch video: React Concurrency, Explained
React 18's concurrent rendering, specifically the useTransition hook, optimizes app performance by allowing non-urgent updates to be processed without freezing the UI. However, there are drawbacks such as longer processing time for non-urgent updates and increased CPU usage. The useTransition hook works similarly to throttling or bouncing, making it useful for addressing performance issues caused by multiple small components. Libraries like React Query may require the use of alternative APIs to handle urgent and non-urgent updates effectively.
How React Compiler Performs on Real Code
React Advanced 2024React Advanced 2024
31 min
How React Compiler Performs on Real Code
Top Content
I'm Nadia, a developer experienced in performance, re-renders, and React. The React team released the React compiler, which eliminates the need for memoization. The compiler optimizes code by automatically memoizing components, props, and hook dependencies. It shows promise in managing changing references and improving performance. Real app testing and synthetic examples have been used to evaluate its effectiveness. The impact on initial load performance is minimal, but further investigation is needed for interactions performance. The React query library simplifies data fetching and caching. The compiler has limitations and may not catch every re-render, especially with external libraries. Enabling the compiler can improve performance but manual memorization is still necessary for optimal results. There are risks of overreliance and messy code, but the compiler can be used file by file or folder by folder with thorough testing. Practice makes incredible cats. Thank you, Nadia!
The Future of Performance Tooling
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Today's Talk discusses the future of performance tooling, focusing on user-centric, actionable, and contextual approaches. The introduction highlights Adi Osmani's expertise in performance tools and his passion for DevTools features. The Talk explores the integration of user flows into DevTools and Lighthouse, enabling performance measurement and optimization. It also showcases the import/export feature for user flows and the collaboration potential with Lighthouse. The Talk further delves into the use of flows with other tools like web page test and Cypress, offering cross-browser testing capabilities. The actionable aspect emphasizes the importance of metrics like Interaction to Next Paint and Total Blocking Time, as well as the improvements in Lighthouse and performance debugging tools. Lastly, the Talk emphasizes the iterative nature of performance improvement and the user-centric, actionable, and contextual future of performance tooling.
Optimizing HTML5 Games: 10 Years of Learnings
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Optimizing HTML5 Games: 10 Years of Learnings
Top Content
Watch video: Optimizing HTML5 Games: 10 Years of Learnings
PlayCanvas is an open-source game engine used by game developers worldwide. Optimization is crucial for HTML5 games, focusing on load times and frame rate. Texture and mesh optimization can significantly reduce download sizes. GLTF and GLB formats offer smaller file sizes and faster parsing times. Compressing game resources and using efficient file formats can improve load times. Framerate optimization and resolution scaling are important for better performance. Managing draw calls and using batching techniques can optimize performance. Browser DevTools, such as Chrome and Firefox, are useful for debugging and profiling. Detecting device performance and optimizing based on specific devices can improve game performance. Apple is making progress with WebGPU implementation. HTML5 games can be shipped to the App Store using Cordova.

Workshops on related topic

React Performance Debugging Masterclass
React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Top Content
Featured Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Next.js 13: Data Fetching Strategies
React Day Berlin 2022React Day Berlin 2022
53 min
Next.js 13: Data Fetching Strategies
Top Content
Workshop
Alice De Mauro
Alice De Mauro
- Introduction- Prerequisites for the workshop- Fetching strategies: fundamentals- Fetching strategies – hands-on: fetch API, cache (static VS dynamic), revalidate, suspense (parallel data fetching)- Test your build and serve it on Vercel- Future: Server components VS Client components- Workshop easter egg (unrelated to the topic, calling out accessibility)- Wrapping up
React Performance Debugging
React Advanced 2023React Advanced 2023
148 min
React Performance Debugging
Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Building WebApps That Light Up the Internet with QwikCity
JSNation 2023JSNation 2023
170 min
Building WebApps That Light Up the Internet with QwikCity
WorkshopFree
Miško Hevery
Miško Hevery
Building instant-on web applications at scale have been elusive. Real-world sites need tracking, analytics, and complex user interfaces and interactions. We always start with the best intentions but end up with a less-than-ideal site.
QwikCity is a new meta-framework that allows you to build large-scale applications with constant startup-up performance. We will look at how to build a QwikCity application and what makes it unique. The workshop will show you how to set up a QwikCitp project. How routing works with layout. The demo application will fetch data and present it to the user in an editable form. And finally, how one can use authentication. All of the basic parts for any large-scale applications.
Along the way, we will also look at what makes Qwik unique, and how resumability enables constant startup performance no matter the application complexity.
High-performance Next.js
React Summit 2022React Summit 2022
50 min
High-performance Next.js
Workshop
Michele Riva
Michele Riva
Next.js is a compelling framework that makes many tasks effortless by providing many out-of-the-box solutions. But as soon as our app needs to scale, it is essential to maintain high performance without compromising maintenance and server costs. In this workshop, we will see how to analyze Next.js performances, resources usage, how to scale it, and how to make the right decisions while writing the application architecture.
Maximize App Performance by Optimizing Web Fonts
Vue.js London 2023Vue.js London 2023
49 min
Maximize App Performance by Optimizing Web Fonts
WorkshopFree
Lazar Nikolov
Lazar Nikolov
You've just landed on a web page and you try to click a certain element, but just before you do, an ad loads on top of it and you end up clicking that thing instead.
That…that’s a layout shift. Everyone, developers and users alike, know that layout shifts are bad. And the later they happen, the more disruptive they are to users. In this workshop we're going to look into how web fonts cause layout shifts and explore a few strategies of loading web fonts without causing big layout shifts.
Table of Contents:What’s CLS and how it’s calculated?How fonts can cause CLS?Font loading strategies for minimizing CLSRecap and conclusion