A Different Kind of Serverless: A Case Study for SQLite and Whisper.cpp

This ad is not shown to multipass and full ticket holders
JSNation US
JSNation US 2025
November 17 - 20, 2025
New York, US & Online
See JS stars in the US biggest planetarium
Learn More
In partnership with Focus Reactive
Upcoming event
JSNation US 2025
JSNation US 2025
November 17 - 20, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

You can build applications with a local-first focus. In this case study, we dig into how you might use WASM to run Whisper.cpp for Speech-to-text and also use WASM to host a local SQLite database. The only reason we need a network connection at all for this application is to get the initial JS payload and to download the Whisper models from a remote store since they can get quite large. 

This talk has been presented at JSNation US 2024, check out the latest edition of this JavaScript Conference.

FAQ

Wasm, or WebAssembly, is an open standard designed to support any language on any operating system. It enables running compiled code from languages like Rust, C++, or Go in the browser, offering major portability.

Popular applications using WebAssembly include Figma, CapCut, 1Password, Google Earth, uBlock Origin, and Leachess. These applications leverage Wasm for improved performance and capabilities in the browser.

WebAssembly allows browser-based applications to run more performant code by compiling languages such as C++ or Rust to Wasm, providing real data structures and memory access, which can be faster and more efficient than JavaScript.

Limitations of using WebAssembly in the browser include a lack of direct DOM access, higher potential RAM usage, and the need to run in a WebWorker. Outside the browser, system access may be limited without a WASI runtime.

WASI, or the WebAssembly System Interface, is a standard designed to provide a POSIX-like feature set to WebAssembly, enabling functionalities like file I/O and networking within Wasm code.

Common languages used with WebAssembly include C, C++, Rust, and Go. Other languages with varying support include C#, Zig, Lua, and more.

WebAssembly can be run outside of the browser using tools like Xtism, WasmTime, Spin SDK, and Wasmer, which provide server-side runtimes and environments for executing Wasm code.

Security concerns with WebAssembly include its binary format, which can be obfuscated and harder to analyze than JavaScript, and shared array buffers, which require secure contexts to mitigate vulnerabilities like Spectre Meltdown.

Orderly is an application built for voice dictation of books and blog posts, using WebAssembly combined with technologies like SQLite in the browser and whisper.cpp for transcription, allowing it to run without a server.

Hosting considerations for WebAssembly applications include the need for proper header configuration to support shared array buffers and OPFS. Services like Render, Netlify, or VPS can be used for hosting models and applications.

Chris Griffing
Chris Griffing
25 min
21 Nov, 2024

Comments

Sign in or register to post your comment.
Video Summary and Transcription
Today's Talk introduces WebAssembly (Wasm) and its versatility in supporting any language on any operating system. Wasm's history dates back to 2011 with Emscripten and NACL, and it has gained importance with the introduction of Wasi, the WebAssembly system interface. Wasm is supported by modern browsers and can be used with languages like C, C++, Rust, and Go. Popular applications like Figma, CapCut, and 1Password utilize Wasm for improved performance. Web Workers and shared array buffers eliminate the need for object serialization. The Talk also discusses the use of Keasley and Drizzle for interacting with SQL. Building and optimizing the application involves considerations of migrations, app readiness, suspense, optimistic UI, and debouncing inputs. Whisper, an open-source AI project, offers models for audio-to-text conversion and is implemented using whisper.cpp. Deployment options include Render, Netlify, Vercel, and Cloudflare, but using a cheap VPS provider with a file server can be a cost-effective alternative.

1. Introduction to Wasm

Short description:

Today I'm going to talk about a different kind of serverless: a case study for SQLite and whisper.cpp running in the browser. I'll explain what Wasm is and how it supports any language on any operating system. Let's dig into it!

Hey everyone, today I'm going to be talking about a different kind of serverless. This is a case study for SQLite and whisper.cpp running purely in the browser. So basically, I built an application. I want to kind of walk you through how I built that application, how it works under the hood a bit, and a little bit of the technology I used.

But also, I'm going to have to explain what Wasm actually is, because that's how we're able to do all of this. So, let's get started. But first, a little bit about me. So I'm a developer advocate at GitKraken. It's my first time in this kind of role, and I'm really loving it. I've been an engineer for about 10 years, and it's quite fun. I'm a Twitch streamer and a lazy YouTuber. As well as a beanie enthusiast. I have a lot of beanies. If you want to find me on other socials, my name is CMGriffin on basically every platform. You can use this QR code to also find me, and you know, yeah, let's connect. Let's chat. But now, let's dig into it.

What is Wasm? So, Wasm is WebAssembly. It's an open standard, and it aims to support any language on any operating system. The idea is, if you can compile from a language like Rust or C++ or Go or whatever, you can run it in your browser as Wasm, and you have this major portability. So, yeah. Wasm. What? Here's a technical or textual representation of what Wasm kind of looks like. And you don't need to memorize this, but let's just imagine this program that is built of these, you know, S expressions. When it's compiled to Wasm, it's going to be turned into this binary format. So, this is that same exact program. Now, you're not going to need to memorize this, but we can go through and kind of look at what some of these things are. So, that first line identifies this as WebAssembly, as a WebAssembly binary, as Wasm stuff, as well as the version of Wasm that we're supporting. There's going to be the module definition. There is the types of the parameters, the number of types that we're passing, you know, the function that we're exporting, you know, the number of results that we'll return, and the type of that return result.

2. Wasm: Function Signature, History, and Wasi

Short description:

There's the function signature, the export section, and the binary format. Wasm's history dates back to 2011 with Emscripten and NACL. In 2015, Wasm was announced, and NACL was deprecated in favor of Wasm. Wasi, the WebAssembly system interface, provides a POSIX-like feature set. Wasm and Wasi's importance is highlighted by the creator of Docker. Running Wasm on the server is the future of computing.

There's the function signature, right? So, that's the arguments that the function takes, the size of it, the number of functions that we're exporting. There's the export section, which is the keyword export, the size of that section, the number of exports. And as we keep going, there's the export section, which is just some more like the name of the export, you know, the function section, which is actually like the code inside of it, the logic of that section and how we're manipulating some of the variables that are passed into it, and then finally the end, which denotes like the end of that binary block. And really, again, you're not going to need to have this memorized, but it's still something interesting, and if you want to read more about it, you can find the binary specification on GitHub.

So now, let's actually consider the history of Wasm. It started back in 2011, which is a long time ago, before I was even really a dev. You could use Emscripten to compile C and C++ all the way back in 2011 to something that would run in the browser. Google had their own version of something Wasm-like called the Native Client, or NACL, and then in 2013, ASM.js was introduced, and ASM.js was JavaScript code that would then be passed to this like ASM.js compiler for the browser to then optimize and run it, and it was very interesting, because you would write JavaScript code, but you would annotate maybe the type of a certain variable to let it know that it would be a float or an integer or things like that. And then in 2015, Wasm itself was first announced, that actual binary format. NACL was then deprecated a couple of years later in favor of Wasm, because Google decided that, yeah, why build something and maintain it themselves when they could actually support an open standard. Broad browser support came around that same time. After that, we got a thing called Wasi, which is the WebAssembly system interface, which I'll go into a little bit more here in a bit. In 2019, Wasm threads were enabled by default in Chrome, and in 2022, Wasm 2.0, a draft for it was created. There's a little bit of contention around that draft and how that works, but we're not going to dig into that too much right now. It may just be worth reading on your own.

So Wasi is the WebAssembly system interface. It's designed by Mozilla, and it provides a POSIX-like feature set, so you can get like file I.O. or networking or things that like your operating system kind of handles for you that when you compile a binary for a certain operating system, those are the bindings that are actually kind of being wired up for you to that binary. So instead, Wasi helps you do that with Wasm code, and when you combine those things together, you get something very powerful. Solomon Hikes, the creator of Docker, once said in 2019, if Wasm and Wasi existed in 2008, we wouldn't have needed to create Docker. That's how important it is. WebAssembly on the server is the future of computing. Just think about that, like really powerful. So the portability that we would get to run Wasm in the Web would also mean we could get that portability at the server level, and you know, we wouldn't need Docker at that point as long as there is a Wasi interface to expose these things for us. And as long as our dependencies could compile to Wasm as well. My favorite talk and example of what this future could look like is The Birth and Death of JavaScript by Gary Bernhardt. Now, the title is misleading. He's not actually anti-JavaScript. He has some very good points about how it's not a perfect language, of course, because there's no such thing. But he goes through and points out that because JavaScript is the way it is, it led to the future that we're seeing where people are pushing for Wasm to get more performant code out of their browser.

3. Wasm: Languages, Browser Support, and Tools

Short description:

The future of Wasm was predicted in a conference talk back in 2014. C, C++, Rust, and Go are the main languages for writing Wasm. Other stable languages include C Sharp, Zig, and Lua. Modern browsers support Wasm, except for Internet Explorer. Tools like Xtism, WasmTime, Spin, and Wasmer enable running Wasm outside the browser. Check out AppCypher's repos to learn more. Popular apps using Wasm can be found on madewithwebassembly.com.

And it's very entertaining. He has some satire in it. It's my favorite conference talk of all time. I highly recommend watching it. And it was back in 2014. He kind of saw the writing on the wall and the future that we're experiencing right now ten years ago, which is awesome.

So if you want to write a language and compile to Wasm and hopefully run it in your browser or in some server runtime, what languages could you use? The main ones are C and C++, Rust, and Go. But there are stable languages such as C Sharp, Zig, Lua, and many more. If you want to see the full list of their varying support and which languages you might be able to do this with, go ahead and check out that repo from AppCypher. It has this awesome list, and it's kept pretty well up to date.

Now if you want to run Wasm, yeah, I mean, all modern browsers support it, as we saw in the timeline. So that means, yeah, Chrome, Firefox, Opera, Safari, Edge, but not Internet Explorer. So if you have to support Internet Explorer still, you're not going to be using Wasm. Hopefully none of us have to do that. And if you do, well, I'm sorry. It does get better. Now if you want to run Wasm outside of the browser, how can you do that? Well, there's tools like Xtism, WasmTime, Spin, which is the Spin SDK, and Wasmer. Xtism is really cool because it's a plugin system. So imagine you have a Go binary, like Hugo, and you want to write some Rust code to extend it. You can use Xtism to create that plugin ecosystem for it. Wasmer is very interesting because it aims to be like a serverless-style runtime where you just ship it some Wasm, and it'll spin up and run that Wasm as if you were shipping it a Docker container kind of thing. But it's lighter weight. And there's more runtimes, too. They're being added all the time. This repo owner, AppCypher, has multiple Wasm-based repos, or Wasm-focused repos, that kind of like help us discover and learn more about it. So awesome. I love open source. It's just great that way. But like, think about it this way. What are some popular apps that are using Wasm right now? And you can go find them on madewithwebassembly.com, but let's list a few.

4. Wasm: Applications and Limitations

Short description:

Figma, CapCut, 1Password, Google Earth, uBlock Origin, and Leachess are some popular applications that use WebAssembly. Wasm provides real numbers, data structures, and memory access, improving performance. However, there are limitations and concerns, such as no direct DOM access in the browser and higher RAM usage. Running Wasm outside the browser may have limited system access. Security concerns include the obfuscation of binaries and vulnerabilities related to shared array buffers like Spectre Meltdown.

So the big one, the first one, really, is Figma. Back in 2017 they realized that WebAssembly was the future, and they started building towards it with their infinite canvas. So the rendering of that infinite canvas is Wasm. CapCut, which is a video editor, their web version of the video editor, uses Wasm. 1Password and Google Earth and uBlock Origin and Leachess also use Wasm. 1Password and uBlock Origin are using it for lookups in their very large databases of either where your passwords are stored or what ads might need to be blocked, etc. So they're speeding that up with Wasm because they get real numbers instead of everything being floating point. They get real data structures and real memory access rather than just relying upon the garbage collector. So it's pretty cool stuff.

So now, right, it's not all sunshine and rainbows. There are some limitations and concerns. Some of the Wasm limitations in the browser mean that you don't get direct DOM access. At least not yet. And that means that you kind of need a JavaScript shim to communicate between the Wasm and, you know, document.querySelector in the browser as we're used to it. RAM usage could be higher depending on how things are working, the size of your Wasm payload, and simply having to serialize data between the main thread and a worker thread. And for the most part, in the browser, you must run it in a WebWorker. And that's kind of where that serialization overhead comes in.

Now outside the browser, there is limited system access. Hopefully, you have a WASI runtime, you know, supported. And that solves that problem for you. But not everything has that out of the box yet. So there's still work to be done. But there's also security concerns. So number one, it's a binary instead of JavaScript. So we have a very privileged life as web developers that when we consume some JavaScript code, as long as it's well-written, we can read it and understand it and know what it's doing. So there's no surprises to what it's doing. If we see JavaScript code that is heavily obfuscated, we can usually just assume that it is doing something nefarious. But with binaries, we can't make that assumption because even really safe Wasm code could look unsafe because it's obfuscated into a binary. And then shared array buffers are another major concern for security. Spectre Meltdown was a set of vulnerabilities in the browsers that were related to shared array buffers, I think in 2018 or so.

5. Web Workers and the Orderly Application

Short description:

Shared array buffers provide a better way of dealing with web workers, eliminating the need for object serialization. The application built using WebAssembly and cool libraries is called Orderly. It allows for voice dictation, writing books and blog posts, and exporting to PDF. The architecture includes React, Mantine, SQL local, whisper.cpp, YoTi for state management, PDF.js for exporting to PDF, and React Arborist for the file system-style tree.

But nowadays, as long as we maintain a secure context, so HTTPS, and we have Corp invoked, so Corp is cross-origin resource policies. As long as we have that, we can get access to shared array buffers and that gives us a better way of dealing with web workers so that we don't have to serialize objects back and forth between the main thread and the web worker. And when I say serialize, I mean literally like using json.parse and json.stringify to send these messages back and forth. That is a heavy amount of overhead and shared array buffers can help us avoid that.

So now I'm going to talk about a real-world example. I started this out saying that this was a case study, we've learned about Wasm, now let's talk about the application that I built using Wasm and some really cool libraries. I called it Orderly. And the basic gist of it is I want to write books, I want to write blog posts, but I don't necessarily want to exacerbate the impending carpool tunnel for having to type out a 100,000 word book or something like that. So instead, I wanted to be able to dictate, you know, my prose, my book into snippets and chunks and chapters that I could then export to a PDF or something like that in the future. So just to save myself all the typing effort. And also, I'm not a very fast typist. So instead, I can work around that with technology and Orderly is the tool I built for that.

The main goals, right? Voice dictation. Writing books, blog posts, etc. I also wanted it to be cheap because, you know, I'm not made of money. I want to be able to just run this in the browser without having to pay for a server to run it all the time. So I wanted to have no server, and I also didn't want to consume some kind of like software as a service like PicoVoice or some of the other dictation JavaScript tools out there. So yes, I had these constraints, and I had heard about whisper.cpp before, which we'll dig into here in a second. I kind of figured, let's try and make it happen. So here is the architecture. I'm using React. So it's React 18. This was made maybe a year ago, so I didn't want to like jump onto the React 19 train yet. I'm using Mantine for my UI components. The awesome interesting parts are SQL local, which is SQLite in the browser and whisper.cpp, which we'll dig into both of those more deeply here in a bit. For state management, I'm using YoTi, because I like the atomic-style state management. For exporting to a PDF, I'm using PDF.js. For dates, Day.js. There's a tree, like a file system-style tree, that collapses and expands on the left of my application. I'm using React Arborist for that.

6. SQL Local, OPFS, Keasley, and Drizzle

Short description:

I'm using Case for text case conversion. For interacting with SQL local, I'm using Keasley and Drizzle. SQL local is a browser-based SQLite library that is threaded and persisted via OPFS. OPFS is part of the file system access API and is properly sandboxed. Keasley is used for migrations and setting up tables, while Drizzle is a query builder and not a full-on ORM.

I'm using Case for text case conversion because I don't want to have to write those converting kebab case to camel case all the time and things like that. And then finally, for an ORM to interact with SQL local, I'm using Keasley and Drizzle. And I'll dig into the why of that here in a bit.

So first, SQL local. It is proper SQL, or SQLite. So you can run any SQLite query in the browser using SQL local. It is threaded, which means you're not going to block your main thread, and is persisted via OPFS, which is the Origin Private File System, which is just a thing browsers are shipping with that is faster than IndexedDB. So very nice, very easy to get started with. I recommend checking out SQL local.

But OPFS, I think, is the more interesting part of implementing SQL local. It's part of the file system access API. It has basic support in all major browsers since late 2021. It's not an actual file system, so it's not meant to be visible to users. It does actually get represented on your local file system with the browsers maintaining it, but it's not meant for you to actually be able to crawl it and find files in it. So don't expect to like go look through and copy and paste things from your local file system. It's for the browser to interact with. There are no permission prompts and security checks for it because it is properly sandboxed. Rather than giving your browser full access to your entire file system, the browser makes sure that you're not escaping that to go mess with system files or things that could be potentially security bombed. It does have the same browser storage quota as IndexedDB and local storage, and you can query that to like see how much storage you have left. And it is usable from the main thread or a web worker.

I did mention that I'm using Keasley and Drizzle at the same time. Why would I do that? Well, I'm actually using Keasley for the migrations and setting up the tables, indexes, and initial DB seeding, the scaffolding or fixtures or whatever you want to call it. So I'm doing all that with Keasley because it's just a little easier that way. With Drizzle, they use a CLI tool to generate the migrations. And because I wanted to run this all in the browser just for the user, I didn't want to create a CLI tool for that. I didn't want to have some build step to generate these migrations. I just wanted to run the SQL statements. So Keasley is more of a query builder. It's not a full-on ORM.

7. Keasley and Drizzle for SQL

Short description:

Keasley and Drizzle provide different functionalities for interacting with SQL. Keasley is used for migrations and setting up tables, while Drizzle allows for easier querying with ORM-like capabilities.

It's not a full-on ORM. It's made by the creator of Objection.js. So you know that there's actually someone behind it that has experience making these kind of things. But there's no magic. It's just SQL, which means that you don't really get relations with it. You can set up foreign keys and all that, but when you do a query for something in Keasley, you're not going to get the nested, like joined values in the same way. You can still do those joins and get that data, but you're going to have to write that SQL by hand yourself. Whereas Drizzle allows us to do some of that in an easier way, like an ORM, in the way that we would expect. And it's type safe, relation capable, so Drizzle is for the querying, Keasley is for the migrations.

8. Building and Optimizing the Application

Short description:

When building my application, I had to handle migrations and app readiness, as well as consider suspense, optimistic UI, and debouncing inputs for a good user experience.

When building my application, there were a couple of key considerations to make. I needed to create the idea of app ready. And app ready is basically after my migrations and scaffolding have run, because I don't want Yotai to spit out state to the user that is going to be immediately stale once things have run, and I also don't want Yotai's logic for fetching things from the SQLite store. I don't want that to have to deal with those tables not being created and doing some try catch magic and handling that. So instead, I just simply created this state of app ready, and once the migrations run, I can just set app ready to true, and my Yotai-derived state will actually update.

There's also going to be a need for suspense, right? Like this is all happening in your browser, do you really need to worry about suspense? Well yes, for a good user experience, you do. Because even saving to the origin private file system takes time. And if you're typing into something like a controlled input field, even a little bit of lag on that is going to be really noticeable to the user. So suspense is still necessary, there's async reading and writing that can cause too many loader moments, so I did a lot of start transition magic with React. And you're also going to want to do stuff with optimistic UI and debouncing inputs to make sure you're not writing to the file system too much, right? You could batch things a little better. And optimistic UI is also just nice because it's instant in the main thread versus having to do a round trip to the file system and back. So a lot of our server based considerations for building a browser side application still actually matter.

9. Whisper: Models, Implementation, and Hosting

Short description:

Whisper is an open source, open AI project that allows me to record audio and extract text from it. It provides multiple models with different sizes, and the choice of model depends on the available resources. The whisper.cpp file is a plain C, C++ implementation without dependencies, and it supports both CPU and GPU. Message passing and regex matching are used to handle the communication with the main application. Hosting was a challenge due to limitations with GitHub Pages, but an alternative service was used.

And then we can move on to whisper.cpp. So first off, what is Whisper? Whisper is an open source, open AI project written in Python and it has a CLI or a Python SDK that you can use. It's the thing that actually allows me to record audio and take that audio and extract the text from it for the dictation of my application.

With Whisper, we have all these models, like the tiny, base, small, and large ones. The large one ends up being almost like a gigabyte, I believe. I'd have to double check on the exact file size, but these things can get pretty large. There are some smaller versions, but for the most part, you know, you have to be aware of what model someone's going to load. And the larger models do run faster, but they require much more VRAM or RAM from the user. So allowing them to pick which one was essential for my case. I'm using a Mac mini from 2014 still, so I had to use the tiny or base ones just because, you know, my machine couldn't handle it.

Now whisper.cpp is actually a plain C, C++ implementation without dependencies. So Apple Silicon is first class, you get AVX Intrinsic, so SIMD stuff for x86. There's zero memory allocations at runtime. And there's support even for CPU only, so you don't even need a GPU to run it. But there is also efficient GPU support via WebGPU in browsers that support that. And you get a nice C-style API. So one of the tricks to wiring that up is it runs inside of this wasm thread, and we have our main application that needs to communicate with it.

So I end up creating this script that kind of handles the message passing between whisper and my main application, and then I can just spit that out to where I need it. Now the tricky part is I have to create this recent result array and I have to use regular expressions to understand what messages are being passed from whisper. Is it just the timings, letting me know how long something took? Was it a blank audio that I actually just want to ignore? Or is it actually just the code that I'm looking for? Or not the code, but the text that I'm looking for? So I have this print function that is what whisper is going to call from the back end. I test it for some regular expressions and I push the messages to that array because it's not going to give it all to me at once, it's going to stream chunks of the text as it processes the audio. And then finally when it's done, I can spit that recent result out to my application via just an event that I have created custom.

The last thing that was kind of a hurdle to implement was hosting. So we talked about OPFS and shared array buffers and how they work a little bit. The trickiness is in how we implement them. So with GitHub Pages I tried using it but we don't have access to setting headers from GitHub Pages, so we couldn't actually get shared array buffer support or OPFS out of the box. Now there is a possible workaround using CoI Service Worker or COI Service Worker. And what it does is it uses a service worker to intercept your requests to your JavaScript and HTML and CSS and all that stuff to kind of like, and the WASM as well, to like allow these headers to be injected at the service worker level rather than from the server. But there's a bit of inconsistency due to loading and order of operations with that I decided I just had to do something else. So instead I used another service.

10. Deployment and Model Hosting Options

Short description:

Render, Netlify, Vercel, and Cloudflare are options for deployment. However, hosting the large models can be expensive with these providers. Instead, a cheap VPS provider via low end box was used, offering predictable bandwidth and costing only $2 a month. By spinning up a simple file server with caddy, SSL/TLS and HTTPS can be achieved easily. Consider building your own application using Wasm and exploring new possibilities. Thank you for listening.

I used Render. Render allows us to do header configuration. It's nice and easy. It deploys from a git repo. So that's nice and easy too. But I could have used Netlify or Vercel or Cloudflare. All that's totally fine.

This was great for the application itself, but I mentioned the one gigabyte large model. Hosting those models are something you don't want to do inside of a Vercel or Netlify because that can get really expensive. So instead I just found some cheap VPS provider via low end box. You know, the models can get large. I get a predictable bandwidth. So I get a terabyte of transfer from this cheap $2 a month VPS. I can spin up a simple file server via caddy and I can get SSL TLS stuff, right? HTTPS out of it with just one config line with caddy. So for $2 a month I have a predictable budget versus how much it might cost me with one of the larger, more common providers.

So I know that was a whole bunch. I jumped around a bit. But I think you may have gotten at least enough of an inspiration to start building on your own some type of application that is maybe just local first, using Wasm, doing things that you didn't even think possible just a few years ago. Thanks for letting me speak and have a great rest of your day.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Utilising Rust from Vue with WebAssembly
Vue.js London Live 2021Vue.js London Live 2021
8 min
Utilising Rust from Vue with WebAssembly
Top Content
In this Talk, the speaker demonstrates how to use Rust with WebAssembly in a Vue.js project. They explain that WebAssembly is a binary format that allows for high-performance code and less memory usage in the browser. The speaker shows how to build a Rust example using the WasmPack tool and integrate it into a Vue template. They also demonstrate how to call Rust code from a Vue component and deploy the resulting package to npm for easy sharing and consumption.
Debugging JS
React Summit 2023React Summit 2023
24 min
Debugging JS
Top Content
Watch video: Debugging JS
Debugging JavaScript is a crucial skill that is often overlooked in the industry. It is important to understand the problem, reproduce the issue, and identify the root cause. Having a variety of debugging tools and techniques, such as console methods and graphical debuggers, is beneficial. Replay is a time-traveling debugger for JavaScript that allows users to record and inspect bugs. It works with Redux, plain React, and even minified code with the help of source maps.
A Framework for Managing Technical Debt
TechLead Conference 2023TechLead Conference 2023
35 min
A Framework for Managing Technical Debt
Top ContentPremium
Today's Talk discusses the importance of managing technical debt through refactoring practices, prioritization, and planning. Successful refactoring requires establishing guidelines, maintaining an inventory, and implementing a process. Celebrating success and ensuring resilience are key to building a strong refactoring culture. Visibility, support, and transparent communication are crucial for addressing technical debt effectively. The team's responsibilities, operating style, and availability should be transparent to product managers.
Making JavaScript on WebAssembly Fast
JSNation Live 2021JSNation Live 2021
29 min
Making JavaScript on WebAssembly Fast
Top Content
WebAssembly enables optimizing JavaScript performance for different environments by deploying the JavaScript engine as a portable WebAssembly module. By making JavaScript on WebAssembly fast, instances can be created for each request, reducing latency and security risks. Initialization and runtime phases can be improved with tools like Wiser and snapshotting, resulting in faster startup times. Optimizing JavaScript performance in WebAssembly can be achieved through techniques like ahead-of-time compilation and inline caching. WebAssembly usage is growing outside the web, offering benefits like isolation and portability. Build sizes and snapshotting in WebAssembly depend on the application, and more information can be found on the Mozilla Hacks website and Bike Reliance site.
Building a Voice-Enabled AI Assistant With Javascript
JSNation 2023JSNation 2023
21 min
Building a Voice-Enabled AI Assistant With Javascript
Top Content
This Talk discusses building a voice-activated AI assistant using web APIs and JavaScript. It covers using the Web Speech API for speech recognition and the speech synthesis API for text to speech. The speaker demonstrates how to communicate with the Open AI API and handle the response. The Talk also explores enabling speech recognition and addressing the user. The speaker concludes by mentioning the possibility of creating a product out of the project and using Tauri for native desktop-like experiences.
A Practical Guide for Migrating to Server Components
React Advanced 2023React Advanced 2023
28 min
A Practical Guide for Migrating to Server Components
Top Content
Watch video: A Practical Guide for Migrating to Server Components
React query version five is live and we'll be discussing the migration process to server components using Next.js and React Query. The process involves planning, preparing, and setting up server components, migrating pages, adding layouts, and moving components to the server. We'll also explore the benefits of server components such as reducing JavaScript shipping, enabling powerful caching, and leveraging the features of the app router. Additionally, we'll cover topics like handling authentication, rendering in server components, and the impact on server load and costs.

Workshops on related topic

Relational Database Modeling for GraphQL
GraphQL Galaxy 2020GraphQL Galaxy 2020
106 min
Relational Database Modeling for GraphQL
Top Content
Workshop
Adron Hall
Adron Hall
In this workshop we'll dig deeper into data modeling. We'll start with a discussion about various database types and how they map to GraphQL. Once that groundwork is laid out, the focus will shift to specific types of databases and how to build data models that work best for GraphQL within various scenarios.
Table of contentsPart 1 - Hour 1      a. Relational Database Data Modeling      b. Comparing Relational and NoSQL Databases      c. GraphQL with the Database in mindPart 2 - Hour 2      a. Designing Relational Data Models      b. Relationship, Building MultijoinsTables      c. GraphQL & Relational Data Modeling Query Complexities
Prerequisites      a. Data modeling tool. The trainer will be using dbdiagram      b. Postgres, albeit no need to install this locally, as I'll be using a Postgres Dicker image, from Docker Hub for all examples      c. Hasura
Building a Shopify App with React & Node
React Summit Remote Edition 2021React Summit Remote Edition 2021
87 min
Building a Shopify App with React & Node
Top Content
Workshop
Jennifer Gray
Hanna Chen
2 authors
Shopify merchants have a diverse set of needs, and developers have a unique opportunity to meet those needs building apps. Building an app can be tough work but Shopify has created a set of tools and resources to help you build out a seamless app experience as quickly as possible. Get hands on experience building an embedded Shopify app using the Shopify App CLI, Polaris and Shopify App Bridge.We’ll show you how to create an app that accesses information from a development store and can run in your local environment.
Build a chat room with Appwrite and React
JSNation 2022JSNation 2022
41 min
Build a chat room with Appwrite and React
Workshop
Wess Cope
Wess Cope
API's/Backends are difficult and we need websockets. You will be using VS Code as your editor, Parcel.js, Chakra-ui, React, React Icons, and Appwrite. By the end of this workshop, you will have the knowledge to build a real-time app using Appwrite and zero API development. Follow along and you'll have an awesome chat app to show off!
Hard GraphQL Problems at Shopify
GraphQL Galaxy 2021GraphQL Galaxy 2021
164 min
Hard GraphQL Problems at Shopify
Workshop
Rebecca Friedman
Jonathan Baker
Alex Ackerman
Théo Ben Hassen
 Greg MacWilliam
5 authors
At Shopify scale, we solve some pretty hard problems. In this workshop, five different speakers will outline some of the challenges we’ve faced, and how we’ve overcome them.

Table of contents:
1 - The infamous "N+1" problem: Jonathan Baker - Let's talk about what it is, why it is a problem, and how Shopify handles it at scale across several GraphQL APIs.
2 - Contextualizing GraphQL APIs: Alex Ackerman - How and why we decided to use directives. I’ll share what directives are, which directives are available out of the box, and how to create custom directives.
3 - Faster GraphQL queries for mobile clients: Theo Ben Hassen - As your mobile app grows, so will your GraphQL queries. In this talk, I will go over diverse strategies to make your queries faster and more effective.
4 - Building tomorrow’s product today: Greg MacWilliam - How Shopify adopts future features in today’s code.
5 - Managing large APIs effectively: Rebecca Friedman - We have thousands of developers at Shopify. Let’s take a look at how we’re ensuring the quality and consistency of our GraphQL APIs with so many contributors.
Build Modern Applications Using GraphQL and Javascript
Node Congress 2024Node Congress 2024
152 min
Build Modern Applications Using GraphQL and Javascript
Workshop
Emanuel Scirlet
Miguel Henriques
2 authors
Come and learn how you can supercharge your modern and secure applications using GraphQL and Javascript. In this workshop we will build a GraphQL API and we will demonstrate the benefits of the query language for APIs and what use cases that are fit for it. Basic Javascript knowledge required.
Scaling up Your Database With ReadySet
Node Congress 2023Node Congress 2023
33 min
Scaling up Your Database With ReadySet
WorkshopFree
Aspen Smith
Nick Marino
2 authors
The database can be one of the hardest parts of a web app to scale. Many projects end up using ad-hoc caching systems that are complex, error-prone, and expensive to build. What if you could drop in a ready-built caching system to enable better throughput and latency with no code changes to your application?
Join developers Aspen Smith and Nick Marino to see how you can change one line of config in your app and use ReadySet to scale up your query performance by orders of magnitude today.