Embracing WebGPU and WebXR With Three.js

In the rapidly evolving landscape of web technologies, the introduction of WebGPU and WebXR represents a significant leap forward, promising to redefine the boundaries of what's possible in 3D web experiences. This talk dives into the heart of these new technologies, guided by the Three.js library.


We begin by exploring WebGPU, a next-generation graphics API offering enhanced performance and efficiency for rendering 3D graphics directly in the browser. We'll demonstrate how Three.js is adapting to harness its full potential, unlocking unprecedented opportunities for developers to create visually stunning interactive experiences.


Transitioning to the immersive realm, we delve into WebXR, a technology that opens the door for virtual reality and augmented reality experiences right from the web. We'll showcase how Three.js enables creators to build immersive experiences.

Rate this content
Bookmark
Video Summary and Transcription
The 3JS project has evolved over 14 years into a community-led initiative with around 2,000 contributors. Initially starting with Flash and SVGs before transitioning to WebGL, the project now focuses on utilizing WebGPU for optimized 3D performance. WebGPU, a modern API built on technologies like Metal and Vulkan, offers better control and performance compared to WebGL, which began in browsers in 2011. The transition from WebGL to WebGPU is expected to modernize web-based 3D rendering. Developers can leverage frameworks like React ReFiber and tools such as the 3GS editor for easier development and scene creation. The new shader language TSL simplifies shader usage and custom material creation. WebXR remains unchanged, with platforms like MetaQuest and Vision Pro supporting it by default. The project showcases innovative examples like the Johnny project and the Japanese car project, which demonstrate excellent performance across devices. Embracing WebGPU and WebXR with Three.js allows developers to push the boundaries of 3D web experiences, making it crucial to understand concepts like the C buffer for optimizing 3D transparent objects. The project's open-source nature encourages creativity and experimentation similar to the Flash era.

This talk has been presented at JSNation 2024, check out the latest edition of this JavaScript Conference.

FAQ

WebGL became available in browsers in 2011. It is important because it allows the use of the computer's GPU for 3D rendering, making it more efficient.

Yes, the hope is that WebGPU will eventually overtake WebGL as the main API for 3D on the web, offering better performance and modern features.

No, the 3JS project does not have a road map; it is community-led with contributors working on what they think should be done.

The 3JS project has had around 2,000 contributors so far.

Ricardo has been leading the 3JS project for around 14 years.

React ReFiber is a framework that brings the React component approach to 3JS, making it easier to connect and use components.

Initially, the 3JS project used Flash for 3D engine work and then moved to using SVGs for rendering.

The 3GS editor allows users to create and modify 3D scenes directly in the browser, similar to a lightweight version of Blender.

To optimize performance in 3JS applications, focus on simplifying 3D models, reducing the number of triangles, and optimizing assets before writing code.

WebGPU is a new API for doing 3D on the web, built on modern technologies like Metal, Vulkan, and DirectX 12. It offers more control and performance compared to WebGL, which is based on the older OpenGL.

1. Introduction to 3JS Project#

Short description:

I've been leading the 3JS project for around 14 years, and it is all kind of a community lead with no road map. Ren Yuan did a visualization of the Git repo, showing how it evolved over the years. The project has had around 2,000 contributors, and I appreciate the opportunity it gives to anyone who wants to fix things without the need for contracts.

I'm Ricardo, I've been leading the 3JS project for around 14 years at this point. The project has had so far around 2,000 contributors, and one thing I found interesting is that we really never had any road map for the project. It is all kind of a community lead, like I kind of said, I guide a little bit, but in general, we don't, everyone is trying to do what they think should be done. If you're curious to see how such a project looks like, how an organisation looks like, Ren Yuan did a visualisation of the Git repo for the last 14 years, and you will see on the centre, this is basically me in the centre, a little ball moving around, the files are on the edges, and for the first few years, it was pretty much just me, like, you know, some people would join me and try to help in some of the parts, but there's a lot of little, you know, all those little dots are just people that are helping for fixing this thing, and, you know, like, and I really enjoy the fact that we have, we give this opportunity to anyone that, you know, you want to fix this thing, it's pretty easy to jump in and fix that thing, you know, compared to, you know, not having to, some of the bigger projects, like if they were required to sign contracts and things like that.

2. Evolution of the 3JS Project#

Short description:

After 14 years, the 3JS project has evolved into a community-driven effort with numerous contributors. It started with 3D engine work in Flash, then transitioned to using SVGs for rendering in HTML5. Eventually, WebGL became the preferred option for utilizing the GPU for 3D. The 3JS website showcases various projects, including an interactive section that evokes the creativity and experimentation of Flash. Additionally, there are frameworks like React ReFiber and Trace.js for easier development, as well as a no-code tool powered by 3.js. The project has also seen advancements in path tracing, allowing for more realistic scene rendering. The browser can now handle real-time and path-tracing scenarios, as demonstrated by examples like Octopus T. Overall, the capabilities of the 3JS project continue to expand.

So, after, like, 14 years, it kind of looks like this, it's a bunch of people that are main contributors, but there's a lot of people that are, like, helping a lot. For those of you that don't know, I'm going to give a little bit of introduction, a little bit of background, too, like, originally, this is actually longer than 14 years ago, I started doing, like, 3D engine work, exploring this, like, in Flash, already, in 2006, 2007, and, like, kind of was working on that for a few years until, you know, there was, like, when we started to see that there was an HTML5, and we were going to move into that, so I actually started to port the code that I had to HTML5, and I was actually using SVGs for rendering it, because SVG was similar to what Flash was using for rendering 3D, and believe it or not, I was basically, like, creating a new SVG every frame, I was deleting all the nodes and adding all the nodes for all the triangles, but I think at the time you could only do 2,000 triangles or something like that, like a scene with only 2,000 polygons, it would be too difficult already for the CPU.

Then we had canvas 3D, and, like, you know, it is a much better option for it, but still it's not the most optimal, and then, like, 2011, we had, like, we got WebGL on browsers, which is what you want to use, like your computer has a GPU, you want to use your GPU for doing 3D. I'm going to show, like, you know, you can go to the website to see some of the projects that people have been doing. I should try, I'm supposed to update this, I'm two years late for updating, like, you know, the latest projects, but one of my favourites is this Japanese company that really went all overboard on all the things that you can do with it, so, you know, this is just a normal website. Like, this is the introduction. You can move around, like, play with the things, and go to the next section, now, like the animal becomes refractive, go to the next section, like, you know, it is more like an informative and a very, like, trippy way, I guess, and this section you can move the pen around and draw on the floor, and eventually, like, the animal just flies away. This to me really brings a lot of the kind of creativity that people, or the kind of website that we used to see with Flash, which is always going to be the battle of the people saying, like, this is not useful, and this is, like, boring, so, you know, I still go with the kind of more pretty things and more, like, experimental things. That's mostly like plain JavaScript.

There's also you're used to using frameworks. For React, there's React ReFiber which brings all the components, like approach from React, and make it much more, it's much more easy for people, like you don't have to learn that much of how things work, how to connect things, you can just connect things, put things together much easier. For Vue.js, there is Trace.js, similar approach, it would be nice if there was an easier way for all of them to reuse components, but it's a similar idea. Basically it's kind of a dialect of the language that we had in JavaScript but, like, you know, in every different flavour. If you use Velt, it's also like Threlt which is the same thing, all using the same kind of like some of the code that we have done on the base, but some of them do their own components and their own work to make things much, much easier. And if you're more into like no-code kind of thing, there is a popular one which is also like using 3.js underneath, and it's a pretty good tool for creating prototypes and for more designers to play with. There is also more recently, like, Gareth has been working on path tracer. For those of you that don't know, this is more basically not focused on realtime, but it's more similar to Blender or any 3D software like Maya where you are able to render a scene in a way that looks much more realistic. But the idea is that it basically uses the same, like, you know, tries to use the API as much as possible, so right now your project has a scene that you're defining, like, you know, a good example for this is if you're doing like a furniture shop or something that you want the person to customise the furniture, and you want them to be able to see in a more realistic way how the furniture is going to look like, then you can use this for doing a slower, lower render. So if you have any kind of scene that you have done with 3, you can also now add this basically what I can see, this path tracer, and instead of rendering with a normal render, you render with this, and basically it allows you to do, like, the first scene, the first one you see is without it, and then progressively it gets more realistic. A better example is this one. So any time you move the camera, you can see, like, it's a realtime, it's basically a WebGL renderer running, but when you stop the camera, the path tracer starts working and tries to make the scene more realistic. Also makes the background like it has depth of field, more shadowing, in general, more realistic than what you can do in realtime at this point. Another example, where, on this one, we can really play with all the, it actually offers a lot of parameters, like how much you want the light to bounce in the object, and the reflection, like all the different parameters. Yet another example is Octopus T, I think. So this is realtime, and that becomes a path-tracing one. This is the kind of thing that we can do now in the browser. If you want an easier way to try, like, there is a 3GS editor, so, let's see, we can do maybe a box on a sphere. If I get the box, I can make it turn into a plane. Something like this. To be able to see, we can see that this basically, right now, we have this sphere which is sitting on top of the box.

3. Rendering with WebGPU and Shader Language#

Short description:

To achieve realistic rendering, we can add emission to the sphere and save the image. WebGPU, a new API for 3D on the web, is becoming available on Chrome and offers optimized performance. The 3JS project has been working on a new render using WebGPU, with a WebGL2 fallback for compatibility. A new shader language called TSL is being developed to simplify the use and combination of shaders, enabling the creation of custom materials.

So we turn this to realistic. Right now, it doesn't, because there is no light or nothing, it's not going to, we're not going to see anything, but if we select the sphere and we change the material to emissive colours, we add emission to it, it means the sphere itself is going to emit colour on whatever is around it. We can also add like another, maybe a tourist nod. Zoom in too much. Where am I? There you go. Then from here, you can actually save that as an image, and basically, you have a little blender in the browser at this point. This is a code that anyone can reuse for their project for whatever you want to use.

So this is a bit like what we have been like, we had so far, like the kind of work that people are doing, the kind of frameworks people have been building on top, to the point that we even have path tracers available for websites. Now we have since about last year, WebGPU became available on Chrome. It's still not available on Safari, or other browsers, like Firefox, but we are confident that it's going to, maybe not this year for Safari, maybe next year. For WebGPU, basically, WebGPU is a different API underneath for doing 3D on the web. It's more, it's basically on top of Metal, Vulkan, and DirectX 12. So, basically, there are native APIs that are more optimised for the current hardware that we have these days.

For the last few years, we have been working on Mugen, and Sunak, and Reno have been working on a new render for the library. The idea is to, the only thing you have to do is basically, instead of doing a WebGL render, we used to do a WebGPU class. We have a bunch of examples already. We've been working for quite a few years trying to try to have the same parity, like the same to be able to be compatible to the things that we support WebGL, which is a lot of them. We're not going to be able to support everything, but you can see a lot of the things that we have so far. One of the things, like the idea is to try to make it so you only replace the render and it works straight away, but at the same time, if we had that, like, you know, we don't want people to have to do, like, if you do a website, you don't want to have to have two different renders, so we're trying to have a WebGL2 fallback on the WebGPU render so you can start using the same code, and you don't have to worry if Safari supports WebGPU, we're going to try to make it work using the WebGL2 fallback. Again, not everything is going to be able to work, but the idea is that for most of the things, you know, if you're not doing something specific, it should work. One of the problems, though, is that, you know, one of the things we're not able to support is that OpenGL has GLSL for its own language, and WebGPU has the WGSL, which they are basically like different languages. You could have a compiler on your website that compiles GLSL to WGSL, but that will be a minimum, like right now I think the last compiler size was like a one megabyte, which that is too much for a website. So something that Sunak has been working on is like a new yet another sharding language TSL, which is basically like a node-based JavaScript shader abstraction. The same way that Three.js was making an abstraction for OpenGL, to make it easier to use, like TSL is trying to make shaders themselves easier to use, and easy to combine with other things. Right now, it's very difficult to either one shader or another shader, you can't easily combine them. So this language basically generates GLSL or WGSL, or otherwise it will generate GLSL. It also is going to allow it to make it much easier to make custom materials. Like a good example is this one, like on this one we have basically two textures.

4. Custom Materials and WebXR#

Short description:

It is now easier to create custom materials and combine textures using the new shader language TSL. The WebGPU work is progressing and aims to be ready for production by the end of the year. The WebXR API remains unchanged, and new devices are being developed. Two platforms, MetaQuest and Vision Pro with Vision OS 2, will have WebXR enabled by default. Hand tracking and mesh detection are among the projects being worked on.

It also is going to allow it to make it much easier to make custom materials. Like a good example is this one, like on this one we have basically two textures. We have a colour map there, and a detail map. The idea is that you have a material that has a colour texture, but you want to add like another, you want to multiply another texture on top of the colour, to make it more interesting, like to not make it, especially you think of a game, usually you have some shades of colour on a cell or something like this, and then you add another texture on top that kind of adds some roughness on everything, so it seems not that simple. So for this, like this is the code that will do that. You use this kind of node material, you use this colour map texture, and you multiply this detail on it, and this is all like JavaScript-based. We don't have to write a string any more, like the GLSL or WGSL. For comparison, if you want to do the same thing with what we have, up until now it will be this, like in order to modify a built-in material, this is the kind of code we had to do. And you know, it's going to be interesting to see how people are able to combine these things, the kind of materials that people are going to be able to do with this. It also is going to allow us to do tree-shaking, like up until now, all the built-in materials have shaders that we were not able to tree-shake whenever you build a website because anyone is going to be able to create any material at any point, and the shaders need to be there, and now we are able to know exactly which node you're using and which code we should be generating, or which code we should be able to, you know, it's all JavaScript-based basically. It is also going to allow procedural textures in a much more easy way. A good example like I saw today, there was like this guy was doing what he's calling a TSL texture, which is one of the interesting things with this is that it's actually like an infinite kind of resolution, like if you wanted to do something like this, you know, it will be like an 8k texture you have to load, but at the same time, you can modify it in real time and do whatever with it. Another example of this. Let's see. Maybe this one. Yes, interestingly, this is actually like, you know, this texture is being generated every frame, and it seems like to be pretty performant. And the same thing, like the, you know, resolution is infinite. So that's pretty much what we have so far for like the WebGPU work. It's still, I don't think it's ready for production at this point, but we're getting pretty close to that. I think for the end of the year, we're aiming to be able to replace it and recommend people to start using it.

From the WebXR side, like the API hasn't really changed, like right now we're just waiting for devices to catch up. We did most of this stuff for, you know, on the Daydream era. Then they stopped things, and now people are starting to do devices. The API is pretty much, you know, on the render, you enable XR and then you have all the code internally to render for each eye. At this point, it's supposed like the main two platforms, like MetaQuest, and also like yesterday they were announcing that for fall 2024, the Vision Pro with the Vision OS 2, they're going to have WebXR enabled by default. So now we have two platforms. It's only going to be VR, but it's good. It's exciting. Some of the projects that people have been doing so far, at this point we have hand tracking is kind of something that was always there, so this is a good example of that. One of the new extensions that we've been doing, working on is mesh detection, so now you can have like a geometry of your whatever you have, like what you see, but it's something that basically the browser gives you.

5. 3D Geometry and Scene Inclusion#

Short description:

You can include 3D objects on the scene with whatever you have around you. This is all the work that we have done so far.

You can have whatever you are, it gives you a 3D geometry. It's not something that the website can scan. And also this sensing, which basically is something that in this case you have a depth map of the scene that you're looking at. So in this case, you can see the bug went well behind the screen. So you're able to include things, you're able to include 3D objects on the scene with whatever you have around you. And, yeah, that's pretty much it. This is all the work that we have done so far.

6. Understanding WebGL and WebGPU#

Short description:

What concepts of WebGL do you believe are useful to understand before you start working on 3JS? The C buffer is important to understand, as it affects the appearance of 3D transparent objects. Game development on the web faces challenges with download size and monetization. Performance in Three.js apps can be improved by optimizing 3D models and assets. The hope is that WebGPU will eventually replace WebGL, offering better control and performance.

What concepts of WebGL do you believe are useful to understand before you start working on 3JS? I guess the C buffer, like transparency, you need to understand that I cannot explain very quickly what the C buffer is, but it's something that is good to understand first. But basically you're going to have 3D transparent objects that kind of don't look correct, and it's because of that. And that's all you need to get started with 3JS? I think that's one of the main things to understand for this specific question.

What do you think about game development on the web? Will we ever see an AAA game on the web, or would that be a pipe dream? I think it is difficult to, like technically we can, but an AAA game is like, you know, I guess maybe nowadays they're like 100 gigabytes that you have to download, so you go to a website that you're not going to download 100 gigabytes to play a game, and then maybe next time he's lost, or because the browser had to give the resources to another website. People are doing games, especially more like Flash games, but I think the AAA side is more, it's going to be difficult because mostly download size. And of course how do you monetize it if it's in the browser? I think monetizing is kind of, you can still do that in a similar way to Flash. There's Poki, people tend to do small games and having banners around it. I think that's coming back, but that's doable and that's happening. For AAA it's going to be different.

What are your top tips for making performance Three.js apps? I guess the main thing is that you have to be careful how complicated the 3D model is. So if it has a lot of triangles, a lot of detail, and you really need that detail, so you learn Blender to optimize things, I think that's the easiest way to simplify things. You have a CAD from a car company that has a million objects, that's going to go slow. You try to combine all of them to one, and that's going to be much faster. But in general it's more asset creation, it's not so much the code side, it's more like at the end of the day it tends to be the assets are what needs to be optimized. So it's a question of thinking about what level of detail you actually want before you even start writing the code. It's similar to images. You don't load a PSD file or you don't want a 10K image for a logo, so you need to be mindful of optimizing the assets for what you need. For sure, although I've seen many websites with 20MB images for no reason at all.

Do you think WebGPU will eventually overtake WebGL? That's the hope, that eventually it will be that way. WebGL is still going to be there, but the idea is browser vendors are interested in having WebGPU as the main API, because basically you have to deal with less driver issues and you have more control and more performance in general, so it's kind of modernizing this step. Eventually, the browser is going to run WebGL on top of WebGPU, kind of. Not WebGPU, but what's running under WebGPU, they're going to have WebGL running all the same.

7. Optimizing Assets and the Future of WebGL#

Short description:

Optimizing assets is important, similar to images. WebGPU may overtake WebGL, offering more control and performance. Eventually, WebGL will run on top of WebGPU, modernizing the technology. It will become a legacy layer for supporting old code.

So it's a question of thinking about what level of detail you actually want before you even start writing the code. It's similar to images. You don't load a PSD file or you don't want a 10K image for a logo, so you need to be mindful of optimizing the assets for what you need. For sure, although I've seen many websites with 20MB images for no reason at all. Do you think WebGPU will eventually overtake WebGL? That's the hope, that eventually it will be that way. WebGL is still going to be there, but the idea is browser vendors are interested in having WebGPU as the main API, because basically you have to deal with less driver issues and you have more control and more performance in general, so it's kind of modernizing this step. Eventually, the browser is going to run WebGL on top of WebGPU, kind of. Not WebGPU, but what's running under WebGPU, they're going to have WebGL running all the same. So in the same way that, like, on macOS they have OpenGL but it's not really OpenGL, it's OpenGL on top of Metal. So eventually everything is moving to the newer APIs. So WebGL will end up actually running WebGPU under the hood and just being a legacy layer for supporting old code.

8. Favorite Three.js Project and Future Possibilities#

Short description:

Ricardo's favorite project built with Three.js is the Johnny one. He continues to be amazed by what people create with the platform and believes that the web can regain the same level of creativity as Flash with an equivalent editor. Ricardo emphasizes the importance of open software and the current generation's role in building innovative tools. The performance of the Japanese car project is excellent and works on any phone. The URL is nextjuni.jp.

Cool. What's your favorite project that was built with Three.js? It doesn't have to be one of your own. I guess like the one that I showed, the Johnny one for now, that's the one that brings me more joy of, like, sure, let's put everything there and it's playful, it's good. So even now you still see stuff that surprises you and makes you go, oh, wow. It's been, yeah. It was a warm feeling when I saw that project. It's been a long time that I've been surprised what everyone is doing with it. I understand very little what's happening at this point. I understand, like, I can do some of the layers but when I see what people are doing I'm pretty impressed with it.

Gotcha. It's overwhelming. It's kind of overwhelming. Here's one that I bet you get asked many times. Mr. Doob, Flash was peak creativity and an easy tool to build animation. Do you believe 3 could achieve the same thing if there was an equivalent editor? I think we can get there, and something that I think about from time to time is that we had Flash at the time as a proprietary software. We didn't want the web to depend on proprietary software so now we're making it more open At this point, if we want to have that kind of creativity back, it's basically up to us, the current generation, to build it. If we want other people to build it, we're probably not going to have it. So it's basically up to us to do it. And I think eventually we'll get it. We keep making sure that everything is open and building the different components but I think we're getting there. It's just taking some time.

Somebody asked what's the performance like of that wonderful Japanese car? The animal? Oh, the animal one? It's very well. The guy that did it there knows very well what they're doing. It's pretty well optimized. It works on any phone. Would you mind reminding us of the URL? Sure. I think it's nextjuni.jp Excellent. Pretty sure. Thank you very much, Ricardo. That was really interesting. Give it up for Ricardo, Mr. Do.

Mr.doob
Mr.doob
27 min
13 Jun, 2024

Comments

Sign in or register to post your comment.
  • GitNation resident
    Nice talk! But too bad in practice Threejs dosen't leverage GPU for simple things like Point Clouds. The current PointMaterial isn't based on WebGL. Looking forward to better compatibility with ShaderMaterials and Points! Amazing work.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Unreal Engine in WebAssembly/WebGPU
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Unreal Engine in WebAssembly/WebGPU
Top Content
Alex Saint-Louis, co-founder of Wunder Interactive, shares the mission of bringing Unreal Engine to the browser, enabling cross-platform 3D applications and games. They are working on a WebGPU back end for Unreal Engine to push the limits of 3D on the web. Wunder Interactive has improved compression, built their own asset file system, and offers powerful tools for game developers. They are utilizing modern web technologies like WebAssembly, WebGL, and WebGPU, and plan to support other engines like Unity and Godot. The team aims to transform the industry by bringing console-quality games to the browser and providing an alternative distribution path. They are excited to bring Unreal Engine 5 to the web with WebGPU support and are working on WebXR support for cross-platform 3D experiences, including VR.
Build a 3D Solar System with Hand Recognition and Three.js
JSNation 2022JSNation 2022
36 min
Build a 3D Solar System with Hand Recognition and Three.js
Top Content
This Talk explores the use of TypeScript, 3JS, hand recognition, and TensorFlow.js to create 3D experiences on the web. It covers topics such as rendering 3D objects, adding lights and objects, hand tracking, and creating interactive gestures. The speaker demonstrates how to build a cube and a bouncy box, move objects with flick gestures, and create a solar system with stars and planets. The Talk also discusses the possibilities of using hand gestures for web navigation and controlling websites, as well as the performance limits of these technologies.
Makepad - Leveraging Rust + Wasm + WebGL to Build Amazing Cross-platform Applications
JSNation 2022JSNation 2022
22 min
Makepad - Leveraging Rust + Wasm + WebGL to Build Amazing Cross-platform Applications
Top Content
Welcome to MakePad, a new way to build UI for web and native using WebAssembly and Rust. JavaScript is not suitable for complex applications like IDEs and design tools. Rust, a new programming language, was used to reimagine MakePad, resulting in a fast and efficient platform. MakePad offers live editing, high CPU performance, and the ability to load native instrument components. The future of MakePad includes an open-source release, a design tool, and support for importing 3D models.
Making “Bite-Sized” Web Games with GameSnacks
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Making “Bite-Sized” Web Games with GameSnacks
Top Content
Welcome to making bite-sized games with GameSnacks, a platform that focuses on optimizing game sizes for easy accessibility on the web. Techniques such as lazy loading, script placement, and code and art optimization can greatly improve game performance. Choosing the right file formats, reducing game size, and using game engines or custom tools are important considerations. Prioritizing file size, testing internet connections, and using testing tools for accurate simulation can help attract more users and improve game retention and reach.
Extending Unity WebGL With Javascript
JS GameDev Summit 2022JS GameDev Summit 2022
32 min
Extending Unity WebGL With Javascript
Top Content
Unity targets over 25 platforms and technologies, including desktop, mobile, and virtual reality. They use Emscripten to compile the engine and game logic into WebAssembly for web development. Unity can be extended with plugins to access browser features like WebXR's augmented reality mode. The speaker demonstrates intercepting Unity's calls to the browser to modify its behavior. Unity is actively working on mobile support for web export and improving documentation for extending Unity with web plugins.
React + WebGPU + AI – What Could Go Wrong? 😳
JSNation 2023JSNation 2023
31 min
React + WebGPU + AI – What Could Go Wrong? 😳
With AI and web GPU, it's an exciting time to be a developer. The speaker's journey involves combining programming and design, leading to the creation of Pure Blue, a powerful programming environment. Adding AI to the mix, the speaker discusses the potential of AI in the creative process and its impact on app development. The talk explores the role of React components and WebGPU in enabling fine-grained editing and running AI models locally. The future of app development is discussed, emphasizing the need to stay ahead of the curve and leverage the power of JavaScript.