Embracing WebGPU and WebXR With Three.js

Rate this content
Bookmark
For around 14 years, the 3JS project has evolved into a community-driven effort with no road map, led by mr doob. Initially, the project used Flash for 3D engine work and then transitioned to SVGs before adopting WebGL in 2011 for efficient GPU utilization. The project's website showcases various projects, including an interactive section reminiscent of Flash. React ReFiber and Trace.js are frameworks that simplify development, and a no-code tool powered by 3GS is also available. WebGPU, a new API for 3D on the web, is being integrated into 3JS with a WebGL2 fallback for compatibility. A new shader language called TSL is being developed to simplify creating custom materials. The WebXR API remains unchanged, and new devices like MetaQuest and Vision Pro will have WebXR enabled by default. Optimizing 3D models and assets is crucial for improving performance in Three.js applications. WebGPU may eventually overtake WebGL, running on top of it as a legacy layer. Ricardo's favorite project built with Three.js is the Johnny one, and he continues to be amazed by the creativity of the community. The performance of the Japanese car project, nextjuni.jp, is excellent and works on any phone.

From Author:

In the rapidly evolving landscape of web technologies, the introduction of WebGPU and WebXR represents a significant leap forward, promising to redefine the boundaries of what's possible in 3D web experiences. This talk dives into the heart of these new technologies, guided by the Three.js library.


We begin by exploring WebGPU, a next-generation graphics API offering enhanced performance and efficiency for rendering 3D graphics directly in the browser. We'll demonstrate how Three.js is adapting to harness its full potential, unlocking unprecedented opportunities for developers to create visually stunning interactive experiences.


Transitioning to the immersive realm, we delve into WebXR, a technology that opens the door for virtual reality and augmented reality experiences right from the web. We'll showcase how Three.js enables creators to build immersive experiences.

This talk has been presented at JSNation 2024, check out the latest edition of this JavaScript Conference.

FAQ

Ricardo has been leading the 3JS project for around 14 years.

The 3JS project has had around 2,000 contributors so far.

No, the 3JS project does not have a road map; it is community-led with contributors working on what they think should be done.

Initially, the 3JS project used Flash for 3D engine work and then moved to using SVGs for rendering.

WebGL became available in browsers in 2011. It is important because it allows the use of the computer's GPU for 3D rendering, making it more efficient.

React ReFiber is a framework that brings the React component approach to 3JS, making it easier to connect and use components.

The 3GS editor allows users to create and modify 3D scenes directly in the browser, similar to a lightweight version of Blender.

WebGPU is a new API for doing 3D on the web, built on modern technologies like Metal, Vulkan, and DirectX 12. It offers more control and performance compared to WebGL, which is based on the older OpenGL.

To optimize performance in 3JS applications, focus on simplifying 3D models, reducing the number of triangles, and optimizing assets before writing code.

Yes, the hope is that WebGPU will eventually overtake WebGL as the main API for 3D on the web, offering better performance and modern features.

Mr.doob
Mr.doob
27 min
13 Jun, 2024

Comments

Sign in or register to post your comment.
  • GitNation resident
    Nice talk! But too bad in practice Threejs dosen't leverage GPU for simple things like Point Clouds. The current PointMaterial isn't based on WebGL. Looking forward to better compatibility with ShaderMaterials and Points! Amazing work.

Video Transcription

1. Introduction to 3JS Project

Short description:

I've been leading the 3JS project for around 14 years, and it is all kind of a community lead with no road map. Ren Yuan did a visualization of the Git repo, showing how it evolved over the years. The project has had around 2,000 contributors, and I appreciate the opportunity it gives to anyone who wants to fix things without the need for contracts.

I'm Ricardo, I've been leading the 3JS project for around 14 years at this point. The project has had so far around 2,000 contributors, and one thing I found interesting is that we really never had any road map for the project. It is all kind of a community lead, like I kind of said, I guide a little bit, but in general, we don't, everyone is trying to do what they think should be done. If you're curious to see how such a project looks like, how an organisation looks like, Ren Yuan did a visualisation of the Git repo for the last 14 years, and you will see on the centre, this is basically me in the centre, a little ball moving around, the files are on the edges, and for the first few years, it was pretty much just me, like, you know, some people would join me and try to help in some of the parts, but there's a lot of little, you know, all those little dots are just people that are helping for fixing this thing, and, you know, like, and I really enjoy the fact that we have, we give this opportunity to anyone that, you know, you want to fix this thing, it's pretty easy to jump in and fix that thing, you know, compared to, you know, not having to, some of the bigger projects, like if they were required to sign contracts and things like that.

2. Evolution of the 3JS Project

Short description:

After 14 years, the 3JS project has evolved into a community-driven effort with numerous contributors. It started with 3D engine work in Flash, then transitioned to using SVGs for rendering in HTML5. Eventually, WebGL became the preferred option for utilizing the GPU for 3D. The 3JS website showcases various projects, including an interactive section that evokes the creativity and experimentation of Flash. Additionally, there are frameworks like React ReFiber and Trace.js for easier development, as well as a no-code tool powered by 3.js. The project has also seen advancements in path tracing, allowing for more realistic scene rendering. The browser can now handle real-time and path-tracing scenarios, as demonstrated by examples like Octopus T. Overall, the capabilities of the 3JS project continue to expand.

So, after, like, 14 years, it kind of looks like this, it's a bunch of people that are main contributors, but there's a lot of people that are, like, helping a lot. For those of you that don't know, I'm going to give a little bit of introduction, a little bit of background, too, like, originally, this is actually longer than 14 years ago, I started doing, like, 3D engine work, exploring this, like, in Flash, already, in 2006, 2007, and, like, kind of was working on that for a few years until, you know, there was, like, when we started to see that there was an HTML5, and we were going to move into that, so I actually started to port the code that I had to HTML5, and I was actually using SVGs for rendering it, because SVG was similar to what Flash was using for rendering 3D, and believe it or not, I was basically, like, creating a new SVG every frame, I was deleting all the nodes and adding all the nodes for all the triangles, but I think at the time you could only do 2,000 triangles or something like that, like a scene with only 2,000 polygons, it would be too difficult already for the CPU.

Then we had canvas 3D, and, like, you know, it is a much better option for it, but still it's not the most optimal, and then, like, 2011, we had, like, we got WebGL on browsers, which is what you want to use, like your computer has a GPU, you want to use your GPU for doing 3D. I'm going to show, like, you know, you can go to the website to see some of the projects that people have been doing. I should try, I'm supposed to update this, I'm two years late for updating, like, you know, the latest projects, but one of my favourites is this Japanese company that really went all overboard on all the things that you can do with it, so, you know, this is just a normal website. Like, this is the introduction. You can move around, like, play with the things, and go to the next section, now, like the animal becomes refractive, go to the next section, like, you know, it is more like an informative and a very, like, trippy way, I guess, and this section you can move the pen around and draw on the floor, and eventually, like, the animal just flies away. This to me really brings a lot of the kind of creativity that people, or the kind of website that we used to see with Flash, which is always going to be the battle of the people saying, like, this is not useful, and this is, like, boring, so, you know, I still go with the kind of more pretty things and more, like, experimental things. That's mostly like plain JavaScript.

There's also you're used to using frameworks. For React, there's React ReFiber which brings all the components, like approach from React, and make it much more, it's much more easy for people, like you don't have to learn that much of how things work, how to connect things, you can just connect things, put things together much easier. For Vue.js, there is Trace.js, similar approach, it would be nice if there was an easier way for all of them to reuse components, but it's a similar idea. Basically it's kind of a dialect of the language that we had in JavaScript but, like, you know, in every different flavour. If you use Velt, it's also like Threlt which is the same thing, all using the same kind of like some of the code that we have done on the base, but some of them do their own components and their own work to make things much, much easier. And if you're more into like no-code kind of thing, there is a popular one which is also like using 3.js underneath, and it's a pretty good tool for creating prototypes and for more designers to play with. There is also more recently, like, Gareth has been working on path tracer. For those of you that don't know, this is more basically not focused on realtime, but it's more similar to Blender or any 3D software like Maya where you are able to render a scene in a way that looks much more realistic. But the idea is that it basically uses the same, like, you know, tries to use the API as much as possible, so right now your project has a scene that you're defining, like, you know, a good example for this is if you're doing like a furniture shop or something that you want the person to customise the furniture, and you want them to be able to see in a more realistic way how the furniture is going to look like, then you can use this for doing a slower, lower render. So if you have any kind of scene that you have done with 3, you can also now add this basically what I can see, this path tracer, and instead of rendering with a normal render, you render with this, and basically it allows you to do, like, the first scene, the first one you see is without it, and then progressively it gets more realistic. A better example is this one. So any time you move the camera, you can see, like, it's a realtime, it's basically a WebGL renderer running, but when you stop the camera, the path tracer starts working and tries to make the scene more realistic. Also makes the background like it has depth of field, more shadowing, in general, more realistic than what you can do in realtime at this point. Another example, where, on this one, we can really play with all the, it actually offers a lot of parameters, like how much you want the light to bounce in the object, and the reflection, like all the different parameters. Yet another example is Octopus T, I think. So this is realtime, and that becomes a path-tracing one. This is the kind of thing that we can do now in the browser. If you want an easier way to try, like, there is a 3GS editor, so, let's see, we can do maybe a box on a sphere. If I get the box, I can make it turn into a plane. Something like this. To be able to see, we can see that this basically, right now, we have this sphere which is sitting on top of the box.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Unreal Engine in WebAssembly/WebGPU
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Unreal Engine in WebAssembly/WebGPU
Top Content
Alex Saint-Louis, co-founder of Wunder Interactive, shares the mission of bringing Unreal Engine to the browser, enabling cross-platform 3D applications and games. They are working on a WebGPU back end for Unreal Engine to push the limits of 3D on the web. Wunder Interactive has improved compression, built their own asset file system, and offers powerful tools for game developers. They are utilizing modern web technologies like WebAssembly, WebGL, and WebGPU, and plan to support other engines like Unity and Godot. The team aims to transform the industry by bringing console-quality games to the browser and providing an alternative distribution path. They are excited to bring Unreal Engine 5 to the web with WebGPU support and are working on WebXR support for cross-platform 3D experiences, including VR.
Build a 3D Solar System with Hand Recognition and Three.js
JSNation 2022JSNation 2022
36 min
Build a 3D Solar System with Hand Recognition and Three.js
Top Content
This Talk explores the use of TypeScript, 3JS, hand recognition, and TensorFlow.js to create 3D experiences on the web. It covers topics such as rendering 3D objects, adding lights and objects, hand tracking, and creating interactive gestures. The speaker demonstrates how to build a cube and a bouncy box, move objects with flick gestures, and create a solar system with stars and planets. The Talk also discusses the possibilities of using hand gestures for web navigation and controlling websites, as well as the performance limits of these technologies.
Makepad - Leveraging Rust + Wasm + WebGL to Build Amazing Cross-platform Applications
JSNation 2022JSNation 2022
22 min
Makepad - Leveraging Rust + Wasm + WebGL to Build Amazing Cross-platform Applications
Top Content
Welcome to MakePad, a new way to build UI for web and native using WebAssembly and Rust. JavaScript is not suitable for complex applications like IDEs and design tools. Rust, a new programming language, was used to reimagine MakePad, resulting in a fast and efficient platform. MakePad offers live editing, high CPU performance, and the ability to load native instrument components. The future of MakePad includes an open-source release, a design tool, and support for importing 3D models.
Making “Bite-Sized” Web Games with GameSnacks
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Making “Bite-Sized” Web Games with GameSnacks
Top Content
Welcome to making bite-sized games with GameSnacks, a platform that focuses on optimizing game sizes for easy accessibility on the web. Techniques such as lazy loading, script placement, and code and art optimization can greatly improve game performance. Choosing the right file formats, reducing game size, and using game engines or custom tools are important considerations. Prioritizing file size, testing internet connections, and using testing tools for accurate simulation can help attract more users and improve game retention and reach.
Extending Unity WebGL With Javascript
JS GameDev Summit 2022JS GameDev Summit 2022
32 min
Extending Unity WebGL With Javascript
Top Content
Unity targets over 25 platforms and technologies, including desktop, mobile, and virtual reality. They use Emscripten to compile the engine and game logic into WebAssembly for web development. Unity can be extended with plugins to access browser features like WebXR's augmented reality mode. The speaker demonstrates intercepting Unity's calls to the browser to modify its behavior. Unity is actively working on mobile support for web export and improving documentation for extending Unity with web plugins.
React + WebGPU + AI – What Could Go Wrong? 😳
JSNation 2023JSNation 2023
31 min
React + WebGPU + AI – What Could Go Wrong? 😳
With AI and web GPU, it's an exciting time to be a developer. The speaker's journey involves combining programming and design, leading to the creation of Pure Blue, a powerful programming environment. Adding AI to the mix, the speaker discusses the potential of AI in the creative process and its impact on app development. The talk explores the role of React components and WebGPU in enabling fine-grained editing and running AI models locally. The future of app development is discussed, emphasizing the need to stay ahead of the curve and leverage the power of JavaScript.