WebXR? Virtual Reality and Augmented Reality Natively on Browsers

This ad is not shown to multipass and full ticket holders
React Summit US
React Summit US 2025
November 18 - 21, 2025
New York, US & Online
The biggest React conference in the US
Learn More
In partnership with Focus Reactive
Upcoming event
React Summit US 2025
React Summit US 2025
November 18 - 21, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

Dive into the exciting world of WebXR in my talk about bringing Virtual Reality (VR) and Augmented Reality (AR) directly to web browsers. As a developer, I've seen how technology can spark creativity. 


In this session, I'll introduce the A-Frame framework, showing how easy it is to create amazing and interactive experiences on the web. 


This isn't just a tutorial; it's an invitation for you to break away from the ordinary and explore the endless possibilities of making engaging web applications.

This talk has been presented at JSNation 2024, check out the latest edition of this JavaScript Conference.

FAQ

The main focus of Eric's talk at JS Nation is to motivate developers to explore the power of web technologies, particularly in creating virtual and augmented reality experiences using tools like WebXR and A-Frame.

Eric recommends using WebXR and A-Frame for developing VR and AR experiences on the web. He also mentions Three.js as a foundational library used by A-Frame.

WebXR is a web standard that combines augmented reality (AR) and virtual reality (VR) into a single API. It allows developers to create immersive experiences that can run on a variety of devices, including browsers, mobile devices, desktops, and VR headsets.

To start experimenting with WebXR and A-Frame, you can follow Eric's provided resources, including research, references, and links. He encourages developers to try creating their own projects and to use emulators or editors like the one from Niantic to facilitate development.

Common applications of augmented reality (AR) mentioned in the talk include location-based experiences like Pokemon Go, QR code-based interactions for marketing, and marker-based AR for interactive cards and advertisements.

Challenges associated with using VR and AR technologies include potential dizziness or discomfort when using devices like Google Cardboard, and the need for powerful hardware to render complex 3D models and animations smoothly.

A-Frame is an abstraction layer built on top of Three.js that simplifies VR and AR development. It allows developers to create immersive experiences with a single codebase that can run on multiple devices, including VR headsets, mobile phones, and desktops.

The MetaQuest 2 is significant in Eric's talk as it represents an affordable entry point for experimenting with VR. He discusses how he used it to explore independent VR content and open-source projects available on platforms like SideQuest.

Eric suggests focusing on the middleware APIs like WebXR that can run across different devices, rather than the hardware itself. He emphasizes the importance of having more options and powerful, yet affordable, AR/VR glasses.

Eric sees significant potential in the future of WebXR with Apple's involvement, especially with the new Vision OS. He believes this will lead to more widespread adoption and faster development of immersive web technologies.

Erick Wendel
Erick Wendel
28 min
13 Jun, 2024

Comments

Sign in or register to post your comment.
Video Summary and Transcription
JS Nation explores the power of the web, including virtual and augmented reality experiences. WebXR and A-Frame enable creating native-like experiences using web technologies. Tooling and resources are available to develop and experiment with WebXR. The possibilities for web development and creative projects are endless. Improving AR devices, enhancing PDF viewers, and utilizing AI assistants are areas of interest. JavaScript abstraction and positive audience feedback were highlighted.

1. Introduction to JS Nation and the Power of the Web

Short description:

Hello, JS Nation! I flew from Brazil to show you experiments and motivate you to see the power of the web. Take pictures, mention us on social media. Are you new to JS Nation? Let's talk about VR glasses, coding for VR, and Pokemon Go. Let's go beyond the 9 to 6pm routine and use technology to improve creativity. Let's explore virtual and augmented reality experiences. Have you tried Google Glasses or Cardboard? The Metaverse is more advanced. Look at this game running on glasses. Augmented reality is even more exciting.

♪♪♪ Hello, JS Nation! Whoo! It's so good to be here. A lot of faces right there. It's amazing. I flew all the way back from Brazil. And today I'm gonna show you a bunch of experiments and I'm trying to motivate you to see what we can do in the web and how powerful we are right now. Please, take as many pictures as you can. This helps me a lot. This helps the event as well, so don't forget to mention us in your social media. And let's get going.

So, first of all, who is the first time in the JS Nation here? Wow! A lot of people. So, this was me back in 2022. I was showing how I was using JavaScript to integrate on my house, how my wife was getting mad with me doing a lot of engineering there, but it was pretty amazing. So, yes, JS Nation is one of my favourite talks or conferences ever. Who here has been playing with VR glasses? Ooh, nice. Who has actually coded something for VR glasses? Ooh, just small people. It's amazing. What about Pokemon Go? Who still plays Pokemon Go? Don't be ashamed, it's fine. Okay. So, I wonder if you are playing in... Well, don't do like this guy, right? Technology is evolving so hard and so fast, but sometimes we think, well, I don't think this is normal at all, but I will definitely do this, right? Okay, every time I do some conference or some talk, I know I have just a short time to introduce some ideas, so I brought some goals for you to keep in mind throughout this presentation. First, I prepared all the research, all the references, all the links, and I put in a single video so you can follow along later. My goal here is not just for you to listen to this talk, but experiment with it at your own house, create something, and let me know because I love how the technology is evolving at all. And I'm very like, I told everyone, man, we should do something beyond the 9 from 6pm, right? We should use technology to improve our imagination, our creativity, and actually to make money on YouTube, right? And, as I tell people, we would see technology or programming as something fun, too, so you can have your own hobby as you're doing your job, so you're creating, you're using chat CPT, you're using web APIs, there are so many things we are doing beyond the crud that we are doing, which is sometimes boring at all. So, I started figuring out how I would create virtual reality experiences or augmented reality, so I started researching and I found out there are so many possibilities that I've never heard anyone talking about and it's actually much easier than I thought.

Before we go on, who here has had this Google Glasses or Google Cardboard? Well, I had this one I got from an event, too, but I figure I was feeling a bit dizzy, I couldn't find like the, I don't know, the sensation, the emotions there, I wasn't feeling that good, so I was like, well, this is just hype. Then I saw Uncle Mark playing with the Metaverse and it was like, this is the graphics we saw in Nintendo Wii ten years ago, come on! We have PS5, PS4, we have the computers, well, we should do something way nicer. Then I saw this game. Look how beautiful is this. So, this is running in the glasses, so I was like, wow, so probably there are a lot of things that I'm not looking at and something that got me even crazier is using augmented reality, so you can have your characters, your enemies and monsters interacting with your own environment. So, this got my idea, I was like, wow, now it's way beyond what Zuckenberg was showing in his presentation and we can do it.

2. Exploring the MetaQuest 2 and WebXR

Short description:

I bought the MetaQuest 2 and discovered the independent Metastore with open source projects. Instead of Beat Saber, I found Moon Rider, a web-based game using WebXR. It's crazy how web technologies can create native-like experiences. But be careful, it can also be dangerous. I accidentally pushed my wife and dog during a demo.

So, I bought the MetaQuest 2, which now is outdated, but it was the cheapest one I could find. And someone told me about a store, which is an independent store outside the Facebook store, the Metastore, and here you're going to see a lot of open source things, a lot of independent developers who's building projects, building games, and you can just start using right away. Most of them use C Sharp and Unit, but I was like, okay, nothing special right here, I could install a lot of things, let's just move on.

When I started asking my friends, well, what should I install, they said, well, Beat Saber. Beat Saber is amazing. But when I saw the price, I was like, well, no, I won't buy this. But then I started researching on SideQuest and I saw a reinterpretation of this one, which was open source. It's called Moon Rider. Something pretty interesting here. WebXR. I was like, wait, if it contains the name web on it, this means that it could run in the browser. So, how this works. I have my browser there, and I actually feel as a native application. See, it's pretty beautiful. I'm going to fast forward here just so we can see playing games, too. But it's the same thing. So, I didn't go to C Sharp, I didn't have to make anything. I can use the same environment, the same ideas we are using for web for this kind of application as well. This is crazy.

For the next section, I should warn you. This technology is amazing, but it's pretty dangerous. I'm going to show you why. So, I was recording the demos for here and suddenly... Let's see again. Well, I pushed so hard. Once I pushed my wife, once I pushed my dog... So, yeah.

3. Exploring A-Frame and WebXR

Short description:

I pushed my wife and dog during a demo. WebXR is AR plus VR, usable on Chrome and Android. A-Frame is an abstraction for system calls. It can run on various devices using a single code. A-Frame is built on Three.js. I experimented with a game and used Google Cardboard for VR in the browser.

Well, I pushed so hard. Once I pushed my wife, once I pushed my dog... So, yeah. I think the system for security there is not so good. So, we should be aware that this can happen.

Talk about Moon Rider. I saw there. A-Frame and WebXR. So, I got fancy words to search about. So, WebXR and A-Frame, what are they? The first thing I found when searching about it, I found Google showing how Chrome works. So, basically, WebXR is actually AR plus VR. So, Google had this amazing video showing how those things are working. So, you can use it on Chrome, you can use it on Android. The experience is amazing. You use native APIs, which is not only Google's. It's web standard. So, you can use whatever works, whatever browser is implementing it. And A-Frame is just an abstraction like Angular, React, and everything for those calls to the operating system.

So, one thing that is amazing about A-Frame is that with a single code, I can have my code running in the meta quest, in the mobile devices, in the desktop, and even Apple Vision. My God! I'm going to see this work in practice. A-Frame is built on top of Three.js. Has anyone here played with Three.js? Wow! So, you guys and girls are ready to start working with this right away. I'm a back end developer, by the way. So, this for me is magic, right? Well, when I started doing some experiments, I saw this game. So, I'm using here on my phone, just using the same experience I always had on my phone. And I have the button right there, VR. So, when I click it, I can use the Google Cardboard and see it working just fine. And see, it's in the browser. So, I can have the same game or same experience running on any of those devices.

4. Getting Started with Tooling

Short description:

You can have the same game or experience on any device. You need an emulator and can use an API emulator or a fancy editor like the one from Niantic. They have a library of projects, but there is a cost. To use WebXR on Apple devices, you need to install alternative browsers.

So, I can have the same game or same experience running on any of those devices. So, now you're a hackerman, right? Now you're ready to start implementing and now we should take a look at the tooling. What do you need to start playing with it?

First, you're going to need some emulator, right? So, when I started, I was debugging things, I was deploying things on my phone, trying to figure out the environment, moving and everything. But here, you can use just an API emulator that works just fine. If you want to go fancier, there is a very nice editor. The company is Niantic, the same company that built Pokemon Go.

And there, they have an emulator. So, you can put your different videos and you can train and check the depth of your videos, your experiences. I think this, for me, was way easier. They have a huge library of projects, too. But still, you have to pay. So, if you don't want to pay, you can actually install it by yourself. Well, I figured out Apple doesn't allow you to use WebXR native on the browsers, even on Safari, so I had to install alternative browsers to make it work. And then, we are all set, okay? So, see how amazing this is.

5. Exploring 3D Modules and Augmented Reality

Short description:

You can download 3D modules and use them. There is an amazing library where you can download for free or buy other modules. A-Frame provides a lot of environment components for virtual reality. For augmented reality, you can use LIDAR or computer vision to analyze the environment. There are different types of augmented reality, including AI market-based and location-based.

I have a frame, I have some components, I have an entity, and see, I'm going to draw a sun, a solar system, with just a few lines of code, I start working right away, and I can animate, it's pretty powerful. However, when I go there, I was like, okay, this is not so good because I want something powerful, something I can use. So, you can download 3D modules, are just ready, and start using them.

So, here, there is this amazing library that you can go there, download for free, you can even buy other ones for free, buy for free, it's not easy, right? You can go there and you can buy, pay the license, and just start using. How is this it? Let's see. So, I got this pretty complex gallery, and I'm going to go there. This one is one free. I'm going to download the object, and as an image for HTML, I can just refer to the object, and everything's working. I can test it on MetaQuest, I can test on my cell phone, just with it. So, yeah, a lot of ideas.

So, when I saw this, I was like, yeah, man, it's back on the Rapiboost Rap Times, right? We have the templates, we can start using. I don't know what is going on behind the scenes. However, if I want to extend it, I can have this as well. What about animations? I think video games is all about animations, okay? So, how do you do? Same idea. You have the 3D module there, you can download it, and you can upload to Mixmanual, which is I think it's free, I don't know, they have some paid plans. But, see, animations are right there. It's going to change your model, you can download it, and just use it on an HTML file. It's crazy how easy this thing is.

Well, and how are we going to use? So, A-Frame brings a lot of environment components, so you can use the sky, you can put animation, you can put Capoeira. One thing amazing about this is like the hello world, right? Every time we start working with some technology, it's like, oh, my God, this is working just fine, with just a few lines of code, and we abstract all the things, like, oh, my God, where is the database? No, just doing something for fun, but that could actually end up with something good. Okay, virtual reality is done, what about augmented reality? I want to integrate with my own environment, I want to hold some element and start working. So, I think the showcase is Pokemon Go, so when you go to a lake, you're going to see different fishes there, you're going to see some terrains differently.

But to make this, usually, your cell phone, the modern cell phones, they have the LIDAR scanner, which is a LIDAR that throws a light, and it's mapping, like, the time the light is going to come back, so they know there is some obstruction, and you can put your own objects there. Okay, but I was building all these demos with an old iPhone, and I was so happy. This sensor is so amazing how they work, and my friend just told me, man, you don't have the sensor, right? This is a microphone. I was like, wow, but how is this working? So, when you're working with A-Frame, they have fallbacks, and there, you can use computer vision to analyse your environment and get to know where is everything, the obstacles, and much more, so it's pretty helpful. We have a bunch of types of augmented reality, so the first one that I love is AI market base, so you can have a card like this, you can put some image. Has anyone played Yu-Gi-Oh! Form the Memories here? Wow, a lot of people. I was thinking, imagine for the monsters going there, my God, this brings a lot of ideas, and we just have to play with it, right? Well, one of the most commercial ones is the location base, so you can put your own GPS and the coordinates, so you can show your friends where you live, or go to museums, stores, and much more, but I want to tell you, this doesn't work just fine when you're working on very close environments. This works better outdoors.

6. Creating 3D Animations and Apple's Vision OS

Short description:

You can create 3D animations using QR codes. Another interesting feature is the heat test, where you can place objects in the environment and interact with them. Apple's Vision OS with WebXR allows for similar experiences and includes eye tracking.

Okay, and the last one is the most popular one. I would say Coca-Cola and others have been doing a lot of this for Christmas, so you have a QR code, and with this QR code, you want to place some 3D objects that are there, and you can just make this animate. So, let's see. Look how amazing is this. So, someone just got there and put an image on this, but it doesn't look like a 3D model, right? It's pretty, exactly the same, but you can make this animation, so I would say for your business card, you can do some experience like this in just a weekend, but you can extend it, create products, or even do just for fun.

One of my favorites is the heat test, that you can place objects in the environment, so with this, you can create, like, your Iron Man experiences. Imagine, I'm gonna put here YouTube, I'm gonna put here my work, VS Code, Terminals, and you can see everything on your store. You can start walking around this, you can interact with it. So, the only thing needed here is creativity, which is sometimes hard, but when we are playing with something new, it's something amazing.

Three days ago, someone had a huge event, and for me, it was interesting. You saw, I've been mentioning a lot, MetaQuest, mobile phones, because you saw, for running on iPhone, I had to install a different browser that marks the behavior, and I can work on there. However, on WWDC24, Apple just saw, well, now you could use WebXR for Vision OS. This is nice, isn't it? It's a new market, so you can start creating projects, and you can make money on this, because there's not a lot of people doing right there. So, on their OS, you can go there and have very similar experience. However, Apple has eye tracking, right? So, when you are looking at the object, you can just pinch and move the objects right there, and this, for us, on the web, they're going to emit the same events we've been seeing on the browser. A click, a drag, and this just works fine.

7. Exploring Web Development Possibilities

Short description:

Web development offers amazing possibilities for experimentation and project creation. Get ready to integrate your own knowledge and start creating unique experiences. Here's the link for all the experiments and slides. Let's capture our video selfie together!

A click, a drag, and this just works fine. Wow, that's a lot of information, right? As I told you, I gathered all the information for you to follow along later, to experiment with it, and create projects.

Now, I want to make a question. Who is going to experiment with it? Oh, very nice, very nice. So, web development is amazing. We have web APIs, we have strings, web workers, we have 3D, WebQL, web CPU, my god, we have everything we can just make work. But now, we can integrate your own knowledge to start creating those experiences.

All right, so, as promised, here's the link for you. I'm going to send everything, all these experiments, all these slides as well, so just take a moment to take a picture right there. And as a tradition, let's make our video selfie, right? What do you think? Woohoo! Okay, I'm going to count on three, and we're going to make, oh, and just raise the hands, okay? So, one, two, three, oh! Thank you so much, Yassi Nation!

8. Exploring Creative Possibilities with Web XOR

Short description:

Node.js is green, just like my favorite car. Let's explore creative approaches for creating a Web XOR where users can try different watches using TensorFlow.js. Imagine interactive 3D objects and even horror movie-inspired experiences. The possibilities are endless!

First one in for you to try and answer, when can I come and have a drive in the Mercedes? This is a pretty good joke, because I bought a car, my favorite color, it's, I don't know, green, follows me my whole life. It turned out Node.js is green as well. It's like that neon Node.js green as well. Pretty bright, yeah, it's lime green. Yeah, lime, yeah. So, everyone was like, oh my god, it's a Node Mobile. I was like, yeah! It really is. Excellent. And again, just to show people are already thinking about what they can create, hold on, I'm going to change this. What would be the best approach for AOR if I wanted to create a Web XOR where users can try different watches? So, people are already trying to be creative. You've already got people. There are a bunch of things that you can do. You can put your QR code, or you can, something that I was trying out at home. We have TensorFlow.js, right? I think Jason is going to talk about something about this later, too. But we can understand the whole environment, detect images, and place the 3D objects right there. So, with your watches, I don't think we're going to have some holographic just looking at it, but you can use your phone to try out and showing something like this, right? There's so much stuff coming in here. Ooh, nice. It brings out a lot of ideas. One cool thing. You know, I forgot. It's a horror movie where the girl just go out from the screen, and she was just jumping on the face of the user or who's watching. This should be nice. Imagine giving the cell phone to my mom. Mom, point to the TV, and then something jumps on her face. Like, wow, this should be nice. That's a way to give her an early... That's a funeral coming up after that. Wow. I'm definitely going to try that. My poor mom is definitely going to get that.

9. WebXR Development and Resource Availability

Short description:

WebXR is actively being developed, with the addition of Apple joining the game. The community is growing, and there are numerous resources available. The platform is stable, experimental, and offers a wide range of possibilities. The complexity of the experience determines the need for storage and the traffic for clients. A-Frame handles lazy loading and renders the screen on demand, making it a responsible choice.

My poor mom is definitely going to get that. Yeah. A question on WebXR, and I'm assuming with the Apple announcement it should be. But is WebXR being actively developed further? Yeah. I was at Google I.O., so I met some guys who were working on the team there, and they were like, we are not talking a lot about this, but the spec is going very well. And now with Apple joining the game, I think everything's going to be way easier for us, because the speed thing, the eye tracking, they said they're going to have something for the phones, too. So probably now we're going to fester things up. But honestly, I'm just curious. So I've seen the community, it's pretty huge. If you try finding the resources I left there, there are a lot of links that people are doing, so you can use joysticks, you can use your eyes, you can use the sensors, cameras, and it's a lot of things. It's pretty stable already, I would say. And there's so much different things you can try out. It's experimental, but it's pretty stable. Un-experimental, that's the whole point. Yeah, exactly.

Do you need a big database to store all these different models and animations? How much do you need, and how big is the traffic for these clients? The first thing I thought was, well, I got the gallery there, which was a 3D module. It was huge, it was 200 megabytes. But I was thinking, well, then I should create everything on features, editors, like Blender or others. But then what I realized, you're going to have each individual piece of your environment, like the character here, I have the table. Each one of them, there are going to be a different separate file, so you're going to render the screen on demand. So depending on the platform you go, they could be really heavy. Like, if you're going to play on PS4, you're going to see the loading. So depending on how complex is your experience, this is going to be very heavy.

So it depends on the hardware you're actually running? It depends on how complex is the application. Because it's the elements, right? We are rendering 60 frames per second there. And is that like lazy loading, like a web? It's only what's in view is what's being loaded? And it's all built in, in A-Frame. Ah, excellent. So A-Frame is the way to do that responsibly. It's amazing. Amazing.

10. AR Helmets and PDF Viewer Enhancements

Short description:

AOR helmets are currently expensive and impractical for outdoor use. Speculating on making AR more approachable, the focus should be on the API rather than the devices. More affordable and powerful glasses with eye-tracking capabilities are desired. Improving PDF viewers in XOR involves rendering 3D elements and creating interactive experiences.

All right. AOR now requires a huge helmet. It costs a fortune. No one would use them outside. I don't know. Did you see that video? I think people would use them outside. You just look ridiculous outside. How to overcome the issue? Would Google Glass or something like that come back? Do you see, this is just speculating, what would you like to see happen to make it more approachable?

Honestly, I think for me, I would think more in the layer back. I don't care much about the devices, but the API that is in the middle. So if Google Glasses come back, I would love to see how they're going to work. But for now, I was like, well, the only thing I'm focusing is, I have a single code that would run everywhere there. So, yeah, but I would like to see more glasses, right? More options. More powerful, more options. Like with the camera inside to see the eye track and everything, but not too expensive. Stop you punching things like your wife when you're playing games. I think the problem now is, like, we don't have, I mean, there are some glasses, pretty small, but for screens, multiple monitors, but not like the Apple Vision with just these glasses. It's something, I don't know how they're going to make it, because there's a lot of processing power. Where are, I've deleted one that I wanted to ask there.

How should we make PDF viewers more attractive in XOR? PDF? PDFs. He's like, let's make PDFs cool. This is something nice, but I wouldn't say it's a PDF anymore. Probably a readme file or a website that is going to be rendered there. Because, imagine, it's 3D things popping up. That might be a book door or something, I don't know. I don't know. Imagine, it's like the Easter egg on Microsoft Word. When you are clicking, you see the clip one is just moving. Yeah, Mr. Clippy, yeah. There's a library, Mr.

11. Clippy JS, AI Assistants, and A-Frame vs Tree.js

Short description:

Clippy JS can be used in websites with the Konami code. Ideas include interactive card games with AI assistants and the potential risks of playing with human-like avatars. A-Frame is an abstraction layer that utilizes Tree.js for rendering objects, providing flexibility in choosing between the two.

Clippy JS. So you can use Konami code? You can actually put them into your websites now. Go and just install that in MPM. It's Mr. Clippy JS. And then you can have a terrible little, it'll be great with all these new AI tools to have Mr. Clippy on steroids now for our documents. Bring him back.

Some of the ideas I had was, imagine I'm playing cards, like playing Uno. And I have a ghost or someone talking to me, which could be an assistant like ChatGPT, Google or whatever. But they are playing cards, like moving cards. It's a game that's going on there. But I feel like it's a real person. So, how dangerous do you think this is? Playing with someone that looks like a human.

Yeah, it'll be fun. We're getting there. Sure. With the new vision and do we have avatars that look like people? We can have you do a talk next year and you won't even be here. Exactly. It'll just be recorded from your last talks. Exactly. Or your twin, right? Yeah, exactly. Your evil twin. A-Frame is limited in functionality compared to Tree.js. Could you explain where it is easier to use A-Frame and where Tree.js, or even native WebGPU or WebGL?

Nice. So, basically, A-Frame is the abstraction layer, right? So, it's using WebXR or they have a lot of fallbacks if your computer is too small, your device is limited resources. But behind this, they use Tree.js to render all objects. So, if you're like, oh, I didn't like much A-Frame, but I have this, this, and that. Well, you can go from there and use Tree.js directly. It's like using TypeScript.

12. JavaScript Abstraction and Feedback

Short description:

If you don't like the types, you can use JavaScript directly. The talk received positive feedback and excitement from the audience, with kind messages and compliments. Eric will be available for Q&A outside and online via Discord.

If you don't like the types, you can use JavaScript directly. It's just an abstraction layer, but it's powerful. Very powerful.

Yeah. I'm really excited. I nearly forgot to come up and ask questions because I was so into this talk. I was quickly trying to find a link to this slide on my site. Amazing.

So, well done. And the sentiment was shared because we even have a lot of kind messages coming in saying thank you for such a lively talk. There's a lot of comments. They're meant to be for questions, not compliments. I'm going to find them outside. And there's some Ronaldo questions. Amazing. Pro Brazil stuff. There's a lot of big fans here as well.

That's all the time we have for questions in here. But if anyone wants to find Eric afterwards, he's going to be hanging out at the Q&A booth outside. Or if you're watching online, join the Discord and ask some questions there. Again, one more time, a round of applause for Eric. Thank you. Thank you. Thank you.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

The Future of Performance Tooling
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Today's Talk discusses the future of performance tooling, focusing on user-centric, actionable, and contextual approaches. The introduction highlights Adi Osmani's expertise in performance tools and his passion for DevTools features. The Talk explores the integration of user flows into DevTools and Lighthouse, enabling performance measurement and optimization. It also showcases the import/export feature for user flows and the collaboration potential with Lighthouse. The Talk further delves into the use of flows with other tools like web page test and Cypress, offering cross-browser testing capabilities. The actionable aspect emphasizes the importance of metrics like Interaction to Next Paint and Total Blocking Time, as well as the improvements in Lighthouse and performance debugging tools. Lastly, the Talk emphasizes the iterative nature of performance improvement and the user-centric, actionable, and contextual future of performance tooling.
Rome, a Modern Toolchain!
JSNation 2023JSNation 2023
31 min
Rome, a Modern Toolchain!
Top Content
Rome is a toolchain built in Rust that aims to replace multiple tools and provide high-quality diagnostics for code maintenance. It simplifies tool interactions by performing all operations once, generating a shared structure for all tools. Rome offers a customizable format experience with a stable formatter and a linter with over 150 rules. It integrates with VCS and VLSP, supports error-resilient parsing, and has exciting plans for the future, including the ability to create JavaScript plugins. Rome aims to be a top-notch toolchain and welcomes community input to improve its work.
Conquering Complexity: Refactoring JavaScript Projects
JSNation 2024JSNation 2024
21 min
Conquering Complexity: Refactoring JavaScript Projects
Today's Talk explores the complexity in code and its impact. It discusses different methods of measuring complexity, such as cyclomatic complexity and cognitive complexity. The importance of understanding and conquering complexity is emphasized, with a demo showcasing complexity in a codebase. The Talk also delves into the need for change and the role of refactoring in dealing with complexity. Tips and techniques for refactoring are shared, including the use of language features and tools to simplify code. Overall, the Talk provides insights into managing and reducing complexity in software development.
Improving Developer Happiness with AI
React Summit 2023React Summit 2023
29 min
Improving Developer Happiness with AI
Watch video: Improving Developer Happiness with AI
GitHub Copilot is an auto-completion tool that provides suggestions based on context. Research has shown that developers using Copilot feel less frustrated, spend less time searching externally, and experience less mental effort on repetitive tasks. Copilot can generate code for various tasks, including adding modals, testing, and refactoring. It is a useful tool for improving productivity and saving time, especially for junior developers and those working in unfamiliar domains. Security concerns have been addressed with optional data sharing and different versions for individuals and businesses.
Automate the Browser With Workers Browser Rendering API
JSNation 2024JSNation 2024
20 min
Automate the Browser With Workers Browser Rendering API
The Talk discusses browser automation using the Worker's Browser Rendering API, which allows tasks like navigating websites, taking screenshots, and creating PDFs. Cloudflare integrated Puppeteer with their workers to automate browser tasks, and their browser rendering API combines remote browser isolation with Puppeteer. Use cases for the API include taking screenshots, generating PDFs, automating web applications, and gathering performance metrics. The Talk also covers extending sessions and performance metrics using Durable Objects. Thank you for attending!
Static Analysis in JavaScript: What’s Easy and What’s Hard
JSNation 2023JSNation 2023
23 min
Static Analysis in JavaScript: What’s Easy and What’s Hard
Static analysis in JavaScript involves analyzing source code without executing it, producing metrics, problems, or warnings. Data flow analysis aims to determine the values of data in a program. Rule implementation in JavaScript can be straightforward or require extensive consideration of various cases and parameters. JavaScript's dynamic nature and uncertainty make static analysis challenging, but it can greatly improve code quality.

Workshops on related topic

Solve 100% Of Your Errors: How to Root Cause Issues Faster With Session Replay
JSNation 2023JSNation 2023
44 min
Solve 100% Of Your Errors: How to Root Cause Issues Faster With Session Replay
WorkshopFree
Ryan Albrecht
Ryan Albrecht
You know that annoying bug? The one that doesn’t show up locally? And no matter how many times you try to recreate the environment you can’t reproduce it? You’ve gone through the breadcrumbs, read through the stack trace, and are now playing detective to piece together support tickets to make sure it’s real.
Join Sentry developer Ryan Albrecht in this talk to learn how developers can use Session Replay - a tool that provides video-like reproductions of user interactions - to identify, reproduce, and resolve errors and performance issues faster (without rolling your head on your keyboard).