Building the Interconnected, Traversable Metaverse

Rate this content
Bookmark
The video discusses building an interconnected metaverse using web technologies. It highlights the use of procedural generation and AI systems to create immersive, livable worlds. The project aims to make the metaverse accessible on various devices by leveraging the universality of web technologies like JavaScript and APIs. The speaker introduces Exokit, a browser designed to explore web capabilities for virtual worlds, and Totem, a system for integrating 3D assets via URLs. The video also covers the integration of NFTs into the metaverse, allowing users to load and interact with their NFT assets seamlessly. The project emphasizes the importance of interoperability, enabling users to carry avatars, assets, and identities across different metaverses, and aims to create a positive perception of the metaverse by showcasing its potential benefits.

From Author:

Largely based on Free Association in the Metaverse - Avaer @Exokit - M3, Avaer will demo some of the ways that open standards enable open and free traversal of users and assets throughout the interconnected metaverse.

This workshop has been presented at JS GameDev Summit 2022, check out the latest edition of this JavaScript Conference.

FAQ

The project leverages the universality of web technologies, ensuring that the virtual worlds it creates are accessible on any device with a web browser, promoting inclusivity and wide accessibility.

The main goal is to demonstrate what's possible for open virtual worlds on the web, showcasing the integration and functionality achievable using browser-based technologies.

The project aims to address skepticism and controversy around the Metaverse by demonstrating the positive potential and benefits of open virtual worlds, trying to convince users that the Metaverse can be good.

The project utilizes browser-based technologies, open source tooling, JavaScript for asset loading, and integrates various APIs for enhanced functionality in creating virtual worlds and experiences.

Exokit is a browser written by the speaker, designed to explore the capabilities of web technologies in running advanced applications like virtual worlds, directly influencing the development of this Metaverse project.

Yes, assets from different sources including IPFS, GitHub, and even NFTs can be integrated into the virtual world using standard web browser APIs and JavaScript, showcasing the flexibility and openness of the platform.

Totem is a system developed within the project that uses templating and intelligent code generation to facilitate the loading of 3D assets from URLs, effectively allowing easy asset integration and management in the virtual environment.

The future development aims to include more advanced features like AI integration, multiplayer support, and continuous improvement on asset interoperability, striving to create an immersive and dynamic virtual world experience.

Avaer Kazmer
Avaer Kazmer
103 min
12 Apr, 2022

Comments

Sign in or register to post your comment.

Video Transcription

1. Introduction to the Metaverse

Short description:

Let's take a look! The main goal is to open your eyes to what's possible for open virtual worlds on the web. Another goal is to convince you that the Metaverse can be good. My personal goal is to learn something today. I've been making Metaverse adjacent technologies for eight years, constantly surprised by what's possible on the web. I wrote a browser called Exokit. We have the ingredients to import the metaverse using a web browser API. All you need is a translation engine and JavaScript code to load assets. We're making the world's first distributed video game using Totem. Everything in the game is an asset with a URL. In the future, we'll provide services like VRChat with AR and VR support and integration with KaleidoKit.

Let's take a look! Yeah, so another title for this room might have been Import- the Metaverse. I kind of have a few goals that I wanted to hopefully accomplish here. The main is to just open your eyes, maybe to what's possible for open virtual worlds on the web. A challenge goal is to kind of convince you that the Metaverse can be good. I know there's a lot of skepticism and controversy about that these days. And my personal goal is, I don't know, I hope I learned something today.

I wrote some script notes that I kind of wanted to cover, but I hope that anybody who has any questions about the crazy things that I'm going to be talking about, just kind of chime in. I'd love to kind of deep dive into whatever parts of this people find most interesting.

So some background for me, I've been making basically Metaverse adjacent technologies for about eight years now. And all of that has been on the web. I basically started on the web and I never left. And I've been constantly surprised what I was able to do just using basically what's available in the browser. The open source tooling that's already out there, just to make experiences, worlds, and even my own browsers.

I wrote a browser called Exokit. I was really surprised that all of this is possible on an open technology like the web. That basically anybody can host, anybody can experience on any device. And it seems like I was the only person that was doing that. Every single time I would show off one of the cool new hacks that I made, whether that was the browser, running on magic leap or some immersive N64 emulators that I wrote which basically let you play Legend of Zelda in your HTC Vive in the browser no less, using a web VR. It was called at the time.

So I basically started hanging around all the people who thought that, like, wow, this is really cool, like, I didn't know that this was possible. And yeah, we kind of started building up this technology toolkit of all these different things that we've been releasing basically open source from day one and working together on. And eventually what it started looking like, was that we basically had the ingredients to quote unquote import the metaverse, meaning you can use a standard web browser API to load in assets from all over the web. From all over different places, including for example IPFS, GitHub hosting basically any website, or even things like NFTs.

And it turns out that this actually isn't that hard to do. All you really have to do is add a translation engine that can basically take a URL and then write some JavaScript code which will be able to load that asset. So what you're looking at here, basically everything here is just an import statement in JavaScript. So this is actually a GLB file. This I'm not sure if it's a GLB file or it's procedurally generated. This is a JSON file that's actually describing that pistol and referencing all the assets and the character itself is on VRM file that we just imported into the world. And so with the way that this works is we actually just have a system on the back end where it takes those URLs and basically uses templating and some intelligent code generation to give you a loader. For the front end that will give you the 3D version of that asset. And that gets hooked in through some API's to the rest of a quote unquote game engine, which provides the services that you'd need for something like this, or you can walk around, you can interact with the environment. You can have things follow you And you can have an inventory that you can kind of equip on yourself.

So we basically started plugging all of this stuff together and started realizing that, hey, we're actually making possibly the world's first distributed video game where all of these assets are just kind of plugged together using a client that we're all kind of agreeing on using. So that technology that we're using to basically make all these translations happen and plug into the game engine. We ended up calling that Totem, because we're basically trying to develop a culture of creating art from the raw ingredients of the web. And we're basically sticking things together. Things are being imported. They're interacting with each other. And we're basically stacking art together. We're creating these virtual totems in a sense. Everything in the game is an asset with a URL and that can be even a local file if you're hosting locally. I am right now. It could be something on a GitHub repo, IPFS, or like I mentioned, even NFTs if you just want to point it at an Ethereum address. Yeah. And in the future, there's a lot of really crazy things that we'll be able to do once we get this baseline game working. One of the things is just providing simple services to recreate the software that a lot of people love like VRChat. Where you basically have your avatar, which can animate. It has facial expressions. You have mirrors, and we'll have soon AR as well as VR support as well as integration with KaleidoKit.

2. Building Worlds and Creating Spatial Relationships

Short description:

This allows us to build worlds and traverse across them, creating spatial relationships. Scenes can be contained within each other, providing level of detail without background noise. The street is a directional representation of virtual worlds, with different content sections. Metadata defines game elements like quests. Our goal is to build the first distributed video game, using open-source technology. We use three.js, CryEngine, and PhysX for rendering. React is used for the UI. We support drag and drop and smooth animations. Our toolkit aims to make these elements work together seamlessly.

So that's basically a way where you can use your webcam to v-tube your avatar here on the side. And another thing that this allows is for us to build worlds so that we can traverse across and share in the sense that I have some sort of way to identify where I am in this quote-unquote metaverse. And I can go over to your place, and we can have a spatial relationship between those two places.

So this is still actually kind of buggy, but I'm going to make it a little bit more manageable. Because we're loading everything through these structured import statements, it allows us to basically have one scene be contained within another scene. And for scenes to be a first-class concept that can be, for example, captured... so you can draw previews of the scene and basically get level of detail from a distant asset without having to deal with any background of the scene.

So there you have it, the perfect scene that's been created for a scene and basically get level of detail from a distant asset without actually having to go there. So right now what's going to happen is this is a place that we call the street. It's basically supposed to be a directional representation of virtual worlds, and each section of the street has different content. As I approach, what this should actually do is it should sideload that scene, take a 3D virtual screenshot of it, and then basically give me a low-res version that's LOD'd before I even go there. This is basically yes, so I actually have to turn off the fog there. But what that actually did is all that is JavaScript code that ran on the side, and basically gave you a little detail. If you turned off the fog, you'd even see that that is fully textured.

And what we're also going to do to improve that is use marching cubes to give basically a full 3D mesh with texture of the virtual world that will be there when you go there. And of course, you can actually just go there, and now you're in some of these virtual world. Another thing that this allows us to do is use metadata to define the kinds of things that you'd want in a video game, like for example, quests. There's actually a quest in this world. It says, destroy all Reavers in the area, and there's a reward of some virtual asset. So if I can find it, I'm not sure where it is. This might actually be broken, but we also have a pathfinding system to kind of geolocate you in the virtual world so you can locate your objective. And then if there's some Reavers, you can go and slay them and get your virtual silk, whatever that means.

All of this is actually open source. And our goal is to essentially build the first distributed kind of video game. Anybody can fork this stuff. And one of my priorities right now is actually to plug in all the different virtual communities that have already been starting to build on top of this. So technology-wise on the render, in terms of engine, a lot of the stuff is both stuff that I built and a lot of it is just plain open source code. Like for example, everything is three js. There's a scene graph. There's really nothing crazy about what we're doing, other than we do use Unreal Engine's Bloom shaders and SSAO. Sorry, not Unreal. This one's CryEngine. But that's what kind of gives worlds. If you defined the render settings properly, this really nice misty effect. We use PhysX, which is the same, basically, PhysX engine that everybody's using, including Unity. It's built into WebAssembly and we're loading it that way. For the UI, we're just using React, although there's a lot of WebGL, for example, to render these kinds of panels and these different shaders. It's all just actually react code under the hood. The avatar IK stuff is actually an open source Unity project that we ported over to JavaScript. That's how you get all the different smooth animations, and basically, the ability to animate any character regardless of what asset they are. In fact, let me show you some re-targeting of animations. We're going to change characters. This is just using the same exact animations, but the character still works. Just the same. We also support drag and drop in this game. If you have your own virtual assets, you can drag them in. That actually gets uploaded to a server and then downloaded back again. If you want to drop them into the world, you can. And if it's an avatar, you can wear it. And essentially, we're trying to build the world's best toolkit that we know of to make these kinds of things work together in the context of a video game.

Watch more workshops on topic

Make a Game With PlayCanvas in 2 Hours
JSNation 2023JSNation 2023
116 min
Make a Game With PlayCanvas in 2 Hours
Top Content
Featured WorkshopFree
Steven Yau
Steven Yau
In this workshop, we’ll build a game using the PlayCanvas WebGL engine from start to finish. From development to publishing, we’ll cover the most crucial features such as scripting, UI creation and much more.
Table of the content:- Introduction- Intro to PlayCanvas- What we will be building- Adding a character model and animation- Making the character move with scripts- 'Fake' running- Adding obstacles- Detecting collisions- Adding a score counter- Game over and restarting- Wrap up!- Questions
Workshop levelFamiliarity with game engines and game development aspects is recommended, but not required.
PlayCanvas End-to-End : the quick version
JS GameDev Summit 2022JS GameDev Summit 2022
121 min
PlayCanvas End-to-End : the quick version
Top Content
WorkshopFree
João Ruschel
João Ruschel
In this workshop, we’ll build a complete game using the PlayCanvas engine while learning the best practices for project management. From development to publishing, we’ll cover the most crucial features such as asset management, scripting, audio, debugging, and much more.
Introduction to WebXR with Babylon.js
JS GameDev Summit 2022JS GameDev Summit 2022
86 min
Introduction to WebXR with Babylon.js
Workshop
Gustavo Cordido
Gustavo Cordido
In this workshop, we'll introduce you to the core concepts of building Mixed Reality experiences with WebXR and Balon.js.
You'll learn the following:- How to add 3D mesh objects and buttons to a scene- How to use procedural textures- How to add actions to objects- How to take advantage of the default Cross Reality (XR) experience- How to add physics to a scene
For the first project in this workshop, you'll create an interactive Mixed Reality experience that'll display basketball player stats to fans and coaches. For the second project in this workshop, you'll create a voice activated WebXR app using Balon.js and Azure Speech-to-Text. You'll then deploy the web app using Static Website Hosting provided Azure Blob Storage.
Tiny Game Live Coding Workshop
JS GameDev Summit 2023JS GameDev Summit 2023
115 min
Tiny Game Live Coding Workshop
Workshop
Frank Force
Frank Force
Dive into the captivating world of micro-game development with Frank Force in this interactive live coding workshop. Tailored for both seasoned developers and curious newcomers, this session explores the unique challenges and joys of creating games and demos with extreme size constraints.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Optimizing HTML5 Games: 10 Years of Learnings
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Optimizing HTML5 Games: 10 Years of Learnings
Top Content
PlayCanvas is an open-source game engine used by game developers worldwide. Optimization is crucial for HTML5 games, focusing on load times and frame rate. Texture and mesh optimization can significantly reduce download sizes. GLTF and GLB formats offer smaller file sizes and faster parsing times. Compressing game resources and using efficient file formats can improve load times. Framerate optimization and resolution scaling are important for better performance. Managing draw calls and using batching techniques can optimize performance. Browser DevTools, such as Chrome and Firefox, are useful for debugging and profiling. Detecting device performance and optimizing based on specific devices can improve game performance. Apple is making progress with WebGPU implementation. HTML5 games can be shipped to the App Store using Cordova.
Building Fun Experiments with WebXR & Babylon.js
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Building Fun Experiments with WebXR & Babylon.js
Top Content
This Talk explores the use of Babylon.js and WebXR to create immersive VR and AR experiences on the web. It showcases various demos, including transforming a 2D game into a 3D and VR experience, VR music composition, AR demos, and exploring a virtual museum. The speaker emphasizes the potential of web development in the metaverse and mentions the use of WebXR in Microsoft products. The limitations of WebXR on Safari iOS are discussed, along with the simplicity and features of Babylon.js. Contact information is provided for further inquiries.
Making Awesome Games with LittleJS
JS GameDev Summit 2022JS GameDev Summit 2022
34 min
Making Awesome Games with LittleJS
Top Content
Little.js is a super lightweight and fast JavaScript game engine that has everything included to start making games right away. It has a tiny footprint and no dependencies, making it perfect for size-coding competitions like JS13K. Little.js is built with an object-oriented structure and comes with several classes. It provides a fast rendering system, a comprehensive audio system, and various starter projects for different game types. Little.js is designed to be simple and easy to understand, allowing you to look at and modify the code.
How Not to Build a Video Game
React Summit 2023React Summit 2023
32 min
How Not to Build a Video Game
Watch video: How Not to Build a Video Game
The Talk showcases the development of a video game called Athena Crisis using web technologies like JavaScript, React, and CSS. The game is built from scratch and includes features like multiple game states, AI opponents, and map editing. It demonstrates the benefits of using CSS for game development, such as instant load times and smooth transitions. The Talk also discusses optimizing performance, supporting dark mode, and publishing the game to other platforms.
Boost the Performance of Your WebGL Unity Games!
JS GameDev Summit 2023JS GameDev Summit 2023
7 min
Boost the Performance of Your WebGL Unity Games!
The Talk discusses ways to boost the performance of WebGL Unity games, including issues with bundle size, memory usage, and runtime performance. It suggests using Brotli for compression and non-exception support for better performance. Choosing the appropriate texture compression format and experimenting with separate builds can also help. The Talk also covers optimizing textures, models, audio, and assets by reducing build size, using compression, disabling unnecessary models, and optimizing audio quality. Unity's optimization tools and profilers are recommended for analyzing performance and memory issues.
Web 3 Gaming: What it is and Why it Matters
JS GameDev Summit 2022JS GameDev Summit 2022
36 min
Web 3 Gaming: What it is and Why it Matters
Web3 gaming enables decentralized identity and finance, allowing game developers to bypass centralized platforms. It is driven by wallets, ERC20 tokens, and NFTs. Web3 games focus on collaborative world-building, ownership, and open-source collaboration. The challenge is achieving decentralization while addressing economic and technological limitations. Web3 aims to redefine the gaming industry by using economic tools and exploring new genres like RPG and RTS games.