Or I would say also mixed reality like XR, really XR using metaverse. So I do hope we are at the beginning of a new cycle, but it's already being deployed in production, to be honest for AR at least. I'm assuming a lot of people who are going to be developing some of those VR and AR, so good luck to you if you have to.
So I've got a question from Melody Connor, who seems to be an iOS developer who is asking, when do you think WebXR will make it to the Safari iOS system anytime soon? So it's maybe more a question to Apple, and I would love Apple to embrace WebXR. So there's been some rumors. Also if you follow some people that have been influenced in the web space or PWA, some people have seen that WebXR was maybe about to be implemented by Apple. I don't have any official information about that. We have rumors about maybe Apple going to chip, you know, glasses. I would love Apple to embrace this technology, because it would mean for sure that we would be able to have a single code base being able to target all devices, which is also the promise of web technologies. Before that, you can already experiment a little bit on the iPhone using more or less web technology. We have some kind of component, web component, that exists, that can translate, you know, your model to the USD format of Apple and being able to do these kinds of AR. So I've been doing experiments myself, but it's not using WebXR. You are forced today to use a native implementation. And a way to influence that, I would say the more people will use WebXR, the more pressure on Apple to consider WebXR implementation, for sure, because they don't want to miss those great experiments on their own devices. So this is a way for you also to, to influence that in a way. So I guess we need to get Apple on the next one and we can ask them that question.
So we've got another WebXR question from KB, who's asking, how does WebXR compare with other native or desktop-based counterparts in terms of the performance of the product? So we always have this question, to be honest, like outside of WebXR, even with WebGL and now we've got WebGPU. Of course, the performances of what you can do in a browser today are less impressive than what you can do using the native stack like either DirectX, OpenGL, or meta on iOS. But that's why you need to take that into account too to design your experience. You won't be able to build AAA games today in the browser, not yet, even if it's been the equivalent of the performance of at least Xbox 360, and now even more than that, in the browser which means that it already enables some interesting scenarios in the browser. The main limitation we have is not the GPU to be honest, but the CPU side because JavaScript is still being monothreaded most of the time so it's going to limit a lot what you can do regarding physics for instance because it's all on the CPU side. So compared to native, for sure we have less powerful features. But that's why I think using simple model to do AR or even VR like a lot of people are doing VR using low-poly models, you can already do a lot of stuff and compared to native you can target much more devices. Awesome, thanks for that.
So another one on the Babylon.js and WebXR toolkit and I know Babylon.js just got a major update only last week, so it might have been included, but can the toolkit detect vertical surfaces? Vertical surfaces, so this is a good question. I would need to check with my team because I'm not sure. I know that we have access to you know spatial mapping to have flat surfaces like a horizontal one. I would need to check how to do the vertical one. So I'm not sure like what we shipped exactly with the toolkit was a set of UI building blocks you can use coming from the HoloLens because we've been working closely with the people building the HoloLens and to get inspired by their UX paradigm and to put that into Babylon, but I would need to check the vertical one and feel free to ask the question on our forums like on the Babylon.js forum. So you can find it on Babylonjs.com.
Comments