The WebXR Device API

The WebXR Device API provides access to input (pose information from headset and controllers) and output (hardware display) capabilities commonly associated with Virtual Reality (VR) and Augmented Reality (AR) devices. It allows you develop and host VR and AR experiences on the web.

You can read more about the goals of this standardisation effort by reading the WebXR Explainer.

What does this mean...

For phones:

Enable VR by providing pose information and allowing the WebGL scene to be rendered side by side to be placed in a headset like the Cardboard

Enable AR by using the platforms AR capabilities such as ARCore to render the WebGL scene onto the users environment like a magic window.

For Desktops:

Desktop computers can make use of tethered VR hardware such as the Oculus Rift or HTC Vive to display the VR scene

For standalone AR Headsets:

Enable AR by using the platforms AR capabilities to render the WebGL scene immersively onto the users environment.

For standalone VR Headsets:

Enable VR by rendering the scene using the platforms VR capabilities.

Try out some Demos

A demo of VR and AR with the WebXR Device API, embedded with an iframe, for iframes allow="xr-spatial-tracking" is required.

These are samples from the WebXR Samples from the Immersive Web Working Group They use a very minimal libary to show how one can make use of the WebXR Device API directly.

This sample demonstrates use of an 'inline' XRSession to present content on the page prior to entering XR presentation with an immersive session.
This sample demonstrates basic tracking and rendering of XRInputSources. It does not respond to button presses or other controller interactions.
This sample demonstrates teleporting the viewer by updating the XRSession reference space.

Benefits of doing XR on the Web

WebXR in the Real World

Spatial Fusion

Meta, Phoria, Lusion - AR

Learn how to use a civilian-grade hand mounted stellarator in this polished AR demo.

Vanveer Original Custom

Vanveer

A great example of WebXR using Model Viewer.

Above Par-adowski WebXR Mini-Golf

Paradowski Creative

A WebXR mini-golf game made for virtual reality. This game began as a 40-hour prototype made for Paradowski Creative's putt putt tourney, but quickly took on a life of its own. Our aim is native-quality VR gameplay & design on the 3D open web.

Project Flowerbed

Meta

An immersive VR garden-building experience developed by Meta using WebXR. The goal of this project is to demonstrate best practices for developers building high-quality WebXR experiences.

XR Dinosaurs

Brandon Jones

Welcome to the web's virtual Dinosaur Park!

We've used the magic of your browser to bring back a friendly pack of prehistoric pals.

Our dinosaurs can be viewed with a variety of Virtual Reality headsets, Augmented Reality headsets and phones, or directly in your browser.

Hello WebXR

Mozilla Mixed Reality

The demo is designed as a playground where you can try different experiences and interactions in VR, and introduce newcomers to the VR world and its special language in a smooth, easy and nice way.

Castle Builder

Needle Tools

Build your own castle! Drag 3D models from the various palettes onto the stage, and create your very own world.
Works on Desktop, Mobile, VR, AR, all right in your browser. Interactions are currently optimized for VR

Invite your friends! Click Create Room to be put into a live, multi-user space

Dead Secret Circle Web

Wonderland Engine

Wonderland has achieved a significant breakthrough in WebXR technology by porting the native mystery horror VR game 'Dead Secret Circle' by Robot Invader to the web.
The project demonstrates WebXR can match native performance and quality on VR headsets.

The game is also accessible on PC and mobile devices, showing how WebXR is cross-platform even to traditional media.

Getting started building a WebXR Website

These are brief guides to building a site which uses and AR and VR.

The WebXR device API relies on graphics APIs like WebGL & WebGL2 to work, these graphics libraries and frameworks come with WebXR support built in.

A-Frame is a web framework for building 3D/AR/VR experiences using a combination of HTML and Javascript.

A-Frame is based on three.js and has a large community, as well as lots of community-made custom elements and components.

<html>
  <head>
    <script src="https://aframe.io/releases/1.2.0/aframe.min.js"></script>
  </head>
  <body>
    <a-scene>
      <a-box position="-1 0.5 -3" rotation="0 45 0" color="#4CC3D9"></a-box>
      <a-sphere position="0 1.25 -5" radius="1.25" color="#EF2D5E"></a-sphere>
      <a-cylinder position="1 0.75 -3" radius="0.5" height="1.5" color="#FFC65D"></a-cylinder>
      <a-plane position="0 0 -4" rotation="-90 0 0" width="4" height="4" color="#7BC8A4"></a-plane>
      <a-sky color="#ECECEC"></a-sky>
    </a-scene>
  </body>
</html>

Babylon.js is an easy to use real-time 3D game engine built using TypeScript. It has full WebXR support out of the box, including gaze and teleportation support, AR experimental features and more. To simplify WebXR development Babylon.js offers the WebXR Experience Helper, which is the one-stop-shop for all XR-related functionalities.

To get started use the Babylon.js playground, or try these demos:

  • https://playground.babylonjs.com/#PPM311
  • https://playground.babylonjs.com/#JA1ND3#164

To start on your own use this simple template:

<!DOCTYPE html>
<html>
    <head>
        <meta http-equiv="Content-Type" content="text/html" charset="utf-8" />
        <title>Babylon - Getting Started</title>
        <!--- Link to the last version of BabylonJS --->
        <script src="https://preview.babylonjs.com/babylon.js"></script>
        <style>
            html,
            body {
                overflow: hidden;
                width: 100%;
                height: 100%;
                margin: 0;
                padding: 0;
            }

            #renderCanvas {
                width: 100%;
                height: 100%;
                touch-action: none;
            }
        </style>
    </head>

    <body>
        <canvas id="renderCanvas"></canvas>
        <script>
            window.addEventListener('DOMContentLoaded', async function () {
                // get the canvas DOM element
                var canvas = document.getElementById('renderCanvas');
                // load the 3D engine
                var engine = new BABYLON.Engine(canvas, true);
                // createScene function that creates and return the scene
                var createScene = async function () {
                    // create a basic BJS Scene object
                    var scene = new BABYLON.Scene(engine);
                    // create a FreeCamera, and set its position to (x:0, y:5, z:-10)
                    var camera = new BABYLON.FreeCamera('camera1', new BABYLON.Vector3(0, 5, -10), scene);
                    // target the camera to scene origin
                    camera.setTarget(BABYLON.Vector3.Zero());
                    // attach the camera to the canvas
                    camera.attachControl(canvas, false);
                    // create a basic light, aiming 0,1,0 - meaning, to the sky
                    var light = new BABYLON.HemisphericLight('light1', new BABYLON.Vector3(0, 1, 0), scene);
                    // create a built-in "sphere" shape; its constructor takes 6 params: name, segment, diameter, scene, updatable, sideOrientation 
                    var sphere = BABYLON.Mesh.CreateSphere('sphere1', 16, 2, scene);
                    // move the sphere upward 1/2 of its height
                    sphere.position.y = 1;
                    // create a built-in "ground" shape;
                    var ground = BABYLON.Mesh.CreateGround('ground1', 6, 6, 2, scene);

                    // Add XR support
                    var xr = await scene.createDefaultXRExperienceAsync({/* configuration options, as needed */})
                    // return the created scene
                    return scene;
                }

                // call the createScene function
                var scene = await createScene();

                // run the render loop
                engine.runRenderLoop(function () {
                    scene.render();
                });

                // the canvas/window resize event handler
                window.addEventListener('resize', function () {
                    engine.resize();
                });
            });
        </script>
    </body>
</html>

For advanced examples and documentation see the Babylon.js WebXR documentation page

Model viewer is a custom HTML element for displaying 3D models and vieweing them in AR

AR

<!-- Import the component -->
<script type="module" src="https://unpkg.com/@google/model-viewer/dist/model-viewer.js"></script>
<script nomodule src="https://unpkg.com/@google/model-viewer/dist/model-viewer-legacy.js"></script>

<!-- Use it like any other HTML element -->
<model-viewer src="examples/assets/Astronaut.glb" ar alt="A 3D model of an astronaut" auto-rotate camera-controls background-color="#455A64"></model-viewer>

Needle Engine Banner

Needle Engine is a web engine for complex and simple 3D applications alike. It is flexible, extensible and has built-in support for collaboration and XR! It is built around the glTF standard for 3D assets.

Powerful integrations for Unity and Blender allow artists and developers to collaborate and manage web applications inside battle-tested 3d editors. Needle Engine integrations allow you to use editor features for exporting models, author materials, animate and sequence animations, bake lightmaps and more with ease.

Our powerful and easy to use compression and optimization pipeline for the web make sure your files are ready, small and load fast!

Follow the Getting Started Guide to download and install Needle Engine. You can also find a list of sample projects that you can try live in the browser and download to give your project a headstart.
For writing custom components read the Scripting Guide.

p5.xr is an add-on for p5.js, a Javascript library that makes coding accessible for artists, designers, educators, and beginners. p5.xr adds the ability to run p5 sketches in Augmented Reality or Virtual Reality.

p5.xr also works in the p5.js online editor, simply add a script tag pointing to the latest p5.xr release in the index.html file.

<!DOCTYPE html>
<html>
<head>
    <script src="https://cdn.jsdelivr.net/npm/[email protected]/lib/p5.js"></script>
    <script src="https://github.com/stalgiag/p5.xr/releases/download/0.3.2-rc.3/p5xr.min.js"></script>
</head>
<body>
    <script>
        function preload() {
            createVRCanvas();
        }

        function setup() {
            setVRBackgroundColor(0, 0, 255);
            angleMode(DEGREES);
        }

        function draw() {
            rotateX(-90);
            fill(0, 255, 0);
            noStroke();
            plane(10, 10);
        }
    </script>
</body>
</html>

PlayCanvas is an open-source game engine. It uses HTML5 and WebGL to run games and other interactive 3D content in any mobile or desktop browser.

Full documentation available on the PlayCanvas Developer site including API reference. Also check out XR tutorials with sources using online Editor as well as engine-only examples and their source code.

Below is basic example of setting up PlayCanvas application, simple scene with light and some cubes aranged in grid. And Immersive VR session on click/touch if WebXR is supported:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>PlayCanvas Basic VR</title>
    <meta charset="utf-8">
    <script src="https://unpkg.com/playcanvas"></script>
    <style type="text/css">
        body {
            margin: 0;
            overflow: hidden;
        }
        canvas {
            width: 100%;
            height: 100%;
        }
    </style>
</head>
<body>
    <canvas id="canvas"></canvas>
    <script>
        let canvas = document.getElementById('canvas');

        // create application
        let app = new pc.Application(canvas, {
            mouse: new pc.Mouse(canvas),
            touch: new pc.TouchDevice(canvas)
        });

        // set resizing rules
        app.setCanvasFillMode(pc.FILLMODE_FILL_WINDOW);
        app.setCanvasResolution(pc.RESOLUTION_AUTO);
        // handle window resize
        window.addEventListener("resize", function () {
            app.resizeCanvas(canvas.width, canvas.height);
        });

        // use device pixel ratio
        app.graphicsDevice.maxPixelRatio = window.devicePixelRatio;

        // start an application
        app.start();


        // create camera
        let cameraEntity = new pc.Entity();
        cameraEntity.addComponent("camera", {
            clearColor: new pc.Color(0.3, 0.3, 0.3)
        });
        app.root.addChild(cameraEntity);

        // create light
        let light = new pc.Entity();
        light.addComponent("light", {
            type: "spot",
            range: 30
        });
        light.translate(0,10,0);
        app.root.addChild(light);

        let SIZE = 8;

        // create floor plane
        let plane = new pc.Entity();
        plane.addComponent("model", {
            type: "plane"
        });
        plane.setLocalScale(SIZE * 2, 1, SIZE * 2);
        app.root.addChild(plane);

        // create a grid of cubes
        for (let x = 0; x < SIZE; x++) {
            for (let z = 0; z < SIZE; z++) {
                let cube = new pc.Entity();
                cube.addComponent("model", {
                    type: "box"
                });
                cube.setPosition(2 * x - SIZE + 1, 0.5, 2 * z - SIZE + 1);
                app.root.addChild(cube);
            }
        }


        // if XR is supported
        if (app.xr.supported) {
            // handle mouse / touch events
            let onTap = function (evt) {
                // if immersive VR supported
                if (app.xr.isAvailable(pc.XRTYPE_VR)) {
                    // start immersive VR session
                    cameraEntity.camera.startXr(pc.XRTYPE_VR, pc.XRSPACE_LOCALFLOOR);
                }
                evt.event.preventDefault();
                evt.event.stopPropagation();
            };
            // attach mouse / touch events
            app.mouse.on("mousedown", onTap);
            app.touch.on("touchend", onTap);
        }
    </script>
</body>
</html>

react-xr is a collection of hooks to help you build XR experiences in react-three-fiber applications. To make a VR React application we’ll use the following stack:

The Stack

Three.js is a library for 3D graphics, react-three-fiber is react renderer for Three.js, drei is a collection of reusable components for r3f and react-xr is a collection of hooks to help you build XR experiences in react-three-fiber applications.

react-xr

As soon as you have a 3D scene using react-three-fiber you can make it available in VR or AR with react-xr.

For that, the only thing you need to do is to replace <Canvas> component with <VRCanvas> or <ARCanvas> from react-xr package. It’s still the same canvas component but with all additional wiring necessary for VR to function.

Take a look at those simple example here:

VR

https://codesandbox.io/s/react-xr-simple-demo-8i9ro VR demo preview

AR

https://codesandbox.io/s/react-xr-simple-ar-demo-8w8hm AR demo preview

You’ll notice that you now have “Enter VR/AR” button available at the bottom of the screen that should start the experience.

Adding controllers

To add controllers you can use a component from react-xr package called <DefaultXRControllers/>. It will load appropriate controller models and put them in a scene.

<VRCanvas>/* or ARCanvas */
    
    <DefaultXRControllers />
</VRCanvas>

Interactivity

To interact with objects using controllers you can use <Interactive> component or useInteraction hook. They allow adding handlers to your objects. All interactions are rays that are shot from the controllers.

here is a short example

const [isHovered, setIsHovered] = useState(false)

return (
  <Interactive onSelect={() => console.log('clicked!')} onHover={() => setIsHovered(true)} onBlur={() => setIsHovered(false)}>
    <Box />
  </Interactive>
)

You can also see this method in the two VR and AR examples aboves

Learn more

We barely scratched the surface of what is possible with libraries like react-three-fiber and react-xr, I encourage you to check out more examples in GitHub repositories here and here. Remember, every r3f scene can be easily adjusted to be available in WebXR.

Three.js is a cross-browser JavaScript library used to create and display animated 3D computer graphics in a web browser. It has a large community, good docs, and many examples.

Using VR is largely the same as regular Three.js applications. Setup the scene, camera, and renderer. The major difference is setting the vr.enabled flag to true on the renderer. There is an optional VRButton class to make a button that will enter and exit VR for you.

For more info, see this guide to VR in Three.js and the WebXR examples.

Here is a full example that sets up a scene with a rotating red cube.

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <title>Title</title>
    <style type="text/css">
        body {
            margin: 0;
            background-color: #000;
        }
        canvas {
            display: block;
        }
    </style>
</head>
<body>
    <script type="module">
        // Import three
        import * as THREE from 'https://unpkg.com/three/build/three.module.js';
        // Import the default VRButton
        import { VRButton } from 'https://unpkg.com/three/examples/jsm/webxr/VRButton.js';

        // Make a new scene
        let scene = new THREE.Scene();
        // Set background color of the scene to gray
        scene.background = new THREE.Color(0x505050);

        // Make a camera. note that far is set to 100, which is better for realworld sized environments
        let camera = new THREE.PerspectiveCamera(50, window.innerWidth / window.innerHeight, 0.1, 100);
        camera.position.set(0, 1.6, 3);
        scene.add(camera);

        // Add some lights
        var light = new THREE.DirectionalLight(0xffffff,0.5);
        light.position.set(1, 1, 1).normalize();
        scene.add(light);
        scene.add(new THREE.AmbientLight(0xffffff,0.5))

        // Make a red cube
        let cube = new THREE.Mesh(
            new THREE.BoxGeometry(1,1,1),
            new THREE.MeshLambertMaterial({color:'red'})
        );
        cube.position.set(0, 1.5, -10);
        scene.add(cube);

        // Make a renderer that fills the screen
        let renderer = new THREE.WebGLRenderer({antialias: true});
        renderer.setPixelRatio(window.devicePixelRatio);
        renderer.setSize(window.innerWidth, window.innerHeight);
        // Turn on VR support
        renderer.xr.enabled = true;
        // Set animation loop
        renderer.setAnimationLoop(render);
        // Add canvas to the page
        document.body.appendChild(renderer.domElement);

        // Add a button to enter/exit vr to the page
        document.body.appendChild(VRButton.createButton(renderer));

        // For AR instead, import ARButton at the top
        //    import { ARButton } from 'https://unpkg.com/three/examples/jsm/webxr/ARButton.js';
        // then create the button
        //  document.body.appendChild(ARButton.createButton(renderer));

        // Handle browser resize
        window.addEventListener('resize', onWindowResize, false);

        function onWindowResize() {
            camera.aspect = window.innerWidth / window.innerHeight;
            camera.updateProjectionMatrix();
            renderer.setSize(window.innerWidth, window.innerHeight);
        }

        function render(time) {
            // Rotate the cube
            cube.rotation.y = time / 1000;
            // Draw everything
            renderer.render(scene, camera);
        }
    </script>
</body>
</html>

Here is a full example of an immersive-ar demo made using three.js

<!DOCTYPE html>
<html lang="en">
	<head>
		<title>three.js ar - cones</title>
		<meta charset="utf-8">
		<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=no">
		<link type="text/css" rel="stylesheet" href="main.css">
	</head>
	<body>

		<div id="info">
			<a href="https://threejs.org" target="_blank" rel="noopener">three.js</a> ar - cones<br/>
		</div>

		<script type="module">

            		import * as THREE from 'https://unpkg.com/three/build/three.module.js';
			import { ARButton } from 'https://unpkg.com/three/examples/jsm/webxr/ARButton.js';

			var container;
			var camera, scene, renderer;
			var controller;

			init();
			animate();

			function init() {

				container = document.createElement( 'div' );
				document.body.appendChild( container );

				scene = new THREE.Scene();

				camera = new THREE.PerspectiveCamera( 70, window.innerWidth / window.innerHeight, 0.01, 20 );

				var light = new THREE.HemisphereLight( 0xffffff, 0xbbbbff, 1 );
				light.position.set( 0.5, 1, 0.25 );
				scene.add( light );

				//

				renderer = new THREE.WebGLRenderer( { antialias: true, alpha: true } );
				renderer.setPixelRatio( window.devicePixelRatio );
				renderer.setSize( window.innerWidth, window.innerHeight );
				renderer.xr.enabled = true;
				container.appendChild( renderer.domElement );

				//

				document.body.appendChild( ARButton.createButton( renderer ) );

				//

				var geometry = new THREE.CylinderBufferGeometry( 0, 0.05, 0.2, 32 ).rotateX( Math.PI / 2 );

				function onSelect() {

					var material = new THREE.MeshPhongMaterial( { color: 0xffffff * Math.random() } );
					var mesh = new THREE.Mesh( geometry, material );
					mesh.position.set( 0, 0, - 0.3 ).applyMatrix4( controller.matrixWorld );
					mesh.quaternion.setFromRotationMatrix( controller.matrixWorld );
					scene.add( mesh );

				}

				controller = renderer.xr.getController( 0 );
				controller.addEventListener( 'select', onSelect );
				scene.add( controller );

				//

				window.addEventListener( 'resize', onWindowResize, false );

			}

			function onWindowResize() {

				camera.aspect = window.innerWidth / window.innerHeight;
				camera.updateProjectionMatrix();

				renderer.setSize( window.innerWidth, window.innerHeight );

			}

			//

			function animate() {

				renderer.setAnimationLoop( render );

			}

			function render() {

				renderer.render( scene, camera );

			}

		</script>
	</body>
</html>

Unity is a GUI based game engine. It has a number of unofficial WebXR extensions.

WebXR Export

Create a new Unity Project (2019.4.7f1 and up in the 2019.4.x cycle). Switch platform to WebGL.

Import WebXR Export and WebXR Interactions packages from OpenUPM.

Once packages are imported, Go to Window > WebXR > Copy WebGLTemplates.

Copy WebGLTemplates

After WebGLTemplates are in the Assets folder, Open the XR Plug-in Management tab in the Project Settings window and select the WebXR Export plug-in provider.

XR Plug-in Management

Now you can import the Sample Scene from Window > Package Manager > WebXR Interactions > Import into Project.

Import Sample Scene

In Project Settings > Player > Resolution and Presentation, select WebXR as the WebGL Template.

Resolution and Presentation

Now you can build the project.

Build

Make sure to build it from Build Settings > Build. Unity’s Build And Run server use HTTP. Run the build on your own HTTPS server.

Result

That’s it.

Verge3D is an artist-friendly toolkit that allows Blender, 3ds Max, or Maya artists to create immersive web-based experiences. Verge3D can be used to build interactive animations, product configurators, engaging presentations of any kind, online stores, explainers, e-learning content, portfolios, and browser games.

Setting up Virtual Reality

We recommend to enable the Legacy VR option in app creation settings in the App Manager in order to support a wider range of browsers (such as Mozilla Firefox) and devices.

Cardboard devices should work out of the box in any mobile browser, both on Android and iOS.

Google Daydream works in stable Chrome browser on Android phones while HTC and Oculus devices should work in both Chrome and Firefox browsers.

Plese note that WebXR requires a secure context. Verge3D apps must be served over HTTPS/SSL, or from the localhost URL.

The VR mode can be set up for any Verge3D app using enter VR mode puzzle.

Interaction with 3D objects is performed by using the gaze-based reticle pointer automatically provided for VR devices without controllers (such as cardboards).

For VR devices with controllers, interaction is performed by the virtual ray casted from the controllers.

You can use the standard when hovered or when clicked puzzles to capture user events as well as VR-specific on session event.

Setting up Augmented Reality

You can run your Verge3D-based augmented reality applications on mobile devices with Anroid or iOS/iPadOS operating systems.

Android

To enable augmented reality, you need an Android device which supports ARCore technology and latest Google Chrome browser. You also need to install Google Play Services for AR. The installation of this package is prompted automatically upon entering AR mode for the first time, if not pre-installed.

iOS/iPadOS

Mozilla’s WebXR Viewer is a Firefox-based browser application which supports the AR technology on Apple devices (starting from iPhone 6s). Simply install it from the App Store.

Creating AR Apps

The AR mode can be set up for any Verge3D app using the enter AR mode puzzle.

Upon entering AR mode you will be able to position your 3D content in the “real” coordinate system, which is aligned with your mobile device. In addition to that, you can detect horizontal surfaces (tables, shelves, floor etc) by using the detect horizontal surface AR puzzle.

Also, to see the the real environment through your 3D canvas, you should enable the transparent background option in the configure application puzzle.

What’s Next

Check out the User Manual for more info on creating AR/VR applications with Verge3D or see the tutorials for beginners on YouTube.

Got Questions?

Feel free to ask on the forums!

Wonderland Engine is a highly performant WebXR focused development platform.

The Wonderland Editor (Windows, MacOS, Linux) makes WebXR development accessible and provides a very efficient workflow, e.g. by reloading the browser for you whenever your files change.

WebAssembly and optimizations like automatically batching your scene allow you to draw many objects without having to worry about performance.

Start with the Quick Start Guide and find a list of examples to help you get started. To start writing custom code, check out the JavaScript Getting Started Guide and refer to the JavaScript API Documentation.

Wonderland Engine Screenshot

Click on a tab to begin.

Tooling

screenshot of the Immersive Web Emualtor

Immersive Web Emulator

Meta

The Immersive Web Emulator browser extension lets you test your WebXR enabled pages on emulated Meta Quest devices. It's very useful for approximating how your experience works in a real device and to quickly test from within your browser.

WebXR Input Profiles

W3C Immersive Web Working Group

This repo provides a javascript library for managing known motion controller profiles, loading the most ideal known profile for a supplied input source, and creating a MotionController object that binds them together. Developers can use this library to interact with the conceptual components of an input source, rather than each individual button or axis.

Support Table for the WebXR Device API

If you notice missing data you can make changes on GitHub

Feature Name Standardisation Chrome Safari on visionOS WebXR Viewer Magic Leap Helio Samsung Internet Meta Quest Browser Microsoft Edge Wolvic PICO Browser Servo Immersive Web Emulator
[Chrome] [Edge]
Polyfill
WebXR Core Explainer
Spec
MDN
Chrome 79 Behind a feature flag iOS Magic Leap Helio 0.98 Samsung Internet 12.0 7.0, December 2019 Edge 87 on Windows Desktop
Edge 91 on Hololens 2
0.9.3, February 2022 Supported Supported Supported Supported, will make use of WebVR if available and WebXR is not.
WebXR AR Module Explainer
Spec
MDN
Chrome for Android, 81 iOS Magic Leap Helio 0.98 Samsung Internet 12.1 24.0, October 2022 Edge 91. Hololens 2 only 3.3.32
WebXR Gamepads Module Explainer
Spec
MDN
Chrome 79 Partially supported on Magic Leap Helio 0.98 Samsung Internet 12.0 7.1, December 2019 Edge 87 on Windows Desktop
Edge 91 on Hololens 2
0.9.3, February 2022 Supported Supported Supported Supported
Hit Test Explainer
Spec
MDN
Chrome for Android, 81 iOS Samsung Internet 12.1 25.3, January 2023 Edge 93. Hololens 2 only
DOM Overlays Explainer
Spec
MDN
Chrome for Android (Mobile), 83 iOS Samsung Internet 14.2
Layers Explainer
Spec
MDN
16.1 3.3.32 Supported. See here.
Hand Input Explainer
Spec
MDN
Chrome 131 Behind a feature flag 15.1,
April 2021
Edge 93. Hololens 2 only Supported
Anchors Explainer
Spec
MDN
Chrome for Android, 85 15.2 24.0, October 2022 Edge 93. Hololens 2 only
Depth Sensing Explainer
Spec
MDN
Chrome for Android, 90
Light Estimation Explainer
Spec
MDN
Chrome for Android, 90 15.2
Additional Details Hardware Support Details 2.0 announcement Meta Quest Browser Dev Docs The new Microsoft Edge for Windows Mixed Reality Introducing Wolvic GitHub
Name Explainer
Spec