Mesh Detection

Mesh Detection is designed to enhance augmented/mixed reality experiences by enabling the detection and manipulation of 3D meshes in a physical environment. This API allows web-based AR applications to understand the structure of the real world by detecting surfaces, objects, and environments in 3D space, enabling more realistic and interactive reality experiences.


In this example, when you perform a pinch gesture (or a click using the controller), the detected meshes become visible. As new meshes are detected, the engine generates a visual representation for each, applying a black color and outlining them with light blue lines.

🥽
To experience the following example, an XR headset is required. Additionally, you may need to configure your environment (for instance, for Oculus Quest 3, refer to this guide for instructions on setting up your space).
const scene = useScene();
const xrExperience = useXrExperience();
 
useEffect(() => {
    const meshDetector: WebXRMeshDetector = xrExperience.baseExperience.featuresManager.enableFeature(WebXRFeatureName.MESH_DETECTION, 'latest', {
        generateMeshes: true,
    }) as WebXRMeshDetector;
 
    meshDetector.onMeshAddedObservable.add(meshData => {
        if (meshData && meshData.mesh) {
            const material = new GridMaterial("grid");
            material.gridRatio = 0.1;
            material.majorUnitFrequency = 2;
            meshData.mesh.isVisible = false;
            meshData.mesh.material = material;
        }
    });
 
    scene.onPointerDown = () => {
        scene.meshes.forEach(mesh => mesh.isVisible = true);
    }
}, []);

Video captured with Oculus Quest 3.


Learn More

For a deep understanding, please refer to WebXR documentation: https://immersive-web.github.io/real-world-meshing/.