Plane Detection

Plane detection refers to the process of identifying and tracking flat, 2D surfaces in the real world using an AR/MR-capable device (e.g., a smartphone, tablet, or AR headset). These surfaces can be horizontal (e.g., floors or tables) or vertical (e.g., walls). Once detected, virtual objects can be anchored to these surfaces, appearing to interact with the physical world as if they were placed on them.


In the following example, when a new surface is detected, the engine creates a visual representation of it, giving each detected plane a unique color and making it semi-transparent. If the surface changes, the engine updates the visual representation accordingly. When a plane is no longer detected or removed, the corresponding visual representation is also removed. Additionally, when starting a new augmented reality session, all previously detected surfaces are cleared away to ensure a fresh start.

🥽📱
To experience the following example, an XR headset or a mobile with AR capabilities is required. Additionally, if you are using an XR headset, you may need to configure your environment (for instance, for Oculus Quest 3, refer to this guide for instructions on setting up your space).
useEffect(() => {
    const featuresManager = xrExperience.baseExperience.featuresManager;
    const xrPlanes = featuresManager.enableFeature(WebXRPlaneDetector, 'latest') as WebXRPlaneDetector;
    const planes: Array<Mesh> = [];
    
    function createPlane(plane: Plane) {
        plane.polygonDefinition.push(plane.polygonDefinition[0]);
        const polygon_triangulation = new PolygonMeshBuilder(
            `polygon-triangulation-${plane.id}`,
            plane.polygonDefinition.map((p: Vector3) => new Vector2(p.x, p.z)),
        );
        const polygon = polygon_triangulation.build(false, 0.01);
        plane.mesh = polygon;
        planes[plane.id] = plane.mesh;
        polygon.createNormals(true);
        // polygon.receiveShadows = true;
        plane.mesh.rotationQuaternion = new Quaternion();
        plane.transformationMatrix.decompose(plane.mesh.scaling, plane.mesh.rotationQuaternion, plane.mesh.position);
        return plane;
    }
    
    xrPlanes.onPlaneAddedObservable.add(_plane => {
        const plane = createPlane(_plane as Plane);
        const mat = new StandardMaterial(`material-${plane.id}`, scene);
        mat.alpha = 0.9;
        mat.diffuseColor = Color3.Random();
        plane.mesh.material = mat;
    });
    xrPlanes.onPlaneUpdatedObservable.add(_plane => {
        const plane = _plane as Plane;
        let mat;
        if (plane.mesh) {
            // keep the material, dispose the old polygon
            mat = plane.mesh.material;
            plane.mesh.dispose(false, false);
        }
        const some = plane.polygonDefinition.some(p => !p);
        if (some) {
            return;
        }
        createPlane(plane);
        plane.mesh.material = mat!;
    });
    // dispose mesh when plane is removed
    xrPlanes.onPlaneRemovedObservable.add(plane => {
        if (plane && planes[plane.id]) {
            planes[plane.id].dispose();
        }
    });
    // remove old planes when you enter in a new xr session
    xrExperience.baseExperience.sessionManager.onXRSessionInit.add(() => {
        planes.forEach(plane => plane.dispose());
        while (planes.pop()) { }
    });
}, []);
 

Video captured with Oculus Quest 3.


Learn More

For advanced use cases and customizations, please refer to Babylon.js documentation: https://doc.babylonjs.com/features/featuresDeepDive/webXR/webXRARFeatures/#plane-detection.