🕶️
NRSDK(Old)
New DocumentationSDK DownloadAPI Reference
v2.1.0
v2.1.0
  • NRSDK Fundamentals
    • NRSDK Overview
    • XREAL Devices
      • XREAL Glasses
      • Controller
      • Compatibility
    • Getting Started with NRSDK
    • Sample Code
    • Tutorials
    • Release Note
      • NRSDK 2.1.1
      • NRSDK 2.1.0
      • NRSDK 1.10.2
      • NRSDK 1.9.5
      • NRSDK 1.9.3
      • NRSDK 1.9.1
      • NRSDK 1.8.0
      • NRSDK 1.7.0
      • NRSDK 1.6.0
  • Development
    • Input and Camera
      • NRInput
      • Interact with Unity UI (Tutorial)
      • Customize Controller UI
      • NRCameraRig
    • Hand Tracking
    • Image Tracking
    • Plane Detection (Tutorial)
      • Overview
      • Import the package
      • Detect planes in the real world
      • Perform a hit test against detected planes
      • Add a car
      • Add gems
      • Wrap up
    • Depth Mesh
      • Meshing Manager Overview
      • Use Meshes in the Editor
      • Tutorial: Mesh Collision
    • Spatial Anchor
      • Mapping Example Scene
      • Tutorial: Halloween Treasure Hunt
        • Handle the Situation of Failed Anchor Saving
      • Tutorial: Sharing Anchors
        • Setting Up Photon
        • Cloud Storage: Firebase (optional)
        • Cloud Storage: Aliyun OSS (optional)
        • Implementing Cloud Save and Load
        • Sharing Anchors with Photon
    • Tools
      • Single Pass Stereo Rendering
      • First Person View
      • Emulator
      • XR Streaming
    • Miscellaneous
      • Access RGB Camera
      • NRSDK Coordinate Systems
      • MRTK Integration
      • Notification popup
      • Reset Camera
      • Render Metrics
      • Render MonoMode
  • API Reference
  • Frequently Asked Questions
  • DESIGN GUIDE
    • Design Guide Overview
    • Displaying
    • Interacting
    • Controlling
    • Navigating
Powered by GitBook
On this page
  • Capabilities
  • Best Practice
  • Device compatibility

Was this helpful?

  1. Development

Depth Mesh

Depth mesh is the creation of triangle-based meshes surfaces detected by Nreal Light. The mesh is used for real-time occlusion rendering and for collision detection with digital content. Unlike Plane Detection, which only detects planar surfaces, Meshing can detect a variety of surfaces.

Capabilities

Depth mesh expands many AR capabilities.

  • Visualization: Provide visual feedback as users scan their environment.

  • Occlusion: Hide virtual objects behind real ones, especially in larger outdoor environments where Depth-based Occlusion may not be practical.

  • Physics: With the help of MeshColliders, virtual objects and characters interact with the environment and real objects.

  • Procedural Object Placement: Populate the AR world with more flexibility than Plane Detection.

Best Practice

The mesh quality depends on the image texture, related to the scene material, lighting, motion speed, etc.

Good quality:

  • Uniform illumination

  • Move smoothly and slowly

  • Highly textured surfaces: carpets, wallpapers, wood floors

  • The appropriate range of distances: 0.5m to 3m

Poor quality:

  • Moving objects

  • Dark environments

  • Transparent/Semi-transparent or reflective surfaces

  • Thin objects: chair legs, fences

  • Textureless surface: white walls, painted desktop

Device compatibility

Depth Mesh has been fully tested on the following Android phones:

  • OnePlus: 7T / 9

  • SONY: Xperia 5iii

  • OPPO: Find X2 / Find x3 Pro

  • HUAWEI: P30 / Mate40Pro

  • LG: VELVET / V50s / G900N / style

  • SAMSUNG: Galaxy Note20Ultra / Galaxy Note20 / Galaxy S21 5G / Galaxy Z Fold3 5G / Galaxy S20 Ultra

PreviousWrap upNextMeshing Manager Overview

Last updated 1 year ago

Was this helpful?

Depth Mesh can also run on devices besides the above, the stability however is not guaranteed. For the complete compatibility list, please refer to.

Device Compatibility