To enable input support for XREAL device (Controller and Hand tracking), simply drag NRInput prefab into your scene hierarchy. It is used to query virtual or raw controller state, such as buttons, triggers, and capacitive touch data.
Raycast Mode: Choose between Laser / Gaze interaction. Laser is the default raycasting mode most apps will be based on. In this mode, the ray will start from the center of the controller.
Input Source Type: Choose between Controller / Hands. Hands will enable hand tracking capability.
You may leave other fields unchanged.
Raycasters in NRInput
The raycaster class inherits from Unity's BaseRaycaster class. A chosen raycaster's farthest raycasting distance can be modified directly from the Inspector window. You can also define which objects are interactable by changing the parameter of their Mask.
Handle Controller State Change
The primary usage of NRInput is to access controller button state through Get(), GetDown(), and GetUp().
Get() queries the current state of a controller.
GetDown() queries if a controller was pressed this frame.
GetUp() queries if a controller was released this frame.
Sample Usages:
//returns true if a Trigger button is currently being pressedNRInput.GetButton(ControllerButton.TRIGGER);//returns true if a Trigger button was pressed down this frameNRInput.GetButtonDown(ControllerButton.TRIGGER);//returns true if a Trigger button was released this frameNRInput.GetButtonUp(ControllerButton.TRIGGER);
You could also add listeners to controller buttons:
Sample Usages:
//Add button down listeners for right controller's trigger buttonNRInput.AddDownListener(ControllerHandEnum.Right,ControllerButton.TRIGGER, () => { Debug.Log("do sth"); });//Add pressing listeners for right controller's trigger buttonNRInput.AddPressingListener(ControllerHandEnum.Right,ControllerButton.TRIGGER, () => { Debug.Log("do sth"); });//Add button up listeners for right controller's trigger buttonNRInput.AddUpListener(ControllerHandEnum.Right,ControllerButton.TRIGGER, () => { Debug.Log("do sth"); })
Other Controller Events:
/// Event invoked whenever a controller device is connected.publicstaticAction OnControllerConnected;/// Event invoked whenever a controller device is disconnected. publicstaticAction OnControllerDisconnected;/// Event invoked before controller devices are going to recenter. <publicstaticAction OnBeforeControllerRecenter;/// Event invoked whenever controller devices are recentering. internalstaticAction OnControllerRecentering;/// Event invoked whenever controller devices are recentered. publicstaticAction OnControllerRecentered;/// Event invoked whenever controller devices states are updated. publicstaticAction OnControllerStatesUpdated;
//Getting available features for current controller, such as //magenetometer, remaining battery, haptics, etc.NRInput.GetControllerAvailableFeature();//Get controller's position. For current XREAL device, this //will always return Vector3.ZeroNRInput.GetPosition();//Get controller's rotationNRInput.GetRotation();//Get controller type: Phone controller or computing unit controllerControllerType controllerType =NRInput.GetControllerType();//Whether the controller's touch pad is being touchedNRInput.IsTouching();//Getting a Vector2 touch value. Vector2.Zero if not touchedNRInput.GetTouch();//Get controller's gyroscope dataNRInput.GetGyro();//Get controller's accelerometer dataNRInput.GetAccel();//Get controller's magnetometer dataNRInput.GetMag();//Get controller's remaining batteryNRInput.GetControllerBattery();//Getting the dominant handControllerHandEnum domainHand =NRInput.DomainHand;//You could pass in a dominant hand for most of the functions aboveNRInput.GetRotation(NRInput.DomainHand);//Whether the feature is supported by controllerNRInput.GetControllerAvailableFeature(ControllerAvailableFeature.CONTROLLER_AVAILABLE_FEATURE_POSITION);NRInput.GetControllerAvailableFeature(ControllerAvailableFeature.CONTROLLER_AVAILABLE_FEATURE_GYRO);
Get Frequently Used Anchors
The NRInput provides an easy way to get frequently used root node for gaze and laser quickly.
Sample Usage:
//Obtain the root of gazevar gazeAnchor =NRInput.AnchorsHelper.GetAnchor(ControllerAnchorEnum.GazePoseTrackerAnchor);//Obtain the root of right laservar rightLaserAnchor =NRInput.AnchorsHelper.GetAnchor(ControllerAnchorEnum.RightLaserAnchor);//Obtain the root of right modelvar rightModelAnchor =NRInput.AnchorsHelper.GetAnchor(ControllerAnchorEnum.RightModelAnchor);
Interact with GameObject
Please inspect CubeInteractiveTest.cs which handles Unity events when interacting with gameObject this script attached to. Be aware that the gameObject must has a Collider component in order to receive the event. For a full list of supported Unity Event, please refer to: https://docs.unity3d.com/Packages/com.unity.ugui@1.0/manual/SupportedEvents.html
publicclassCubeInteractiveTest:MonoBehaviour,IPointerClickHandler,IPointerEnterHandler,IPointerExitHandler,IDragHandler{privateMeshRenderer m_MeshRender;voidAwake() { m_MeshRender =transform.GetComponent<MeshRenderer>(); } //when pointer clicks, set the cube color to random colorpublicvoidOnPointerClick(PointerEventData eventData) {m_MeshRender.material.color=newColor(Random.Range(0f,1f),Random.Range(0f,1f),Random.Range(0f,1f)); } //when pointer hover, set the cube color to greenpublicvoidOnPointerEnter(PointerEventData eventData) {m_MeshRender.material.color=Color.green; } //when pointer exit hover, set the cube color to whitepublicvoidOnPointerExit(PointerEventData eventData) {m_MeshRender.material.color=Color.white; }}
Interact with Unity UI
Integration with Unity's EventSystem supports user interaction with UI System. Please be aware in order for the Unity UI to respond to raycast and receive unity events, you must remove the default Graphic Raycaster component and attach Canvas Raycast Target component on Canvas.
In this way, you may add event callbacks on Unity UI elements such as Button, Image, Toggle, Slider, etc. For example, On Click() on button: