Studies have shown that much of what we communicate to others is through non-verbal communication [1, 2], and a significant part of that non-verbal communication comes from hand gestures. While VR HMDs can highlight head movement, they’re unable to capture these all-important gestures. In order to capture these gestures in AltspaceVR, we decided to add support for the Intel® RealSense™ camera, which provides hand tracking via the Intel® RealSense™ SDK. This led to our virtual beach volleyball demo at the 2015 Consumer Electronics Show in Las Vegas. The hand integration was a big hit at CES; even more than using their hands to control a volleyball, attendees were wow’d by the “reality” of having their own hands and gestures appear with them in virtual reality — waving hi, giving the peace sign, or even just opening and closing their hands.

In this post, we’ll be sharing what we learned after adding hand tracking to AltspaceVR. Note that the provided C# code is intended for Unity, and that we use skeletal (i.e. 3D joint-based) tracking. Also, we’ve left out error checking for brevity in most places. The complete RealSense™ SDK documentation can be found here.

Initialization

To start off, we need to create an instance of PXCMSenseManager , which is the main interface for interacting with the RealSense™ camera. Once we have one, we call EnableHand() to activate the hand tracking module, and call Init() to initialize the pipeline.

senseManager = PXCMSenseManager.CreateInstance(); if (senseManager != null) { senseManager.EnableHand(); senseManager.Init(); ...



Next, we obtain a reference to the PXCMHandModule for hand tracking and gesture detection, and a reference to PXCMHandData for managing the captured hand data.

handModule = senseManager.QueryHand(); handData = handModule.CreateOutput();



Finally, we need to configure the hand tracker for our needs. For AltspaceVR, we enable gesture detection and alerts related to the hands going out-of-bounds. We also enable joint normalization to keep the dimensions of the tracked hands constant regardless of the user’s hand size, since we do not want an avatar’s hands to reflect the physical size of a user’s hands.

PXCMHandConfiguration handConfig = handModule.CreateActiveConfiguration(); handConfig.EnableAllGestures(); handConfig.EnableAllAlerts(); handConfig.EnableNormalizedJoints(true); handConfig.ApplyChanges(); handConfig.Dispose();

Basic Joint Detection

We can extract joint information for rendering once the hands are being tracked. Typically, sensor information is updated by the sensor libraries in a thread separate from the main rendering thread. We’ve found the best technique for using sensor data is to briefly lock the sensor update thread during each frame in the render thread, quickly copy the sensor-dependent data into a custom data structure suitable for rendering, and then release the lock. To lock should held briefly since the sensor is prevented from processing data while it is open.

For our RealSense™ implementation we do this by storing the hand tracker joint data in custom data structures during each frame, and use that data to render the avatar hands. A nice benefit of this approach is that other hand trackers can be added by simply writing additional adapters which copy their sensor data into the same data structures.

We perform this copying step in FixedUpdate() , which usually executes less frequently compared to Update() and requires less cycles to execute overall. First, we acquire a frame of sensor data from the sense manager (which locks the sensor thread from processing further frames). Then, we copy the joints from the hand tracker into the hand data structures we use for rendering. Finally, assuming no error was encountered, we release the frame and unlock the sensor thread.

if (senseManager.AcquireFrame(false, 0) >= pxcmStatus.PXCM_STATUS_NO_ERROR) { try { copyJointsFromHandTracker(); } finally { senseManager.ReleaseFrame(); } }



In copyJointsFromHandTracker , Update() updates the latest available estimated hand data based on the sensor. We then iterate over each observed hand and store the hand data into the fetchedHandData variable.

handData.Update(); int numHands = handData.QueryNumberOfHands(); for (int hand = 0; hand < numHands; hand++) { // Fetch into fetchedHandData. PXCMStatus status = handData.QueryHandData(PXCMHandData.AccessOrderType.ACCESS_ORDER_BY_TIME, hand, out fetchedHandData); // Verify no error occurred, and that body side was determined. if(status == pxcmStatus.PXCM_STATUS_NO_ERROR && fetchedHandData.QueryBodySide() != PXCMHandData.BodySideType.BODY_SIDE_UNKNOWN) { copyJointsForHand(fetchedHandData); } }



Once we have the data for a specific hand, we identify it as the left or right hand and copy it into our data structures in copyJointsForHand(...) :

bool isRightHand = fetchedHandData.QueryBodySide() == PXCMHandData.BodySideType.BODY_SIDE_RIGHT; for (int joint = 0; joint < PXCMHandData.NUMBER_OF_JOINTS; joint++) { PXCMHandData.JointType currJoint = (PXCMHandData.JointType)joint; PXCMStatus jointStatus = fetchedHandData.QueryTrackedJoint(currJoint, out jointData); // Only add joint if it has no error and if it has high confidence, to reduce error. if (jointStatus == pxcmStatus.PXCM_STATUS_NO_ERROR && jointData.confidence == 100) { storeJoint(isRightHand, joint, jointData); } }



Accessing the joint data itself is fairly easy, since the position (in world coordinates) and both local and global rotation quaternions are readily available in the jointData object. The joint positions are represented from the perspective of the camera, so the x and z positions might need to be negated (for mirroring.)

Smoothing

A common problem with raw sensor data is the presence of noise. Hand tracking data inferred from a camera is no different, so we must smooth the data in order to prevent an avatar’s hands from appearing jumpy. The RealSense™ SDK provides a smoothing API which relieves us from having to implement it ourselves, but that also means that we may need to implement smoothing to support additional trackers. Intel provides notes for the different smoothing algorithms available in the “Smoothing in Unity” section found here. For AltspaceVR, we use a weighted smoothing algorithm which smooths several frames by computing a weighted average of previous joint positions.

Implementation-wise, we create a smoother for each joint on each hand:

var smoother3D = new PXCMDataSmoothing.Smoother3D[2][]; for (int hand = 0; hand < 2; hand++) { smoother3D[hand] = new PXCMDataSmoothing.Smoother3D[PXCMHandData.NUMBER_OF_JOINTS]; }



The PXCMSession instance is used to create a PXCMDataSmoothing instance which is then used to create the weighted average smoothers, one for each joint across both hands:

session.CreateImpl(out dataSmoothing); for (int joint = 0; joint < PXCMHandData.NUMBER_OF_JOINTS; joint++) { smoother3D[0][joint] = dataSmoothing.Create3DWeighted(4); smoother3D[1][joint] = dataSmoothing.Create3DWeighted(4); }



Adding a joint sample to the smoother is quite easy:

long timeStamp = fetchedHandData.QueryTimeStamp(); smoother3D[hand][joint].AddSample(jointData.positionWorld, timeStamp);



Once sensor samples are added, the smoother can be used to compute smoothed samples at a given timestamp:

PXCMPoint3DF32 positionWorld = smoother3D[hand][joint].GetSample(timeStamp);





One of our state-of-the-art hand tracking test rigs

Cleaning Up

It’s always a good idea to release any instances that we have created in OnDestroy() or OnDisable() . To ensure we do not leak memory, we must call Dispose() (and Close() , when applicable) on the various instances that we created during initialization.

Intel RealSense™ is the first tracking input we integrated with AltspaceVR, and we found that its API is straightforward to work with. When integrating with sensors, it’s important to pull sensor data quickly to avoid locking the sensor I/O thread, and to then smooth that data to avoid jitter. And as always, thinking about architecture design upfront can be useful. The decision to create a common data structure that can be leveraged with other inputs has already proven useful in subsequent input integrations.

If getting a chance to work with the latest input controllers to bring in real body motion to virtual reality sounds fun to you, we’re hiring!

References