VR Interaction Design And Development
Making Oculus Quest hand tracking more natural and intuitive for users
Oculus' new hand tracking system has the alluring benefit of letting you put down the controllers and use your bare hands for navigating VR experiences. But this very natural human computer interface does have some challenges, and this project is a first step towards developing a more comfortable and natural interaction system for Quest hand tracking.
Objectives
Grasp Interaction
Diegetic Controls And UIs
Extras
Take Aways

Objectives: Hand Grasping And Natural UIs

Hand Grasping
The trackable hand assets in the Unity SDK have an option for adding physics colliders, which means you can have some fun nudging objects around a scene in VR. But you can't use those components to create a system that gives users the ability to pick-up and hold items as we do in the real world. Obviously hand grasping can expand the interactive possibilities for a VR experience.
Natural UI Interactions
Hand tracking offers the benefit of not requiring hand controller hardware, but comes with the loss of the hardware button inputs that are used to launch menus, pause games, control player movement, etc. Gestures have commonly been the way to add those input capabilities back to experiences, but gestures in Quest hand tracking can be imprecise and difficult for users to remember. So an objective is to quickly explore other ways to add those hardware inputs back to VR experiences without using gestures.

Adding A Grasp Interaction

The trackable hand assets in the Unity SDK have an option for adding physics colliders, which means you can have some fun nudging objects around a scene in VR. But after flexing my virtual fingers and knocking objects around, I had an urge to pick things up and toss them. So I decided to add grab/grasp functionality to the SDK's hands.

For this small test, I put together a simple system of sensors for fingertips and palms to make it possible to implement the same kind of hand grabbing design patterns used with hardware controllers.
The Out-Of-The-Box hands with physics but no grabbing function (can't pick things up)
Hands with a very basic grabbing feature (Now I can pick things up)
Hands Are Equipped With Sensors That Detect FingerTip Positions And Intersections

Using Diegetic Elements To Enable Player Controls

For a VR app, your controller hardware is usually how you move your virtual body around large scenes—locomotion that is controlled via joysticks, touch pads, or buttons. So I put together a "wrist band" method for digitally recreating a hardware button for locomotion (which is via jetpack for this project).
The UI experiment I created for this demo is a swipe-able image carousel, inspired by the idea of applying the swiping gesture that's commonly used for mobile device interactions to UI windows in VR.
"Wristband" Diegetic Controls (Jet Pack Locomotion In This Sample)
Notched, swipable hologram Slider
Pushing buttons
A pseudo biometric scanner to lift the Teleport Pod

Some Extras: VFX Graph For URP

I absolutely love particle systems, and I've been wanting to work with VFX graph for over a year. So I used this project as an excuse to watch a bunch of tutorial videos, learn my way around the system, and implement a few VFX graph assets in a project.

As usual, I took the more difficult route: VFX Graph isn't fully functional for a Universal Render Pipeline project that doesn't use the Vulkan graphics API. So I spent a few hours wondering why my VFX assets weren't rendering correctly in the headset, but I eventually found a few fixes.
I didn't go too crazy with the VFX—need to keep things reasonable for mobile VR—but I was happy with the results. And I do think VFX graphs are much more performant when you need hundreds of particles for mobile VR.
My Favorite: the VFX sequence After pressing the teleport button
Light rays emitted through the hologram UI
very simple Propulsion VFX Emitted from the wrist band

The Take Aways

Hand-Object Grasping Has Potential
This was an easy win, and it seems possible to replicate all of the hand-object grasping features used with hardware controllers.

The "Wristband" Could Be A Good Replacement For Gestures
This method could be a good option for users: it's easy to access while also being hidden during gameplay, doesn't require learning and remembering gestures, and also enables display of more information and options than possible with gestures.
Natural Interactions With UI Windows Could Be Difficult
I was able to get a swappable UI interface working as a prototype, but it was very specific to display of image content and not quite standardizable to become a utility component. I also attempted to put together a system for distance interaction with UIs (raycasting to detect UI objects), but Quest's hand tracking is quite a bit shaky and frustrating when used with a pointer system.