Unity VR: Hands for the Player
Intro
Adding hand models of your VR controllers on the XR Origin rig with some basic animations when certain input actions occur, can significantly boost immersion and add polish to your VR build.. In this example, we are going to have an open and closed hand and we will transition between them when the Select Input action is triggered.
Adding Hand Models
If you are using the XR Interaction Toolkit 2.x, you can easily add hand models using the XR Controller component. You can just assign it to the Model Prefab slot from the project folder and it will automatically instantiate it to the XR Controller component. But, if you are using the XR Interaction Toolkit 3.x, there is no XR Controller component as it now uses Action-based controllers. So…what do you do?
Controller Models with Action-based Setup
In the newer Input Action-based system, visual models are not driven by the XR Controller component anymore, but instead placed manually.
So we make an empty child for both of the controllers in the XR Origin. In these child objects, we will place the hand models.
Here is the pain point- the LeftHand Controller and RightHand Controller objects typically don’t have meaningful positions or rotations in the editor because they only receive tracking data at runtime. So, you will have to position and rotate during runtime and copy the transform values.
You can use the Z-forward axis of the XR Origin and position/rotate the hands to make this axis align with the hands. Below are the values I used.
Adding Hand Animations
To add Hand Animation, we will need to set up the Animator and Animator Controller for each hand. The Animator Component and Animator Controller work together to bring objects to life through animation in Unity. The Animator component is attached to a GameObject (like a character, hand model, or object) and it plays and blends animations on that GameObject. Think of it as the “animation engine” on your object
The Animator Controller is an asset you create in your Project folder. It’s a state machine for your animations. It controls how animations play, transition, and blend.
Inside an Animator Controller:
- States: Represent different animations (like idle, grab, point, wave)
- Transitions: Connect animations and define when/how they change
- Parameters: Booleans, floats, triggers, etc., that control transitions
The Animator component should be attached to the Hand Model and the Animator Controller created in the Projects folder and assigned to the controller field.
This Hand Model comes with an Avatar asset which is a data asset that maps your 3D model’s skeleton (bones).
Animator
In the Animator Window create two starts, one for Open hand and the other for the Closed Hand. In these States, we can assign the open and closed animations.
In order to move from the Open animation to the Closed animation, we need a transition going back and forth between the two states.
For the Animator Controller to know when to make those transitions it needs a Parameter and for this case a two Trigger parameters since we are using it when the Select Input Action occurs or is triggered and when it stops or exits. These triggers will be called “Select” and “Deselect”.
Select the Transition from Open to Close, we want to make ‘Has Exit Time’ false to prevent it from looping and the Condition to make this transition will be the trigger parameter we created-”Select”.
Now, going from the Closed to Open, we set the condition to the Trigger parameter of “Deselection”.
Play Animations
If you are using XR Interaction Toolkit 2.X with the XR Controller component, you can enable the Animate Model on it and pass in the name of the Trigger parameters (Select, Deselect) to the Select and Deselect fields, which are connected to those Input Actions.
If you are using the XR Interaction Toolkit 3.x, which doesn’t have the XR Controller component, you can easily use the Select and Deselect Events on the Interactors you want the animation to play on. Make sure you have a Select Input Action for that Interactor.
Then assign the Hand Model that has the Animator component attached, select the Animator.SetTrigger open and pass in the name of the Trigger parameter that goes from Open to Close (Select). Do the same thing for Select Exit, but pass in the name of the Trigger parameter that goes from Closed to Open (Deselect).
Conclusion
Adding animated hand models to your XR rig enhances VR player immersion by providing visual feedback for hand interactions. This is achieved by placing hand models under controller objects and using the Animator system for open/closed hand animations. Whether using XR Interaction Toolkit 2.x or the newer Action-based system in 3.x, these basic steps offer an effective foundation for visualizing hand interactions, even though more advanced techniques exist.