Unity VR: Multiple Interactors & One Controller
Intro
We have run into a problem with the different interactors being triggered when they shouldn’t.
This issue can be applied to any Interactor and Interactable, but this will focus on Teleportation. Now, both of these Teleportation Interactables have a property called Teleport Trigger. This property gives you the ability to control when the teleport action happens based on the user’s interaction and either the Select or Activate Action selected from the XRI Default Input Actions.
Looking at the Action Map we see that the Selection Action is assigned to the GripButton, which is typically used to, well, grip things or pick them up, though it could lead to some confusion by teleporting when you want to grab or grab something when you want to teleport.
This leads us to the Action Based Controller Manager and/or XR Group Interaction to sort out which Interactor should be enabled and/or priority over the others.
Action Based Controller Manager
Our Controller object has several different Interactors as child objects. Each one handles a different interaction, but this can be problematic having all these interactors be active at the same time
The Action Based Controller Manager is a component that automatically enables and disables different Interactors (like Ray Interactors, Direct Interactors, Teleport Interactors, etc.) based on what the user is trying to do, like teleporting or grabbing.
The Action Based Controller Manager needs the references to the different child Interactors and the Controller Input Actions.
- References to your Interactors → so it can turn them on/off depending on what the player is doing.
- References to your Input Actions → so it knows when to switch interaction modes.
XR Interaction Group
The XR Interaction Group is another component that lets you combine multiple Interactors (like a Direct Interactor and a Ray Interactor) on the same controller and it manages which one is active based on priority or conditions when they are not disabled by the Action Based Controller Manager.
For example:
- If your hand is near a grabbable object → Use Direct Interactor (hand grabbing).
- If you’re not near anything → Use Ray Interactor (laser pointer for UI, teleporting, far grabbing).
- But both the Direct Interactor and Ray Interactor are active on the same controller
You will want to assign the XR Interaction Manager in the scene then, assign each of the Interactors you would want in the Interaction Group. After assigning them, you will see the Override Configure which interactor “wins” if they are both active and are being triggered. I have it so the Teleporter overrides both the Ray and Direction, the Ray overrides the Direct, and the Direct cannot override anything.
Action Base Controller Manager vs. XR Interaction Group
Action Based Controller Manager
- Controls when Interactors are active
- Handles mode switching (grab vs ray vs teleport)
- Usually used for input-based switching
XR Interaction Group
- Controls how active Interactors behave together
- Manages interaction priority within a hand/controller
- Usually used to stack Interactors on a controller cleanly
Why Both?
You don’t have to use both of these components. The XR Interaction Group works fine, if you don’t mind having multiple Interactors always active/available on the same controller, but use the pr priority rules to decide which one “wins” (ex: Direct Interactor takes priority over Ray if close enough).
The Action Based Controller Manager gives you more explicit control over the player’s interaction mode because it enables one the Interactors while disabling the rest. But using both together is the recommended approach for polished VR experiences where you want clear control over player interaction modes and to prevent any accidental interactions.
The Interactors
Teleport Interactor
The Teleport Interactor is just a Ray Interactor with some modifications. So create a XR>Ray Interactor and rename to “Teleport Interactor”
The modification I want to point out here is the Input Configuration for Select Input. We want to assign this to the Locomotion Teleport Mode. This makes it Select input when this Teleport Interactor is enabled (which is handled by the Controller Input Action Manager) action to the up direction of the joystick. This is different from the Select action in the Interaction map, which would be the grip button. This is nice because we won’t accidentally trigger the Select input even if we were on the incorrect Interactor (i.e. Ray Interactor) since they have two different inputs.
The Ray Interactor
This is a look at the Ray Interactor and the Input Actions set to the Select Input
The Direct Interactor
This Interactor is basically not set up to do anything at this moment, but is here just to show how the Controller Input Action Manager and XR Interaction Group can be used.
Interaction Layer Masks
The Interaction Layer Masks is a filter system.I t controls what an Interactor (like a direct, ray, or teleport) is allowed to interact with. Without Interaction Layer Masks. every Interactor could interact with everything, which leads to unwanted behavior. With Interaction Layer Masks, you can control exactly what types of interactables respond to different interactors.
For the Ray Interactor, we don’t want it to detect Teleport so we can just have it detect the Default Layer and for the Teleport Interactor we only want to interact with the Teleport layer. You create your own Layers to detect by selecting ‘Add layer…’
Lastly, you want to go to any of your Interactables and set their Interaction Layer Mask to match the Interactor you want them to be detected by. For example, I set the Teleport Area and Teleport Anchor to the Teleport Layer, so only the Teleport Interactor will interact with it.
Conclusion
Managing multiple Interactors on a VR controller requires careful consideration to avoid unwanted interactions, especially when dealing with teleportation and grabbing actions. The Action Based Controller Manager provides clear control over which Interactors are active based on player input, while the XR Interaction Group manages how multiple active Interactors behave together using priority rules. While either component can be used independently, combining both offers the most robust solution for polished VR experiences. This approach ensures precise control over interaction modes and prevents accidental interactions, creating a smoother and more intuitive experience for players.