Sitemap

Unity VR: Interaction Events

3 min readMay 1, 2025

Intro

Interaction Events are just Unity Events on XR Interactables that get triggered when specific interaction-related actions occur between an Interactor (like a controller or hand) and an Interactable (like a grabbable object, button, or lever). These events allow you to hook into the interaction system and run custom logic at different stages of the interaction lifecycle — such as when something is hovered over, selected (grabbed), or released.

Example Use Cases:

  • Play a sound– when an object is picked up (OnSelectEntered)
  • Trigger an animation– when a lever is pulled (OnActivate)
  • Enable a visual highlight– when an object is hovered over (OnHoverEntered)
  • Reset a puzzle piece– when it’s dropped (OnSelectExited)

Interaction Events

These events are available in the Inspector of the Interactable component and can be connected to other components to call functions (methods) from them all through the Unity Inspector.

These events are triggered depending on the Input Action Map for the different Interactors. For example, the default Activate Input for the Quest is the Trigger button. So when the Trigger button is pressed that event happens and any functions associated with it.

Connect to Script

On the Pistol game object there is a Pistol Script attached.

This script features a public method, essential for Inspector visibility with the Interactive Unity Event. The method’s function does this:

  1. Creates a muzzle flash at a point
  2. It casts a ray forward from a point and checks for collisions with objects with a specific tag. Upon detecting such a collision, the script instantiates hit particles at the precise point of collision.
using System.Collections;
using System.Collections.Generic;
using UnityEditor;
using UnityEngine;

public class PistolController : MonoBehaviour
{
[SerializeField]
GameObject _muzzleFlashPrefab, _hitPrefab;
[SerializeField]
Transform _rayOrigin;
[SerializeField]
Vector3 _muzzleFlashOffset;

public void TriggerPull()
{
Instantiate(_muzzleFlashPrefab, _rayOrigin.position + _muzzleFlashOffset, Quaternion.identity);

if (Physics.Raycast(_rayOrigin.position, _rayOrigin.forward, out RaycastHit hit, Mathf.Infinity) && hit.transform.CompareTag("Target"))
{
Instantiate(_hitPrefab, hit.point, Quaternion.identity);
}
}

We want to call this method from the Pistol script when the Activate Interaction Event is triggered, so we assign it to the script in the Inspector and navigate to the method.

We can call multiple methods from the same event like a sound effect. If you want sound effects to emit sound in world space, set the spatial blend towards 1 for 3D sound, instead of 0 which is sound that is non-positional.

Conclusion

Interaction Events provide a powerful and flexible way to link gameplay logic to physical interactions in XR without writing extra code. By using Unity Events exposed in the Inspector, developers can easily connect actions like grabbing, hovering, or activating objects to custom behavior — such as animations, sounds, or particle effects.

--

--

No responses yet