Our VR designer software uses SteamVR to connect to all kind of VR headsets and controllers. SteamVR implements an open API called OpenVR meaning that it’s easy for a hardware manufacturer to add support for his HMDs and controllers to SteamVR. So It’s super easy to switch for example from the HTC Vive to the Oculus Rift and our program will still work like a charm. You could even mix the hardware from different vendors which might become quite interesting in the near future.
In this article I’ll show you how you would connect to the Oculus Rift and Oculus Touch using Unity 5.5 with the utilities given by Oculus instead of using SteamVR. Please note that we think that SteamVR currently supports Oculus hardware so well that we don’t recommend using the Oculus SDK. Doing so will limit your software to Oculus hardware and I personally think that the SteamVR API is way better designed.
However, if you’re still interested in the Oculus way to program your VR application, you should keep reading this short howto.
Assuming you have your Oculus installed and configured (see the setup page for details) you should at first enable the Unknown Sources flag in the settings under the General tab:
This will prevent Oculus from complaining that your own app hasn’t been “reviewed by Oculus for security, comfort, content, or health and safety”. You can reed more about that on howtogeek.com.
Creating the Unity project
Let’s create a basic unity 3D project:
To enable VR you have to go to Edit -> Project Settings -> Player and then hit the Virtual Reality Supported button unter Other Settings. Make sure that Oculus is selected below the button :
Install Oculus Utilities
Now you can download the Oculus Utilities for Unity 5 here: https://developer.oculus.com/downloads/package/oculus-utilities-for-unity-5/
Unfortunately there seems to be no Asset in the store doing this for you. However you’ll only need to unzip the downloaded file and drag the extracted OculusUtilities.unitypackage file into the asset window of Unity. If you import everything you’ll get an OVR directory in you Assets directory.
Making the HMD run
In my case that wasn’t enough to see the scene via the HMD yet. It turned out that restarting Unity was enough to solve this. So you don’t have to drop any script into the scene and the HMD already runs like a charm. However the default settings of the main camera in your scene are not quite optimal .
To change this simply select the Main Camera in your Hierarchy and change the Position to (0,0,0) and the Near clipping plane to 0.1:
Reading from the Touch controllers
Now you need to drop an object into your scene running the OVRManager.cs script. Without that you can already read the orientation of your controllers but the positions won’t be updated:
Please rename that empty GameObject to OVRManager by pressing F2. Now your’re ready to create your own script reading data from the controllers. Let’s create a cube and a sphere as dummy controller models:
You can create a new script by selecting e.g. the Cube and hitting Add Component in the Inspector window again. This time select New Script and call it Controller. Now you should have a new Controller.cs file in your Assets directory. Open it with the editor of your choice and make it look like the following:
public class Controller : MonoBehaviour
public OVRInput.Controller controller = OVRInput.Controller.LTouch;
private void Update()
transform.position = OVRInput.GetLocalControllerPosition(controller);
transform.rotation = OVRInput.GetLocalControllerRotation(controller);
float trigger = OVRInput.Get(OVRInput.Axis1D.PrimaryIndexTrigger, controller);
transform.localScale = Vector3.one * (0.05f + 0.1f * trigger);
float grip = OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, controller);
GetComponent<Renderer>().material.color = Color.red * grip + (1 - grip) * Color.blue;
If you go back to unity and run your program by hitting ctrl+p you should see (through your hmd) a blue cube at the position of your left controller. By pulling the trigger (index finger) you can change it’s size and by pulling the middle finger you can change it’s color.
So if you have a look at the code you can see that OVRInput.Get is used to read the current value of the trigger buttons and OVRInput.GetLocalController(Position|Rotation) is used to read the position and rotation of the controller.
How to get the button events
If you want to listen on events like “Button A pressed” you can add this code to your Update method (please also add using System; to your header):
foreach (var button in Enum.GetValues(typeof(OVRInput.Button)))
if (OVRInput.GetDown((OVRInput.Button)button, controller))
Debug.Log(controller.ToString() + " button down: " + button.ToString());
if (OVRInput.GetUp((OVRInput.Button)button, controller))
Debug.Log(controller.ToString() + " button up: " + button.ToString());
This will print a message whenever you press or release a button. So once you found the name of the button you want to use, you should write code like this:
if (OVRInput.GetUp(OVRInput.Button.One, controller))
// Do something
What about the right controller?
Well that’s easy. All we need to do is to attach the Controller.cs to another Object (the sphere) and change the public attribute controller:
Now you should have tow ‘hands’ changing their sizes and colors. Writing a proper application should be straight forward from here on.
The whole code with some extras can be found on github: https://github.com/JonasKunze/unyrift/blob/master/Assets/Scripts/Controller.cs