Oculus has added the documentation for ‘buffered haptics’ feature of the Oculus SDK which is a method for programming more advanced haptic feedback from the company’s Touch controllers. Oculus uses linear actuators to provide feedback. This technology has been gaining popularity in the recent times and is expected to replace the simple ‘rumble’ feedback common console gamepads. Linear actuators have an advantage that they can move very quickly compared to the rotating mass motors of yore, which allows a broader variety of haptic effects, better control and faster response time.
The new documentation will allow the developers to have a fine grain control of the controller’s haptic feedback. The SDK will support two approaches to controller haptics, Non-buffered and Buffered with the only limitation being that cannot be used together. The non-buffered Haptics is not only simpler to control, but also offers a simple way of switching vibrations off and on with the given amplitude and frequency.
The non-buffered Haptics is specially designed for far more simple effects that do not have stricter latency requirements (because controller requires 33ms to respond to the API call to modify the haptics settings).
On the other hand, Buffered Haptics is faster to respond and allows wider and more complex set of haptic effects like panning the vibrations across controllers, patterning vibrational amplitudes around sine wave or tangent functions, generating a variety of low-frequency carrier waves and much more. The new feature will allow the developers to queue up a string of bytes representing desired amplitudes. They can then play back in the sequence which will the developers to finely adjust the amplitude once every 3.125ms.
Oculus also has the sample app which is provided with the Oculus SDK. The app gives an example of some of the haptic effects that can be achieved with Buffered Haptics. The documentation released by the Oculus describes additional detail about how the feature works, which involves queuing up the buffer with the desired haptic instructions before sending it to the controllers. Developers can also vary the vibrational effects based on input streams like controller movement. They can pre-mix multiple input streams before passing the information to the buffer to allow some interesting dynamic haptics depending upon what the player is doing in the virtual world.
The new documentation released by Oculus will help the developers to come with the apps that can provide better immersive experience to the users.
Link to the documentation: Developer.oculus.com