Quest Pro VR tongue tracking hailed as “accessibility at its finest”
A new feature added to the Meta Quest Pro VR headset allows for the position of the tongue to be tracked accurately, which opens up new frontiers for accessibility to VR and gaming.
Virtual reality (VR) tech has been moving along quickly in the past few years. When the Meta Quest Pro was released last year, it included facial tracking that allowed users to view their virtual avatar in a VR mirror. As noted by UploadVR, however, the illusion can be broken as soon as you stick out your tongue.
In a new update to the SDK for Unity and native code, Meta has introduced into the facial tracking OpenXR extension the ability to track how far your tongue is sticking out. Meta avatars have not yet been updated to integrate this change, but third-party avatars can already take advantage of this new feature if the SDK is updated to version 60. One developer named korejan posted a video to Twitter/X demonstrating how it works.
Accessibility at its finest
The response from the community has been somewhat mixed, with users on Reddit wondering what exactly this can be used for. One user named harg0w pointed out that it was, in fact, “Accessibility at its finest.” as it could unlock new input methods for people with disabilities.
Technology such as eye-tracking has already been adapted for use in video games to allow people with disabilities to play games such as Minecraft, with charities like SpecialEffect and organizations like National Star College offering new options that can allow disabled people to enjoy even fast-paced esports using different input methods. The ability to track the tongue could offer even more options to broaden the scope of accessibility in VR and gaming in general.
At the moment, however, this functionality requires a Meta Quest Pro, which is an expensive premium device. As the technology develops, it could see wider applications.