A couple of new pieces of technology have caught my radar recently. One is Valve’s Lighthouse Tracking System.
This technology may replace the 10-30,000 dollar tracking systems that presently exist and remain an expensive selling point for VR display systems.
Another one that has caught my attention is Google’s Project Soli. It’s basically an embedded gesture recognition system that analyzes radio wave interactions between the chip and human fingers.
Embedding some of these onto a VR display system (such as a HMD) may bring about a new method for natural interaction with your hands (arguably better than using cameras ala the Leap Motion). One could also possibly place them around the walls of a CAVE system for easier gesture detection.