computer vision

computer vision

Pawdio: low-cost hand input using acoustic sensing

5d ago
SOURCE  

Description

Hand input offers a natural, efficient and immersive form of input for virtual reality (VR), but it has been difficult to implement on mobile VR platforms. Accurate hand-tracking requires a depth sensor, such as a Leap Motion, but performing computer vision on a smartphone is computationally intensive, which may degrade the frame rate of a VR simulation and drain battery life. PAWdio appropriates a pair of ordinary headphones to track the position of the hand. The user holds a single ear bud in their hand that produces an inaudible tone. Doppler shifts are used to determine the velocity of the earbud from or towards the phone, which is then used to manipulate the Z-position of a virtual hand that is attached to the user's gaze pointer. PAWdio costs less to implement than what a Google Cardboard adapter cost and its low computational overhead can assure a high frame rate, which maintains immersion. Though PAWdio only offers 1D hand input, it can implement various game related actions, such as, grabbing, punching, pushing or thrusting and throwing.