Tuesday, January 22, 2013

Hyundai concept car tracks your eyes and hand gestures (not those gestures)



Look over to the climate controls, then take your hand off the wheel and move it slightly upward. The car responds by notching up the heat a few degrees. Sounds like in-car Kinect? That’s the magic behind the Hyundai HCD-14: eye tracking, gesture recognition, and smart software.
These controls, which are possibly the next big thing in dashboard interaction, made the HDC-14 Genesis concept car the hit of the North American International Auto Show — even if most people never got past ogling the impossibly sleek exterior. Some details are sketchy because the HDC-14 was finished just two days before the show and the electrically-powered suicide doors overheated, limiting how many people could actually hop in and see the cockpit.
Here’s how the cockpit control system works:
The HCD-14 has four front-seat displays: a free-form center stack display of about 10 inches, a head-up display, a digital instrument panel, and a driver information display to the left of the instrument panel. A pair of cameras in the steering wheel tracks the driver’s eyes. Once the car sees the eyes glance at an area of the center stack with its climate control, infotainment, phone and navigation areas, it determines which specific control you want. A second set of sensors tracks your hand movement. A hand gesture — such as pointing, raising or lowering the hand, pinching or spreading, swiping left or right, or rotating clockwise or counterclockwise — could adjust the volume, zoom in on a display, flick to a new page, adjust the speed of the fan, or scroll through a phone contact list.
The eye tracking system was developed with Tobii of Sweden. The week before, the company released Tobii REX for Windows 8 PCs that ”enables users to control the computer by combining their eye gaze with other controls, such as touch, mouse and keyboard.” REX is based on Tobii Gaze software announced a year earlier.
Hyundai says the Soft Connect gesture recognition system picks up hand gestures from sensors that look down from the headliner. It works much like Microsoft Kinect gesture recognition.
“You pick a function with eye tracking and then attenuate with gesture recognition,” explains Hyundai designer Mike Barbush. The driver could also choose to refine the selection with steering wheel buttons or voice controls.
-Courtesy of extremecartech.com 


No comments:

Post a Comment