The original 1977 Star Wars movie might have been the highest-profile stage for hologram technology to that date, yet in the decades since, holograms have become nothing special. So what’s the next step? The ability to manipulate holograms like physical objects, evidently. Japanese automotive supplier Denso brought this very technology to CES, along with a neat head-motion-controlled head-up display to show what might be possible in the car interior of the future. We had to put our mitts—and, uh, our heads—on Denso’s creations to try them for ourselves.
Holographic Haptic Controller Interface
Denso’s Holographic Haptic Controller is a functioning yet still conceptual system that the company says could someday be integrated into, say, a self-driving car. Denso’s projected hologram incorporates real haptic feedback (the sensation that, as you manipulate projected “buttons,” you feel a physical response). The setup we tried consisted of three main components: a gesture-recognition camera, a projection device for the holographic “control panel,” and an array of ultrasonic transducers.
The gesture camera, located between the projector and the ultrasonic array, works much like similar gesture-recognition cameras found in, say, Volkswagen’s upcoming infotainment system or the current BMW 7-series. It discerns when a user’s finger is using one of the holographic controls. The projector, in this case an angled unit, beams a flat-appearing vertical “panel” that appears as a head-up display might—except there is no glass panel onto which it’s projected.
Both hologram and gesture-recognition capabilities exist today. What makes Denso’s experiment special is the ability to impart haptic feedback on the user’s finger when poking at the holographic controls. That feel is generated by the ultrasonic transducers, which send concentrated sound waves to your finger when it “touches” a control. These waves are felt like zaps of static electricity, more of a tingle than a tap, but they indeed complete a feedback loop between a user’s eyes and his or her finger.
Whoa, that’s a floating ghost display!
To us, manipulating a series of floating controls seems cool in concept, yet the physical actions required of the user aren’t terribly different from, say, operating good old-fashioned hard buttons on a dashboard. Furthermore, when we held our hands over the ultrasonic transducers, our whole hand had the tingly feeling of a static charge. (The intended haptic feedback could nonetheless be felt distinctly on our fingertips when applicable.)
Another potential hiccup, which we noticed only after reviewing our iPhone video of the system at work: Those ultrasonic transducers, while not audible to the human ear, assaulted our phone’s mic with a loud static-type noise (which is why we turned the sound off for the first video on this page). Even though it was not situated particularly close to the ultrasonic transducers’ waves, the phone picked up interference, a potential problem if the system ever makes it into a production car with a microphone-based voice-control system.
Touchless Human Machine Interface
Last year at CES, Denso experimented with eye-tracking technology melded with in-car controls. The premise was simple: Let a driver’s eyes do the work of shuffling among menus and selecting various secondary controls. This year, Denso has tried a different variation on the same theme with a head-tracking setup that operates pretty much the same way. The company’s demonstration comprised a dashboard mockup with a steering wheel, a gauge cluster, a central display, and a head-up display. Using a head- and eye-tracking camera mounted above the steering wheel, the so-called Touchless Human Machine Interface first detects where the driver’s eyes are pointed—straight ahead, through the windshield, or toward the central display?—triggering a highlighting ring to glow around either the dashboard display or the head-up display to indicate which control interface is being used.
In our demo, only the head-up display was set up to receive head-based commands. Denso had designed a simple head-up display menu structure consisting of rows of key functions—such as audio or phone—represented as tiles. Tilting or turning one’s head either to the left or the right would move a glowing ring from around one tile to the next in each direction. Once the intended tile is highlighted, the user nods his or her head forward to select it; this draws the user “down” (the tiles animate in such a way to suggest the user has “dropped” into a submenu) into a secondary menu. There, more left or right tilts can be used to navigate, a nod selects something, and so on. Want to back out of the submenu? A lift of the chin bounces you “up” a level, back toward the main menu.
The Touchless HMI system worked surprisingly well, despite our self-conscious selves feeling like Barbara Eden granting wishes in I Dream of Jeannie. Critically, it better fulfills the goal of last year’s eye-tracking system—keeping the driver’s hands on the wheel and eyes on the road—by requiring no hand inputs or concentrated eye aiming. On the other hand, so does voice control. In fact, voice control sums up our skepticism for both control setups: Why not bark a command? But some might want a more interactive experience that doesn’t require chatting with a robot.
Denso understands that it will be up to consumer demand or carmakers to decide if these control methods are something desirable. For now, the company is simply putting the tech out there to gauge interest, yet it notes that all of the technology already exists, making these futuristic controls feasible today. It isn’t alone, either—BMW displayed a similar haptic hologram setup at CES, too.
from Car and Driver BlogCar and Driver Blog http://ift.tt/2jkOelQ
via IFTTT
0 comments:
Post a Comment