Wednesday, 17 January 2018

Nissan Says Its Brain-to-Vehicle Tech Will Be Here Sooner Than You Think

Leave a Comment
https://blog.caranddriver.com/wp-content/uploads/2018/01/426213412_Nissan_Brain_to_Vehicle_technology_redefines_future_of_driving-626x352.jpg

Nissan Brain-to-Vehicle technology redefines future of driving

Nissan, currently the sixth best-selling automotive brand in America, has chosen to focus its energies on usable and accessible alternative-fuel vehicles—such as the world’s top-selling electric car, the Leaf—while also creating evanescent fantasies related to the eventuality of fully autonomous driving. It is into this latter category that its newest experiment fits. Unveiled this month at the CES technology show in Las Vegas, it is called brain-to-vehicle (B2V) technology. Quite simply, it is meant to harvest a driver’s thoughts.

Why would someone want their brain waves collected by their car? To provide the vehicle’s driving-assistance sensors with advance information. All you have to do is wear what looks like a kid’s wrestling helmet studded with electrodes, as if you were going into a futuristic clinic to have those voices in your head diagnosed. Simple, right?

“The concept is to decode, in real time, brain activity. Not in terms of reading commands and so on; just brain activity while you are driving the vehicle,” Lucian Gheorghe, who directs the project at Nissan’s advanced research center in Japan, told C/D. “To use this connected with autonomous driving functions in two ways: to enhance performance when you’re driving and to enhance the riding experience when riding [using autonomous-driving technology].”

Nissan Brain-to-Vehicle technology redefines future of driving

You probably shouldn’t think too many things too quickly—at least not yet. This technology is in the early phases of an iterative process. The system is not going to leap from initial experimentation to full-on Scanners, the 1981 David Cronenberg sci-fi classic in which people with telekinetic powers are weaponized by a defense contractor.

There are three aspects of the technology. First, there’s the physical aspect: a recording device. Second are the software and algorithms for the coding. Third, there is the control side, namely what’s to be done with the data: how to interpret and act on it. Nissan has faced challenges in all of these areas.

It had to create the helmet, which needs to have medical-level accuracy without bulk and cables that would impinge on the necessary head movement and driving capability. It had to develop code that can manage the complex information coming in—to build machine-learning techniques to decode these very small signals and to be precise in the milliseconds of time involved. And it had to decide how it will support this information, for example to do the same thing that the driver intends to do but just do it earlier, or to do it more smoothly than the driver would be capable of, or to do it more sportily.

Nissan Brain-to-Vehicle technology redefines future of driving

The ideal scenario for this technology is not one in which drivers sit back and just think their way down the road, as compelling or terrifying as that notion might be. Rather, the idea is that the additional input from a human, behind the wheel, can help driver-assistance technologies in three ways.

First, it can provide advance notification to help these systems “see,” “feel,” or “sense” things that its sensors are either not sensitive or not advanced enough to decipher. Secondly, it can help the artificial intelligence (AI) algorithms integrated into the vehicle to learn, allowing them access to actual human thought processes and how they match, or don’t, with the computers’. And, perhaps most important, it can provide another means to keep a driver engaged with these robot assistance systems, something that is of key importance as the technology improves. Since Level 5—full autonomy—is likely to remain quite far off, for the time being all self-driving cars will need the ability to hand control back to a human driver in an instant.

Gheorghe referred to this as disaccord detection. “This is when we look at when the driver disagrees with whatever happens in the system,” said Gheorghe. “For example, if the car brakes, and you don’t agree with that because you wanted to keep going. That information gets fixed in the system every time it happens. And then you can make a model of better dynamics for the autonomous driving [system] for yourself.

“This could also work in a lower level of automation, when you’re even more engaged, when you’re surveying the vehicle continuously so that your will is even clearer and stronger,” he said.

Nissan Brain-to-Vehicle technology redefines future of driving

Of course, when we’re driving, we think all sorts of thoughts we don’t act on—often things we would not want to act on. For example, how does this system respond if you think about changing the climate settings or skipping a song on the entertainment system? Is the same lobe of the brain activated as if you want to change lanes or pass the car in front of you? How does it respond if you have those occasional thoughts that we all do—like, what if I drove this car over the guardrail and into that ravine? How does it process our road-rage fantasies—or the alluring pull of the scent of Cinnabon at a roadside rest stop?

“With movement anticipation, we’re looking at something that’s called motion-related potential. This appears most clearly before intentional movements. And it’s even stronger when you’re synchronized with some external stimuli,” said Gheorghe. “These triggers do not appear that clearly when we are considering movements in the navigation system. And, also, they do not appear when there is no movement afterward. Just thinking ‘I might go’ does not trigger any preparation in the motor cortex. So there are very few situations when you can have strong false positives.” And if they do occur, and you correct them, the autonomous-driving system learns from that.

This seems like a very ambitious project, one that feels to be at least a decade or two away from practical application in production vehicles. Nissan is more optimistic. “We’re looking at five to 10 years,” said Gheorghe. “If it’s 10 to 20,” he suggested, “it should be done in a university.”

2018 CES Show Full Coverage

Let's block ads! (Why?)



from Car and Driver BlogCar and Driver Blog http://feedproxy.google.com/~r/caranddriver/blog/~3/5Y19f9npuUA/
via IFTTT

0 comments:

Post a Comment