Tuesday, 3 October 2017

“A Tesla Crash, but Not Just a Tesla Crash”: NTSB Issues Final Report and Comments on Fatal Tesla Autopilot Crash

Leave a Comment
http://ift.tt/2wvBCeU

Tesla-Fatal Crash
According to a key member of the National Transportation Safety Board (NTSB), automakers may want to slow the rollout of automated features until they have a better understanding of how these new technologies are interacted with by human drivers, who still carry responsibility for monitoring the road even when they’re in use.

That suggestion comes alongside the release of the NTSB’s final report on a fatal Tesla Model S crash involving the company’s Autopilot feature. (The agency had previously released an abstract and summary of the report on September 12; this is the comprehensive analysis.) Conclusions in the 53-page report are consistent with findings issued last month, which indicated that a truck driver’s failure to yield combined with a Tesla driver’s overreliance on Autopilot to cause a fatal crash along a Florida highway on May 7, 2016, which killed the Model S driver, Joshua Brown.

The final report is augmented with written comments from board member Christopher A. Hart, who compared the present struggle to meld human and machine in passenger cars to those experienced during the introduction of automation in the aviation industry a generation ago. Although decades have passed, he commented, the auto industry hasn’t learned from aviation’s mistakes.

“They learned from experience that automation ‘because we can’ does not necessarily make the human-automation system work better,” Hart wrote, referring to the aviation industry. “That resulted in an evolution toward human-centric automation, in which the objective was improving the overall performance of the human-automation system. This crash is an example of what can happen when automation is introduced ‘because we can’ without adequate consideration of the human element.”

Hart’s comments struck at the core of a quandary that, from its outset, has essentially split the nascent autonomous-technology industry in two. Some companies, such as Ford and Waymo, believe the intertwining of human drivers and self-driving systems is so fraught with problems that they have forgone the development of advanced driver-assist features in favor of designs that rid cars of traditional controls like steering wheels and brake pedals.

“This crash is an example of what can happen when automation is introduced ‘because we can’ without adequate consideration of the human element.”

—Christopher A. Hart, NTSB

For the likes of General Motors, Audi, and Tesla, which have made plans that include keeping humans in the driving loop, the complexity of how best to bring together humans and automated features is well known. While each has designed ways to ensure that humans remain engaged in driving while automated features are enabled, the NTSB report makes it clear that arduous work remains in readying these systems for the road. As these features get better over time, there’s a disquieting prospect that humans behind the wheel could actually do a worse job of driving.

“People are very bad at monitoring automation,” Deborah Bruce, investigator in charge of the NTSB’s Tesla Autopilot investigation, told Car and Driver. “When it’s doing what it’s designed to do, then it’s not calling attention to itself. There’s decades-long research and history going back to [monitoring] nuclear power plants [that show] we’re poor at automation. It’s an attention task, and we are not good at that.”

A photo taken during the NTSB investigation shows damage to the trailer involved in the fatal Tesla Motors crash.

A photo taken during the NTSB investigation shows damage to the underside of the trailer involved in the fatal Tesla Model S crash.

Given those limits in human capability, the NTSB recommended that manufacturers develop methods for monitoring whether drivers are paying attention that are better than steering-wheel sensors that drivers merely need to touch. Hart termed those “inadequate.” Because driving is an inherently visual task, preferred methods might include inward-facing cameras that track driver eye movement, a feature already in use in systems such as  General Motors’ new Super Cruise.

Among the NTSB’s further recommendations, the federal agency said automakers should add system safeguards that ensure automated systems are used in the conditions for which they are designed. Autopilot is intended for use on divided highways, but Brown used the feature on U.S. 27A in Williston, Florida, which is designed to permit the sort of crossing traffic that ultimately allowed a tractor-trailer hauling blueberries to pull into Brown’s path. Brown had driven for 6.7 miles with the Autopilot feature engaged prior to impact.

“You can certainly look and say this was a Tesla crash,
but through our investigation we’re finding
it’s not just a Tesla crash.”

—Deborah Bruce, NTSB

Like the investigation carried out by the National Highway Traffic Safety Administration before it, the NTSB’s probe into the Tesla crash was a landmark case involving the first known death of a driver who had engaged a Level 2 automation system, one that can control steering, acceleration, and braking functions with an expectation that humans are monitoring the broader driving environment and ultimately responsible for vehicle operations.

The final report comes at a time when Congress is considering legislation that would ease the regulatory path for automakers in deploying self-driving vehicles and on the heels of the introduction of a revised federal automated-vehicle policy that removes a prior request that manufacturers submit a voluntary safety assessment of their technology.

NTSB investigators noted that the automaker cooperated with its investigation and helped distill data from the car’s systems into insights the board may not have otherwise had access to. Further, they noted that although this incident involved a Tesla, the issues at hand are systemic throughout the industry.

“You can certainly look and say this was a Tesla crash, but through our investigation we’re finding it’s not just a Tesla crash,” Bruce said. “It might have been they were the first one, but it could have been anybody, based on the fact that humans are operating these vehicles.”

Tesla Autopilot

Still, Tesla may have sowed confusion about the capabilities and limitations of the system with the name Autopilot. There’s growing concern within the NTSB and more broadly among safety advocates that automakers aren’t properly educating drivers on the nuances of these systems and that the wide assortment of brand-specific terminology is bewildering.

“Adding to the problem is the moniker ‘Autopilot,’ ” Hart wrote in his supplemental comments. “In aviation, airline pilots know that even when the autopilot is controlling their airplane, the pilots still play a crucial role. Joe and Suzy Public, on the other hand, may conclude from the name ‘Autopilot’ that they need not pay any attention to the driving task because the autopilot is doing everything.”

Hart’s comments were supported by board chairman Robert Sumwalt. The NTSB is a federal agency independent of the Department of Transportation that conducts crash investigations and makes recommendations on safety improvements; it does not hold regulatory power.

Tesla Motors could not be reached for comment on the NTSB’s final report. When the board issued its findings and recommendations last month, a spokesperson said: “We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

Let's block ads! (Why?)



from Car and Driver BlogCar and Driver Blog http://ift.tt/2xS0OgO
via IFTTT

0 comments:

Post a Comment