Thursday, 27 April 2017

On the Path to Autonomous Vehicles, Police Officers Get Left Behind

Leave a Comment
http://ift.tt/2no6zj2

Mandatory credit: FRESCO NEWS / Mark Beach Uber crash

While responding to reports of what looked like a routine car crash early on the evening of March 24, Officer Dustin Patterson of the Tempe, Arizona, police department became an unwitting pioneer of the autonomous age.

On arriving at the scene, Patterson learned that one of Uber’s self-driving cars was among the four vehicles involved. The company had started testing in this Phoenix suburb only three months earlier; this marked the first time one of the city’s police officers would investigate a crash that involved a car traveling in autonomous mode.

Within minutes, Patterson determined the Uber car was not at fault. A human driver had made a normal human mistake, turning left at an intersection into the path of the Uber car, which was a 2017 Volvo XC90 SUV. It was a simple case of failure to yield. Determining how to handle a crash involving an autonomous vehicle, however, was more complex.

For starters, the official Arizona crash-report document used to collect information on collisions contains spaces for hundreds of variables—ranging from weather at the time of the crash to whether wild animals or livestock were contributing factors. But none of the spaces allow officers to denote that a car was operating under control of a self-driving system.

“The disturbing thing for me is that they’ve been, by and large, left out of the discussion so far.” – Jim Hedlund, Governors Highway Safety Association

This was a minor crash, and all vehicle occupants walked away. Had it been more serious, other complications may have been more concerning. Patterson had no way of knowing, for instance, whether data in the car could be relevant to an investigation. If Uber’s human safety driver hadn’t been present, the officer would have had no way of knowing whether the autonomous systems in the Uber remained active or whether they posed an ongoing safety hazard.

Officer Patterson had no set of procedures to follow, because the Tempe Police Department, like pretty much every other law-enforcement agency in the country right now outside Silicon Valley, has no defined method for or expertise in investigating accidents involving self-driving vehicles.

A dozen states already allow autonomous vehicles to test on public roads, and 28 more states have legislation pending that addresses the arrival of this potentially game-changing form of transportation. A growing number of transportation experts and law-enforcement officials caution that too many questions remain unanswered regarding the interaction between police officers and self-driving cars.

Uber-crash-1

Law Enforcement: “Left Out of the Discussion”

Some involve everyday operations, and others involve crashes in which first-responders may be at heightened risk. Following the Uber crash, a spokesperson for the Tempe Police Department said, “As far as the dangers post crash, we are always cognizant of the danger that can occur while investigating any crime and not necessarily specific to autonomous vehicles.”

That’s not good enough, according to Jim Hedlund of the Governors Highway Safety Association. His recent report, Autonomous Vehicles Meet Human Drivershighlights concerns that police officers haven’t been trained to identify the potential hazards. For their own safety, Hedlund said, officers should have the means to determine whether a crashed vehicle has any autonomous features that may affect their safety during an on-site investigation.

“The disturbing thing for me is they’ve been, by and large, left out of the discussion so far,” he said. “People aren’t even talking to them, even within state task forces.”

“Both highway patrol and local areas need to know how
to interact with the vehicles, and we’ve put out what we think are minimum standards.” – Brian Soublet, California DMV

On a more mundane level, the GHSA report says that traffic records should have spaces for officers investigating crashes to record a vehicle’s autonomous capabilities. Traffic officials have proposed two potential modifications to the standard Model Minimum Uniform Crash Criteria used by accident investigators, which could spur changes to the documents. One would allow officers to check whether a vehicle traveled under zero, partial, or full control of an automated system. Another would allow them to denote the vehicle’s level of automation using SAE’s guidelines. These proposals are subject to further debate this August at the national Traffic Records Forum conference.

But such proposals lead to more basic questions: How are officers supposed to understand the nuances between different levels of automation?

“There should be a place in a car where that sort of information is kept,” Hedlund said. “Somewhere where an officer can look for information on the autonomous technology and how to determine whether it’s still active. But these are questions for once a car is stopped. Do we know how an officer can make a traffic stop? What happens when you want to wave a car over? Something as simple as that.”

Accident on the Trans Canada Highway, near Nairn Centre, Ontario, Canada on June 29, 2008. | Location: Nairn and Hyman, Ontario, Canada.

California: At Forefront of Mandatory Rules

For now, human safety drivers remain behind the wheel in almost all autonomous test cases. But that will change, and soon. California, home to 30 companies currently permitted to test autonomous vehicles, held hearings this week on proposed regulations that pave the way to put fully driverless vehicles on the roads by the end of 2017.

Provisions in those new regulations would mandate that car companies provide “law-enforcement interaction plans” to authorities located in the jurisdictions they intend to test. These would provide information on how to communicate with vehicles in both normal traffic and emergency situations, where insurance and registration information can be found, and how to detect whether the autonomous mode has been deactivated.

The proposed regulations don’t specify how those companies should address those topics, but they exist to ensure that they do.

“Both highway patrol and local areas need to know how to interact with the vehicles, and we’ve put out what we think are minimum standards,” said Brian Soublet, deputy director and chief counsel at the California DMV. “They need to understand how to know if the autonomous technology is engaged, how to pull it off the road, and some important things like where to find in the vehicle who owns it and who is insuring it.”

Government officials are aware such guidance needs to stretch beyond California. In its landmark autonomous-vehicle guidance issued in September 2016, the federal government acknowledged there is “a growing need for the training and education of law enforcement” on how their interactions will change with the advent of autonomous vehicles. But, just as in California, the federal guidance contains few specifics on how that growing need should be met.

One potential solution: Ask police officers.

Otto truck Colorado

An Otto self-driving truck makes a delivery run along Interstate 25 in Colorado.

Colorado: A Lesson in Collaboration

As Hedlund indicated, many departments feel as if they’ve either gotten a late invitation to the party or left out entirely. In January, the U.S. Department of Transportation established a new advisory committee to focus on automation across all transportation modes. Not one of the 25 people appointed to that committee has a law-enforcement background, a circumstance that did not go unnoticed among police officers who have followed the development of autonomous technology.

“Not having a law-enforcement person at that table is a huge miss, in my opinion,” said Mark Savage, deputy chief of the Colorado State Patrol. “Law enforcement must have a seat at the table, because we are the ones that will be enforcing the laws.”

Savage speaks from firsthand experience. Last summer, when executives from Otto, a self-driving truck subsidiary of Uber, told Colorado Department of Transportation officials that they wanted to test on state roads, CDOT turned to Savage to formulate parameters for the testing.

“So when we sat down, we were all kind of like, ‘What do we do?’ We didn’t know.” – Mark Savage, Colorado State Patrol

Savage missed the first meeting, in August 2016. In retrospect, he said that was “a huge mistake.” He’d assumed there would be a prolonged period to deliberate over procedures, but as it turned out, Otto wanted to commence testing within a month. That surprise meant he had a compressed timeframe to provide guidance for a state with no laws or regulations to govern autonomous-vehicle testing. What the state did have was his experience in commercial-vehicle safety and a willingness to work together.

“We have no law that prohibits this, no regulatory infrastructure, and to be frank, we don’t have anything formal now,” Savage said. “That’s part of the reason they chose Colorado. But they said, ‘Hey, we want this to work,’ and they reached out in a positive manner, and said, ‘We’re not trying to slide anything under the table,’ and ‘We want to do this right, collaboratively.’ So when we sat down, we were all kind of like, ‘What do we do?’ We didn’t know.”

Savage asked experts to give his department a primer on exactly how the self-driving systems worked. He soon realized that police didn’t need a deep dive on systems operations as much as they need merely to be able to ensure the systems operate safely on the road. In the lead-up to the Otto test, the company proposed basic performance standards, such as one that states that the trucks need to remain within their intended highway lanes.

For the purposes of Otto’s test run down Interstate 25, the state patrol didn’t need to explore questions about autonomous systems because the company’s engineers rode alongside the truck and could handle contingencies—and were readily identifiable as the responsible parties. But law enforcement won’t always have that luxury.

An Uber autonomous vehicle drives in downtown San Francisco days after the company's self-driving program was temporarily halted following a crash in Tempe, Arizona.

An Uber autonomous vehicle drives in downtown San Francisco days after the company’s self-driving program was temporarily halted following a crash in Tempe, Arizona.

A Need for Fast Answers

Colorado’s collaborative efforts, along with the proposed California regulations, might provide valuable lessons for states such as Arizona, where Gov. Doug Ducey has aggressively promoted autonomous testing. In 2015, he signed an executive order that applies statewide, but that order provides no specific guidance to local communities where testing occurs.

No meetings took place between Uber representatives and the Tempe Police Department before testing began there, according to the department spokesperson, but leaders from both organizations met “shortly” after Uber began operating in Tempe.

As part of those discussions, Tempe Police said, Uber agreed to share data from any resulting car crashes should it be needed for an investigation. So far, police have said they have not requested any data regarding the March 24 crash. An Uber spokesperson said the company has fully cooperated with the Tempe Police Department.

But questions persist. Does law enforcement have the legal right to demand such data? If a police department needs data to investigate a crash involving an autonomous vehicle, would it have the on-staff expertise to understand it? What are the safety implications for first responders at scenes of accidents? What new training do officers need? With automakers and tech companies promising real-world autonomous deployments within two years, these matters remain unresolved for the people who enforce the rules of the road.

Let's block ads! (Why?)



from Car and Driver BlogCar and Driver Blog http://ift.tt/2oQ84Vh
via IFTTT

0 comments:

Post a Comment