DETROIT — Automakers and suppliers have already spent tens of billions developing driver assistance and autonomous driving capabilities, but the legal landscape surrounding those technologies is at best problematic — and at worst, “a hot mess,” according to panelists Tuesday at the Automotive News Congress.
“I think the liability situation is a hot mess right now, but I think it’s probably relatively straightforward to fix it if you look at it the right way,” said Phil Koopman, a faculty member at Carnegie Mellon University who has been working on the technology for more than 25 years. “But if we don’t fix it, it’s going to be a mess.”
Problems range from technologies that are inaccurately or nebulously described, consumers who overestimate what their vehicles are capable of and regulatory challenges that have made true autonomous driving difficult to achieve in the real world, panelists said.
Add to those issues the fact that most of the litigation in the area so far has been settled out of court, said Jennifer Dukarski, a lawyer and “recovering engineer” in Butzel’s Ann Arbor, Mich., office, practicing in the areas of intellectual property, media and technology.
“When you look at most of the cases that have arisen, under these driverless vehicles or somewhat autonomous vehicles … people sued under negligence theory,” Dukarski said. “And let’s be honest — all of them have settled, so we really have no idea where [legal precedent is] ultimately headed.”
Martin Fischer, a member of the board of management for supplier ZF Group, touched on the difficult regulatory environment facing autonomous driving technologies and the significant engineering hurdles that remain. He tried to explain the difficulties using a common traffic light.
“You would think that recognizing a red traffic light is pretty simple — use a camera, see if it’s red or is it green,” Fischer said rhetorically. But using just a camera-based system, the failure to properly recognize the color of a traffic signal is substantially higher than for a human driver.
“That tells you why we struggle — because of the technology,” he said. “We want to put something on the road that is better” at recognizing the proper traffic light instead of worse than a human driver.
Koopman asked the audience to look at the traffic light in Fischer’s example from a different point of view.
“If a human driver has driven a million miles and never done anything wrong, and then they run a red light and kill someone, they don’t get a free kill because they have a good driving record,” the autonomous driving researcher said. “They’re still negligent. Saying that self-driving cars are statistically safer, [does that mean] we’re going to give them a bunch of free kills?
“You can’t do that,” he said. “Not only do you have to be statistically safer,” but the machine needs to be held to the human standard of safety and responsibility for its actions.
Koopman said companies developing autonomous driving systems have a responsibility to maintain the same “duty of care” not to harm others with their creations that a human driver maintains when they operate a motor vehicle. If they fail to maintain that duty of care, they have liability for their actions, just as a human would.