Autonomous Vehicles and the Future of Insurance

CSG convened the Autonomous and Connected Vehicle Policy Academy June 12-14, 2017 in Detroit. A group of state policymakers from around the country attended the event. The academy included a special briefing June 13 by Robert Peterson, a law professor at Santa Clara University in California, who explained how insurance and liability will change as autonomous vehicles come online.

Peterson told policy academy attendees autonomous vehicles will bring about a fundamental shift in the way we think about insurance and liability.

“How does your standard automobile (insurance) policy fit when it comes to a vehicle that drives itself?” he said. “Well, I think the answer is not very well because your standard policy insures you against your legal responsibility. Your uninsured/underinsured motorist coverage insures you against the legal responsibility of the uninsured or underinsured motorist who hit you. When you have an automobile that’s driving itself, there may be no responsibility on the part of whatever you want to call the person in the car—the operator, the driver or the passenger.”

Peterson believes traditional insurance models may still be needed—at least for a while, particularly if more vehicles are introduced that still require a driver to be able to intervene if something goes wrong.

“If we assume that perfection cannot be achieved and that’s probably correct, there are still going to be some accidents and when you have level 3 vehicles, there are going to be some accidents where the driver or the owner is responsible so there’s going to be some room for your standard auto policy to still have some play,” he said.

Level 3 technology on Tesla’s Model S cars known as “Autopilot” has come under scrutiny in two recent crashes. Last year a man was killed in Florida when his vehicle hit a tractor trailer. The National Highway Traffic Safety Administration found that driver error was to blame and noted that Tesla’s manuals and instructions clearly state that drivers are required to stay engaged and prepared to take the wheel, which the driver had apparently not done. In a second incident this summer, a driver initially blamed Autopilot for a crash in which he and his passengers sustained minor injuries. The driver later said he had misremembered what happened and had disengaged Autopilot by stepping on the accelerator before the crash.

Peterson said developing ways of determining whether the human driver or the technology had control of a vehicle at the time of a crash is likely to become increasingly important in the future, which may present issues for state policymakers. More than 15 states have enacted statutes that address the use of event data recorders, also known as “black boxes,” that can capture how a vehicle was performing at the time of a crash. California and Nevada require the preservation of the last 30 seconds of data from these recorders before a crash. States have already had to wade into the complicated issues of who has access to this data and related privacy concerns, Peterson said.

A proposed regulation in California could help define the liability question in sharper detail. It says:

  • In the event that the autonomous vehicle requires the driver to take control of the vehicle or when the vehicle is operating outside of its approved operational design domain, the driver shall be responsible for the safe operation of the vehicle, including compliance with all traffic laws.
  • The manufacturer of the vehicles … shall be responsible for the safe operation of the vehicle, including compliance with all traffic laws applicable to the performance of the dynamic driving task, when the autonomous vehicle is operating in autonomous mode within its approved operational design domain.

“If you have a vehicle that travels say 10,000 or 12,000 miles per year, it might travel only a few hundred miles manually,” said Peterson. “Insurers will want to be able to rate those policies but to do it properly they’re going to need to know how many miles these vehicles are driven in manual mode because the autonomous mode is a whole different profile of risk. Beyond those situations, if the vehicle is properly maintained and it does something—goes through a red light or a stop sign and causes an accident—it’s likely to be because of some defect in the design or the algorithm or whatever is running the car and that is going to be the responsibility of the manufacturer.”

While original equipment manufacturers are responsible in just 2 percent of crashes today, that responsibility is expected to increase to 80 to 100 percent in the future.

“If it’s going to be the manufacturer, then we have a big change in relative responsibility,” Peterson said. “Manufacturers are going to have a much larger slice, hopefully of a much smaller pie because there are going to be fewer accidents and the accidents are going to be less severe but the manufacturers are going to be the ones that bear the primary responsibility for them.”

But will auto manufacturers and technology companies be willing to accept potential future shifts in liability to them? Peterson said it looks like they will.

“Volvo, Google and Daimler Mercedes have at least said they will accept liability when their cars in autonomous mode cause an accident,” Peterson said. “I don’t know that any of them have signed on the line anywhere and it may be no more than saying they’re going to be liable when they’re liable anyway. But at least I think it helps promote or accept or get some public acceptance of these vehicles when they try to clarify some of the liability issues.”

Peterson noted however that the big automakers and tech companies won’t be the only ones in the autonomous vehicle marketplace. There may also be after-market suppliers offering conversion kits that allow consumers to turn their traditional vehicles into autonomous-technology-enhanced vehicles. States such as Florida, Michigan and Nevada along with the District of Columbia have included provisions in legislation that seek to protect original equipment manufacturers from liability in crashes where vehicles had such after-market conversions.

An End to Mandatory Auto Insurance?

One of the biggest changes most states could face as the autonomous future evolves in the decades ahead could be an end to mandatory auto insurance, Peterson told policy academy attendees.

“Every state except New Hampshire has some form of mandatory auto insurance,” he said. “Those rules were adopted when deaths per vehicle miles traveled were really quite high. … In 1958, we killed 53 people per billion miles traveled. Roll that up today and it’s 11 to 12 … per billion miles traveled. Is it sound public policy to continue with mandatory auto insurance when the severity and frequency goes down and the responsibility of drivers shrinks and shrinks and shrinks as they are driving fewer and fewer of those vehicle miles? … Mandatory auto insurance sounds like a good idea but it’s a pretty big burden for people who are impecunious and need cars to go to work. It’s part of why you have so many uninsured cars out there. … We might need to take another look at mandatory auto insurance.”

Setting Insurance Rates

Autonomous vehicles appear likely to complicate how insurance rates are determined, Peterson noted. Today, insurance rating agencies and insurance companies collect data that is based on billions of miles traveled on the nation’s roads to decide the rates that are charged.

“When we go now to self-driving vehicles, I think there are only 30 companies in California that have a license to test these vehicles,” Peterson said. “But these test vehicles that are being driven around California have to be driven by trained test drivers so we have virtually no data on how they would behave in the hands of you or me. Google—or now Alphabet—their vehicle has traveled about 3 million miles so far and other companies far, far less than that. So the data that we have to base a rate on for self-driving cars is very, very thin and is also not the equivalent of you and I driving these cars because these are being driven by test drivers who are trained to a certain extent.”

As these vehicles are deployed, there will be more data produced. But the question will then become whether insurers can rely on that data in the way they can current data, Peterson said.

It's also not entirely clear how much of an impact autonomous vehicles will have on reducing the number of crashes. Peterson pointed to a study done by the Casualty Actuarial Society.

“In spite of the fact that 94 percent of accidents can be attributed to some decision on the part of a human being, they did not think that self-driving cars would eliminate all of that,” he said. “In fact, they thought that in 49 percent of the cases … there was some limiting factor that would make the autonomous nature of the vehicle difficult to benefit from—most notably snow. … A self-driving car in San Diego is going to have far more utility than one in Fargo. … If you think ‘let’s just take drunks home in self-driving cars.’ There’s a big savings. We could get rid of all of those accidents that are caused by intoxication. Well, if you can’t take them home for half the year because there is snow on the ground, then you’re not going to get the benefit of that. So there are limitations and we don’t want to get too excited about getting rid of 94 percent of those accidents.”

Since autonomous vehicles will be reliant on software and programming, they will also have the potential to be constantly improving, which presents additional challenges when it comes to determining their risk profile for insurance purposes.

“If I drive a car and I get in a wreck, my insurance (company) says ‘it looks like you were at fault for that wreck. I’m going to raise your rates because you’re a bad driver,’” Peterson said. “But with a self-driving car, you get in an accident, all of the cars in the fleet are better because they will study that accident, figure out what happened and then download a fix to the entire fleet so that that accident won’t happen again. So it’s an entirely different experience. And also, every download and every patch and every update that goes out to these cars and these fleets changes the entire risk profile of that fleet. That’s something insurance companies are not used to dealing with. Insurance changes at a very stately pace when it’s you and I driving. We don’t get that better that fast but these cars will.”

Peterson said as the transition to fully autonomous vehicles is taking place and the insurance marketplace is evolving, there will be some important things for policymakers and state insurance regulators to keep an eye on.

“One of the things I think regulators are going to want to look at—particularly ones in states that really try to keep the premiums down—is are insurance companies passing on to their insureds the savings that you get from automatic braking and autonomous vehicle features?” he said. “There is one (auto insurance) company called Root in Ohio. They’re now offering a discount to Tesla drivers but the discount is based on how many miles the Tesla is in Autopilot. So you’re going to get rated depending on what kind of driver you are when it’s not in Autopilot and it’s rated in a different way when it’s in Autopilot because they have at least concluded to their satisfaction that it’s a lot safer when it’s in Autopilot than when it is in your hands.”

Federal Preemption

Peterson also further addressed an issue he alluded to during an earlier policy academy session: whether insurance and liability are likely to remain the domain of state governments. He said while the federal government might have the authority to preempt states on autonomous vehicle policy as it regards insurance, he doesn’t see it happening anytime soon because there is currently no burning issue that might prompt them to do so.

“I know insurance companies are working on this and trying to come up with ways to cover these vehicles and I’m going to assume that they are going to work it out,” he said. “If they didn’t then I could see some pressure at the federal level.”

Peterson said his home state of California presents an unusual case because of Proposition 103, a 1988 voter-approved measure addressing insurance rates and regulation. The measure requires insurance rating agencies to use a person’s driving record as the most significant issue in setting rates. The highest the car itself could come in is fourth. 

“That I think may very well distort the rates at least for private auto insurance and if it distorted them in a way that was making it difficult to deploy these vehicles, then I could see possibly someone stepping in,” Peterson said. “But I think it’s not going to be a burning issue because I don’t see fully autonomous vehicles being deployed in the hands of you and me. I don’t think we’re going to buy them. … I think they’re going to be deployed on a fleet basis by service providers and you’re going to subscribe. And those weird rules … do not apply to (rideshare companies) and groups like that. In fact, they don’t apply to anyone that owns more than four vehicles.”  

Portions of the policy academy focusing on insurance and liability issues were presented in collaboration with The Institutes Griffith Insurance Education Foundation.