Do self driving cars spell doom for vehicle insurance ?
The self driving car seems like a terrible driver’s dream : no steering wheel to control, no trouble parallel parking, no concerns about having a couple of beers on the way home from work. But the removal of fallible human drivers might create havoc for automobile insurance.
Human error accounts for 90 percent of road accidents, according to the International organization for road accident prevention, and our collective failure rate helps the automobile insurance companies rake in a cool $157 billion in premiums each year. So if self driving cars eliminate all those driver mistakes, the business of truck insurance will necessarily change as well.
A recent study by Rand corp. predicted increased liability for auto manufacturers while personal liability plummets. “If a vehicle and a human share driving responsibility, the insurance issues could become more complicated”, the study noted. The Rand authors had a sober message for the company insurance : if automated vehicles succeed in reducing the risk of crashes, the industry could see a “significant reduction in insurance coverage premiums”. The average cost of auto affordable insurance in the US is $1,020, according to Aaa, giving the industry a lot to lose.
In writing driverless collision insurance policies, underwriters will likely focus on the make and model of a car instead of a driver’s accident history or how often he drives. There may also be the introduction of “black boxes”, data recorders akin to those found in airplanes that can track car data and decipher what really happened seconds before a crash.
Another possibility : an uptick in no fault insurance, where an insurer covers the damage regardless of who’s to blame. This type of provide insurance has fallen out of favor in recent years as the cost of medical claims has increased, despite the expectation of lower litigation costs.
While it’s too early to tell which way insurance compare will go, there’s another cost for car owners that may go up. With fewer accidents keeping the nation well supplied with mechanics, repairing that fender bender may cost you more.
But until then we still have to search for automobile insurance quotes and it will be a long time until automobile insurance quote will be of no interest at all.
The problem with self driving cars : they don’t cry.
Sure we can make a self driving car, but can we make a self driving car with feelings ?
Noah Goodall, a University of Virginia scientist, asks that question in a new study of autonomous driving. Goodall (no doubt a big fan of the terminator movies) isn’t so much worried about driving as he is crashing can robot cars be taught to make empathetic, moral decisions when an accident is imminent and unavoidable ?
It’s a heady but valid question. Consider a bus swerving into oncoming traffic. A human driver may react differently than a sentient car, for example, if she noticed the vehicle was full of school kids. Another person may swerve differently than a robot driver to prioritize the safety of a spouse in the passenger seat.
This stuff is far more complicated than calibrating safe following distances or even braking for a loose soccer ball. Goodall writes : “There is no obvious way to effectively encode complex human morals in software”.
According to Goodall, the best options for car builders are “deontology”, an ethical approach in which the car is programmed to adhere to a fixed set of rules, or “consequentialism”, where it is set to maximize some benefit say, driver safety over vehicle damage. But those approaches are problematic, too. A car operating in those frameworks may choose a collision path based on how much the vehicles around it are worth or how high their safety ratings are, which hardly seems fair. And should cars be programmed to save their own passengers at the expense of greater damage to those in other vehicles ?
In a crash situation, human drivers are processing a staggering amount of information in fractions of a second. The computer is doing the same thing, but much faster, and its decisions are effectively already made, set months or years earlier when the vehicle was programmed. It just has to process, it doesn’t have to think.