It’s all the rage lately, the “morality” issue with self-driving cars or autonomous vehicles: What should my self-driving car do if it sees that an accident is inevitable, and somebody has to die?
The Scenario: The Car Crash That Can’t Be Avoided
Suppose I am the sole occupant in my smart car, cruising along happily on the highway. It’s driving, I’m reading.
Perhaps I’m a nice enough guy, but sadly dying of an incurable cancer, with two months to live. I’m on my way to hospice.
In my lane directly ahead of me is a school bus full of angelic, talented young people, just starting out on the journey of life. They have so much to offer, and so many good years to offer it.
Just behind me is a car speeding out of control. It’s going to rear end me in a few seconds. The laws of physics dictate that a horrendous collision is inevitable.
My beautiful autonomous vehicle does a quick calculation, and it determines that there is just enough time for my car to switch lanes. I’m safe!
Of course, this means that the school bus will get demolished instead.
One final detail: the car software knows that if I just stay in my lane and get hit, the school bus will be spared.
The Red Herring: Me or Them?
Maybe I should give it up for the kids? After all, I don’t have much longer to live, and they have their entire young lives ahead of them. Should I do the “right” thing, and stay in my lane?
Most people would say yes, but here’s the rub – it’s not my decision. There’s no time. It’s up to the car’s software.
Why I Don’t Give a Flying Fig
A lot of people are spinning their wheels on this one. Why is this an irrelevant or even dangerous question?
Well, here’s an interesting fact to chew on: an average of 90 people die every day in the United States alone in car crashes. According to the World Health Organization, there are on average more than 3400 deaths every day on the roads world-wide.
Autonomous/self-driving vehicles would prevent 93% of these deaths, according to Lloyd’s. That translates to saving 83 lives per days in the US, and 3160 per day world-wide.
Which means that almost every minute that we delay the implementation of self-driving technology by debating this silly red herring, one more person dies. And the injury rate is one serious, life-changing injury every 2½ to 5 seconds.
So that’s why I don’t give a flying fig, and you shouldn’t either. While these theoreticians debate how many angels can dance on the head of a pin, real people are suffering and dying every second of every day.
And that’s why I take this hard line: Anyone in government, industry, academia, or anywhere else who needlessly delays the implementation of self-driving/autonomous technology by even one minute is tantamount to being an accessory to murder. Strong stuff, but tell me why I’m wrong.