Ego Morality
Today's topic is how 'morality' will be programmed into autonomous vehicles. There has been some recurring press on this issue over the last month and I've had some time to formulate my thoughts on this topic. It is tough as, like most of first-world humanity, I think I'm moral but I've never faced the really tough life-and-death moral decisions that put that assertion to the test. Most of my morality training has come from Star Trek, and that's not bad, but it is fairly basic and presumes that I'm sitting on the bridge of a star ship in orbit, debating the gray areas of the Prime Directive. I hope I never have to even that. Should it happen though, I'm pretty sure that if I have the time to think through the decision, I'll do the right thing.
But, that's the catch: 'if I have the time to think it through'. The area where this is most likely to happen in my current life path is when I drive. It is easily the most dangerous thing that I do on a daily/weekly/monthly basis (I ski and take some bigger risks there, but am only able to get in about twelve days a year). Driving is a task that, when it goes south, will happen too quickly for my frontal lobe to override my learned motor skills and lizard brain reflexes. Reflexes that have been fine tuned over hundreds of thousands of years to keep the individual alive.
To the point
The thought experiment that all of these articles walk us through is about how an autonomous car will react if it is faced with an us or them live-and-death driving decision. For instance, it comes around a corner at speed. There is a cliff off the shoulder on its side of the road and a bunch of people crossing the road in front of it. (I imagine some part of California Route 1, around Big Sur and these are tourists crossing from the north bound lane to the western side of the road so that they can take pictures.) There is no time to slow down and stop before plowing into the pedestrians. The car must decide whether to stay on the road and hit the people or to drive off the cliff and sacrifice itself.
Not pedestrians, but you get the point.
(From this video loaded onto YouTube by 'MK Biswas73')
My sensationalism is that this is a GOOD THING. The car will have the processing power and the code (unencumbered with millennia of survival instinct) to make the decision in time. I'm pretty sure that I, along with the vast majority of my fellow suburban, coffee-swilling, texting, sedan pilots, will not. I might get out an "Oh, Fu...!" before leaving a seven-ten split through the crosswalk. If I have time to swerve, it will be toward the cliff, but I know that I'll counter steer as soon as I see what is about to happen.
The point is, I'm much more likely to save myself at the expense of the pedestrians than is the autonomous car.
Relativistic Morality
Which side you come done on this issue is going to have to do with your frame of reference: are you more likely to be a driver or a road crosser. Drivers want control and are going to have a problem with driving off a cliff. Pedestrians, outside of Big Sur tourists, are more likely to be at the mercy of the world around them. Either they don't own a car and their two feet are how they get around, or their car is broken, or they are on their way to public transport. They are already feeling put upon and getting hit by a privileged vehicle owner is more than likely to 'trigger' their social injustice right in the rib cage. At least, with an autonomous vehicle, the pedestrians have a higher chance of survival.
Again, the point. The point is that autonomous cars may be an moral leveling factor for the vehicle challenged. Despite recent accidents (that I'm going to excuse as 'working the bugs out'), autonomous vehicle should be, will be, safer for our roads and will save lives over the yearly massacre that is human piloted vehicles. The only people that will really suffer will be the insurance industry, though they still have a few cards to play.
No comments:
Post a Comment