The Ethics of Self-Driving Cars
Earlier today, I came across a brief article talking about the progress that’s been made in the technology of driverless vehicles. I’m most definitely not a Luddite and I’m generally excited by the thought of this tech being available but the article really got me started thinking about the moral and ethical questions involved.
For example, as a reasonable thought experiment, a self-driving car is moving along and detects a child that ran into the road directly in front of it. With no time to stop, a decision has to be made. Swerve to the right and hit a tree/parked car/other obstacle and risk the owner’s safety? Swerve to the left into oncoming traffic and risk both the owner and the oncoming drivers? Or continue straight and strike the child?
As a product made by a for-profit company, would it be acceptable for that company to prioritise the paying customer’s safety above everything else, even if their risk of injury is less than that of pedestrians?
Programmers will be writing the code that makes these decisions. Do the programmers bear some moral responsibility for the decisions their code makes? What about legal responsibility? Will there be any ethical guidelines to help when writing the code? Given the recent VW scandal, should there be anyone reviewing the code to make sure they’re following such guidelines? Do you feel comfortable letting these decisions be made by people who will be far removed from any personal involvement with the consequences?
If you are a programmer, how would you feel about participating in and writing code for a project like this?
As I said, I’m actually excited by the thought of this technology coming into use but I think these are good questions that I don’t have any easy answers for. I’m very curious about what people think so please, feel free to post and discuss.