Will Google Kill ‘The Fat Man’?
“The psychology seems to be quite clear,” Edmonds says. “It’s easier to press a button on a joystick in the deserts of Nevada and blow up a compound in the mountains of Northern Pakistan than it is to put the bayonet between somebody’s eyes. The more divorced we are from the outcomes of our actions, the easier it is to take lives.”
One way these morals will ultimately manifest themselves is through a less violent, and more insidious, technology: the driverless car. “[Driverless cars] are going to encounter things and make choices without humans being around,” Edmonds says. “At that moment, they’re going to need some kind of ethical programming.”
Situation A: A driverless car’s brakes fail, and it’s about to run over five people. The car can swerve, but if it does, it will hit one innocent person. Situation B: A driverless car’s brakes fail, and it’s about to plow into 10 bicyclists. The car can swerve to avoid them, but if it does, it will careen off a mountain ridge and kill the person inside. Situation C: A driverless car’s brakes fail, and it’s about to run off a bridge and into a giant crowd of people. But, in this case, the car can swerve slightly and run into our beloved Fat Man, who will prevent the fall.