AI Trolley Problem
I had to cut this from my business article because it didn't fit very well with the voice of the article, but I really liked the ideas I brought up.
The trolley problem is one of the most well-known examples of an ethical dilemma and has thousands of variations. If you’ve ever seen the TV show The Good Place, you’ll know that it is easy to say what you think you would do, but it is a lot harder to actual do what you think you would do. A major lesson of the trolley problem is that there is no right answer. There is no consensus on which decision is correct. Consider the following variation on the trolley problem. You are driving your car in the afternoon down a two-way street in the big city. As you enter an intersection a car traveling the opposite direction tries to make a left turn in front of you! You have to make a quick decision, do you swerve to avoid a head on collision, or do you keep going and crash? What if there is an old lady carrying her groceries across the street right where you were going to swerve? This is exactly the kind of situation that self-driving cars must be prepared to make.
Eventually there will be a situation in which the AI is forced to choose between crashing into one car or another. Right now, the people that make this decision are the engineers writing the code to govern the self-driving cars. Are you comfortable letting a Google employee decide who lives and who dies? Who should get to make that decision? Self-driving cars are one example of autonomous machines facing ethical dilemmas, but any time that AI is used to automate a process or machine similar problems can arise. The White House recently published an early draft of the AI Bill of Rights, which is a document that outlines responsible practices surrounding the creation and use of AI systems. Having a governing body is an important step to ensuring the ethical use of AI. In the universe of Dune created by Frank Herbert, a robot rebellion and ensuing war led to the outlaw of so-called thinking machines. This is a future that we need to actively prevent from happening.
Comments
Post a Comment