I answer surveys online to make money and lately I completed one about this, as some car constructors (identity wasn't disclosed) wanted to gather opinions on that matter. It was the most intense suite of dilemmas I've ever faced, like the "do you divert the trolley" shit except you know... They're going to actually adapt the cars reactions to "our"/"the public's" choices.
one of the biggest fears all truck drivers have is the kid that comes running out from behind a parked car chasing a ball. most of them have made the decision that if it ever happens they will flip the truck or put it in the trees/ditch even at the risk of their own lives to avoid hitting the kid. this car is programmed to do the opposite, can't say that's a good thing. is my life worth more than some junkie crossing the road sure. but some kid makes a mistake chasing a ball no i'd rather take my chances hitting a tree avoiding the kid. self driving cars are great in theory but there are going to be problems with ethical dilemmas like this. what if its end up in the corn field or hit a dog? id choose the field but it's up to the car/programmer, pretty scary thoughts.
You're right, this is a very complex issue that unfortunately doesn't have a "right" answer. In the end I believe it comes down to marketing: "Would you get in a car that values your life less than that of others?"
An interesting moral debate could be whether to include an option to value the lives of passengers or pedestrians more, but ignoring the RND required to add two separate AIs to all vehicles, the argument would likely be just as long .
(FORMATTING ERROR)
An interesting moral debate could be whether to include an option to value the lives of passengers or pedestrians more, but ignoring the RND required to add two separate AIs to all vehicles, the argument would likely be just as long .