I’ve been seeing notes from people that they’re worried what decision self-driving cars will make, since their programming, the software that runs them and potentially contains bugs, will control the car. If there’s a choice to avoid a collision with a car, but hit a bicyclist, what does the car do?
I found this piece commenting on the issues, and for the most part I agree. It’s not as if humans don’t already have to make these decisions, and potentially make bad choices all the time. Cars kill a lot of people. Around 30,000 per year according to some stats I’ve seen. How many of those might be preventable if a car drove itself?
Drunk driving certainly changes, as do texting while driving, distracted by kids/makeup/food/etc.
I do think the code that powers cars should be open and reviewed. I think the government might need to set some standards for how situations are handled, and certainly someone should take some responsibility for self-driving cars, but as I get older, as I find myself busier, I’d love a self-driving car and look forward to the day I can buy one.
No comments:
Post a Comment