I’m not really a ‘car’ kind of person, so I don’t usually follow news articles about the car market.
But I couldn’t help stopping in my tracks just a couple of weeks ago when I saw an article which forecast that self driving cars will be on the market in 5 years. Why?
1. The thought of cars zipping around the roads was at one time a little overwhelming while at the same time reminiscent of that immortal TV show – The Jetsons. – Stop for a tic to either wander down memory lane or discover this futuristic family for yourself!
2. The second reason I stopped in my tracks was because I had the pleasure of going for a drive in one about a year ago – The Tesla Model S. Even if I was a little nervous sitting in the front passenger seat while watching the driver’s hands be anywhere other than the steering wheel, it really was quite an awesome experience!
While the article I read seemed to be a promo for Ford’s predicted entry into the self drive market, others including Google, Uber and BMW are starting to compete with Tesla who are so well advanced in the self driver market that they are now working on a more affordable version.
Self driving cars are, it is said, is a development that will place the incredible advances that have been made in artificial intelligence squarely into the lives of the masses. But with this development, a whole range of ethical issues arise. And like many of you, I’ve not considered these issues until I read a recent article in NovaNext: Can Autonomous Cars Learn to be Moral? (July 27, 2016)
As artificial intelligence develops increasingly subtle and complex decision making processes, it will become harder to determine who’s accountable for a machine’s actions: the engineer who designed it, the consumer who purchased it, or the machine itself.
The kinds of decisions that need to be incorporated into the ‘thinking’ of self driving cars are really quite scary. If, for example, the self driving car is heading into a crash with another vehicle or an oncoming train should it veer sideways to avoid the crash knowing that the car and its driver will roll down the bank on the side of the road with the possibility of the driver being either injured or killed?
Referred to as The Trolley Problem, this kind of ethical decision has long been debated by philosophers:
Food for thought – no?