In sad news an Uber autonomous driverless car was involved in a collision with a pedestrian in Tempe, Arizona. The lady was taken to hospital and later died of her injuries. This is believed to be the first time an autonomous self driving car has been involved in an accident where a pedestrian was killed, although a Tesla driver was killed in 2016, allegedly when he didn’t keep his hands on the wheel as he was advised to do.
Uber have pulled all their autonomous self driving cars off the road and in a statement on twitter said “Our hearts go out to the victim’s family. We’re fully cooperating with @TempePolice and local authorities as they investigate this incident.”
Our hearts go out to the victim’s family. We’re fully cooperating with @TempePolice and local authorities as they investigate this incident.
— Uber Comms (@Uber_Comms) March 19, 2018
It’s hard to see how this will affect the driverless car industry, of which there are many players. The manufacturers often claim that they will cut road deaths as technology is more reliable than humans and unable to become tired or distracted. However this incident will doubtless give fuel to activists who disbelieve the safety claims or simply don’t want to see drivers’ jobs taken over by robot cars.
In this incident there was a human driver in the car who’s role was to take over in an emergency. Quite what the legal situation is hasn’t ever been tested – is the driver responsible or will the companies who manufactured the car or supplied the software be deemed to be at fault. Either way it’s terribly sad and our thoughts go out to the victim’s family.
2 Responses
food for thought
a spokesman for the industry said
its right to be concerned about driverless cars
but be terrified of cars driven by humans
over a 100 people a day die in america from traffic accident
“Quite what the legal situation is hasn’t ever been tested – is the driver responsible or will the companies who manufactured the car or supplied the software be deemed to be at fault.”
The (human) driver is at fault.
the laws haven’t been changed in such a way to make the car liable, so you need a driver in there purely to take responsibility. his job is as much to take the blame when something goes wrong, as it is to prevent things going wrong.
– as the law stands, if you stab someone, you’re responsible, not the knife manufacturer. the human is in control of the tool, regardless how smart the tool is.
i’ve brought this up a few times before, at some point (dont think it applies to this particular case) a car is going to DELIBERATELY take a human life.
presented with the trolley problem (do you kill one person or five people?) the car must make a choice, and that choice is pre-programmed.
somebody has already written the code that may purposefully choose to end your life.
so when it does kill you, and some guy 5 years ago was the one that made that choice. does he get charged with murder? it was pre-meditated and deliberate, not an accident, a pre-programmed choice that was by definition foreseen, they just didnt know who the victim would be, but they knew there would be a victim.
– Humans don’t operate like that, we make mistakes, to err is human. the car will deliberately kill you then print out a logic report of why it decided you were the least fit to live.