Uber Halts Autonomous-Car Testing After Fatal Arizona Crash

A self-driving car from Uber Technologies Inc. hit and killed a woman in Tempe, Arizona, on Sunday evening, what is likely the first pedestrian fatality involving a driverless vehicle.

Read Full Story >>
The story is too old to be commented.
Speed-Racer217d ago

Unfortunate...I think Uber rushed too fast into the autonomous game without proper testing like what Tesla and Google are doing.

Speed-Racer216d ago

After reading up more about it, it seems the lady is at fault for suddenly walking out into flowing traffic. Don't think anyone could have easily avoided hitting her. Darwin award on her part?

Cobra951216d ago

Yes, she brought this on herself. But a human driver may have instantly recognized hitting her as the thing to avoid at all costs, while an AI may have followed a totally different logic path, if it saw her as a human at all. We have all had idiots walk out in front of us on the street, haven't we? I haven't hit one yet, in a few decades of driving.

EazyC216d ago

Wonder how long this will take to at least be accepted? I think we are still a ways off from viewing an autonomous vehicle crash as we would one driven by a person. I wouldn't get in a self-driven car, personally -- but I recognise that that's a sign of changing times... I imagine people had the same attitudes about aeroplanes in the early 20th century.

Cobra951216d ago

Those aeroplanes had humans controlling them. They were no less capable than we are now. Question is how capable are AIs piloting autonomous vehicles? Does an AI get the overwhelming sense of dread when a careless pedestrian steps out into their path? Do they instantly decide that crashing into a parked car would be preferable to hitting the person? Or do they estimate force of impact both ways, overall damage to the vehicle and occupants, and do other entirely heartless plus and minus math instead?

Cobra951216d ago

Had to happen. A bug in a videogame may be excusable. Bugs in a 2-ton guided missile are not, yet they will still happen, because programmers are not infallible, and can't account for all possible conditions. Once the world fills up with autonomous machines, it will be interesting to see how laws evolve to deal with it. Is this accident criminal negligence, or is it an empty hole in criminal law? Someone lost her life either way.