Uber’s self-driving car crash might have been prevented by older tech, says Intel

March 27, 2018

The death of Elaine Herzberg, who was struck by a Volvo XC90 Uber car in self-driving mode, has launched frenzied theorizing throughout the tech and automotive industries. Intel took its turn this week, through the car-sensor subsidiary Mobileye, which it acquired in 2017. In a blog post, Mobileye’s combo CEO/CTO Amnon Shashua says that the self-driving industry relies too much on emerging technologies:

“Recent developments in artificial intelligence, like deep neural networks, have led many to believe that it is now easy to develop a highly accurate object detection system and that the decade-plus experience of incumbent computer vision experts should be discounted.”

Computer vision, an older form of image recognition, requires painstaking annotation of video streams. It’s powered advanced driver assistance systems (ADAS), such as automatic emergency braking and lane-keeping support. Shashua argues that computer vision should serve as a backstop for the newer technologies, in which the AI figures out how to identify objects after seeing many, many examples.

Uber’s self-driving car crash might have been prevented by older tech, says Intel | DeviceDaily.com

Mobileye ADAS system response. Green and white bounding boxes are outputs from the bicycle and pedestrian detection modules. [Photo: courtesy of Mobileeye]

As a test, Shashua ran Mobileye’s ADAS software on the dash cam video from the Uber crash. It was able to identify both Herzberg and the bicycle she was pushing, says Shashua, who released images showing the detection. However, the ADAS registered only a “low confidence” that it was correct, and it kicked in during the last second before the crash–too late to stop in time. (While that would seem to undercut the argument that old-school tech could have prevented the crash, Shashua points out that the second-hand dash cam video was much lower-quality than what a dedicated safety system would capture.)

Moreover, it’s far from clear if the crash happened because the deep-learning AI wasn’t up to the task or simply because engineers screwed up in how they implemented it. The car’s Lidar (laser scanner) and radar should have easily spotted Herzberg as some kind of obstacle long before she was visible on video.

Avoiding this accident wasn’t a “hard problem”–the term engineers use to describe big technical challenges. The incident could have been a celebrated example of self-driving technology exceeding human capability to avoid an accident. Instead, it’s a dumbfounding example of failure.

Fast Company

(16)