Technology vs Humans. Engineers Seek Answers in Uber’s Fatal Self Driving Car Accident

The Volvo XC90 owned and operated by Uber shows the hardware system developed by Uber. On top is a rotating LiDAR scanner. LiDAR can detect objects up to 100 meters away in complete darkness. Uber has about a hundred of these vehicles in the Arizona facility. (Picture courtesy of Uber)

UPDATE: Preliminary report from NTSB indicates that Uber’s self-driving car had 6 seconds to respond before the fatal accidents but the system got confused and was unable to correctly identify the object as a pedestrian with a bicycle. At a little over one second before the collision, the Uber system did determine that the vehicle was on a collision course with the yet unidentified object but the Uber system did not alert the operator or apply the brakes. See full update here in June 5, 2018 article.



Elaine Hertzberg, 49, walking her bike across North Mill Avenue in Tempe, Arizona, was struck and killed by a vehicle. She became the latest of roughly 6000 pedestrians killed every year in the US. But the vehicle that hit her was unlike any other that had killed before. The Volvo XC90 SUV was owned and labeled as an Uber self driving car. In self driving mode, it was designed to be safer than a normal, human operated car. It had LiDAR and radar to see through the dark. A bank of near-super computers in the back were to have analyzed images streaming from a camera array in real time and applied the brakes. There was an “operator,” who was to be alert and be alerted should a collision be imminent.

None of that worked.

Although many factors may have contributed to this accident, including notable failures by the operator and the victim, the biggest engineering mystery may be The Uber car’s inability to detect and avoid the pedestrian. Uber had tested their vehicles on that very road last Fall, successfully detecting jaywalkers as much as 100 yards away, according to azcentral.com.

The NTSB is on the scene and conducting a full investigation, but what follow is what we know so far.

Cameras, Radar, LiDAR and Super Computers

An array of forward-facing cameras is mounted on top of the vehicle. Some cameras may have been of the kind effective in low light situations. Infrared cameras can detect humans and warm-blooded animals. It is not known if the Uber vehicle was equipped with low-light or infrared cameras.

While low visible light may be a problem for both cameras and the human operator, LiDAR can make no such excuse. Operating at wavelengths outside the visible spectrum, LiDAR should have given the Uber vehicle a full “360 degree 3 dimensional scan of the environment,” according to Uber’s own documentation.

1 Video of LiDAR in action. Click to watch LiDAR-equipped car navigate traffic and avoid pedestrians in DARPA's urban challenge. (Picture courtesy of You Tube)
The data from the LiDAR and the images from the cameras are fed to GPU-based computers in the back of the vehicle. Using elaborate pattern matching algorithms, the images are processed to determine if the brakes are to applied or evasive action is taken. The GPUs are from NVIDIA but the company has denied its self-driving software or systems were used in the Uber vehicle.

“Something unexpectedly entering the vehicle’s path is pretty much the first emergency event that autonomous car engineers look at,” says Tech Crunch. “The situation could be many things — a stopped car, a deer, a pedestrian — and the systems are one and all designed to detect them as early as possible, identify them and take appropriate action. That could be slowing, stopping, swerving, anything.”

LiDAR units operate by emitting laser pulses. From the direction and the time it takes between the laser pulse emission, reflection and detection, a position on an object is determined. The laser is rotated, a mirror directs the laser beam up and down and the vehicle creates a point cloud around the vehicle.


LiDAR by Velodyne

The LiDAR unit, about the size of a human head, may have been this one, a Velodyne HDL-64E. Said to cost between $75,000 and $100,000, the unit uses 64 lasers and can rotate up to 900 RPM producing 2.2 million points per second. The laser emits light of 905 nm wavelength (eye safe, infrared), lighting up objects as far away as 120m. Velodyne makes the most sought after LiDAR units for self driving vehicles. The company cannot make enough. There is a “multi-month backlog” for the devices. LiDAR is used by all self driving cars except Tesla, which favors radar.

Uber may have had the previous generation model with lower resolution, says Greg Locock in EngTips, who calculates that at 120m away, 6 seconds from impact, a person with a bike would come into range as 2 pixel by 2 pixel blobin a picture 900 pixels wide. Braking time from 40 mph is about 2 seconds. The blob would be persistent and therefore easy to track, he says.

“It should have been trivial for the object processor to determine that there was a moving object about to get hit by the car. Consider that in the 1.2 seconds from that point, there should have been at least 6 complete frames, and more than 1230 LiDAR returns from the pedestrian (actually way more, since the range was decreasing), it should have been impossible for the object processor to ignore that pedestrian,” says IRstuff on EngTips.

Marta Thoma Hall, president of Velodyne, questions why the Uber vehicle involved in the accident would not have stopped.

"Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation," wrote Thoma Hall in an email to the LA Times. "However, our LiDAR doesn't make the decision to put on the brakes or get out of her way.

NVIDIA video shown at CES and GTC shows company's ability to detect cyclist on a bike. NVIDIA software was not used in the Uber vehicle. (Picture courtesy of NVIDIA)

Radar Maker Backs Away from the Accident

Aptiv, the company who makes the radar and camera for the Volvo XC90 standard collision detection and lane-keeping system wants everyone to know its system had been disconnected. This is “standard practice as it tests its own system,” according to an Aptiv spokesperson in Automotive News Europe.


Police Blame Pedestrian

Tempe police were quick blame the victim, classic wrong place at the wrong time, and to clear Uber and operator.“It’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” said Sylvia Moir, police chief of Tempe, Ariz. “The driver said it was like a flash, the person walked out in front of them.”

Not so fast, says Lionel Hultz on EngTips “…she was walking and pushing the bike, and if that is true then she wasn't travelling particularly fast.” He adds, “If she came from the median on the left then she crossed almost 4 traffic lanes before the SUV hit her which certainly goes against any claims that she suddenly stepped in front of the SUV.”

Police pointout the victim was jaywalking.

The incident happened within perhaps 100 yards of a crosswalk, Moir said. “It is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available,” she said according to the San Francisco Chronicle.


The X Factor –Pathway Gives Mixed Message

The view during the day of Mill Avenue taken approaching the spot Elaine Hertzberg was struck. (View taken by Google Map July 2017)
2 Satellite view of accident area. Click for interactive map. (Picture courtesy of Google Maps)
3 Mixed messages. X shaped pathway in the wide median on Mill Avenue near the fatal accident invite crossing but signs dissuade it. (Picture courtesy of Google Maps)
But a theory of why Elaine was crossing the road at that very spot has emerged.

Elaine was homeless. The area may be near a homeless encampment. Videos recorded at night show people on the sides of the road in that area.

While there is a crosswalk a 100 yards away at the traffic light, that point of Mill Avenue may have been intended as a place for pedestrians to once cross. Two diagonal brick paved paths form an X shape in wide median strip near where Elaine was struck. The paths appear to be designed for pedestrians. The city currently tries to discourage pedestrians by putting signs at the curb in the middle of each entrance to the paths that tell people to use the crosswalk at the intersection, but without fencing or barriers, anyone can easily go around the signs. Also, at night the signs may not have been visible.


What Do We Know About the Driver?

For those wanting to blame the “driver,” the operator at the wheel of the Uber, 44 year old Rafaela Vazquez, also presents an easy target. Vazquez, a convicted felon with a list of traffic violations, is obviously distracted and does not appear to see the impending collision until it is too late.

Uber defends the hiring of Vazquez. Under a different name (Rafael Vazquez had started identifying as a woman) served almost 4 years in prison after conspiring to do an armed robbery of a Blockbuster Video co-worker making a deposit. She was released in 2005. Uber’s background checks for drivers only go back 7 years. Drivers only have to reveal traffic violations in the last 3 years, so Vasquez was a legitimate Uber driver by Uber’s rules in Arizona, referring to the company’s public hiring policy which states how everyone deserves a fair chance.

Uber came under fire in Colorado, which forbids anyone with a felony to drive for a ride sharing company. The company was fined $8.9 million after a 2017 investigation showed Uber had hired 60 drivers with felony convictions, says azentral.com.

As automation increases in a self driving vehicle, say in level 4 automation and above, there is little for a human to do. And while requiring a human to pay constant attention is easy enough to write into the human operator rule book, it may be big ask for humans in real life, and may, in effect, make them scapegoats in a system that for the most part relies on tech to detect and avoid danger.

How effective is demanding that an operator has to pay 100% attention, eyes on the road, hands on the wheel and you are dammed if you don’t. Consider an operator in a test vehicle would be tempted to do the very things self driving cars will make possible for us all: texting and watching videos. Imaging mile after boring mile on Arizona roads, some of the longest straightest roads in the country.

Google’s Waymo found its operators so prone to distraction and unable to successfully take over operations when needed that they chose not to rely on human intervention. Videos taken in 2013 by Waymo showed one operator literally asleep at the wheel. Other infractions included putting on makeup and handling cell phones. All while going highway speeds, according to Reuters.

“What we found was pretty scary,” said John Krafcik, head of Waymo, during a media tour of a Waymo testing facility.

Tempe police spokesman Sgt. Ronald Elcock said impairment did not initially appear to be a factor for either Vasquez or Herzberg.


What Does the Video Tell Us?

Dashcam video from Uber vehicle shows Elaine Hertzberg walking her bike across Mill Avenue and seeing the Impending vehicle before impact.
The video released is high contrast and may not exactly duplicate what the visual images taken by the cameras or what the safety driver was seeing.
Driver was looking away for 5 seconds. In that time, the car going 38 mph had traveled almost 300 ft (274 ft). Driver appeared to be watching something below and to the right. A slight smile appears on her face that could be interpreted as amusement or enjoyment.
The operator’s hands do not appear to be hovering over the steering wheel, as Uber drivers are required to do, according to the New York Times, despite this being difficult to do for anyone for any length of time. However, level 3 automation, which was likely the minimum level of the Uber vehicle in question, does not require hands hovering over the the wheel. Uber has not confirmed the level of automation of the vehicle involved in the accident.

Pedestrian may have crossed 2 turn lanes and 1 through lane and was struck in the right lane of the road way, travelling a total of 42 feet.

Pedestrian was walking her bike. Normal walking speed around 3.5 miles per hour would have provided 10 seconds of exposure on the roadway.

Pedestrian looked to be have a pink bike with several white bags on its handlebars.


What Role Does A Driver Play in a Self Driving Car?

The Society of Automotive Engineers (SAE) shows the human fading away with increasing levels of automation. (Picture courtesy of NHTSA) source: https://www.nhtsa.gov/technology-innovation/automated-vehicles-safety
The Society of Automotive Engineers (SAE) and the National Highway Traffic Safety Administration have defined varying levels of automation in self driving cars. Volvo and Uber, in a partnership, had announced a plan to have level 4 self driving cars available by 2021 and the Uber vehicle in question may have been testing level 4. Google’s Waymo had been testing Level 4 vehicles in Phoenix since October, according to a local news source.

Most self driving cars on the road are level 3 prototypes where the driver may not need to monitor the vehicle but must be ready to take control at any time.

  • Level 0 – no automation: the driver is in complete control of the vehicle at all times.
  • Level 1 – driver assistance: the vehicle can assist the driver or take control of either the vehicle’s speed, through cruise control, or its lane position, through lane guidance. The driver must monitor the vehicle and road at all times and must be ready to take control at any moment, with hands on the steering wheel and feet on or near the pedals.
  • Level 2 – occasional self-driving: the vehicle can take control of both the vehicle’s speed and lane position in some situations, for example on limited-access freeways. The driver may disengage, with hands off the steering wheel and feet away from the pedals, but must monitor the vehicle and road at all times and be ready to take control at any moment.
  • Level 3 – limited self-driving: the vehicle is in full control in some situations, monitors the road and traffic, and will inform the driver when he or she must take control. When the vehicle is in control the driver need not monitor the vehicle, road, or traffic but must be ready to take control when required.
  • Level 4 – full self-driving under certain conditions: the vehicle is in full control for the entire trip in these conditions, such as urban ride-sharing. The vehicle can operate without a driver in these conditions; the driver’s only role is to provide the destination.
  • Level 5 – full self-driving under all conditions: the vehicle can operate without a human driver or occupants


What Happens to Uber’s Self Driving Program?

The death of a human being by an automated anything is a bombshell event. A week after Uber’s fatal accident, Governor Doug Ducey closed down the company’s self driving operation in Arizona, calling the incident a “deplorable failure” in a letter to Uber’s CEO, Dara Khosrowshahi. Ducey had been Uber’s champion, welcoming the company’s fleet of self driving cars to Arizona in 2015 after Uber failed to get permits in San Francisco. Uber had also worn out its welcome in Pittsburgh, after poaching researchers from Carnegie Mellon, charging for rides it promised for free, and not delivering on jobs for the locals, say reports in the Wall St Journal and the New York TimesWill the fatal accident also kill Uber’s self driving car program?


What Happens to Elaine’s Family?

Uber said it settled with Elaine’s daughter, with terms not disclosed. But the attorney representing additional family members has stated no settlement has been reached and additional relatives are still coming forward, reports azcentral.com. Elaine, 49, was released from prison and was homeless at the time of her death.