Waymo Rolls Out Driverless Taxis in Phoenix

Waymo One autonomous vehicle. (Image credit of Waymo.)

Five years ago, Waymo, a Google subsidiary, offered the first autonomous vehicle (AV) ride on public roads. After that demonstration, the company spent time refining its technology. For the past year, select customers have been beta testing Waymo’s AV taxi service, with roughly 10 percent of the rides being completely autonomous (i.e., there was no human driver on board to take control in an emergency). The company recently announced that its driverless taxi service is now available to the general public in the Phoenix area. Let’s take a look at the evolution of the vehicle’s design and the hardware and software that could soon be cruising around your neighborhood.

Vehicle Safety

Worldwide, there are more than a million vehicle-related fatalities every year; nearly all are caused by human errors such as excessive speed, distraction, intoxication and drowsiness. When it comes to vehicle safety, AV companies are out to show that artificial intelligence (AI) is better than authentic ineptitude. 

Waymo’s AI “Driver” has logged more than 20 million miles on public roads and over 10 billion miles of simulations in 25 cities. (By comparison, in my 40 years behind the wheel, I’ve driven about a half-million miles.) The Waymo Driver has encountered more than 40,000 scenarios, each with multiple variations, including “once-in-a-million-miles” situations. 

Waymo’s Safety Report addresses the U.S. Department of Transportation’s AV safety framework by answering four fundamental questions:

  • Where am I? Using its database of 3D maps, the Driver knows where it is on the road relative to curbs, lane markers, signs and other stationary objects. It uses its sensors to fine-tune and cross-reference the information from prestored maps without relying on GPS.
  • What’s near me? Sensors and software can identify objects such as other vehicles, pedestrians, barricades—basically anything that may be different than what’s on the stored maps. Its sensors can detect objects up to 300 meters away in any direction.
  • What happens next? Based on other objects’ speeds and trajectories, the Driver anticipates their next moves. Like a chess master, it predicts a plethora of paths and uses an algorithm to determine the most likely behaviors of the objects.
Data-driven decision-making. (Image credit of Waymo.)
  • What should I do? Weighing all the options, the Driver then decides what to do and how to do it. It chooses the lane, speed, trajectory and steering maneuvers while constantly monitoring its surroundings in case an object does something the Driver hadn’t anticipated. 

Here's how the car “sees” a school crossing:

Video credit of Waymo.

Designs

There are two schools of thought when it comes to making an autonomous vehicle: retrofit an existing vehicle or build one from scratch. When Waymo’s parent company, Google, started the project, it began with a Prius retrofitted with off-the-shelf technology, followed by a retrofitted Lexus. Then, Waymo designed and built a new vehicle, the Firefly, from the ground up. The company learned many lessons from that endeavor, one of which is that the fastest way to build a fleet is not to build it from scratch, but to apply AV technology to multiple platforms, which allows easier scale-up. This isn’t the same as a retrofit, as these vehicles are custom-built models designed to accommodate Waymo’s AV technology. Chrysler and Jaguar both agreed to produce custom models of their Pacifica minivan and I-PACE sedan, respectively, which were designed to incorporate Waymo’s AV hardware and software.

Waymo Evolution. (Image credit of Waymo.)

Technology

The bulk of the specialized technology resides in the sensors, which give the Driver a view of the vehicle’s surroundings. AVs employ three types of sensors—cameras, LiDAR and radar—each with its own benefits and drawbacks. Cameras can detect colors, but they can’t measure distance or see through fog and rain. LiDAR detects all objects and offers precise distance measurements but struggles in adverse weather conditions. Radar penetrates rain and fog but only reflects off hard surfaces. The combination provides a thorough, and often overlapping, perspective of the world. 

Sensors provide a long-range, 360-degree view. (Image credit of Waymo.)

Waymo’s roof-mounted dome LiDAR system provides a long-range 360-degree view. Perimeter LiDAR located at four points on the car have a wide field of view, helping to detect nearby objects. Twenty-nine cameras with overlapping fields of view also give a 360-degree perspective, including some long-range cameras that can see objects up to 500 meters away. To handle weather issues, the cameras are outfitted with washers, wipers and heaters. 

Top view of the sensor array. (Image credit of Waymo.)

Six radar units, which can see through rain, snow and fog, can detect static and moving objects and measure their velocities. This helps the Driver figure out what other objects are doing and anticipate their next moves. 

In addition to the sensors, the rooftop dome includes an LED display that can be used to tell a passenger which vehicle is waiting for them, like a Limo driver holding a sign at the airport. 

The Dome. (Image credit of Waymo.)

Machine Learning and Artificial Intelligence

For an autonomous vehicle, the phrase “driver education” takes on a whole new meaning. To train the Driver, Waymo set up a custom city on a 113-acre California parcel that was once home to an Air Force base. Data scientists pored over databases of accident reports and engineers developed algorithms to handle the scenarios. 

Machine learning (ML) is an iterative process of trial and error followed by feedback and adjustments. The adjustment algorithm is typically created by engineers, using their experience and intuition. Unfortunately, this process is very labor-intensive and time-consuming, so Waymo decided to take a lesson from evolutionary biology and use a competitive process called “Population-Based Training” (PBT). 

Waymo's machine learning. (Image credit of Waymo.)

Engineers set up multiple neural networks, each encountering the same scenarios, and pitted them against one another. Whichever one offered the best solution was propagated with minor tweaks (mutations, as it were) and retried, while the others were discarded. Since each “progeny” inherits its “parent’s” knowledge, the result is more accurate and uses fewer computation resources. This cycle of trial, error and refinement helps the Driver learn from its own experiences as well as the collective history of all Waymo vehicles. That’s quite a knowledge base!

The AV Market

By the year 2026, the AV market is expected to exceed $500 billion. In addition to Waymo, several established automotive companies and numerous upstarts are vying for a piece of the pie, and many will be using the product-as-a-service model in order to turn a fleet of AVs into a steady revenue stream. As the competition continues, we’ll see improvements in sensors and artificial intelligence. Think about how much of that technology will spin off into other products, spawning innovation in industries outside the automotive arena. Just don’t let the thought distract you while you’re driving, okay?