Tesla AI Day: Musk Unveils Ambitious Plan to Be an AI World Leader

Tesla recently held its AI Day event, in which the company made a bold case to be seen as a global leader in artificial intelligence (AI) development.

The carmaker showcased three technologies to back up those claims: Full Self-Driving, the Dojo supercomputer and the Tesla Bot. Let’s take a deeper look at these technologies.

Full Self-Driving

Tesla’s “Vector Space” imaging, from the AI Day video.

Tesla is working under a cloud of regulatory investigations into the failure of its Autopilot feature to operate properly when faced with stationary emergency vehicles—and this isn’t the first time the company has been subject to regulatory scrutiny.

But that isn’t stopping the carmaker from pursuing autonomous driving—and the technology unveiled at AI Day suggests that the company could overcome the failures that have resulted in those investigations with its Full Self-Driving (FSD) feature. While FSD isn’t actually capable of autonomous driving right now, Tesla’s AI team is making significant strides in reaching that goal.

“We are effectively building a synthetic animal from the ground up,” said Andrej Karpathy, Tesla’s head of AI. “The car can be thought of as an animal—it moves around and senses the environment, and acts autonomously and intelligently. We are building all the components from scratch in-house … all the mechanical components of the body, the nervous system which is all the electrical components, and for our purposes the brain, or the autopilot.”

FSD in action. (Click for video.)

Tesla is designing a neural network that processes raw information from eight cameras around car in real time and is recreating them in a virtual 3D representation called the “Vector Space.” That representation will include and identify edges, curbsides, traffic signs and lights, and other vehicles—their position, direction, distance, velocity, and so on.

That data is processed by the carmaker’s Neural Net Planner, which is a series of AI algorithms that interprets the data and creates a route for the vehicle to follow. The Planner runs thousands of simulations a minute that allows the vehicle to not only sense and interpret its surroundings but also simulate and predict the behaviors of objects around it—such as other cars, pedestrians and other obstacles. It will even remember objects that become hidden—such as pedestrians who walk behind a cargo van—and predict their behavior as well.

This is a significant leap forward from the current technologies available in a Tesla car—and is a promising solution to problems such as how to deal with a parked emergency vehicle that has its lights flashing (which is the subject of the current investigation).

“I'm confident that our hardware 3 Full Self-Driving computer 1 will be able to achieve full self-driving at a safety level much greater than a human,” said Tesla CEO Elon Musk. “At least 200 percent or 300 percent better.... Then, obviously, there will be a hardware 4 FSD computer 2, which we’ll probably introduce with Cybertruck, so maybe in about a year or so. That will be about four times more capable.”

While most of the image processing happens in the vehicle itself, training and simulations are run in Tesla’s data centers where the neural net is trained and sensor data is analyzed to organize and label millions of objects. The resulting behavior models are then shared back with the vehicle to help fine-tune its on-the-road processing.

Not surprisingly, FSD is going to require a heck of a lot of computing power. That’s where the next big reveal of AI Day comes in: the Dojo supercomputer.

The Dojo Supercomputer

Tesla’s current supercomputer is, according to Karpathy, the number five largest supercomputer in the world. Dojo will take that to the next level: Tesla boasts that its next supercomputer will be the fastest AI training computer on the planet, with four times the performance, 1.3 times better performance per watt, and a five times smaller footprint than its competitors—all at the same cost.

While Dojo is still in development, Tesla unveiled its proprietary D1 microchip that will be used to run the supercomputer.

While it’s currently easy to scale up the compute in supercomputers, it is difficult to scale up bandwidth and extremely difficult to reduce latencies. Dojo addresses this challenge by using a distributed compute architecture with a large compute plane populated with robust compute elements, connected with extremely high bandwidth and low latencies, in a two-dimensional format. Big networks are mapped and partitioned onto it to extract parallelisms, and then a neural compiler exploits spatial and temporal localities to reduce communications footprints to local zones as well as the global communications load. Doing this allows the bandwidth use to be scaled as needed.

The smallest unit of the architecture, the training node, is capable of 1064 gigaflops of compute power. The training node architecture allows for simultaneous compute and data transfer functions, and Tesla’s custom instruction set architecture is fully optimized for machine learning.

Tesla’s D1 chip compute array consists of 354 interconnected training nodes that allow for data transfer rates of 4 terabytes per second. The 645-millimeter chip has 50 billion transistors and over 11 miles of wiring, and is manufactured using 7-nanometer technology.

“This chip is like a GPU-level compute, with a CPU-level flexibility and twice the network chip level I/O bandwidth,” said Ganesh Venkataramanan, Tesla’s senior director of Autopilot Hardware and Project Dojo lead.

500,000 D1 chips are connected together to create a training tile—which will be Dojo’s unit of scale. To make high-bandwidth compute plane possible, the compute plane is orthogonal to the power supply and cooling. Each training tile is capable of 9 petaflops of compute.

A cabinet with two trays, each one with six tiles, would contain over 100 petaflops of compute. And 10 cabinets create what Tesla calls an ExaPOD: a room-sized processing unit that has over 1 exaflop of compute power with uniform high bandwidth and low-latency fabric.

The compute plane can be partitioned into Dojo Processing Units (DPUs), a virtual device that can be sized according to the needs of the application. It can be scaled up or down depending on network needs. All the user has to do is make small changes to their scripts and the Dojo compiler engine maps it onto the DPU.

To demonstrate Tesla’s progress in developing the system, Venkataramanan unveiled a physical training tile to the audience. He said that Tesla has already started running networks and that the company will start building cabinets soon. Tesla even has a next generation plan already in place.

Beyond the Vehicle Fleet: The Tesla Bot

Dojo won’t be limited to just making FSD work. Tesla aims for Dojo to be made available for broader AI development such as robotics—which brings us to the Tesla Bot.

The Tesla Bot.

You’ve likely seen or read about the guy dancing in a Tesla Bot suit during AI Day. But Tesla is serious about making AI breakthroughs beyond its vehicles.

“There’s no reason [Dojo] needs to be used specifically for the Autopilot computer,” said Lex Fridman, an AI expert at MIT. “You can basically use this for every machine learning problem, especially one that requires scale.”

In fact, the company is confident that it can use the same hardware and software technologies that make up Dojo to create a humanoid robot. Tesla vehicles are already “semi-sentient robots on wheels,” said Musk, who reasoned that it makes sense to put the vehicle technologies in a humanoid robot. The Tesla Bot would use FSD and Autopilot system cameras in its head to navigate. The robot would also have Dojo training, neural net planning, auto labeling and simulation capabilities.

The bot would take on the boring, repetitive and dangerous work that people currently perform. The 5-foot-8-inch robot will be able to carry 45 pounds, deadlift 150 pounds, and run at five miles per hour. It will have 40 electromechanical actuators, force feedback sensors and two-axis feet for balancing, and be made of lightweight materials. Its “face” will contain a display screen for useful information. Musk anticipated that a working prototype will be available next year.

Tesla’s AI Day.

With Full Self-Driving, the Dojo supercomputer and a humanoid robot, Tesla certainly has ambitions to shape the future of AI both on and off the road. “Tesla is much more than an electric car company,” said Musk to kick off Tesla’s AI Day. “We are arguably the leaders in real-world AI.”

Read more about Tesla’s challenges as it pushes ahead with AI at Tesla Under Federal Investigation for Autopilot Failures.