Programming for Uncertainty Skyrockets Drones’ Operational Efficiency

Over the last decade, the prominence and wide acceptability of drones has been on the increase. However, the use of automated drones has been limited mostly to higher altitudes, while lower altitude drone operations have been majorly remote controlled. The major limitation behind this is the computational complexity associated with avoiding obstacles while travelling at a high speed. But with the all-new NanoMap, this limitation is set to become history.

A flying NanoMap-enabled Drone (Credit: MIT CSAIL)

Existing approaches to drone technology rely mostly on intricate maps that inform a drone about its position relative to obstacles on its path. The problem with this approach is that it’s not practical in the real world full of unpredictable objects here and there. Consequently, if the estimated object location is off even by a small margin, the drones can easily crash.

To solve this problem, the NanoMap sticks to a simple principle – it recognizes that over time, the drones’ real-world position is uncertain and it accounts for this uncertainty. It uses a depth-sensing system to gather information about the drone’s immediate surroundings. This allows it to plan ahead for its current field of view and also anticipate its movement patterns around the hidden fields of view it has seen already.

The major between the NanoMap and previous works is that it the NanoMap consists of a group of images and their position uncertainty rather than their rigid orientation and positions. So, keeping up with the uncertainty of objects allows the drone to utilize previous images in planning motion. It’s like the drone goes back in time to think of all the places it has been and uses this knowledge to navigate its way out of tight spots.

The initial tests of the Nanomap saw the drones moving as fast as 20 mph in clustered areas like forests and dense warehouses. This crash rate was drastically reduced to 2% from over 25 percent, just by accounting for uncertainty. This system can be applied to various fields including defense, search and rescue, package delivery, self-driving cars and entertainment, where it will allow for more efficient filming of action sequences.

The NanoMap was developed by Pete Florence (a graduate student) and Professor Russ Tedrake of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), alongside software engineers Jake Ware and John Carter. Their paper has been accepted by the IEEE International Conference on Robotics and Automation (ICRA), which is set to hold in May in Brisbane Australia.

Interested in drone technology? Check out Drone Age: Rise of the Flying Robots.