Could a Shrimp Improve the Vision of Self-Driving Cars?

Researchers are turning to an unusual source of inspiration to improve the guidance systems of driverless cars: the mantis shrimp.

This creature has one of the most complex visual systems in the natural world—and scientists at the University of Illinois at Urbana-Champaign are modeling a new type of camera based on the crustacean. The device would improve the ability of autonomous vehicles to identify potential dangers—in difficult imaging conditions—three times farther away than color cameras currently in use.

The camera achieves this by detecting a property of light known as polarization, where light waves that vibrate in more than one plane are restricted to vibrate along a single plane. Polarization allows the camera to more easily differentiate between objects of similar color and brightness: the polarization of a white truck is different than that of a cloudy sky even if both are a similar color.

To achieve polarization, the researchers deposited nanomaterials that contain photodiodes, which convert light into an electrical current, onto the surface of the imaging chip. The researchers used a forward bias mode—the application of voltage in the direction of current flow—rather than the reverse bias mode used by conventional cameras. This changed the chip’s electrical current from being linearly proportional to the light input to a logarithmic response—which is how the mantis shrimp’s eyes work.

“In a recent crash involving a self-driving car, the car failed to detect a semi-truck because its color and light intensity blended with that of the sky in the background,” said team leader Viktor Gruev. “Our camera can solve this problem because its high dynamic range makes it easier to detect objects that are similar to the background.”

The researchers tested the camera in the lab under different light intensities, colors and polarization conditions. They then used it in various real-world scenarios such as in tunnels and foggy conditions—and the camera handled those situations without problems.

The team is now working with an airbag manufacturer to see if it can use the camera to deploy the airbag a few milliseconds earlier than is currently possible.

And the camera is not only making itself useful in cars: it could also possibly be used to detect cancer cells, which exhibit a different light polarization than healthy cells. In addition, it could potentially be used to help improve ocean exploration.

“We are beginning to reach the limit of what traditional imaging sensors can accomplish,” said Missael Garcia, the paper’s lead author. “Our new bioinspired camera shows that nature has a lot of interesting solutions that we can take advantage of for designing next-generation sensors.”

Want to find out more about work on improving autonomous car sensors? Check out New LIDAR Driving System Lets Vehicles “Gaze” Like Humans.