3D Glasses for Bugs Could Help with Machine Vision

For decades, we’ve been using colored lenses to see 2D films and images in 3D. Little did we know this concept could help us with machine vision.

A praying mantis sporting a pair of green and blue 3D glasses. (Image courtesy of Newcastle University.)

Using colored glasses to restrict imagery to individual eyes, researchers at Newcastle University have managed to make 3D vision happen for an unlikely creature: praying mantises. The “eyeglasses” used in the research resemble our version of 3D glasses but with a few exceptions: they incorporated 0.2 in (7mm) green and blue lenses instead of red and blue and were affixed to the mantis’ head with a bit of bee's wax.

Green and blue lenses were used instead of the more common red and blue combination because mantises don’t perceive red very well.

The imagery captured was presented using a Dell U2413 monitor sporting a 60Hz refresh rate. This particular monitor has very narrowband spectral output in the blue and green responses, which was necessary in order to work with blue and green glasses.

The team devised stimuli for the mantis using Matlab and its Psychophysics Toolbox. With this setup, the researchers were able to provide very low spectral crosstalk to the mantis' eyes. This enabled the presentation of complex, spatially rich images—even including movies!

By presenting moving 3D targets in a format that mimics its prey’s motion and provides the needed depth perception, the researchers managed to evoke a response from the mantis in the form of an attack strike.

Here’s a video of the mantis in action:



(Video courtesy of Newcastle University.)

This interesting advance in “bug theater” is causing excitement in several areas, including mantis perception and behavior. Mantises are very efficient visual hunters capable of detecting, stalking and capturing their prey. With this new system, researchers have the potential to show just about anything with a desired motion to a mantis in order to investigate behavioral responses.

There is still much debate on how vision information is fed back into motor control of animals including humans. This new information could help with new insights.

Another interesting implication is the potential for this research to be used in machine vision. Our understanding is that insect brains are vastly simpler than primate brains. Clearly, this insect is able to recognize miniscule displacements in angular geometry and turn them into the fine depth perception needed to target prey accurately.

If researchers can tease more detail and understanding out of insect depth perception and perhaps the underlying neural functionality, it may be possible to formulate a faster or simpler algorithm for robotics and other rapid depth/distance applications.

This in turn could lead to reduced computational needs, which translates into reduced power consumption and smaller processors for depth perception tasks.

For more information on the mantis study, check out the report here.