Driverless Cars See an Improvement in Electric Eyesight

Scientists are adamant on making autonomous cars operate better than driver-operated vehicles. The latest technology that scientists at the Massachusetts Institute of Technology have developed will help autonomous cars steer better through fog and dust. Driverless cars that have struggled to maneuver through blinding conditions due to their reliance on light-based image sensors will get an on-chip system that detects signals at sub-terahertz wavelengths.

The key advantage of sub-terahertz waves is that they can be detected through the fog with ease, as opposed to the waves detected by infrared-based LiDAR imaging systems, a standard feature in autonomous vehicles. The sub-terahertz imaging system detects objects by sending a signal through a transmitter. Once a receiver has measured the reflection and absorption of the rebounding sub-terahertz waves, a signal will reach the processor, recreating an image of the object, according to MIT News.

The technology is an advancement for the driverless car industry, but the implementation of the sensors is difficult. For the sensor to function at its optimum, it needs a robust baseband signal from the receiver to the processor. Traditional systems that produce the requirement are expensive and large, but the smaller-sized systems produce weak signals. To achieve the optimal sensitivity, scientists implemented heterodyne detectors, a scheme of independent signal-mixing pixels. To ensure that the pixels fit into the chip, the scientists reduced the size of the heterodyne detectors, according to Road traffic Technology. The technique helped the researchers to produce a multipurpose component that can produce strong output baseband signals.

The prototype that the scientists created contains a 32-pixel array integrated on a 1.2-square-millimeter device. The sensitivity of the pixels that the MIT scientists created is 4,300 times greater than the top-tier sub-terahertz array sensors. Ruonan Han, coauthor, and professor of electrical engineering and computer science in the MIT Microsystems Technology Laboratories, said, “A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones. Our low-cost, on-chip sub-terahertz sensors will play a complementary role to LiDAR for when the environment is rough.”

The scientists used a decentralized design to achieve their objective. The design involves a single pixel generating a frequency beat and an electrical signal that changes the frequency of an input frequency. This process, which scientists refer to as down-mixing, produces a signal in the megahertz range. The output signal is used to calculate the distance of objects.

To find out more about how autonomous vehicles are changing our lives, check out the article on engineering.com