The Value of Thermal Vision in ADAS

Autonomous vehicles, also known as self-driving cars, have been making a lot of promises lately about the future of transportation—some more far-fetched than others. In ten years, will we see the designated drivers lose their jobs to automation? When we order pizza, will a human deliver it? Will we even need to own our own cars at all?

Far-fetched or not, a lot of technology and development stands between our current reality and a driverless future. To learn more about one technology which may play a critical role in that development, I spoke with Ezra Merrill, Director of Marketing for OEM and Emerging Markets at FLIR Systems, Inc. about the performance and cost of thermal sensors, FLIR’s open-source thermal dataset, and the technology’s applications in autonomous vehicle sensor systems.

Automotive Development Kit (ADK) for Advanced Driver Assist Systems (ADAS)

FLIR is an advocate of autonomous vehicle tech and is working to support and accelerate development. For example, the company released an automotive development kit that includes a Boson long wave infrared (LWIR) camera core in a weatherproof housing. The FLIR ADK is intended for automakers, tier one automotive manufacturers, and tech disruptors who are looking to quickly and easily integrate thermal vision as part of a vehicle’s sensor suite.

Why Use Thermal for Autonomous Driving Tech?

Autonomous Driving is classified into five levels which help us understand how much automation is involved in a given system. The graphic below explains these levels:

Summary table of the SAE's levels of vehicle automation. (Image courtesy of SAE International/J3016.)

“Thermal really comes into play around level three and beyond,” Merrill said. “So, there's an immediate need. You don’t have to look far in the news to see cars on the road at level two, that are colliding into things where a thermal sensor would have detected and helped avoid an accident.  The key message we want developers to understand is that thermal sensors cover many of the corner cases and cost only a few hundred dollars today and will be significantly less in automotive quantities.  As the industry transitions to typical automotive volumes the costs will continue to improve.”

Thermal Vision Could Improve Safety and Reliability of Self-Driving Vehicles

When you’re driving, your eyes scan the car’s surroundings. But what are you really looking for? The visual information gathered by ADAS includes information about the road’s surface and boundaries, lane markings, other objects in the roadway and even objects outside the roadway which may enter the path of travel. This represents a massive range of different objects to detect and classify, especially for a 2D image-based camera.

Left: Thermal image of roadway in bright sunlight. Right: Visible light camera view of same.
Comparison of image clarity during night driving. Left: visible light spectrum (FLIR BFS-U3-51S5C-C USB3). Right: FLIR ADK. Bottom: Velodyne LIDAR VLP-16.

A visible light camera system is working with the same or less information than the human eye, which means the processing algorithms are playing catch-up with the human brain. Thermal sensors provide a definite advantage even at lower level driver assistance. Thermal adds the parameter of heat, which can make it much easier for machine learning systems to detect and classify certain objects such as vehicles, people and animals. This is unaffected by most typical fog conditions, darkness, directly viewing the sun or reflected glare.

Totally passive, thermal sensors are becoming  an integral part of some ADAS sensor suites due to their high point cloud density and ability to visualize living targets with high contrast under all lighting conditions.   To date, FLIR has delivered over 500,000 thermal sensors to automotive driver warning systems. While an image-processing algorithm may mistake a white 18-wheeler truck for the sky, or a person for a shadow, a thermal camera looks at objects in a different part of the spectrum and will greatly reduce those types of accidents.

Increased Range – More Reliable Classification of Pedestrians

An annotated image from FLIR ADK System showing detection of a person.

“Detection and classification are different.  One thing we talk about with thermal sensors is called ‘pixels on target’.  Based on testing that we’ve done, the device needs as few as 16 pixels vertically on a target to be able to reliably classify,” Merrill said.

 A FLIR thermal camera can be configured to classify a pedestrian at over 200 m, up to 4x farther than headlights can typically see.

Because of the longer wavelength of infrared light (around 8-14 µm for blackbody radiation, compared to 0.39 to 0.7 µm for visible light), the pixels for thermal cameras are larger so the camera resolution tend to be lower than that of visible sensors. For cost effectiveness and small size to ease packaging, FLIR uses a VGA sensor (640 x 512 pixels) in Boson camera and ADK.

Infrared Cameras Can be Inexpensive, Potentially Lowering System Cost

According to FLIR, it’s a common misconception that thermal cameras are expensive. In fact, FLIR’s thermal sensors are deployed in over 500,000 cars today are in the hundreds of dollars, while the LIDAR systems used by some ADAS systems could add thousands to the price of a vehicle. Typically, today the ADAS sensor suite consists of visible, radar, LiDAR and ultrasonic sensors.  

The addition of a thermal camera to this suite of sensors may lower the overall costs as it can enable a lower-resolution solid state LiDAR system.  With the addition of thermal sensing the analytics can use the dense point cloud information from the visible and thermal cameras for classification in all weather and lighting conditions and rely on the Radar and LiDAR for ranging and closing speed information.

Who is Developing Thermal Vision in Automotive?

Comparing visible light to infrared cameras, the thermal camera classifies well in visually cluttered environments.

Machine Learning

Hardware is only one part of the system. The algorithms and software that interpret sensor inputs and control the vehicle outputs are, of course, a critical part of the technology.  A vehicle needs machine learning algorithms to have the capability of identifying and reacting to objects autonomously. To train a driving system on these algorithms requires large sets of annotated images.

As part of their initiative to help accelerate technological development of thermal vision in ADAS, FLIR has built an open-source starter dataset of 10,000 annotated images, taken from FLIR cameras on test vehicles. The dataset includes five classes of objects (people, cars, other vehicles, bicycles and dogs) taken during summer driving during the day and at night, collected in Santa Barbara, California. FLIR is making the dataset available for free in July 2018, and you can sign up to receive it here.

Long Road Ahead for Self Driving

In the race to level five autonomous driving, it can be difficult to pin down exactly where technology stands today. Many production vehicles from the 2018 model year have level 2 and 3 features like adaptive cruise control, blind spot monitoring systems, active parking assistance or even Tesla’s Autopilot. On the whole, these systems are well-developed, and safety approved. However, this doesn’t mean that these consumer-ready ADAS systems aren’t under heavy development at automakers and suppliers in America and abroad.  The news is full of stories and images of fully autonomous driverless cars being tested by disruptive companies like Alphabet, Nvidia, Uber and Tesla.

Many of the self-driving test cars on the roads today are a far cry from what we imagine as full, robust autonomous driving. For one thing, the cars are not driving in unknown territory. Road maps of test cities like San Francisco are meticulously and precisely mapped by humans. In some cases, the vehicle’s sensors are not actively processing the road environment, but instead are using LIDAR and GPS to accurately place the car in the known geometry of the area. The car’s sensors detect and classify moving objects in the environment like vehicles, people, and debris with the help of the (hopefully) vigilant human driver.

Processing unknown territory in real time, reliably, is not yet a reality. According to Merrill, FLIR has been demonstrating the advantages of thermal vision for night driving to ADAS developers. In some cases, the feedback they received indicated that some developers aren’t focused on night driving yet, instead focusing on driving in optimal conditions, during the day.

Thermal Vision Valuable in ADAS Systems: An Ongoing Engineering Challenge

Whether you believe the hype or not, advanced driver assistance systems are becoming increasingly quintessential in consumer cars with each passing model year. As the technology develops, there are exciting opportunities for companies who have the engineering skills to gain an edge in the race to level five. Thermal camera technology just might be that edge.

For more information about the free starter dataset of annotated images, or about the FLIR ADK, click the hyperlinks.



This article was sponsored by FLIR. Opinions are my own. –Isaac Maw