Thermal imaging approach turns night into day

Sept. 12, 2023
Alone, thermal imaging and IR imaging each tend to generate vague imagery—and it poses risks for robotics, AI, and autonomous technologies. But together, they may be a force to be reckoned with.

Researchers at Purdue University (West Lafayette, IN) have developed heat-assisted detection and ranging (HADAR), a system that combines thermal and infrared (IR) imaging to recover geometric textures in a weak-scattered signal. This approach works via a combination of thermal physics, machine learning and artificial intelligence (AI), and spectral resolution in thermal images (see video).

Specifically, HADAR uses hyperspectral thermal imaging to take thermal images of a scene in hundreds of different colors in the thermal IR range. These frequencies and colors are invisible to the human eye.

“For visible frequencies, the three preferred channels are red, green, and blue. However, these are arbitrary choices not driven by the fundamental science of waves,” says Zubin Jacob, a professor of electrical and computer engineering at Purdue (see Fig. 1).

HADAR uses three thermal physics-driven attributes to represent a scene (see Fig. 2): temperature (T), material information and emissivity (e), and physics-driven texture (X). The technique also uses machine learning to disentangle X from T, and from e.

“Representing T/e/X together in the hue, saturation, and value (HSV) color space, we get the T/e/X vision, which has rich textures and is as vivid as daylight optical imaging,” Jacob says. “HADAR object detection and ranging are then based on T/e/X vision images.”

While HADAR disentangles T, e, and X, their quantities are mixed in the cluttered heat signal used by traditional thermal imaging. This allows the approach to extract more information for detection and ranging, Jacob explains. It recovers the weak geometric textures in a heat signal that are nearly invisible in traditional thermal imaging. These textures are crucial for accurate ranging at night, which makes HADAR highly accurate compared to daytime detection and ranging—even in pitch darkness.

The color in the T/e/X vision represents semantic categories. The Purdue team’s work emphasizes that this has physics-driven meanings and is in stark contrast to traditional thermal imaging with pseudo colors. HADAR uses a semantic library—a collection of materials such as water, trees, and sand—for T/e/X decomposition, and to generate the T/e/X vision.

“The semantic library can be calibrated on the site of the scene,” Jacob says. “Or it can be derived from the hyperspectral imaging data cube.”

And the semantic library comes with a color corresponding to each semantic label: blue for water, green for trees, and yellow for sand. Colors for materials are purely determined according to their daily visual appearance to mimic daylight optical imaging, and are determined and labeled beforehand while creating the library. In subsequent experiments, when HADAR detects a tree, it will automatically be shown by the algorithm in green.

Imaging: night and day

Unlike thermal vision, which involves a thermal camera capturing thermal images, thermal ranging is the process of extracting the depth information out of thermal images. This is “a significantly more difficult task,” Jacob says, “but for machines operating at night, it is a crucial task to make a 3D map of their surroundings.”

Thermal images are usually textureless and low contrast (typically blurry), while optical (visible light) imaging offers much more detail. Both types of imaging do, however, obey the same physics laws of signal scattering and propagation.

Jacob notes that thermal imaging collects both direct emission of the targets and the scattered thermal radiation of other environmental objects, which produces blurred images. Direct emission is usually 10X stronger than the scattered signal, but lacks texture; the scattered signal carries the textures. In comparison, the signal for optical imaging in daylight is mainly the scattered solar illumination without direct emission from targets.

“Essentially, textures in a scattered signal being immersed in textureless direct emission leads to blurry thermal images,” Jacob says. “An intuitive example to understand this ‘ghosting’ effect is a shining bulb. You can see geometric textures on the surface of a bulb when it is off. These geometric textures immediately become invisible once the bulb is turned on and shining.”

HADAR vs. the conventional

Techniques such as sonar, radar, and LiDAR send out signals and detect the reflection to infer the presence or absence of an object and its distance. This provides extra information about the scene as well as the camera vision, especially when the ambient illumination is poor.

But techniques face challenges, including the ability to scale up to multi-agent tasks that may become common soon.

“If 100 self-driving cars are using LiDAR in the same street block, their emitted laser signals would interfere with each other,” Jacob says. “The laser signal received by car A is probably not emitted by itself, but from car B. This will lead to incorrect detection and ranging.”

He adds that laser signals are harmful to human eyes, so to meet industrial eye safety standards, each of these 100 self-driving cars must emit a laser power that is 100X weaker than the power it can use when operated alone. The decrease in laser power significantly decreases the distance that LiDAR can see—making existing self-driving cars short-sighted.

With HADAR, specifically its T/e/X physics quantities, X improves ranging at night. The new approach also offers accurate temperature measurement as well as accurate semantic information that may lead to improved detection in daylight.

Future insight

HADAR could be useful for applications like “autonomous navigation, robotics, and smart healthcare monitoring, especially at night,” Jacob says. “Since it’s fully passive and does not send out signals, HADAR could also be useful for the defense industry. And many wildlife animals are only active at night, where regular cameras don’t work, so wildlife monitoring could be another application.”

The prototype HADAR system the Purdue team created does pose challenges, however, as it’s based on hyperspectral thermal imagers that currently are large, heavy, expensive, and operate at slow imaging speeds.

“These practical challenges need to be overcome for HADAR to have widespread applications,” Jacob says.

The team is now working to significantly improve the hyperspectral thermal imager component, and anticipates these advancements happening within the next three to five years.

“The fact that HADAR can make AI see through the darkness the same way it does in broad daylight has a deep and far-reaching importance,” Jacob says. “Animals and human beings are subject to the dichotomy between day and night, due to evolution. Now, we have proven that AI equipped with HADAR can overcome this long-standing dichotomy.”

Editor’s Note: Read additional coverage of this work at Vision Systems Design, a brand in Endeavor Business Media’s Design & Engineering Group alongside Laser Focus World; see www.vision-systems.com/14298269.

About the Author

Justine Murphy | Senior Editor

Justine Murphy is a multiple award-winning writer and editor with more 20 years of experience in newspaper publishing as well as public relations, marketing, and communications. For nearly 10 years, she has covered all facets of the optics and photonics industry as an editor, writer, web news anchor, and podcast host for an internationally reaching magazine publishing company. Her work has earned accolades from the New England Press Association as well as the SIIA/Jesse H. Neal Awards. She received a B.A. from the Massachusetts College of Liberal Arts.

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!