Nighttime poses a unique challenge for cameras, but a new method called HADAR, which stands for heat-assisted detection and ranging, is changing the game. This technology helps machines capture detailed information in complete darkness, providing clarity similar to daylight images from stereo cameras.
Researchers from Purdue University and Michigan State University developed HADAR for future scenarios where numerous autonomous vehicles and drones share the same space. Zubin Jacob, a professor at Purdue, leads this research, focusing on how light and thermal radiation can improve machine perception.
Currently, many robots use a mix of sensors like cameras, sonar, and radar to understand their surroundings. LiDAR, a popular laser-based method, is key for navigation but faces limitations, especially in crowded environments. When multiple machines use active sensors, their signals can interfere, creating safety issues. On the flip side, passive thermal cameras capture heat energy from objects, allowing them to see in darkness. However, they often struggle with clarity.
One significant problem with traditional thermal imaging is the ghosting effect, which blurs crucial details. Recent studies showed that this blurring results from both lenses and the inherent properties of thermal radiation. The HADAR system tackles this by using advanced algorithms that process a variety of thermal wavelengths, resulting in sharper, more detailed images.
During initial outdoor tests, HADAR demonstrated its capabilities by revealing intricate textures like bark patterns and water ripples in a nighttime setting. “Pitch darkness carries the same amount of information as broad daylight,” said Jacob, emphasizing that future machines can learn to interpret night similarly to day.
The system creates 3D maps for navigation solely from thermal data, thus avoiding interference from other machines’ sensors. For example, in a test with a human driver and cardboard cutout of Albert Einstein, HADAR accurately distinguished between them, while conventional systems struggled.
Robust night vision is crucial for self-driving vehicles, as even small errors can lead to serious accidents. Current sensors work well in plenty of conditions but can falter under certain lighting or weather challenges. Since HADAR is a passive system, it reduces the risk of interference, a growing concern as more automated vehicles hit the roads.
HADAR’s ability to perceive temperature and texture could revolutionize various fields. Farmers could monitor crops at night, while hospitals might detect unusual temperature patterns that indicate health risks.
However, HADAR is still a prototype. It currently takes about a second to process one image, while self-driving cars require 30 to 60 frames per second. Researchers aim to improve processing speed and reduce the size of the technology, making it practical for widespread use.
In summary, HADAR highlights how thermal signals can offer rich information, turning the darkness into an asset rather than a liability. As engineering challenges are addressed, we may see robots and vehicles that navigate both day and night with equal proficiency. For more details, you can read the full study published in Nature here.
Source link
Science

