MIT's smart imaging tech can help detect objects hidden around blind corners
The tech can even be used in self-driving cars to spot pedestrians.
Researchers at MIT's Computer Science and Artificial Intelligence Laboratory have developed an imaging tech that uses standard cameras to read shadows and detect objects or people hidden around blind corners. The new imaging system, dubbed CornerCameras, relies on subtle changes in light which usually go unseen to the naked eye. Meaning, when objects reflect light around walls and cast a shadow, the small shaded portion is captured using an off-the-shelf consumer camera and processed by the system to help with the detection in real time.
While processing, the CornerCameras system tracks colour shifts and pieces together 1D images revealing the objects as well as other related information such as their speed and trajectory. Once developed to its full capacity, the technology could have incredible applications. It could be incorporated into standard or self-driving vehicles to spot pedestrians near a blind spot, or used by firefighters to find people in burning buildings.
"Even though those objects aren't actually visible to the camera, we can look at how their movements affect the penumbra (a fuzzy shadow) to determine where they are and where they're going," said Katherine Bouman, lead author of the paper detailing the system.
"In this way, we show that walls and other obstructions with edges can be exploited as naturally-occurring 'cameras' that reveal the hidden scenes beyond them."
Significantly, this approach works with any smartphone camera and can detect objects indoors as well as outdoors. It even worked in rain.
"Given that the rain was literally changing the colour of the ground, I figured that there was no way we'd be able to see subtle differences in light on the order of a tenth of a percent," Bouman added.
"But because the system integrates so much information across dozens of images, the effect of the raindrops averages out, and so you can see the movement of the objects even in the middle of all that activity."
While detection in rain is a big plus, the team still has to work on solutions to improve the system's performance in environments with no or low light. It even fails to deliver when lighting conditions are continuously changing.
For now, the researchers are looking forward to addressing some of these concerns while tracking the detection capability of the system when it is in motion. They'll first start with wheelchair tests before moving on to tests on cars and other vehicles.
© Copyright IBTimes 2024. All rights reserved.