“Bat vision” isn't just for comic book heroes any more
Developing machine learning-enabled acoustic imaging for first responders will represent a major advancement in mine rescue, which is surprisingly low-tech in some ways.
By Jenn Fields, Special to Mines Newsroom
The first responders who head underground in a mine emergency are often going in blind, into a blackout fog of dust and smoke.
“You can’t even see your hand in front of your face,” said Andrew Petruska, assistant professor of mechanical engineering who specializes in robotics. “Not only is it pitch black, it’s occluded.”
But while this poses a challenge for humans trying to find their way in an emergency, there is a species that has no trouble navigating tight, dark spaces while avoiding obstacles as thin as wires and even catching dinner along the way.
“We looked into research about how well bats can echolocate,” Petruska said.
The flying mammal’s amazing echolocation abilities provided inspiration for a team of Mines researchers who are developing a wearable augmented-reality display that will let emergency responders “see” as they enter a dangerous situation at a mine. The three-year project, which is being funded by a $620,000 grant from the Alpha Foundation and led by Petruska, will use machine learning-enabled acoustic imaging to create a map of the surrounding space that will appear on a display inside a helmet so first responders can safely navigate zero-visibility corridors.
Developing “bat vision” for first responders will represent a major advancement in mine rescue, which is in some ways surprisingly low-tech.
“Quite often, mine rescuers advance by touching the wall,” Geophysics Professor Paul Sava said. “Think about it: They just walk along the wall, and they feel it to maintain a sense of orientation. It’s scary.”
To develop the wearable AR display, researchers will use a combination of acoustic imaging and millimeter-wave radar from sensors mounted on the helmet to gather data on the surrounding environment.
“Think about it like a motorcycle helmet that is full of instruments on the forehead, sides, back, front—everywhere—and those instruments can be radar sensors as well as acoustic sensors,” said Sava, who specializes in imaging (though usually on a much larger scale and in the subsurface, he noted).
Translating it all into a visual will present a challenge, but Mines has an asset on campus that will help with the initial creation of the augmented-reality simulation. The researchers can do their initial tests of the AR display with pre-existing data from the school’s Edgar Experimental Mine.
To translate data from the sensors into a visual for a wearable display, such as a Microsoft HoloLens, the team will design a convolutional neural network, a type of deep-learning algorithm often used for imagery.
The multiple technologies that make the AR display possible only recently became capable enough that researchers could consider using them for a project like this.
Competition in the self-driving car industry and even video games pushed the miniaturization of the sensor and computing technology that the helmets can now use, Sava said. “Radar systems are becoming more compact. The amount of computing you can now carry on your body has increased exponentially, driven by consumer technologies like smartphones.”
Radar wasn’t just too large to be wearable before, either. It was also too expensive. “The car manufacturers of the world didn’t create the miniature radar technology, really, but they created a market for it,” Petruska said. “The fact that the car industry is looking at it – they buy things in such large numbers – opens it up,” he said. “Mine safety alone is just never going to buy enough radars to make the economics work.”
The final phases of the project will take faculty and students to the Edgar Experimental Mine for field testing. (Two PhD students and two undergraduate research assistants will join faculty members Petruska, Sava, Hao Zhang, Sebnem Duzgun and Jurgen Brune on the project.) There, faculty and students alike will get a sense of how the helmet would behave in a real-life disaster scenario, where the ability to see through smoke and debris could mean the difference between life and death.
“Personally, that is what attracted me to this research,” Sava said. “A person’s life could get saved, and that is enough motivation.”