Purdue Expert Uses Drones to ‘See’ Perfectly With Sound
Subscriber Benefit
As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe NowUnmanned Aerial Vehicles provide visual and infrared information as they hover above, but researchers at Purdue University say UAVs, or drones, can also help self-driving cars and robots to hear and see.
Mathematics experts at Purdue are collaborating with mathematicians from Technical University of Munich in Germany to show how a drone equipped with four microphones and a loudspeaker can precisely reconstruct the layout of a room by listening to echoes.
They say the process is similar to how bats use echolocation to orient themselves.
“Cars are already equipped with cameras. A new approach might be to include an acoustic sensor as well, to enhance the visual information already available and give a better picture of reality,” said Gregor Kemper, a professor of algorithmic algebra at Technical University.
The scientist say when a microphone hears an echo, the time difference between the moment the sound was produced and the time it was heard is recorded to show the distance traveled by the sound after bouncing on a wall
“Our algorithm shows that sound adds a level of reliability to existing approaches and therefore engineers should consider pursuing their work to build navigational systems that listen,” said Mimi Boutin, associate professor of mathematics at Purdue University.
The applications could be used in autonomous cars, drones, or underwater vehicles.
The researchers say it might even be able to help firefighters working in the dark or visually impaired people.
Boutin, whose work was published in SIAM Journal, says the work is significant because it demonstrates the feasibility of using sound for navigation in unmanned systems.