What would you do if you could see in the dark?
In a sudden power outage, you’d be more likely to have your smartphone at hand than a flashlight or a set of matches. That’s all well and good if you’re at home where you know the layout–you’d probably just feel your way to wherever it is you keep the torch.
But what if you were somewhere less familiar?
“It could help people see in the dark”
Perhaps you’re somewhere without a backup generator or maybe it just happened to fail at a large shopping mall or office building. What if, instead of using the light from your cell phone’s screen to stumble towards an exit, you could simply snap your fingers and your iPhone would draw you a map of your surroundings and where you are in relation? You could confidently navigate yourself towards the door, knowing that you’re not about to bump into or trip over a rogue clothes rack.
“It could help people see in the dark,” says Ivan Dokmanic, who is leading research at the Ecole Polytechnique Fédérale de Lausanne, which hopes to build an iPhone app that could do just that. “If there’s a power outage and you’re in a building it would definitely help, kind of like having advanced sonar,” at your fingertips.
The source of sound could be “something as simple as finger snap”
He also says it could help people with impaired sight and might even be used to estimate the acoustic channel of a room to improve the quality of a business teleconference meeting or the music you’re listening to.
Dokmanic and his team at the Swiss university have created and tested a computer algorithm that calibrates the echoes of sound waves that bounce back from walls to “determine the shape of the room,” says Dokmanic. It also picks up other objects, like furniture and people.
The source of sound could be “something as simple as finger snap,” says Dokmanic, adding that you need only click your fingers once to render an accurate map of the room. His algorithm’s method isn’t too dissimilar from how bats screech and wait for their echoes to ‘see’. Dokmanic hopes to simplify this technique and miniaturize the technology into just a single app on your phone that you’d turn to when the lights go out.
“you could actually navigate through the building even with just a single microphone”
Right now, his team use four microphones dispersed around the room to gather data from sound reverberations–and that’s the only barrier preventing you from finding their algorithm in the App Store today, since our smartphones only have the one microphone. They’re currently working on refining the technology so that it would work accurately with current smartphone hardware.
But Dokmanic has a short-term solution to bridge the gap between now and his research team overcoming the technological hurdle. He proposes producing the app with a number of pre-loaded maps, which would help to make up for the loss of data by using just the inbuilt microphone on your iPhone. He says he could pre-map buildings like train stations, your office space, and even your home so that when you snap your fingers, the app would have a baseline idea of what the room looks like, on which it could then build a more accurate and current map. “Say you know roughly what the building looks like and you have the blueprints downloaded on your phone, you could actually navigate through the building even with just a single microphone,” says Dokmanic.
In the near future Dokmanic says you’ll be able to forget about the awkward scramble in the dark to find the flashlight. All you’ll have to do is tap your iPhone’s screen once. “I’d say within a year or two you could expect an app.”