Human Echolocation Now Possible Thanks to Acoustic Touch

By Jason Collins | Published

Australian researchers have developed a new type of smart glasses that incorporate “acoustic touch” which allows people with blindness and low vision to navigate using sounds.

Acoustic Touch And Human Echolocation

While we’re still far away from making actual bat-people, this is a first step towards wearable human echolocation that can help those with blindness and low vision “see” better—according to IEEE Spectrum, this allows the visually impaired to reach for objects in near vicinity without mental exertion.

Visually impaired often rely on sound and touch to interpret the world around them. The new human echolocation technology combines the two, allowing humans to interpret what’s in front of them using an acoustic touch technology.

How Does It Work?

So, how does it work? Well, echolocation is a rather interesting concept; imagine that there’s a virtual cone extending out in front of one’s head as someone’s field of vision.

If you rely on your sight, your field of vision relies on light reflection, which allows you to see. The same thing applies to echolocation, but instead of light reflection, human echolocation uses sound.

Sound Representations

So, any object that’s present in the virtual cone region will be recognized by the computer and translated into unique sound representations.

For example, the sound of rustling leaves might signify a plant, while a buzzing sound might signify that a smartphone is present in the device’s “field of view.”

The wearable device might still see more objects, but it will only relay those objects that are within the acoustic touch virtual cone to the user—similar to the peripheral and focal vision of people with sight.

Focal Vision

Human Echolocation acoustic touch

This means that the human echolocation technology would rely on head position to dictate which sounds the wearable tech would play to support the visually impaired in their exploration of the surrounding environment.

In other words, the device would only play those sounds whose physical representations are found in the wearer’s “focal vision,” for lack of a better term.

The researchers tested the technology on seven completely blind, visually impaired individuals and seven sighted people who were blindfolded, all of whom were wearing acoustic touch-enabled smart glasses.

Identifying Objects

Human Echolocation acoustic touch

The participants in the human echolocation testing had to identify objects before them, and the experiment concluded with blind and low-vision individuals performing well in recognizing and reaching for objects in front of them without added mental exertion.

The auditory feedback provided by the technology empowers users to identify and reach for objects with remarkable accuracy, which indicates that acoustic touch might become an effective method of sensory augmentation for the visually impaired.

Still Years Away

Human Echolocation acoustic touch

Of course, it’s safe to assume that we’re still years away from viable implementations, considering that the acoustic touch tech is in its infancy.

The human echolocation experiment was also conducted in a strictly controlled environment, conditions which aren’t achievable in the real world due to the environmental complexity.

But it’s definitely a step in the right direction, as the technology is likely to continue improving as computer vision and object recognition become better with time.

And though we actually have fellow humans who are now legally recognized as cyborgs on this planet, their number will likely continue to grow once the acoustic touch technology matures and becomes more widespread. Perhaps future iterations of human echolocation will become powered by tears as well, like the smart contact lenses that were recently developed.