Humans Could Develop a Sixth Sense, Scientists Say

As animals go, humans have relatively limited senses. We can’t smell as well as dogs, see as many colors as mantis shrimp, or find our way home using the Earth’s magnetic poles as sea turtles do. But there’s one animal sense we can learn: bat-like echolocation.

Researchers in Japan demonstrated this feat in a paper published in the journal PLoS One, proving that humans can use echolocation—or the ability to locate objects through reflected sound—to identify the shape and rotation of various objects without light.

As bats swoop around objects, they send out high-pitched sound waves that then bounce back to them at different time intervals. This helps the tiny mammals learn more about the geometry, texture, or movement of an object.

If humans can similarly recognize these three-dimensional acoustic patterns, it could literally expand how we see the world, says study author Miwa Sumiya, Ph.D., a researcher at the Center for Information and Neural Networks in Osaka, Japan.

“Examining how humans acquire new sensing abilities to recognize environments using sounds, or echolocation, may lead to the understanding of the flexibility of human brains,” says Sumiya. “We may also be able to gain insights into sensing strategies of other species by comparing with knowledge gained in studies on human echolocation.”

illustration-of-dolpin-using-echolocation-to-royalty-free-illustration-1620421121.jpg

Dolphins also use echolocation to identify and hunt down fish. (Dorling Kindersley//Getty Images)

This study is not the first to demonstrate echolocation in humans—previous work has shown that people who are blind can use mouth clicking sounds to “see” two-dimensional shapes. But Sumiya says that this study is the first to explore a particular kind of echolocation called time-varying echolocation. Beyond simply locating an object, time-varying echolocation would enable human users to better perceive its shape and movement as well.

To test subjects’ ability to sense echolocation, Sumiya’s team gave participants headphones and two tablets—one to generate their synthetic echolocation signal, and the other to listen to the recorded echoes. In a second room not visible to participants, two oddly shaped cylinders would either rotate or stand still. The cross-section of these cylinders resembles a bike wheel with either four or eight spokes.

When prompted, the 15 participants initiated their echolocation signals through the tablet. Their sound waves released in pulses, traveling into the second room and hitting the cylinders.

It took a bit of creativity to transform the soundwaves back into something the human participants could recognize. “The synthetic echolocation signal used in this study included high-frequency signals up to 41 kHz that humans cannot listen to,” Sumiya explains. For comparison, bat echolocation signals in the wild range from 9 kHz all the way to 200 kHz—well outside our range of hearing of 20 Hz to 20 kHz.

journal-pone-0250517-g001-png-1620420760.png

When participants tap on the Android tablets, a synthetic echolocation signal is emitted from a loudspeaker (red lines). The recorded binaural sounds, whose pitch is converted to 1/8 of the original by lowering the sampling frequency, are presented to the participants through headphones (green lines). (Image courtesy of Miwa Sumiya)

The researchers employed a one-seventh scale dummy head with a microphone in each ear to record the sounds in the second room before transmitting them back to the human participants.

The microphones rendered the echoes binaural, like the surround-sound you might experience at a movie theater or while watching an autonomous sensory meridian response (ASMR) video recorded using a binaural mic. The signals were also lowered in frequency when received by the miniature head to an eighth of the original frequency so the human participants could hear them “with the sensation of listening to real spatial sounds in a 3D space,” says Sumiya.

Finally, the researchers asked participants to determine whether the echoes they heard were from a rotating or a stationary object. In the end, participants could reliably identify the two cylinders using the time-varying echolocation signals bouncing off the rotating cylinders by listening to the pitch.

They were less adept at identifying the shapes from the stationary cylinders. Nevertheless, the researchers say that this is evidence that humans are capable of interpreting time-varying echolocation.

Sumiya hopes it could one day help humans perceive their spatial surroundings in a different way; for example, helping visually impaired users better sense the shape and features of objects around them.

Cookie Consent with Real Cookie Banner