Our brains' ability to detect and process delays in sound entering our ears, delays too short to be detected consciously, can help our eyes fine-tune our estimated distances to nearby events, researchers say.

We often seen distant events before we hear them — like distant lightning flashes seen seconds before they're heard — because sound moves a lot slower than light.

Consciously measuring that delay by counting seconds until the sound arrives gives us a good idea of how far away the lightning was.

"The way distance has been studied is all about using visual information to infer how far away something is," says Philip Jaekl, a cognitive neuroscientist and co-author of a study appearing in PLOS ONE.

However, Jaekl and colleagues at the University of Rochester in New York wanted to find out if our estimates might also be influenced by delays of sound so short we're not conscious of them.

In one experiment, volunteers were asked to guess whether two images flashed on a screen in front of them, one after the other, were the same distance from them.

One image was accompanied with a simultaneous click noise, the other — which was in fact farther away — with a click that sounded identical but was delayed by an almost imperceptible amount of time.

With the clicks, volunteers became very adept at judging the relative distances, the researchers found.

In fact, they say, their study shows humans can unconsciously notice and make use of sound delays as short as 40 milliseconds.

That may be down to activity in a part of the brain known as the superior colliculus, important for spatial processing, Jaekl says.

"Some of the audiovisual neurons in the superior colliculus respond greatest when the sound is delayed relative to the onset of a visual stimulus," he explains.

Visual cues will always be the main input for estimating distance, but the study suggests combined audio-visual processing can also come into play, the researchers say.

Whether we use the ability to unconsciously detect minuscule sound delays very much in our day-to-day lives is unclear, Jaekl says, but it may come into play when our vision is reduced, such as during nighttime.

"Our brains are very good at recognizing patterns that can help us," Jaekl says. "Now we also know that humans can unconsciously recognize the link between sound delays and visual distance, and then combine that information in a useful way."

ⓒ 2021 TECHTIMES.com All rights reserved. Do not reproduce without permission.