Mice Hear What They Touch: Study Reveals Whisker Sounds Are Processed in the Brain’s Auditory Cortex

Mice Hear What They Touch: Study Reveals Whisker Sounds Are Processed in the Brain’s Auditory Cortex

In the dim confines of underground burrows, mice rely on their sensitive whiskers to explore and detect their surroundings. While this “whisking” behavior has long been viewed as a purely tactile form of navigation, new research is now reshaping that understanding. A study conducted at the Weizmann Institute of Science reveals that whisker movement also produces subtle sounds—sounds that mice can hear and process with their auditory cortex.

This revelation, recently published in Current Biology, was led by Professor Ilan Lampl of the Brain Sciences Department and offers a striking insight into the multisensory nature of perception. “Whiskers are so delicate that no one had thought of checking whether they produce sounds that mice are able to hear,” Lampl explained.

The research team, including Dr. Ben Efron (then a Ph.D. student), Dr. Athanasios Ntelezos, and Dr. Yonatan Katz, began by recording the faint sounds produced when whiskers brushed across various surfaces such as dried bougainvillea leaves and aluminum foil. Using high-sensitivity microphones capable of capturing ultrasonic frequencies—beyond what human ears can detect—they placed the devices just two centimeters from the source, simulating the distance between a mouse’s ear and its whiskers.

The next phase of the study involved recording neural activity in the auditory cortex of mice engaged in whisking behaviors. These recordings confirmed that the brain responded to whisker-generated sounds, regardless of how faint they were. Even when the neural pathways responsible for touch were disrupted, the auditory cortex still registered the sounds, indicating that these auditory signals were processed independently of tactile input.

To assess whether these sounds carried meaningful information for the mice, the team turned to artificial intelligence. They trained a machine-learning model to identify objects based on patterns of neural activity in the mice’s auditory cortex. The model succeeded in correctly identifying the objects, suggesting that the mice's brains may interpret whisker-generated sounds as informative sensory cues.

In a further step, the researchers developed another machine-learning model—this one trained directly on the recorded whisker sounds rather than brain activity. This model was equally successful in object identification, reinforcing the idea that the sounds themselves were the primary drivers of the brain’s auditory response, rather than signals from other senses like smell or touch.

The ultimate test came in the form of a behavioral experiment. Mice with disabled touch sensation were trained to identify aluminum foil using only the sound their whiskers made upon contact. The animals reliably responded to the correct sound cues, demonstrating that they could recognize objects through whisker-generated sound alone.

“These results show that the vibrissa system, or the whisking network in the brain, functions in an integrated, multisensory way,” said Lampl. This sensory integration, he added, may have evolved to help mice avoid predators or seek food more effectively—for instance, by allowing them to judge whether to cross noisy terrain or to detect whether a plant stem is fresh and edible.

Beyond animal behavior, the study opens new doors for technological innovation. Understanding how the brain integrates sensory information could influence the development of advanced prosthetics, aid rehabilitation after brain injuries, or enhance perception in individuals with visual impairments. Already, the concept mirrors existing tools such as the white cane, which creates distinguishable sounds upon contact with different surfaces.

There are implications for robotics as well. “Integrating different types of sensory input is a major challenge in the design of robotic systems,” Efron noted. Insights from the mouse’s whisking system could inspire early-warning sensors that function effectively even when visual data is limited, such as in smoky or dark environments.

This research marks a significant advance in our understanding of how the brain merges sensory inputs and may pave the way for future interdisciplinary innovations.

Source:https://phys.org/news/2025-05-whisker-generated-encoded-auditory-cortex.html

This is non-financial/medical advice and made using AI so could be wrong.

Follow US

Top Categories

Please Accept Cookies for Better Performance