Why the echolocating brown bat is an ideal model for deciphering the complexities of the way humans perceive sound.
The world is a noisy place. How do our brains decide which sounds to pay attention to and which ones to ignore? With a $1 million grant from the National Science Foundation, a team of researchers from the Whiting School and the Krieger School of Arts and Sciences is using the echolocating brown bat as a model to help decipher the intricacies of human auditory perception.
Bats use echolocation to paint a 3-D image of the world around them, emitting ultrasonic waves from their larynx that bounce off objects in their environment and back to their ears. But little is known about the nature of the sound waves themselves, notes Rajat Mittal, professor of mechanical engineering. “We want to understand the entire process of how a bat emits the ultrasound waves, their spectrum and characteristics, and in what way the animal is processing this returning echo to help it distinguish different types of targets in their environment,” he says.
Cindy Moss, professor in the Department of Psychological and Brain Sciences at the Krieger School, uses tiny sensory implants to examine the neural pathways of a bat’s brain as it performs different tasks in her “bat lab.” The soundproof room is equipped with highly sensitive microphones that record bat calls and high-speed cameras that track their movements in flight. The data that Moss records will be used by the engineering team to create detailed computer models of the ultrasound waves, including their frequency and wavelength.
How does a bat in flight differentiate a tasty insect from an unpalatable one? As part of its research, the team will record and analyze a bat’s calls in relation to objects of different sizes, shapes, and textures, creating a catalog of unique echoes based on the nature of the target. “We’ll be able to play a lot of ‘what-if’ games based on the data,” says Mittal. “We can take a recorded echo, put it in our computational model, and change the target object to whatever we want. We could put in a butterfly and model precisely how the echo would be different from one type of butterfly to another.”
Like humans, bats are subject to a multitude of sounds in their environment, but how does a creature that relies on sound to “see” deal with all the chatter? Mounya Elhilali, the Charles Renn Faculty Scholar, associate professor of electrical and computer engineering, and director of the Laboratory for Computational Audio Perception, studies the so-called cocktail party problem—how people make sense of overlapping sounds when they walk into a crowded room. Her part of the research involves looking at how bats decipher their own calls from those of other bats and how their neural responses change accordingly. Elhilali says that her work creating mathematical models based on human responses to the cocktail party problem should inform her work with bats and vice versa, creating new insights as to how mammals navigate complex social situations.
Bats rely on an involved series of head tilts and ear movements to perceive sound. Using images recorded by Moss in her bat lab, Mittal and Jung-Hee Seo, associate research professor in the Department of Mechanical Engineering, will model how a bat’s ears and head shift relative to a target when echolocating. “What we’re trying to understand is how the sound wave is modified by the shape and orientation of the bat’s ear,” says Seo.
A key component of the team’s research involves wirelessly monitoring the neural responses of actively flying bats as they navigate objects, rather than passive ones that have been anesthetized or those sitting on a platform. “We’re trying to engage the bat in what we call active listening—having the bat pay attention to A versus B,” says Elhilali, who notes that such behavior has been shown to rewire the brain in terms of how it processes sound. “The way the brain does computation is not static. It adapts its behavior, and this can have a lot of interesting engineering implications as it relates to understanding adaptive technologies.”
The Big Picture
Building a detailed picture of how bats use echolocation to move about will hopefully give the researchers a better idea of how other animals use sound in comparable ways. “What the bats are accomplishing is a very complex version of what social animals do all the time,” says Mittal. “By looking at the entire process of sound emission, reception, and signal processing in the brain, we’ll be able to relate that to what other social animals—including humans—are doing in similar situations.”