[Keeping track while tracking]
As a bat navigates and hunts, they receive brief “snapshots” of prey, obstacles, and landmarks around them as echoes. By the time the bat has produced another call and received a new “snapshot”, their angle and distance from those objects has changed, dramatically altering the acoustic scene. How do bats reconstruct these disparate bits of information into a continuous representation of objects in their environment? This project investigates how temporal and acoustic features are integrated to allow for identification and discrimination of rotated objects.
[Hearing the forest for the trees]
Echolocating bats navigate and forage using only echoes to guide them. Arial hawking bats are able to track and intercept prey despite the presence of off-target echoes created by objects in the environment. Foliage and other objects create scattered, chaotic echoes that overlap and interact with target echoes. The neural mechanisms that enable accurate target discrimination under these difficult conditions are currently unknown and represent a fundamental problem for all organisms that engage in auditory scene analysis.