Multifeature visual processing in larval zebrafish

Doctoral defense by Katja Slangewal, supervised by Armin Bahl

  • Date: Apr 10, 2026
  • Time: 04:00 PM - 07:00 PM (Local Time Germany)
  • Speaker: Katja Slangewal
  • Location: University of Konstanz
  • Room: ZT 1202
Multifeature visual processing in larval zebrafish
The world is full of sensory information. Animals can detect and use this diverse information to make decisions. In most situations, multiple sensory cues are available simultaneously. By integrating sensory cues, animals can improve the speed and accuracy of their decisions, but also resolve ambiguous or conflicting situations. How exactly do they do this? Which algorithms do animals use to combine different sources of information? And how are these algorithms are implemented in the brain?

In my thesis, I investigated how larval zebrafish integrate multiple visual features to guide sensorimotor decisions. Because of their small size, optical transparency, and robust visually guided behaviors, larval zebrafish form a powerful model system to figure this out. Specifically, I examined the combination of motion and luminance cues, both of which elicit innate behaviors in zebrafish larvae. Motion cues drive the optomotor response, in which fish follow optic flow, while luminance cues evoke positive phototaxis toward brighter regions.

I employed high-throughput behavioral assays combined with computational modeling to identify the minimal algorithm underlying multifeature visual integration. My results support an additive integration model in which three visual features: temporally integrated motion, lateral luminance level, and changes in luminance; are processed in parallel and then linearly combined to drive behavior. Building on this behavioral model, I used two-photon calcium imaging to measure brain-wide neural activity during sensory integration. These experiments revealed distinct neural representations corresponding to each computational model component. Together, they form a distributed network that converges in the anterior hindbrain, a region known to bias swim direction. Single-cell neurotransmitter identity and morphological analyses showed that these functional nodes comprise both excitatory and inhibitory neurons. Their projection patterns suggest three parallel processing streams that may converge onto spatially segregated dendritic compartments of multifeature-integrating neurons in the anterior hindbrain.

Overall, my findings support a modular and flexible strategy for integrating multiple visual features at both behavioral and neural levels. While my work provides a single example of sensory integration in a vertebrate brain, comparison with other studies of multisensory integration offer broader insights. Similarities and differences across systems suggest that, for example, stimulus reliability or the experimental paradigm may account for observed differences in multisensory integration strategies. By combining behavioral analysis, computational modeling, and whole-brain imaging, this thesis provides a framework that can be extended to test the modularity and flexibility of additive integration across development, sensory domains, and across other species.
Go to Editor View