The mushroom bodies in the insect brain play a central role in olfactory learning and memory, performing somewhat analogous functions to vertebrate hippocampus. Together with the nearby brain centre, the lateral horn, the mushroom bodies receive olfactory information from the antennal lobe via projection neurons. Most of our current knowledge of how the mushroom bodies work - process and store olfactory information and recall olfactory memories - comes from studies of the fruit fly Drosophila melanogaster, which enables neurogenetic explorations of these functions. However, the mushroom bodies also receive other sensory inputs, such as visual information, from other brain centres. But very little is known how the mushroom bodies participate in visual information processing and multisensory learning.
James will use our two-photon microscope system to study this question in head-fixed Drosophila, which behave in a bespoke virtual reality/track-ball system. He will investigate how the visual information is rooted and represented neutrally in the mushroom body structures using transgenic Drosophila, in which selected neurons within the different mushroom body lobes (and elsewhere in the central and visual brain) express calcium- or voltage-sensitive activity reporters (GCaMP6f and ASAP4, respectively). James will quantify how the visual stimuli, as delivered by the virtual reality stimulation system, and when linked with sucrose-reward or heat-punishment, changes the mushroom body neural activity representing the visual stimuli; seen as local/global calcium or voltage changes as reported by GCaMP6f or ASAP4 fluorescence changes in two-photon imaging.
The aim is to understand mechanistically how the mushroom body and other nearby neural circuits channel, represent and store visual information and participate in representing and recalling visual memories.