Supervisor: Dr Trevor Wardill
Currently the neural basis for the colour opponent process in animal visual systems is poorly understood. It is known in honey bees that signals leaving the second optical neuropile, called the medulla, already have signals indicating colour receptor information is compared upstream. The fruit fly remains a premier model in vision research due to its small brain and compact neural circuits that can be manipulated by sophisticated genetic tools. The aim of this project will be to functionally characterise the neuronal implementation that underlies colour vision. By combining genetic, imaging, electrophysiological and behavioural methodologies (e.g. Wardill et al., 2012), the biological mechanisms that allow a uniform visual perception of colour despite a stochastic distribution of colour receptors will be elucidated.
To determine precisely which neurons and how they perform these computations, a dedicated 2-photon imaging system will be used to quantify fluorescently reported neural activity (e.g. GCaMP6 or ASAP1) in conjunction with a unique panoramic colour stimulation display. The experimenter will have control of which colour, pattern and velocity can be delivered, so that spectral, spatial and temporal neuronal receptivity maps can be measured. Fluorescent neural activity indicators will be targeted using the latest labelling methods to characterise either diverse or specific neuron populations.
To better understand the function of colour vision, head-fixed animals will also undertake behavioural navigational tasks by placing them in a virtual reality (VR) arena. By monitoring their movements (e.g. Moore et al., 2014; Weir & Dickinson 2015) and externally modulating neural activity in visual circuits using a red-shifted channel rhodopsin (that does not distract the animal), the context when colour vision is computed can be revealed. The ultimate goal of the project is to determine how colour vision is computed and used during natural behaviours such as finding food, mates and avoiding predation.
Moore RJ, Taylor GJ, Paulk AC et al. 2014. FicTrac: a visual method for tracking spherical motion and generating fictive animal paths. Journal of Neuroscience Methods 225: 106-19.
Wardill TJ, List O, Li X et al. 2012. Multiple spectral inputs improve motion discrimination in the Drosophila visual system. Science 336: 925-31.
Weir PT & Dickinson 2015. Functional divisions for visual processing in the central brain of flying Drosophila. PNAS 112: E5523-32.