August 5, 2021
8:30 a.m., Zoom Meeting
CVS Thesis Defense: Adam Danz, BCS Graduate Student, Advisor: Greg DeAngelis
Add to Calendar
Predictive Steering Control and Neuronal Representation of Heading in Macaques
Vision plays a critical role in both sensing and guiding self-motion through the environment. Optic flow defines the pattern of velocity on the retina that forms when the observer is moved through the environment. When the eyes are stationary in the world, optic flow provides an estimate of the observer's velocity but when eyes rotate in the world, rotational velocity obscures the flow field and the visual system must extract the translational component to access heading information. Various computational models define biologically plausible visual and non-visual mechanisms that perform this task with mixed behavioral and neural evidence supporting the models. These models disagree about whether depth information and non-visual estimates of heading such as vestibular responses are required to extract heading from optic flow.
It is critical to determine what sources of sensory information contribute to heading perception during eye rotation since vision supplies critical feedback and trajectory planning in active navigation while the eyes are often moving.
Additional non-sensory information is available when movement is controlled by a navigator. Strong evidence in studies on motor control support the use of internal models of the environment and of the biomechanical kinematic system but only recently has these types of control models been applied to the control of navigation. Very little is known about how vision and model-based estimates of self-motion are integrated.
To determine the disputed sources of sensory information used by the visual system to extract heading from optic flow, we examined the rotation tolerance of heading tuning in macaque area MSTd while varying depth information and vestibular heading signals. Results indicate that the range of rotation tolerance to heading tuning in MSTd neurons is not dependent on depth variation nor the presence of vestibular signals. Instead, suggest that the visual system solves this problem independently using dynamic perspective cues.
To determine how visual estimates of self-motion play a role in guiding active navigation, macaque monkeys were trained to steering through a virtual environment in a novel paradigm that removed the path from the scene requiring the animals to steering based on a internal model of the environment. By perturbing their trajectories and varying the reliability of optic flow, we can measure how model-based and vision-based estimates of heading are combined according to their reliability. A model of steering control using a Kalman filter for state estimation and internal models of the environment for trajectory planning is designed to test predictions of the control system. We found evidence consistent with optimal control theory that navigation is guided by internal and sensory based estimates of self-motion according to their relative reliability.
Together, these studies elucidate the information required for the visual system to estimate heading and how visual estimates of heading contribute to active control of self-motion.