Manipulating synthetic optogenetic odors reveals the coding logic of olfactory perception

Authors: Edmund Chong, Monica Moroni, Christopher Wilson, Shy Shoham, Stefano Panzeri, Dmitry Rinberg


Introduction: Advances in monitoring brain activity at large and fine scales have revealed tremendous complexity in how the brain responds to, and represents, the external world. Although many features in brain activity patterns (which brain cells fire and when) are found to correlate with changes in the external sensory world, it is not yet known which activity features are consequential for perception and how they are combined to generate percepts. Some studies have shown that many of these correlated changes in activity may be redundant or even epiphenomenal.

Rationale: To address how brain activity generates perception, we directly and systematically manipulated neural activity in the mouse olfactory system while measuring perceptual responses. Mouse olfaction is an attractive model system because the relevant brain circuitry has already been carefully mapped out and is accessible for direct manipulation. We used genetically engineered mice in which brain cells can be activated simply by shining light on them—a technique known as optogenetics. Optogenetics allowed us to directly generate and manipulate brain activity in a precise and parametric manner. We first trained mice to recognize light-driven activity patterns in the olfactory system, or “synthetic odors.” Subsequently, we measured how recognition changed as we systematically manipulated learned activity patterns. Some manipulations led to larger changes in recognition than others, and the degree of change reflected the importance of each manipulated feature to perception. By the additional manipulation of multiple features simultaneously, we could precisely quantify how individual features combined to produce perception.

Results: The perceptual responses of mice not only depended on which groups of cells were activated, but also on their activation latencies, i.e., temporal sequences akin to timed notes in a melody. Critically, the most perceptually relevant activation latencies were defined relative to other cells in a sequence and not to brain or body rhythms (e.g., animal sniffing) as previously hypothesized from observational studies. Moreover, earlier-activated cells in the sequence had a larger effect on behavioral responses; modifying later cells in the sequence had small effects. To account for all results, we formulated a simple computational model based on template matching, in which new activity sequences are compared with learned sequences or templates. The model weighs relative timing within each sequence and also accounts for the greater importance of earlier-activated cells. Based on our model, the degree of mismatch between the new sequence and learned template predicts the extent to which recognition should degrade as neural activity changes across many different manipulations.

Conclusion: We developed an experimental and theoretical framework to map a broad space of precisely and systematically manipulated brain activity patterns to behavioral responses. Using this framework, we uncovered key computations made by the olfactory system on neural activity to generate percepts and derived a systematic model of olfactory processing directly relevant for perception. Our framework forms a powerful, general approach for causally testing the links between brain activity and perception or behavior. This framework is especially pertinent given the continued development of advanced tools for manipulating brain activity at fine scales across various brain regions.

Source: Science, 2020