Marlene Cohen, PhD

Paying attention makes sense. We believe that if we pay attention to what we see, we’ll respond better and perceive the world around us more clearly. But just how does attention affect perception and vision more generally? We may have some new ways of understanding the nature of attention, according to Marlene Cohen, Ph.D., an associate professor of neuroscience at the University of Pittsburgh. By understanding the brain’s internal state, we stand a better chance of understanding how it affects our ability to interact with the world around us, she notes, adding that such information is highly relevant to how we could one day better assess and treat attention-disrupting developmental disorders like schizophrenia or autism.

Over the last two decades, Cohen has been developing methods to simultaneously measure and analyze the activity of dozens of neurons within vision-processing sections of the brain. Why is this important? Because brains are noisy environments, and traditionally, neuroscientists have focused on single cell recording in the brain. Prior to her work, scientists tried to discern the role of attention in visual perception by recording from single neurons and averaging their responses over many trials in which the same visual image is presented over and over. But in the real world, humans and animals make quick decisions based on a single viewing of a stimulus. By recording from many neurons simultaneously, Cohen measured a snapshot of the sensory information available at a given moment. She combined the responses of 80 neurons to estimate where the animal was paying attention at each moment. The result? Cohen found that shifts in attention had huge effects on how well an animal could detect a change in a visual image. When the animal’s attention wandered to the wrong location, it could almost never detect even a fairly obvious change in the visual image. (This makes it clear that the shifts in attention all of us experience, for example during texting while driving, have huge consequences for our ability to perform even very basic perceptual tasks.) Cohen could predict – with 80 percent accuracy – an animal’s state of attention, all from a small fraction of the 100 billion neurons within the brain! (2011, J. Neuroscience) 

In a related study, Cohen also found that interactions between neurons, like how much their noise is correlated, have a huge effect on how well they encode information about the visual world. This study (2009, Nature Neuroscience), had a major impact on the field (nearly 500 citations), making it commonplace now to measure interactions between neurons instead of simply the properties of single neurons. 

Recording from groups of cells at the same time is essential, according to Cohen, because almost every neuropsychiatric disorder reflects a problem of a network of neurons, and we need to learn why a network functions and how it relates to behavior. Cohen is especially excited because if she learns how spatial attention evolves in a network of neurons, she may uncover general mechanisms that could be applied to other systems, like touch, smell, hearing, and even cognitive or motor processes. She is continuing her landmark work by recording in two areas of the brain now – areas responsible for processing visual cues and premotor areas responsible for planning how the body responds to those cues. She wants to know what the premotor networks know about the visual control of encoded information and how that information depends on an animal’s cognitive state – what it’s planning to do with the information. 

In other work, Cohen is collaborating with computational neuroscientists like Brent Doiron, a professor of mathematics at Pitt, with whom she has a Simons Foundation award. In his computational models, Doiron has shown that attention should have different effects on excitatory and inhibitory neurons. Cohen is developing techniques for intracellular recordings in animals performing attention tasks to test this model. Conversely, Doiron uses models of Cohen’s experimental observations to test his model. Experimental discoveries that Cohen makes will help narrow down the range of Doiron’s theoretical models that can work.

At the beginning of her career, Cohen received bachelors of science degrees in mathematics and brain and cognitive sciences from MIT, where she also received the Hans Lukas Teuber Award for excellence in undergraduate research. She completed a doctorate in neuroscience at Stanford University in 2007, receiving an excellence in teaching award. She also completed appointments at Harvard Medical School and HHMI. In addition to her appointment in neuroscience, Cohen is an associate director at the Center for the Neural Basis of Cognition, a joint program between the University of Pittsburgh and Carnegie Mellon University. Cohen has been honored with a McKnight Scholar Award, a Sloan Research Fellowship, and a Klingenstein Fund award, and she receives ongoing support from the National Institutes of Health. In 2012, she received the grand prize in the Eppendorf and Science Prize contest for Neurobiology.

This image shows the use of an electrode recording array (top) recording the activity of 80 or more neurons within the brain. The attentional state of an animal, as well as how much experience the animal has with a given task, affects how correlated the activity of neurons are. It turns out that the more an animal is paying attention, the less correlated neural spikes are (left-hand side). Correlations may also be a reliable signature of learning that can be factored back into models of cognition.

Conversely, when neuronal spike activity is strongly correlated (see right hand side), then an animal is paying less attention, and vision is likely to be worse, according to Cohen.