How attention helps the brain perceive an object
It’s easy to miss something you’re not looking for. In a famous example, people were asked to closely observe two groups of people—one group clad in black, the other in white—pass a ball among themselves. Viewers were asked to count the number of times the ball passed from black to white. Remarkably, most observers did not notice a man in a gorilla suit, walking among the players. This ability of the brain to ignore extraneous visual information is critical to how we work and function, but the processes governing perception and attention are not fully understood. Scientists have long theorized that attention to a particular object can alter perception by amplifying certain neuronal activity and suppressing the activity of other neurons (brain “noise”).
Now, Salk scientists have confirmed this theory by showing how too much background noise from neurons can interrupt focused attention and cause the brain to struggle to perceive objects. The findings, which appeared in eLife on February 22, 2019, could help improve designs for visual prosthetics.
“This study informs us about how information is encoded in the electrical circuits in the brain,” says Salk Professor John Reynolds, senior author of the paper. “When a stimulus appears before us, this activates a population of neurons that are selective for that stimulus. Layered on top of that stimulus-evoked response are large, low-frequency fluctuations in neural activity.
Previous work from Reynolds’ laboratory found that when attention is directed to the stimulus, these low-frequency fluctuations are suppressed. Theoretical models of neural information processing suggested that such fluctuations should impair perception and that attention improves perception by filtering these fluctuations out.
To test this idea directly, the researchers turned to a cutting-edge technology called optogenetics, a technique that can affect the activity of neurons by shining lasers onto light-activated proteins. The team used a low-frequency laser stimulation protocol directed at a visual brain region in animals to create low-frequency response fluctuations—the very neural fluctuations that attention suppresses. They measured the impact of this on the animal’s ability to detect a small change in the orientation of a visual stimulus presented on a computer screen. As predicted by the theory, the added noise impaired perception. Then, they repeated the experiment, but using a different laser protocol to induce fluctuations over a high-frequency range that attention does not suppress. Consistent with the theory, this had no impact on perception.
“This is the first time this theoretical idea that increased background noise can hurt perception has been tested,” says first and corresponding author Anirvan Nandy, assistant professor at the Yale University School of Medicine and former Salk researcher. “We’ve confirmed that attention does operate largely by suppressing this coordinated neuron firing activity.”
“This work opens a window into the neural code, and will become part of our understanding of the neural mechanisms underlying perception. A deeper understanding of the neural language of perception will be critical in building visual prosthetics,” adds Reynolds, holder of the Fiona and Sanjay Jha Chair in Neuroscience.
Source: Read Full Article