Vision Lets Brain Make Predictions

Vision Lets Brain Make Predictions
Image by wirestock on Freepik

When a pitcher throws a fastball at Toronto Blue Jays shortstop Bo Bichette in a professional baseball game, the Rogers Centre crowd looks to be doing something extraordinary. They can’t predict it, but Bichette might. A new Western study shows how a player like Bichette might predict it by processing the information available to him with his vision (specifically his retinae) from a ball moving at 150 kilometers per hour (90+ miles per hour) in real-time, giving the all-star slugger a fighting chance.

Lyle Muller, a mathematics professor at Western University, and his colleagues at the Salk Institute for Biological Studies in La Jolla, California, devised a neural network model that can be rapidly and efficiently taught to forecast specific moments by combining arithmetic with artificial intelligence.

This model sheds fresh light on how a certain pattern of neural activity known as “traveling waves” may play a role in the highly structured embedding of visual information onto brain circuits.

Nature Communications published the findings

“Each cortical region in the visual system contains a map of visual space. In this new paper, we reasoned that waves traveling over these maps may enable short-term predictions into the future,” said Muller, Western Institute for Neuroscience faculty member. “When we developed this network with traveling waves, we found it can help the system to forecast what comes next in upcoming movie frames.”

Engineers developing the latest AI technologies—for everything from chatbots to smart cars—may now have a blueprint for teaching machines by using the inherent expertise of all human existence by learning how the brain extrapolates information from individual actions to build a reserve of “mental movies” for forecasting the future.

New information about visual perception

In the baseball game, a ball travels 18 meters (60 feet) from the pitcher’s hand to Bichette at home plate in about 400 milliseconds. Bichette’s brain certainly needs time to do the neurological computations that allow him to perceive the ball and estimate its trajectory. This includes the time it takes for sensory information to travel from the retinae to important parts of the brain, as well as the time it takes to compute the ball’s trajectory in space using this information.

It is expected that the full computation will take 150 milliseconds. The ball will have traveled more than six meters (22 feet), or one-third of the distance to home plate, during this time, therefore Bichette’s brain must be using other visual signals to estimate where the ball is going.

The brain may actively predict a few hundred milliseconds into the future, using these dynamic patterns of neural activity to forecast how the movie of a visual experience will change, to estimate the likely current location of the ball based on information that was available to Bichette’s visual system 150 milliseconds ago.

These findings shed new light on how the visual system interprets information from the eyes. Building on the fundamental work of David Hubel and Torsten Wiesel, recipients of the 1981 Nobel Prize in Physiology, models of the visual system primarily focus on the transformations of visual information across different parts of the visual system. These modifications, which are caused by “feed-forward” connections that convey information from the eyes to the brain via the optic nerve, can explain how the brain turns static images into outlines of edges and how the brain recognizes objects. However, neuroscientists are beginning to realize that this is not the only way vision works.

“The status quo is a very static view of vision and does not consider things like latency or the movement we experience in our normal visual experience,” said Gabriel Benigno, a Ph.D. student in mathematics and first author on the study. “We know the brain processes visual input using connections that stretch far across the map of visual space. These ‘recurrent’ connections can connect processing far across different portions of an image, and using these connections, we basically found a way the brain may ‘animate’ predictions in the visual system, going from a static image to a movie.”

New artificial brain model

Muller and his colleagues previously discovered that neuronal activity can flow across discrete regions of the brain in a traveling wave, similar to how waves move across the ocean. Muller revealed that visual stimuli could generate traveling waves of brain activity in a paper published in Nature Communications in 2014. Muller and colleagues followed up on this discovery with a key 2020 Nature publication that demonstrated spontaneous brain activity is similarly organized into traveling waves that can alter perception. However, it is unknown why these waves exist in the visual system—what computation may they be performing?

This new study effort, which was mostly developed at the Western Academy for Advanced study, begins to address this topic. The Academy team connects work done by Roberto Budzinski, a Western Institute for Neuroscience clinical postdoctoral fellow and co-author on this study, who is applying these mathematical techniques in collaboration with neurologists and neurosurgeons at London Health Sciences Centre (LHSC). Muller and his colleagues hope to develop new mathematical tools that will aid in the explanation of information processing in neural networks, both artificial and biological.

Source Link

Driven by a deep passion for healthcare, Haritha is a dedicated medical content writer with a knack for transforming complex concepts into accessible, engaging narratives. With extensive writing experience, she brings a unique blend of expertise and creativity to every piece, empowering readers with valuable insights into the world of medicine.

more recommended stories