Summary: New research suggests that astrocytes, previously thought to be only sympathetic cells, may considerably improve memory storage. Unlike cells, astrocytes may fire electronic signals but may influence neural activity through magnesium signaling and gliotransmitters.
A mathematical model based on deep associated memory suggests that astrocytes could simultaneously link multiple neurons, significantly increasing storage capacity. This design provides a more effective storage system than neuron-only networks by presenting astrocytic processes as personal computing units.
Important Information:
- Role of the astrocytic: Triads of potassium indicating are used to control neural activity by forming bilateral synapses.
- A new model demonstrates that astrocyte-neuron sites can significantly outperform conventional neurological models in terms of memory.
- AI Potential: Insight from astrocytic processing was inform next-generation AI techniques, reconnecting science with machine learning.
Origin: MIT
There are 86 billion cells in the animal mind. These organisms fire electric signals that help the mental keep memories and give details and commands throughout the head and the nervous system.
Thousands of star-shaped cells with numerous long extensions that allow them to interact with thousands of cells are also present in the brain.
Although they have long been thought to be primarily friendly cells, recent research has suggested that astrocytes may have a responsibility in memory storage and other mental functions.
Researchers at MIT have now developed a new hypothesis regarding how astrocytes does affect memory store. The layout suggested by their design may help to explain the body’s large storage power, which is much greater than would be expected using cells only.
According to Jean-Jacques Slotine, an MIT professor of electrical engineering and of brain and mental sciences,” Astroocytes were originally thought to only clean up around neurons,” there is no particular reason why creation did not realize that because each astrocyte can communicate with hundreds of thousands of connections, they could also be used for computation.
The senior author of the open-access paper, which was published on May 23 in the Proceedings of the National Academy of Sciences, is Dmitry Krotov, a research staff member at the MIT-IBM Watson AI Lab and IBM Research. Leo Kozachkov PhD ‘ 22 is the paper’s lead author.
Memory capacity
Astrocytes perform a variety of support tasks in the brain, including cleaning up debris, providing nutrients to neurons, and ensuring an adequate blood supply.
Astrocytes also send out many thin tentacles, known as processes, which can each wrap around a single synapse — the junctions where two neurons interact with each other — to create a tripartite ( three-part ) synapse.
Neuroscientists have discovered that memory storage and retrieval are impaired if the connections between astrocytes and neurons in the hippocampus are disrupted.
Astrocytoses cannot release action potentials, the electrical signals that carry information throughout the brain, like neurons. However, they can use calcium signaling to communicate with other astrocytes.
Researchers have discovered that calcium signaling also enables astrocytes to coordinate their activity with neurons in the synapses they associate with over the past few decades as the resolution of calcium imaging has increased.
These studies make the case that astrocytes can detect neural activity that causes them to alter their own calcium levels. Those changes may trigger astrocytes to release gliotransmitters — signaling molecules similar to neurotransmitters — into the synapse.
According to Kozachkov,” there’s a closed circle between neuron signaling and astrocyte-to-neuron signaling.”
The unknown is precisely what kind of computations the astrocytes can perform with the data that they are sensing from neurons.
The MIT team set out to model what those connections might be doing and how they might contribute to memory storage. Their model is based on Hopfield networks, a type of neural network that can store and recall patterns.
Hopfield networks, which were first created by John Hopfield and Shun-Ichi Amari in the 1970s and 1980s, are frequently used to model the brain, but it has been shown that these networks are insufficiently able to store enough information to account for the enormous memory of the human brain.
A newer, modified version of a Hopfield network, known as , dense associative memory, can store much more information through a higher order of couplings between more than two neurons.
Since conventional synapses only connect two neurons: a presynaptic cell and a postsynaptic cell, it is unclear how the brain could implement these many-neuron couplings at a hypothetical synapse. Asterocytes play this role.
” If you have a network of neurons, which couple in pairs, there’s only a very small amount of information that you can encode in those networks”, Krotov says.
You need to couple more than two neurons to develop dense associative memories. It is tempting to speculate that there might be an information transfer between synapses mediated by this biological cell because a single astrocyte can connect to many neurons and many synapses.
” That was the biggest inspiration for us to look into astrocytes and led us to start thinking about how to build dense associative memories in biology.”
The researchers ‘ new paper’s neuron-astrocyte associative memory model can store more information than a traditional Hopfield network, which accounts for the brain’s memory capacity.
intricate connections
The extensive biological connections between neurons and astrocytes offer support for the idea that this type of model might explain how the brain’s memory storage systems work, the researchers say. They make the claim that astrocytes ‘ memories are gradually altered by calcium flow patterns.
Glyotransmitters released at synapses that astrocyte processes connect to relay this information to neurons.
” By careful coordination of these two things — the spatial temporal pattern of calcium in the cell and then the signaling back to the neurons — you can get exactly the dynamics you need for this massively increased memory capacity,” Kozachkov says.
One of the main characteristics of the new model is that it views astrocytes as a series of processes rather than as a single entity. One computational unit can be used to represent each of those processes.
Because of the high information storage capabilities of dense associative memories, the ratio of the amount of information stored to the number of computational units is very high and grows with the size of the network. This results in a system with both high capacity and energy efficiency.
The authors argue that tripartite synaptic domains, which are where astrocytes interact dynamically with pre- and post-synaptic neurons, can store as many memory patterns as there are neurons in the network.
” This leads to the striking implication that, in principle, a neuron-astrocyte network could store an arbitrarily large number of patterns, limited only by its size”, says Maurizio De Pitta, an assistant professor of physiology at the Krembil Research Institute at the University of Toronto, who was not involved in the study.
Researchers could find methods to precisely manipulate the connections between astrocytes ‘ processes in order to determine whether this model accurately represents how memory is stored. Afterwards, they could examine how those changes might impact memory function.
According to Krotov,” we hope that experimentalists will take this idea seriously and conduct some experiments testing this hypothesis.”
In addition to offering insight into how the brain may store memory, this model could also provide guidance for researchers working on artificial intelligence.
Researchers could create a wide range of models that could be explored for different purposes, for instance creating a continuum between dense associative memories and attention mechanisms in large language models, by changing the connectivity of the process-to-process network.
While neuroscience initially influenced some of the key concepts in AI, the last 50 years of research in this area have had little impact, and many contemporary AI algorithms have shifted away from neural analogies, Slotine says.
” In this sense, this work may be one of the first contributions to AI informed by recent neuroscience research” . ,
About this news about neuroscience and memory research
Author: Sarah McDonnell
Source: MIT
Contact: Sarah McDonnell – MIT
Image: The image is credited to Neuroscience News
Original Research: Open access.
Jean-Jacques Slotine and al .’s” Neuron–astrocyte associative memory” PNAS
Abstract
Neuron–astrocyte associative memory
The most prevalent type of glial cell, astrocytes, are essential to memory.
There are no current theories that explain how neurons, synapses, and astrocytes might all contribute to memory function, despite the majority of hippocampal synapses being contacted by an astrocyte.
We demonstrate that fundamental aspects of astrocyte morphology and physiology naturally lead to a dynamic, high-capacity associative memory system.
Our framework closely resembles the popular machine learning architectures known as Dense Associative Memories in terms of neuron–astrocyte networks. The model developed here, which adjusts the connectivity pattern, creates a family of associative memory networks that include a Dense Associative Memory and a Transformer as two limiting cases.
In the known biological implementations of Dense Associative Memories, the ratio of stored memories to the number of neurons remains constant, despite the growth of the network size.
Our research establishes that neuron–astrocyte networks outperform conventional biological methods to achieve the same level of memory scaling.
Our model points to the exciting and previously unexplored possibility that memories could be stored, at least in part, within the network of astrocyte processes rather than just in the synaptic weights between neurons.