Nanoconnectomic upper bound on the variability of synaptic plasticity
Thomas M Bartol Jr Corresponding Author, Cailey Bromer, Justin Kinney, Michael A Chirillo, Jennifer N Bourne, Kristen M HarrisCorresponding Author, Terrence J Sejnowski Corresponding Author
Howard Hughes Medical Institute, Salk Institute for Biological Studies, United States; Massachusetts Institute of Technology, United States; The University of Texas at Austin, United States; University of California, San Diego, United States
Published November 30, 2015
Cite as eLife 2015;4:e10778
In a computational reconstruction of brain tissue in the hippocampus, Salk scientists and UT-Austin scientists found the unusual occurrence of two synapses from the axon of one neuron (translucent black strip) forming onto two spines on the same dendrite of a second neuron (yellow). Separate terminals from one neuron’s axon are shown in synaptic contact with two spines (arrows) on the same dendrite of a second neuron in the hippocampus. The spine head volumes, synaptic contact areas (red), neck diameters (gray) and number of presynaptic vesicles (white spheres) of these two synapses are almost identical.
Source/Fonte: Salk Institute
Abstract
Information in a computer is quantified by the number of bits that can be stored and recovered. An important question about the brain is how much information can be stored at a synapse through synaptic plasticity, which depends on the history of probabilistic synaptic activity. The strong correlation between size and efficacy of a synapse allowed us to estimate the variability of synaptic plasticity. In an EM reconstruction of hippocampal neuropil we found single axons making two or more synaptic contacts onto the same dendrites, having shared histories of presynaptic and postsynaptic activity. The spine heads and neck diameters, but not neck lengths, of these pairs were nearly identical in size. We found that there is a minimum of 26 distinguishable synaptic strengths, corresponding to storing 4.7 bits of information at each synapse. Because of stochastic variability of synaptic activation the observed precision requires averaging activity over several minutes.
eLife digest
What is the memory capacity of a human brain? The storage capacity in a computer memory is measured in bits, each of which can have a value of 0 or 1. In the brain, information is stored in the form of synaptic strength, a measure of how strongly activity in one neuron influences another neuron to which it is connected. The number of different strengths can be measured in bits. The total storage capacity of the brain therefore depends on both the number of synapses and the number of distinguishable synaptic strengths.
Structurally, neurons consist of a cell body that influences other neurons through a cable-like axon. The cell body bears numerous short branches called dendrites, which are covered in tiny protrusions, or “spines”. Most excitatory synapses are formed between the axon of one neuron and a dendritic spine on another. When two neurons on either side of a synapse are active simultaneously, that synapse becomes stronger, a form of memory. The dendritic spine also becomes larger to accommodate the extra molecular machinery needed to support a stronger synapse.
Some axons form two or more synapses with the same dendrite, but on different dendritic spines. These synapses should be the same strength because they will have experienced the same history of neural activity. Bartol et al. used a technique called serial section electron microscopy to create a 3D reconstruction of part of the brain that allowed the sizes of the dendritic spines these synapses form on to be compared. This revealed that the synaptic areas and volumes of the spine heads were nearly identical. This remarkable similarity can be used to estimate the number of bits of information that a single synapse can store, since the size of dendritic spines and their synapses can be used as proxies for synaptic strength.
Measurements in a small cube of brain tissue revealed 26 different dendritic spine sizes, each associated with a distinct synaptic strength. This number translates into a storage capacity of roughly 4.7 bits of information per synapse. This estimate is markedly higher than previous suggestions. It implies that the total memory capacity of the brain – with its many trillions of synapses – may have been underestimated by an order of magnitude. Additional measurements in the same and other brain regions are needed to confirm this possibility.
FREE PDF GRATIS: eLIFE