The memory capacity of our brains may be 10 times what was previously believed, say scientists who've gained important insights into the size and extent of our neural connections.

The work is providing answers to questions about how the brain manages to be so energy efficient, requiring only low energy but capable of extreme computational power, researchers at Salk Institute in La Jolla, Calif. say.

"This is a real bombshell in the field of neuroscience," says Terry Sejnowski at the Salk Institute for Biological Studies. "Our new measurements of the brain's memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web."

Patterns of chemical and electrical activity are the brain's methods for storing thoughts and memories, and a key part of that happens at the points where branches of neurons interact at junctions known as synapses.

Chemicals known as neurotransmitters help fire signals across those junctions to other neurons, the researchers explain, noting that every neuron can have thousands of synapses connecting with thousands of other neurons.

The size of a synapse is important, the researchers say; the larger it is, the more likely it will successfully send a signal to a neighboring neuron.

The researchers made a key discovery; synapses can vary in size in increments as small as 8 percent, suggesting their may exist as many as 26 categories of synapse size, rather than just the few as had previously been believed.

That unexpected complexity in synapse categories means the brain's potential memory capacity is much greater that had been thought, they say.

"The implications of what we found are far-reaching," says Sejnowski. "Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us."

The researchers made that discovery while analyzing neurons from the hippocampus — the memory center — of rat brains to estimate the storage capacity of synapses.

Each synapse, on average, can store around 4.7 bits of information, they found.

Scaling up from the rat brain to a human brain suggested a storage capacity of around a petabyte. That's 1,000,000,000,000,000 bytes, and it's around 10 times what had previously been believed, the researchers report in the journal eLive.

The researchers suggest their findings could lead to better ways to design computers, creating energy-efficient and ultra-precise machines based on deep learning and neural network principles.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion