A New Brain-Inspired Learning Method for AI Saves Memory and Energy

Despite the frequent analogies, today’s AI operates on very different principles to the human brain. Now researchers have proposed a new learning method more closely tied to biology, which they think could help us approach the brain’s unrivaled efficiency. Modern deep learning is at the very least biologically-inspired, encoding information in the strength of connections between large networks of individual computing units known as neurons. Probably the biggest difference, though, is the way these neurons communicate with each other.

Artificial neural networks are organized into layers, with each neuron typically connected to every neuron in the next layer. Information passes between layers in a highly synchronized fashion as numbers falling in a range that determines the strength of the connection between pairs of neurons. Biological neurons, on the other hand, communicate by firing off electrical impulses known as spikes, and each neuron does so on its own schedule. Connections are not neatly divided into layers and feature many feedback loops that means the output of a neuron often ends up impacting its input somewhere down the line.

This spike-based approach is vastly more energy efficient, which is why training the most powerful AI requires kilowatts of electricity while the brain uses just 20 watts. That’s led to growing interest in the development of artificial spiking neural networks as well as so-called neuromorphic hardware—computer chips that mimic the physical organization and principles of the brain—that could run them more efficiently.

A similar approach can be applied to spiking neural networks, but it requires huge amounts of memory. It’s also clear that this is not how the brain solves the learning problem, because it requires error signals to be sent backwards in both time and space across the synapses between neurons, which is clearly impossible.

That prompted the researchers, who are part of the Human Brain Project, to look at two features that have become clear in experimental neuroscience data: each neuron retains a memory of previous activity in the form of molecular markers that slowly fade with time; and the brain provides top-down learning signals using things like the neurotransmitter dopamine that modulates the behavior of groups of neurons.

There’s still a long way to go before the technique can match the power of today’s leading AI. But if it helps us start to approach the efficiencies we see in biological brains, it might not be long before AI is everywhere.