A digital neuron fires inside Intel's Loihi chip, sending a spike of information to its neighbors. The event consumes a whisper of energy—roughly one-thousandth the power required if that same neuron had to shout its message to a chip sitting next door. This difference, multiplied across billions of artificial neurons, explains why researchers are racing to rebuild computers in the brain's image.
The 20-Watt Benchmark
Your brain runs on the power of two LED bulbs. Twenty watts keeps 100 billion neurons firing, storing memories, recognizing faces, and solving problems that still baffle the fastest supercomputers. Meanwhile, training GPT-3 consumed enough electricity to power 120 houses for a year. GPT-4 required an estimated 50 times more energy than that, and these figures don't even include the power needed every time someone types a prompt into ChatGPT.
The disparity isn't just embarrassing—it's unsustainable. As AI models grow larger and more capable, their energy appetite threatens to outpace the benefits they provide. Neuromorphic chips offer a different path forward by copying the brain's architectural tricks rather than simply throwing more silicon at the problem.
Spikes Instead of Streams
Traditional computer chips process information continuously, updating calculations thousands of times per second whether anything interesting is happening or not. It's like leaving every light in your house on all day just in case you might walk into a room. Neuromorphic chips use spiking neural networks that communicate through discrete events—brief electrical pulses that fire only when there's actually information to transmit.
IBM's TrueNorth chip demonstrates this approach at scale. Built with 5.4 billion transistors, it packs one million digital neurons and 256 million synapses onto a single piece of silicon that sips just 65 milliwatts of power. The architecture uses 4,096 neurosynaptic cores connected by event-driven routing, meaning signals only travel when neurons actually spike. For algorithms that traditional processors struggle with, TrueNorth consumes orders of magnitude less energy.
Intel's Loihi chips take a similar approach but add programmability that makes them more practical for research. When scientists at TU Graz and Intel tested systems using 32 Loihi chips on temporal processing tasks in 2022, they found the hardware was two to three times more energy-efficient than other AI models. The real surprise came when they measured communication within a single chip versus between chips: internal neuronal messages used 1,000 times less energy than signals crossing chip boundaries.
Memory in Silence
The brain's most clever trick isn't what it does when neurons fire—it's what happens when they don't. Philipp Plank, a doctoral student at TU Graz who worked with Intel on neuromorphic efficiency measurements, helped demonstrate that these chips store information in the "internal variables" of neurons, essentially creating a hardware version of short-term memory. Previous information gets encoded in the non-activity of neurons, and silence costs almost nothing.
This mechanism relies on what researchers call neuronal fatigue. When a neuron fires repeatedly, it becomes temporarily less responsive—a natural filter that helps the system distinguish between fleeting inputs and information worth remembering. Conventional computers maintain memory by constantly refreshing their contents, burning energy to preserve data. Neuromorphic systems let memory rest in the quiet states between spikes.
Wolfgang Maass, professor emeritus at TU Graz who supervised much of this research, points out that neuromorphic chips effectively combine recurrent neural networks (which handle short-term memory) with feed-forward networks (which filter relevant information). The architecture does double duty without the energy overhead that would cripple traditional processors attempting the same feat.
Beyond the Laboratory
Steve Furber from the University of Manchester built SpiNNaker, a neuromorphic computing system now available through the Human Brain Project's research infrastructure in Europe. In September 2021, German company SpiNNcloud partnered with Sandia National Laboratories to explore neuromorphic applications for national defense systems—a sign that the technology is moving from academic curiosity to practical deployment.
Intel released Loihi 2 with an open-source framework designed to accelerate community development. The move acknowledges that neuromorphic computing needs more than better hardware; it needs software tools, programming paradigms, and a generation of developers who think in spikes rather than clock cycles.
The Efficiency Ceiling
Neuromorphic chips won't replace every processor. They excel at specific tasks—visual object recognition, sensory perception, cognitive modeling—where timing and patterns matter more than raw arithmetic throughput. A neuromorphic chip won't speed up your spreadsheet calculations or render video games faster. But for AI systems that need to process continuous streams of sensor data or make rapid decisions based on incomplete information, the brain's architecture offers advantages that conventional designs can't match.
The energy savings become more compelling as AI pervades everyday devices. A security camera that can recognize threats using milliwatts instead of watts can run for months on batteries. A prosthetic limb that processes touch and movement through neuromorphic chips can operate all day without heavy battery packs. These aren't hypothetical futures—researchers are building prototypes now.
The question isn't whether neuromorphic chips will find applications, but whether they'll arrive fast enough to prevent AI's energy demands from spiraling out of control. Twenty watts bought evolution a brain capable of art, science, and philosophy. The least we can do is borrow some of its efficiency tricks before we burn through the planet's electricity trying to simulate it badly.