A biocomputing lab has a certain level of silence that is not present in a traditional data center. Data centers are humming. Raised flooring, industrial fans, and the incessant drone of machinery performing massive tasks at exorbitant costs are all present. There are differences in the labs where human brain cells are grown and connected to silicon chips. They’re warmer. smaller. Cell culture medium, the nourishing broth that keeps neurons alive, frequently has a subtle chemical odor. The apparatus resembles an upscale refrigerator more than a server rack. However, the stakes seem equally high.
Energy is the main issue that motivates all of our effort. The amount of electricity used by contemporary AI systems has begun to worry even the developers. The energy needed to train GPT-3 was enough to power about 120 homes for a whole year. About fifty times more was required by GPT-4.
The infrastructure needed to operate these systems at scale—the data centers, cooling systems, and power contracts—becomes one of the most significant material restrictions in the technology sector as each generation of massive language models becomes more and more demanding. In contrast, the human brain uses about twenty watts to function. comparable to a dim lightbulb. Researchers are dedicating their lives to closing that gap in labs across the United States, Switzerland, and Australia.
Important Information
| Field | Details |
|---|---|
| Field Name | Neuromorphic Computing and Biocomputing |
| Key Companies | FinalSpark (Vevey, Switzerland), Cortical Labs (Melbourne, Australia), Intel (Santa Clara, CA) |
| FinalSpark Founded | 2014 — by Dr. Fred Jordan and Dr. Martin Kutter |
| Cortical Labs Product | CL1 — launched March 2025 at Mobile World Congress; ships at $35,000 per unit |
| CL1 Specs | 800,000 living human neurons on a chip; six-month life-support system |
| FinalSpark Model | Remote access to 16 brain organoids via Neuroplatform — from $1,000/month |
| Intel’s Hala Point | 1.15 billion artificial neurons, 138.2 billion synapses — deployed at Sandia National Laboratories, April 2024 |
| Human Brain Power | Approximately 20 watts — the benchmark every team in this field is chasing |
| Key Advantage | Memory and processing co-located, unlike traditional von Neumann architecture |
| 2026 Milestone | Sandia National Laboratories demonstrated neuromorphic chips solving climate modeling equations using a fraction of conventional energy |
| German BRIGHT Initiative | Launching April 2026 — neuromorphic system using micro-LEDs for ultra-low power AI |
| Further Reading | Intel Neuromorphic Research |
FinalSpark has been working on this issue since 2014 in Vevey, a small city on the northern coast of Lake Geneva that is more well-known for its association with Nestlé than for computing research. The company, which was founded by Drs. Fred Jordan and Martin Kutter, began with traditional AI techniques before changing course in 2018 to create computer systems using real living neurons. A brain organoid, a tiny three-dimensional cluster of human brain cells grown from skin or blood stem cells, is the fundamental technology.
FinalSpark connects sixteen of these organoids to a fridge-like incubator, links them to electronic systems via multi-electrode arrays, and offers remote access to the whole setup through a platform called Neuroplatform. Ten years ago, the statement “more than ten universities are currently conducting experiments on it, accessing living neural tissue through a browser or a Python API” would have sounded like science fiction.
Cortical Labs has been working on a product in Melbourne. Their CL1 system — launched at Mobile World Congress in March 2025 and shipping that summer for $35,000 a unit — integrates 800,000 living human neurons onto a silicon chip inside what the company calls a body in a box, a self-contained life-support system that keeps the cells alive for around six months.
One of the demonstrations that drew the most attention was the system learning to play Doom, the 1993 first-person shooter. not designed to play it. Using a feedback loop, one can learn how to play it. It’s possible to read too much into a demonstration like that — a single video game, a controlled environment, carefully designed stimuli. However, it’s also difficult to ignore what it suggests about the future of this technology.
Working concurrently, the silicon-based strategy has made important advancements of its own. Intel’s Hala Point system, deployed at Sandia National Laboratories in April 2024, contains 1.15 billion artificial neurons and 138.2 billion synapses etched onto conventional chips designed to mimic the brain’s architecture rather than replicate it literally. Researchers at Sandia showed in February 2026 that neuromorphic processors might use a fraction of the energy needed for a conventional supercomputer to solve intricate physical equations similar to those used in climate prediction.

Germany’s BRIGHT initiative, launching in April 2026, is pushing the concept further — using micro-LEDs instead of electrical signals to build an even lower-power neuromorphic system. The common thread across all these approaches is the same: move away from the von Neumann architecture, where memory and processing sit in separate places and data has to constantly travel between them, burning energy at every step, and move toward something closer to how a brain actually works, where computation and memory are woven together.
What makes following this field genuinely interesting is the uncertainty that still surrounds it. The researchers are not shy about the challenges. Neurons perish. The CL1’s six-month lifespan is a hard biological limit. Organoids need constant nutrient feeding, temperature control, and careful handling that does not scale easily to industrial settings.
The neuromorphic silicon systems are impressive at specific tasks — sensory processing, edge inference, tasks with clear temporal structure — but they are not yet general replacements for the GPUs that train the large models everyone uses. There is a sense, talking to people in this space, that they are building something whose full shape is not yet clear even to them. The tools exist. The early results are encouraging. It’s still really unclear what the technology will eventually become.
Watching it unfold from a distance, the thing that strikes an observer most is how the two tracks — living cells and silicon mimics — are not really competing. They are posing distinct queries. One wonders what might happen if biology were introduced straight into the machine. The other inquires as to how accurately biology can be approximated utilizing the available resources. Because no one yet knows which way to go or what combination might eventually result in something that actually alters how computation functions at scale, both are valid and essential. It’s silent in the laboratory. However, the questions they are facing are serious ones.