When Two Swiss Scientists Decided Silicon Wasn't Good Enough

Dr. Martin Kutter and Dr. Fred Jordan already had their success story. AlpVision, their anti-counterfeiting technology company founded in 2001, held over 80 patents between them. They could have coasted on that achievement. Enjoyed comfortable consulting careers. Maybe advised startups from the sidelines.

Instead, around 2018, they looked at the fundamental limitations of silicon computing and decided to do something that sounds completely barking mad: they would build computers out of living human brain tissue.

Not simulate neurons in software. Not create artificial neural networks inspired by biology. Actually grow real neurons in laboratory dishes and use them to process information.

Their company, FinalSpark, launched the world's first commercially accessible biocomputing research platform on May 15, 2024. It's operational right now in Vevey, Switzerland. The Neuroplatform has 16 brain organoids containing approximately 160,000 living human neurons, all interfaced with electrodes, all processing information, all using roughly one million times less energy than the silicon chips consuming your cloud budget.

This is the complete technical deep-dive into how it actually works, what they've achieved, and why it matters for anyone paying attention to where computing is heading.

The Verified Specifications: What's Actually Running in Switzerland

Before we dive into how this works, let's establish the facts. FinalSpark isn't vaporware. This isn't a press release with no substance. The research is peer-reviewed, published in Frontiers in Artificial Intelligence (DOI: 10.3389/frai.2024.1376042), and achieved top 1% most-read status by October 2024.

Company verification: FinalSpark Sàrl, Swiss company registration CHE-256.971.603, founded January 27, 2014, operating in Vevey, Switzerland.

Founders: Dr. Fred Jordan (PhD from EPFL Signal Processing Institute, 1999) and Dr. Martin Kutter (PhD from EPFL, Best Thesis Award 2000). Both continue serving their previous company AlpVision (Jordan as CEO, Kutter as President) while leading FinalSpark's biocomputing venture.

Platform architecture: The Neuroplatform comprises 16 brain organoids arranged in 4 Multi-Electrode Arrays (MEAs), with 4 organoids per MEA. Each organoid contains approximately 10,000 neurons, yielding roughly 160,000 neurons system-wide. These spherical "forebrain organoids" measure 0.5mm in diameter.

Interface technology: Each organoid interfaces with 8 electrodes (32 total per MEA) using Intan RHS 32 controllers with 30 kHz sampling frequency and 16-bit resolution (0.15 μV accuracy). These electrodes bidirectionally stimulate and record neural activity.

Support systems: The platform includes microfluidic systems for organoid maintenance, digital cameras for monitoring, and UV light systems for dopamine reward molecule uncaging (a technique for reinforcement learning with biological tissue).

Operational status: The system has operated continuously for four years, testing over 1,000 organoids and collecting 18+ terabytes of data. Organoid lifespan has improved from initial hours to an average of 100 days, though this remains a constraint requiring ongoing culture replacement.

Accessibility: Researchers access the system 24/7 via internet using Python API and Jupyter Notebooks. Nine research institutions granted free access from an initial pool of 36 universities expressing interest. Commercial access available at $500 to $1,000 per month plus setup fees.

Active collaborations confirmed: ETH Zürich, University of Michigan, Free University of Berlin, Lancaster University Leipzig, University of Exeter, University of Bristol, University of Bath, and University Côte d'Azur are all conducting active experiments.

This isn't science fiction. This is functioning technology, albeit early-stage, with verifiable specifications and multiple independent research groups validating findings.

How You Actually Grow a Computer from Skin Cells

The process of creating brain organoids that can compute starts with something remarkably ordinary: adult human cells. Typically skin cells, though other cell types work as well.

Step 1: Cellular reprogramming. Scientists take adult cells and reprogram them into induced pluripotent stem cells (iPSCs). These are cells that have been reverted to an embryonic-like state, capable of becoming any cell type in the body. The technology for creating iPSCs won the Nobel Prize in Physiology or Medicine in 2012. It's well-established, reproducible science.

FinalSpark currently sources commercially produced Neural Stem Cells at $1,000 to $2,000 per package rather than producing iPSCs in-house. This makes the process more efficient but introduces supply chain dependencies.

Step 2: Neural differentiation. The iPSCs are coaxed toward becoming Neural Stem Cells (NSCs). This involves carefully controlled chemical signalling, specific growth factors, and precise environmental conditions. The cells begin expressing genes characteristic of early brain development.

Step 3: Three-dimensional structure formation. This is where it gets interesting. Rather than growing cells in flat sheets (two-dimensional cultures), the NSCs are encouraged to form three-dimensional spherical structures. These aren't random clumps of cells. They self-organize into layered structures that resemble developing brain tissue.

The protocol follows methodology from Govindan et al. (2021): expansion of NSCs, induction of 3D structure, differentiation using growth factors GDNF (Glial cell line-derived neurotrophic factor) and BDNF (Brain-derived neurotrophic factor), then maturation into functional organoids.

Step 4: Maturation and synapse formation. Over weeks to months, the organoids mature. Neurons extend axons and dendrites. Synapses form. Electrical activity begins. The tissue starts behaving like immature brain tissue, generating spontaneous electrical signals and responding to stimulation.

Step 5: Interface integration. The mature organoids are placed onto Multi-Electrode Arrays. These are sophisticated devices with electrode needles that can both stimulate neurons (send electrical signals in) and record activity (detect electrical signals out). Each organoid sits atop 8 electrodes that penetrate the tissue, creating bidirectional communication channels.

Step 6: Training and optimization. The real magic happens here. Researchers use various stimulation patterns to encourage specific computational behaviours. Dopamine reward signalling reinforces desired responses. Over time, the organoids learn to perform simple computational tasks.

The entire process requires weeks to months before organoids reach computational viability. It's not fast. It's not cheap. But it produces living tissue that processes information using fundamentally different mechanisms than silicon.

The Energy Efficiency Mathematics: Why One Million Times Matters

The claim that biological processors use one million times less energy than silicon sounds like marketing hyperbole. It's not. It's physics.

Let's establish the baseline. The human brain runs approximately 86 billion neurons on roughly 20 watts of power. That's about what a dim lightbulb consumes. Your brain, processing vision, hearing, balance, memory, emotion, language, and conscious thought, all happening simultaneously, all running on 20 watts.

Now consider silicon. Stanford's Kwabena Boahen calculated that simulating equivalent processing in silicon would require the output from a nuclear power plant: 10 megawatts. That's 500,000 times more energy than the biological brain uses.

Training GPT-3 alone consumed 10 gigawatt-hours (GWh) of electricity. That's equivalent to 6,000 times the annual energy use of an average European household. For one training run. Of one model. That has since been superseded by GPT-4, which consumed 40 to 48 times more energy.

The 2019 research from University of Massachusetts (Strubell et al., DOI: 10.18653/v1/P19-1355) documented that training a large NLP model with neural architecture search emitted over 626,000 pounds (284 metric tons) of CO2 equivalent. That's nearly five times the lifetime emissions of the average American car including manufacture.

The efficiency gap isn't marginal. It's not 2x or 10x or even 100x. It's orders of magnitude. Potentially a million times or more depending on the comparison methodology.

Why such dramatic differences?

Silicon's fundamental inefficiency: Traditional von Neumann architecture separates memory from processing. Every operation requires shuttling data between CPU and RAM. Each data transfer burns energy. At modern clock speeds of gigahertz (billions of cycles per second), you're moving astronomical amounts of data just to perform basic operations.

Biological computation's advantages: Neurons store information in synaptic weights (the strength of connections between neurons). Memory and processing are not separated. Information is stored where it's processed. There's no energy-expensive data shuffling. Furthermore, biological systems are massively parallel and event-driven. Neurons only fire when there's information to process, unlike silicon CPUs that burn energy even when idle.

The cooling factor: High-performance computing requires enormous cooling infrastructure. Data centres spend nearly as much energy cooling systems as powering the actual computation. Biological systems operate at body temperature with minimal cooling requirements.

FinalSpark's organoids currently store approximately 1 bit of information each. They perform simple stimulus-response tasks. Direct efficiency comparisons remain limited until biological processors handle complex computational workloads. But the fundamental physics advantage is real and measurable.

What These Neurons Can Actually Do Right Now

Let's be brutally honest about current capabilities. FinalSpark's organoids are not replacing your AWS instances. They're not running enterprise applications. They're not processing your customer database.

Current demonstrated capabilities:

  • Simple pattern recognition

  • Basic stimulus-response learning

  • Temporal sequence processing

  • Reward-based optimization (via dopamine signalling)

  • Persistent storage of approximately 1 bit per organoid

That's it. After years of development, millions in investment, and pioneering research, these organoids perform tasks that a £50 microcontroller does trivially.

So why does this matter?

Because four years ago, these organoids lived for hours. Now they average 100 days. Because four years ago, they couldn't learn anything reliably. Now they demonstrably store information and modify behaviour based on rewards. Because multiple universities are actively experimenting with applications in robotics, reservoir computing, spatiotemporal pattern analysis, and active inference frameworks.

The trajectory matters more than the current position.

Consider where silicon was in the 1950s. The ENIAC computer filled an entire room, weighed 30 tons, consumed 150 kilowatts, and performed 5,000 operations per second. A modern smartphone is a billion times more powerful, fits in your pocket, and runs on a battery.

Nobody in 1950 could have predicted that trajectory. But the physics advantages were always there, waiting for engineering to catch up.

FinalSpark's organoids represent 1950s-era capability in a fundamentally new computing paradigm. The physics advantages are proven. The engineering challenges are being addressed systematically. The question isn't whether biological computing works. It clearly does. The question is timeline to practical commercial viability.

The Technical Challenges Nobody's Solved Yet

FinalSpark isn't hiding their limitations. The peer-reviewed research transparently acknowledges every constraint. That intellectual honesty strengthens credibility while acknowledging biocomputing remains in early experimental stages.

Challenge 1: Lifespan. One hundred days average lifespan sounds impressive until you compare it to silicon processors that run for years without degradation. Data centres expect hardware lifespans of 3 to 5 years minimum. Biological processors requiring replacement every 100 days create operational nightmares.

The team is working on this. Lifespan has improved dramatically from initial hours to current months. But there's no clear path yet to achieving year-plus operational periods without culture replacement.

Challenge 2: Information density. Current organoids store approximately 1 bit each. The human brain stores the equivalent of 2.5 petabytes. That's a gap of roughly 20 quadrillion bits. Even accounting for the fact that FinalSpark's organoids have only 10,000 neurons versus the brain's 86 billion, the information density per neuron is far below biological potential.

Improving information density requires better understanding of how synaptic weights encode information, more sophisticated training protocols, and potentially genetic modifications to enhance plasticity.

Challenge 3: Scalability. The Neuroplatform has 160,000 neurons total. That's impressive for a first-generation platform. But it's 0.0002% of the neuron count in a human brain. And human brains aren't sufficient for most industrial computing workloads, which is why we built silicon computers in the first place.

Scaling to millions of neurons is theoretically possible. The facility houses 2,000 to 3,000 organoids already, suggesting production capacity. But interfacing millions of neurons, maintaining them reliably, and programming them to perform useful work represents extraordinary engineering challenges.

Challenge 4: Programmability. Silicon processors run software. You write code, compile it, load it into memory, execute instructions. The programming model is well-understood with 70+ years of refinement.

How do you "program" neurons? FinalSpark uses reinforcement learning via dopamine signalling. Electrical stimulation patterns encourage specific responses. But there's no equivalent to a programming language. No debugging tools. No software development lifecycle. You're training biological tissue through trial and error, hoping it learns the behaviours you want.

Challenge 5: Reproducibility. Every organoid is unique. They're grown from biological material with inherent variability. Neurons form connections somewhat randomly. Two "identical" organoids won't behave identically.

Silicon's strength is perfect reproducibility. A million identical processors all execute the same instructions identically. Biological computing fundamentally can't match that. You'll need algorithms and architectures that are robust to variability, potentially using ensemble approaches where multiple organoids vote on answers.

Challenge 6: Environmental requirements. Neurons require carefully controlled temperature, humidity, pH, nutrient supply, waste removal, and sterile conditions. Silicon processors work in industrial environments, survive temperature extremes, tolerate dust and moisture (within limits), and require only electricity.

Deploying biological processors into field conditions, industrial settings, or consumer devices presents extraordinary challenges that may never be fully solved.

The Three-Domain Security Nightmare

If you thought biocomputing's technical challenges were daunting, wait until you consider security.

Traditional cybersecurity deals with software vulnerabilities, network attacks, and data breaches. We have frameworks (ISO 27001, NIST CSF), certifications (SOC 2, Cyber Essentials), and established practices (penetration testing, vulnerability scanning, incident response).

None of that applies to biological computing.

Physical vulnerabilities: Biocomputing systems require physical access to laboratory facilities. Traditional data centre physical security doesn't account for biological threats. How do you prevent unauthorized introduction of competing organisms? How do you detect temperature or pH manipulation designed to compromise computation? How do you secure microfluidic systems against tampering?

There are no established protocols. No security cameras detect biological contamination. No intrusion detection systems identify enzymatic attacks.

Biological attack surfaces: This is entirely new territory. Research documented in 2017 successfully encoded malware in DNA strands, demonstrating sequencing software can be compromised when processing artificially synthesized DNA fragments. Unauthorized genetic modifications could alter computational behaviour. Cross-contamination between cultures could corrupt stored information. Biological degradation attacks could accelerate aging beyond natural rates.

Traditional antivirus software is useless here. You need biological monitoring, chemical analysis, genetic sequencing, and metabolic markers to detect attacks. Few cybersecurity professionals have training in microbiology. Few biologists understand cybersecurity.

Cyber vulnerabilities with biological complexity: The API and network interface to FinalSpark's platform present traditional attack surfaces: SQL injection on genomic databases, man-in-the-middle attacks on DNA sequencing pipelines, buffer overflow in bioinformatics applications, API vulnerabilities in cloud-based genomic services, eavesdropping on genomic data transmissions.

But compromising biological systems has consequences traditional malware doesn't. You're not just corrupting data files. You're potentially creating biological hazards. The ransomware that encrypts your files is annoying. Ransomware that genetically modifies neurons is terrifying.

The regulatory and liability vacuum: There are no ISO standards for biocomputing security. The NCSC has not issued specific guidance on biological computing. IEEE working groups are still forming. Traditional penetration testing doesn't apply. There's no equivalent to CVE database for biological vulnerabilities. Incident response procedures remain undefined.

The FDA and EMA have no framework for biocomputing devices. Data protection laws (GDPR) are unclear on biological data processing. Dual-use export controls are not adapted for biological computing. Security certifications for biocomputing providers are absent. Compliance frameworks (SOC 2, ISO 27001) don't address biological systems. Liability and insurance models remain undefined.

We're heading into a future where computers are biological entities, and we have no security frameworks, no trained professionals, no regulatory structure, and no idea what "good security" even looks like.

Why UK Businesses Should Care Despite the 10-Year Timeline

FinalSpark's biocomputers won't be processing your payroll next year. Probably not the year after. Possibly not within the next decade. Commercial viability for mainstream applications sits somewhere in the late 2020s to mid-2030s depending on who you ask and how optimistic their funding pitch is.

So why should UK business owners care about Swiss laboratory experiments?

Reason 1: The energy problem is immediate. Your cloud costs are rising 10 to 14% annually. SaaS inflation is running five times higher than consumer inflation. Data centre electricity consumption is doubling by 2030. Tech giants are committing billions to restart nuclear power plants that won't deliver power until 2028 to 2035. The energy crisis driving biocomputing research is hitting your bottom line right now.

Understanding where technology is heading helps you make better decisions about where to invest your limited resources. If energy-efficient computing is 5 to 10 years away, maybe that's relevant to your infrastructure planning.

Reason 2: Alternative platforms create competitive dynamics. When neuromorphic chips like IBM's NorthPole demonstrate 25x better energy efficiency than GPUs, cloud providers will eventually need to offer neuromorphic computing services to remain cost-competitive. Early movers who identify and adopt more efficient platforms will have cost advantages.

Biocomputing might follow the same pattern. If FinalSpark achieves commercial viability, cloud providers will offer "BaaS" (Biocomputing-as-a-Service) alongside traditional compute. Businesses that understand the technology and its appropriate use cases will gain advantages.

Reason 3: Technology trends accelerate unpredictably. Mobile internet went from niche technology to essential infrastructure in roughly five years. Cloud computing went from "never trust the cloud" to "everything's in the cloud" in maybe a decade. AI went from research curiosity to business essential in about three years.

Biocomputing might follow a similar adoption curve. Or it might not. But dismissing it entirely because current capabilities are limited is how businesses get blindsided by paradigm shifts.

Reason 4: The skills gap is forming now. There are maybe a few hundred people worldwide who understand both cybersecurity and biological computing well enough to secure hybrid bio-silicon systems. Universities are just starting to offer relevant courses. Career paths are unclear.

If you're a technology leader thinking about where your team needs expertise five years from now, "people who understand biological computing security" might actually be on that list. Getting ahead of that skills curve creates opportunities.

Reason 5: Investment and acquisition activity signals intent. FinalSpark seeks approximately CHF 1 million in additional funding. Competitors in Australia and the United States are operating similar programs. The biocomputing market is projected to reach $100.26 billion by 2037 (up from $9.42 billion in 2024, CAGR 19.9%).

When venture capital flows toward a technology sector, when universities establish research programs, when governments fund initiatives, that signals institutional belief that something significant is developing. Maybe they're wrong. But ignoring those signals entirely is risky.

The IBM, Intel, and Quantum Alternatives

Biocomputing isn't the only approach to solving silicon's energy crisis. Three alternatives deserve attention:

Neuromorphic computing (IBM, Intel): These are silicon chips designed to mimic neural processing. Intel's Loihi 2 delivers up to 10x performance improvement over Loihi 1 with flexible memory architecture. The Hala Point system at Sandia National Laboratories has 1.15 billion neurons for national security applications.

IBM's TrueNorth (2015) demonstrated 176,000x more energy efficiency than general-purpose CPUs for neural network workloads. IBM's NorthPole (2023) is 4,000x faster than TrueNorth, 25x more energy efficient than comparable GPUs, and 22x faster than Nvidia V100 GPU on ResNet-50.

Commercial availability: Research systems operational now. Limited commercial availability through Intel INRC with 250+ researchers. Government and defense applications deployed. Near-term (2025-2028) sees specialized commercial applications in robotics, sensing, and edge AI. Long-term (2028-2035) brings broader commercial adoption as software ecosystems mature.

Quantum computing: Quantum advantage demonstrated for specific problems. Commercial systems available (IBM Quantum, AWS Braket, Azure Quantum). Limited practical applications currently. Timeline: 2025-2028 for error-corrected logical qubits, 2028-2030 for first commercial quantum advantage applications, 2030-2035 for integration into hybrid classical-quantum workflows.

Energy metrics for quantum remain disputed. Current systems like 256-qubit QuEra Aquila consume less than 7 KW versus 21.1 MW for classical supercomputers like Frontier at Oak Ridge: potentially 100 to 1000x more efficient per operation. However, quantum computers require extremely low temperatures (near absolute zero) with cooling systems consuming considerable energy. Error correction can consume up to 90% of total energy.

The NCSC's post-quantum cryptography mandate: UK businesses must complete migration to quantum-resistant cryptography by 2035 (NCSC CTO Ollie Whitehouse). Store-now-decrypt-later attacks are already being conducted. Banking, secure communications, and critical infrastructure face existential threats if quantum-resistant cryptography isn't implemented before cryptographically relevant quantum computers emerge.

Each alternative has different strengths, weaknesses, and timelines. Biocomputing offers the most dramatic energy efficiency but faces the longest path to commercial viability. Neuromorphic computing offers near-term deployments with substantial efficiency gains. Quantum computing solves specific problem classes classical computers struggle with but remains energy-intensive and error-prone.

The smart money isn't betting on one winner. It's monitoring all three approaches and preparing to adopt whichever proves practical first.

What FinalSpark's Roadmap Actually Says

FinalSpark presented its 10-year roadmap in London in June 2025. The vision: bio-servers accessible via cloud to provide computational power for generative AI.

That's ambitious. That's also carefully hedged with realistic intermediate milestones:

Phase 1 (2024-2026): Platform optimization. Improve organoid lifespan from current 100 days to 200+ days. Increase information density from 1 bit per organoid to 10+ bits. Expand to 100+ active research collaborations. Demonstrate proof-of-concept for commercial applications.

Phase 2 (2026-2028): Commercial pilot programs. Partner with pharmaceutical companies for drug discovery applications (where biocomputing's biological relevance provides advantages). Deploy systems for materials science research. Establish first commercial contracts with pricing that reflects actual value delivered, not research novelty.

Phase 3 (2028-2032): Scale and integration. Achieve neuron counts in millions rather than thousands. Develop standardized interfaces for hybrid bio-silicon systems. Establish security frameworks and regulatory compliance. Create training programs for bio-cyber professionals.

Phase 4 (2032+): Mainstream adoption. Bio-servers available through major cloud providers. Generative AI models running partially on biological substrates. Energy costs for AI computation reduced by orders of magnitude. New application classes enabled by bio-silicon hybrid architectures.

Notice what's not in that roadmap: replacing traditional computing. Biocomputing is positioned as complementary to silicon, not a replacement. Hybrid architectures that leverage each substrate's strengths. Specific applications where biological processing provides advantages.

That's intellectually honest. That's also more plausible than revolutionary claims that biological processors will obsolete silicon within five years.

The Bottom Line for Business Owners

FinalSpark's Neuroplatform is real, operational, and producing peer-reviewed research. The energy efficiency advantages are proven through physics, not marketing. The technical challenges are substantial but being addressed systematically. The timeline to commercial viability is measured in years to decades, not months.

For UK small business owners facing rising cloud costs, exploding SaaS subscriptions, and mounting energy charges, biocomputing offers hope but not immediate solutions. The technology that could slash computing energy costs by orders of magnitude won't arrive in time to help your Q4 2025 budget.

But understanding the research, tracking the progress, and recognizing the trajectory helps you make better strategic decisions. About infrastructure investments. About vendor relationships. About where technology costs are heading and what alternatives might emerge.

Someone will solve computing's energy crisis. Maybe FinalSpark. Maybe neuromorphic chips. Maybe quantum systems. Maybe some approach we haven't imagined yet. But someone will solve it, because the current trajectory is economically and environmentally unsustainable.

Keep watching the Swiss scientists growing computers in dishes. Because mad ideas sometimes win. Especially the really mad ones.

And tomorrow, we're examining the exact numbers behind AI's energy consumption and why tech giants are scrambling to restart nuclear power plants. Because if FinalSpark represents the solution 10 years from now, we need to understand the crisis hitting us today.

Sources

Source Article
Frontiers in Artificial Intelligence Open and remotely accessible Neuroplatform for research in wetware computing
Business Wire FinalSpark Launches Biocomputing Platform
Scientific American These Living Computers Are Made from Human Neurons
Tom's Hardware World's First Bioprocessor Technical Analysis
Nature PMC The Technology, Opportunities, and Challenges of Synthetic Biological Intelligence
Stanford University (Kwabena Boahen) Brain-Inspired Computing Energy Efficiency Research
Association for Computational Linguistics Energy and Policy Considerations for Deep Learning in NLP
Intel Research Loihi 2 Neuromorphic Processor Technical Documentation
IBM Research TrueNorth and NorthPole Neuromorphic Computing
NCSC (National Cyber Security Centre) Timelines for Migration to Post-Quantum Cryptography
Noel Bradford

Noel Bradford – Head of Technology at Equate Group, Professional Bullshit Detector, and Full-Time IT Cynic

As Head of Technology at Equate Group, my job description is technically “keeping the lights on,” but in reality, it’s more like “stopping people from setting their own house on fire.” With over 40 years in tech, I’ve seen every IT horror story imaginable—most of them self-inflicted by people who think cybersecurity is just installing antivirus and praying to Saint Norton.

I specialise in cybersecurity for UK businesses, which usually means explaining the difference between ‘MFA’ and ‘WTF’ to directors who still write their passwords on Post-it notes. On Tuesdays, I also help further education colleges navigate Cyber Essentials certification, a process so unnecessarily painful it makes root canal surgery look fun.

My natural habitat? Server rooms held together with zip ties and misplaced optimism, where every cable run is a “temporary fix” from 2012. My mortal enemies? Unmanaged switches, backups that only exist in someone’s imagination, and users who think clicking “Enable Macros” is just fine because it makes the spreadsheet work.

I’m blunt, sarcastic, and genuinely allergic to bullshit. If you want gentle hand-holding and reassuring corporate waffle, you’re in the wrong place. If you want someone who’ll fix your IT, tell you exactly why it broke, and throw in some unsolicited life advice, I’m your man.

Technology isn’t hard. People make it hard. And they make me drink.

https://noelbradford.com
Next
Next

No MFA? No Certification. The Cyber Essentials Rule That Changes Everything