Self-reproduction is the hallmark of biology, but I would argue the most astonishing capability of biology is the ability to integrate vast, multi-layered, spectrum-based information into discrete, functional outcomes.
That sentence is a lot to digest, let’s start with an example.
Are you hungry right now? Answering that question requires input from signals across your body—ghrelin, insulin, GLP-1, leptin, and many others1. These molecules circulate in the bloodstream, activating multiple signaling pathways that converge in the hypothalamus at the melanocortin-4 receptor. The presence or absence of signaling at this receptor ultimately determines whether you seek food or not. In this way, biology seamlessly translates a spectrum of biochemical signals into a single, decisive outcome: eat or don’t eat.
Biology is exceptional at integrating highly complex inputs and distilling them into discrete outputs.
Silicon-based technologies are the exact opposite and rely on strict binary logic. In computing, electrical signals are either off or on, represented as 0s and 1s. There is great precision in this simplicity, but layered architectures are needed to achieve greater complexity and more nuanced decisional outcomes. Human-computer interfaces, self-driving labs, and bio-manufacturing all represent convergences of the two paradigms.
It reminds me of a bottom-up vs. top-down market analysis. They don’t meet in exactly the same place, but they interact closely enough to be useful.
Opposites attract innovation
Some of the most transformative and innovative breakthroughs occur at the intersection of unrelated disciplines. A breadth of knowledge across domains increases the likelihood of novel connections. This is the opposite of hyper-specialization, which, despite being career driving for many, narrows the scope of of paradigm-shifting discoveries.
In the 1840’s, electricity and neuroscience were both in their infancy and entirely unrelated disciplines. That is until Robert Bentley Todd applied Michael Faraday’s electrical concepts to nerve conduction and brain activity. The results were spectacular and his insights were revolutionary. He correctly proposed that nerve impulses propagate as electrical currents, recognized that myelin sheaths act as insulators, and suggested that seizures result from “disruptive discharges” of electrical activity in the brain2.
Todd’s work alone would sufficiently have made the case that the contrast between disciplines was a useful fuel for innovation. But things really started to get interesting in 1943. That year Warren Sturgis McCulloch and Walter Pitts published “A Logical Calculus of the Ideas Immanent in Nervous Activity”3. Introducing a mathematical model of the nervous system, they demonstrated that neurons could be modeled as simple logical units making binary decisions, and forming networks capable of processing complex information.
This first model of a neural network is hallmarked by neurons functioning as relatively simple logical devices making binary decisions, and assembled into interconnected nodes that process and transmit information. This enabled the model to simulate complex logical circuits, despite being made up of simple, interconnected nodes. While the field fo neural network architecture has significantly expanded since then, the concept of a binary threshold neuron remains a fundamental building block in many architectures. Not only have neural networks been instrumental to the development of AI, this introduced a new way of thinking about complex systems.
For neuroscientists, this framework offered a new way to study cognition. If neurons perform computations, then thought itself might emerge from neural logic. The ability to replicate complex reasoning with a system of simple interconnected nodes provided compelling evidence for computational theories of cognition.
In 2025 we see the use of living neurons to train AI models, resulting in a claimed 1000x improvement of efficiency4.
The value of distinction
Underlying each of these breakthroughs was the sharp contrast between disciplines. These fields were understood to be fundamentally distinct, and so proposing a direct comparison was the ultimate casting aside of preconceived notions and asking “what if…”
The separation between electrical engineering and neuroscience, or biology and computing, created space for radical “what if?” questions.
But things seem to be quite a bit different today. Computer science students are routinely educated about neural networks as a foundational baseline, not an innovative cross-polination of ideas. Similarly, biology students have been learned the fundamental nature of neurons through the lens of electrical circuits.
We have become so comfortable with these analogies that I cannot help but wonder if they have begun to obscure more than they illuminate. That rather than spurring innovation they have become limiting assumptions. More than once, I’ve seen computer scientists express surprise when I comment that artificial neural networks are far simpler than biological ones. Similarly, I’ve watched biologists struggle with the complexity of neurotransmitter interactions. Perhaps in both cases it’s because their foundational understanding was shaped by a metaphor that oversimplified reality.
The power and danger of metaphors
Metaphors are invaluable teaching tools, particularly in complex subjects. A great metaphor provides an intuitive entry point for understanding, but its very strength is also its greatest danger. It’s difficult to know the precise boundary where the metaphor fails.
Take, for example, the comparison of cells to machines. This analogy is popular because it captures the cell’s ability to transform inputs into outputs. The problem is that it breaks down at its most fundamental level. Machines are designed with discrete, specialized parts that serve specific functions - like a car's engine. But cells are more like self-organizing, adaptive communities where components spontaneously assemble, disassemble, and reassemble based on context. They maintain themselves through constant flux rather than fixed structures. A single protein may perform multiple roles depending on cellular conditions, while feedback loops constantly reshape molecular interactions.
This, the dynamic and context-dependent functionality, is waht makes biology so resilient. If you were to walk into a factory and remove a specific part from 25% of the machines you would immediately notice a significant reduction in output. Yet biological systems are so resilient that some deficiencies don’t produce symptoms until more than 90% of the function has been lost.
Similarly, describing DNA as biological software oversimplifies the complex reality of genetic expression. Unlike software code that executes in a linear, predictable fashion, DNA interpretation depends heavily on cellular context, environmental conditions, and complex feedback loops. The same genetic sequence can produce dramatically different outcomes based on factors like chemical modifications, protein interactions, and cellular state. While software follows precise instructions, genetic expression is more like a dynamic conversation between DNA, RNA, proteins, and the cellular environment - with the "meaning" of any given sequence emerging from this intricate dance rather than being hard-coded. To quote Michael Levin, “In our group, we think of the DNA as producing cellular hardware that is actually implementing physiological software, which is rewritable.“
Preserving the power of difference
Cells are not machines. DNA is not software. AI is running on a skeleton of true neural network connectivity.
These analogies have been useful, but their time has past. If we want to drive the next wave of breakthroughs, it is time to go beyond conflation and embrace distinction. Instead of asking how biology is like a machine, we should ask what makes it fundamentally different, and then ask “what if…”
If you know anyone on the Biological Black Box team, I would love an intro.
Excellent take!
Reminded me of this gem from Susan Oyama in “The Ontogeny of Information”
Thank you for that!