Not Everything Is a Word Problem
We optimize for the screen, often to the detriment of our physics. Meanwhile, China is building an entire lattice of interconnected systems.
I was sitting at the top of a mountain in my beloved Colorado Rockies when for the first time in weeks, my mind felt genuinely quiet.
No queue to clear. No signal to monitor. Just pine trees and grey-white sky and a valley so vast it makes everything I’ve ever stared at on a screen feel appropriately small.
We come here as a family as often as we can. Not to escape the beautiful chaos of daily life—I actually love it—but because I know something about systems that too many leaders in my industry keep forgetting: every system, no matter how powerful, needs a reset cycle built in.
The brain is no exception.
One of my favorite professors at Northwestern opened our fluid dynamics course with something that felt almost like a sermon. He explained, with genuine gusto, a single beautiful truth: the same mathematical equation that describes the motion of galaxies also describes the motion inside a human cell.
I was twenty years old and I felt the top of my head come off. Great professors have a way of doing that to young minds.
That was my first real education in systems thinking.
Not the business case version—the real version. The one that teaches you that the world is not a collection of isolated problems to be solved. It is a web of interdependent systems, each governed by rules as elegant as they are unforgiving. Biological systems. Mechanical systems. Chemical, electrical, fluid, organizational. Each one speaking a different dialect of the same underlying language.
But that equation doesn’t just describe beauty. It describes fragility. The same principles that produce coherence at every scale also break when something pulls them out of rhythm. A turbulence event in a fluid system. A demand shock in a supply chain. A calendar so dense it fragments every thinking brain into thirty-minute chunks with no transition and no recovery. The math doesn’t care whether the disruption is mechanical or organizational. When the rhythm breaks, the system degrades.
I’ve been thinking about that a lot lately. Because we are living through a period in which one type of system—the neural network, the language model, the human brain and its digital approximation—has consumed nearly the entire public imagination around artificial intelligence.
And I think that’s a problem.
AI discourse is flooded with critiques of model reasoning. We test logic. We debate hallucination rates and context windows and whether the latest benchmark actually measures anything real. We ask our AI to summarize, plan, argue, and analyze.
We are, almost entirely, treating AI as a mind simulator.
Don’t get me wrong, the brain is fascinating. I have spent the better part of a decade building systems that augment it, stress-test it, and occasionally reveal how embarrassingly fallible it is at scale. I am not dismissing the neural lens.
I am remembering that the brain is one system. One. And we amazing human beings have built an entire civilization with all the others.
China acted on this first. While the West was busy building software that talked to humans—chatbots, copilots, workflow assistants, productivity wrappers—China was deploying intelligence into the physical world. Into factories. Into materials science. Into supply chain infrastructure. Into the kind of industrial systems that, if you get them right, don’t just move faster—they reorganize entire economies.
The West optimized AI for the screen. China optimized it for making and doing things.
One strategy touches the actual substrate of how physical value gets created in the world. The other makes your next quarterly targets feel slightly more achievable.
Consider what this looks like in practice. In southern and eastern China, vast manufacturing clusters—regions like the Yangtze River Delta and the Greater Bay Area around Shenzhen and Guangzhou—have organized themselves into something most American executives have no equivalent for. These aren’t individual factories competing with each other. They are dense, co-located ecosystems where suppliers, integrators, and end users sit within miles of each other, and where new components get tested on real production lines in days, not months. The result is a compounding flywheel: deployment yields data, data improves the AI systems governing production, improved systems scale further and faster. Gains in one supply chain—electric vehicle power electronics, for instance—spill over into adjacent ones like industrial robotics, lowering cost curves across sectors simultaneously.
Inside those clusters, the AI is not a copilot offering suggestions. It is the production infrastructure itself. BYD, the Chinese electric vehicle manufacturer that has grown from producing 500,000 vehicles in 2017 to over four million by 2024, runs its Xi’an battery plant at roughly 97 percent autonomy. Neural networks analyze real-time sensor data along the production line, catching microscopic deviations in material composition and electrode alignment that no human inspector could consistently detect. The company reports a 40 percent reduction in battery defects. Digital twins of the entire production environment let engineers simulate and optimize processes before anything physical is touched. Seventy-five percent of the vehicle’s components—batteries, motors, power electronics—are manufactured in-house, and BYD’s semiconductor subsidiary is now building custom AI chips. The intelligence doesn’t sit on top of the manufacturing. It is woven through it—from cluster-level coordination down to the molecular tolerances of a single battery cell.
The same principles, operating at unimaginably different scales. The Northwestern equation, playing out on factory floors.
Meanwhile, back in the land of knowledge work, we are doing something quietly catastrophic to the most sophisticated system we do actually own.
First we came for the calendars—meeting culture so dense it fragmented every thinking brain into thirty-minute chunks with no transition and no recovery. Then we came for the messages—surveillance and monitoring that turned every private thought into a potential compliance event. Now, in the name of AI readiness, we are asking people to document every step of what they know so a model can learn it, leaving only the most cognitively demanding, emotionally exhausting work for the humans who remain.
There are no more tasks to down-regulate. The easy wins are gone. The mental load has nowhere to go. And yet too many in leadership are still wondering why the transformation isn’t sticking.
You cannot build an intelligent organization on top of exhausted humans. You cannot run a serious intelligence strategy on systems you haven’t cleaned up. And the most important system in your portfolio isn’t your tech stack or LLM configuration. It is the biological one sitting across from you in the all-hands, running hot, waiting for someone to notice.
The brain has its own garbage collection process. Neuroscientists call it synaptic pruning. It runs during stillness, during rest, during the walk you take without your phone. It is how the brain cuts what doesn’t matter and consolidates what does—not passively, but actively. The same principle as clearing a context window. What you remove is as load-bearing as what you keep.
It only runs when you stop.
Nature is enormous and quiet and completely indifferent to my inbox.
As my mind stills I find myself coming back again to that same equation. Galaxies and cells. The same math, operating at unimaginably different scales, producing unimaginably different outcomes. A factory floor in Xi’an and a resting human brain on a Colorado mountain and a language model cycling through tokens—all, at some level, governed by the same underlying demand: that the system be allowed its rhythm.
The leaders who will build durable advantage in this era are not the ones who read every model release note. They are the ones who understand enough about enough systems—biological, industrial, organizational, mechanical—to see where intelligence actually compounds, and where it merely accelerates the dysfunction already in the room.
The best thinking I have ever done has happened when the queue went quiet. Not empty—quiet. The system still running, but finally at a frequency where something other than reaction becomes possible.
And now for my closing ritual: Thesis and Texture
Each week, I end with two book recommendations. One that sharpens how we think about systems, strategy, and intelligence. Another that holds what systems can’t: the texture of being human.
Because building good systems while living a beautifully full human life requires both.
The thesis: Barbarian Days by William Finnegan – A Pulitzer-winning memoir that only looks like it’s about surfing. It’s really about what happens when you spend a lifetime learning to read a physical system—waves, currents, reefs, tides—with your body as the primary instrument. Finnegan is doing fluid dynamics with no equations and no dashboard. Just pattern recognition, consequence, and the understanding that the ocean doesn’t care how smart you think you are. The best systems book I know that never once uses the word “system.”
The texture: Piranesi by Susanna Clarke – A novel about a man living inside an infinite house of halls and tides who has forgotten there is an outside world. A strange, beautiful meditation on what it means to be so deep inside a system that you’ve stopped questioning the architecture.
If this sparked something, please pass it on. Ideas grow stronger, and systems grow smarter, when we share what we’re learning.



Amazing post, Kristi. I really felt the way you contrasted how Western countries are approaching AI with what China is actually building at the systems level. It’s such a different game than just more screen apps, especially when you factor in how much nature and physical systems have to be part of the picture. 🙏🏻
You have great storytelling. I loved the way you described your professor teaching you something mind-blowing and building from there.🩷🦩