
AI Is a Magnifier, Not a Magic Wand
Wilco van Duinkerken
CTO
There is a story leaders want to believe about AI coding tools. Buy the licenses, hand them to your engineers, and watch the throughput climb. The reality is closer to handing out megaphones in a building with bad acoustics. Whatever was happening before is now happening louder.
AI Coding Is a Magnifier
Strong codebases become stronger faster. Messy ones deteriorate faster. Good feedback cultures move further; tense ones tear faster at the seams. The tool doesn't decide which way you go. The conditions you already had do.
The implication is uncomfortable: an AI rollout is not primarily a tooling decision. It's a stress test of everything you've built so far. The teams that thrive aren't the ones with the best models. They're the ones whose foundations were already healthy.
When You Go Faster, Look Further Ahead
Most organisations treat AI as a speed multiplier for current workflows. Same process, same review cadence, same architecture, just faster. That works for a few quarters and then breaks.
The harder question is whether the workflow itself still makes sense at the new speed. Two-week sprints made sense when changes took two weeks to land. They make less sense when an engineer can ship a feature in an afternoon.
Pace Layers
Stewart Brand's pace layers concept maps neatly onto software systems. Different layers evolve at different speeds. Fashion changes rapidly. Infrastructure moves slowly. Culture barely shifts at all.
In a codebase the analogue holds. UI changes weekly. APIs change quarterly. Data models change yearly. Architectural choices change rarely without significant cost. Healthy organisations move fast in the fast layers and slow in the slow ones. AI tempts you to move everything at the speed of the fastest layer. Resist that temptation. The slow layers are slow for good reasons.
The Simplicity Cycle
Dan Ward's simplicity cycle describes how complexity initially improves outcomes, until at some point the next feature, abstraction, or option begins to hurt more than it helps. Teams that mature learn to move from complex-and-good to simple-and-good. They subtract.
AI makes addition easy. Generating another option, another abstraction, another configuration flag costs almost nothing now. Subtraction still requires humans to understand what's there and decide what isn't earning its keep. Teams who only add accumulate complexity at unprecedented rates.
Cognitive Debt
Tech debt is well understood. Cognitive debt is its quieter cousin. It's the gap between how complex the system has become and how much of it any individual can hold in their head.
Cognitive debt shows up in onboarding times, in incident response, in the number of "who knows how this works" conversations. AI accelerates the rate at which systems grow beyond comprehension. The fix isn't more documentation after the fact. It's deliberate simplification, deletion, and consolidation as a continuous discipline.
Blaise Pascal's line still applies: I have made this longer than usual because I have not had time to make it shorter. Simplicity costs effort. AI doesn't change that.
What This Means in Practice
- Audit the conditions before you scale the tool. Code health, culture, feedback hygiene. Whatever is broken will get worse, faster.
- Compress fast layers. Lengthen slow ones. Don't let speed pressure architectural decisions that should take weeks of deliberation.
- Make subtraction a first-class activity. A feature deletion sprint or a complexity-reduction roadmap line item is not slower work. It's the only way to keep moving.
- Treat cognitive debt as a measurable problem. Track onboarding time, time-to-first-meaningful-PR, and the share of incidents requiring tribal knowledge. These tell you whether the system has outgrown its inhabitants.
AI isn't going to solve the problems your engineering culture has been quietly carrying. It will make them visible at higher resolution, faster than you're ready for. The leaders who treat it as a magnifier rather than a magic wand are the ones still in control twelve months in.
Related articles
Keep reading

The Missing Handovers: Why AI Speed Costs You the Conversations That Catch Bugs
AI compressed development from weeks to hours. The conversations that used to catch bugs disappeared with it. What was lost, and how to build it back.

Unwinding the Fidelity: How to Review AI-Generated Prototypes Without the Chaos
AI prototype reviews collapse into chaos because every layer of feedback arrives at once. A 5-stage technique to give human cognition the structure it needs.

From Writing Code to Validating It: The Engineering Manager's New Job
When AI writes the code, your job becomes proving it works. Testing, security, and quality move upstream. What AI Slop costs teams that don't adapt.
Get Product & Tech clarity in one day
200+ assessments. Same-day results. Built by operators who've been on both sides of the table.