We’re wired to think in straight lines: identify the problem, find the cause, apply the fix.
It works for simple issues, but in complex systems, this kind of thinking becomes a roadblock.
It pushes us to act before we understand, often leading to shallow fixes, unintended consequences, or missed opportunities for real change.
I was recently invited to speak with a group of postgraduate innovation students at CUN, and discuss how to overcome the issue. Our conversation centered around systems thinking.
What is Systems Thinking
Systems thinking is a way of seeing the world.
It focuses less on isolated parts or linear cause-and-effect, and more on connections, patterns, and the underlying structure of something.
It’s not about predicting the future or controlling outcomes.
It’s about gaining clarity in complexity, and finding the places where change can generate the desired outcome.
Gardening: The Systems Thinker’s Playground
During the pandemic, I started caring for my garden.
One day, I set up an automated irrigation system to stop killing plants from thirst.
It seemed like the obvious solution: more regular water = healthier plants.
But the result? Worse.
The soil got drier and harder.
Digging into it, I learned that soil can become hydrophobic—it repels water, once it dries.
So all that extra irrigation wasn’t soaking in; it was running off and eroding the surface making the whole thing worse.
This was my first tangible encounter with the concept of feedback loops.
My assumption was that the lack of water was the issue, so I added more. But the system was pushing back. The more water I added, the more hydrophobic and compacted the soil became.
That, right there, was a balancing loop: stabilizing force that resists change.
My intervention was reinforcing the very condition I was trying to fix.
And worse, automating the watering gave me the illusion of progress, hiding the actual deterioration underneath.
The real solution? Gradual rehydration.
Small, spaced doses throughout the day until the soil could absorb water again.
That reset the system, and enabled me to move to a very efficient approach.
Deep saturation once a week.
After understanding the interrelations of the system, and how things responded to each other, it was like unlocking god mode.
Roots grew deeper, plants became more resilient, and I could water less over time.
That’s a _reinforcing loop, pattern where one good change strengthens another.
It was the first time I saw, not just felt, how systems respond through feedback.
How what looks like cause and effect is often a loop. And how working with the system—timing, pacing, thresholds—can be more powerful than trying to overpower it.
Systems Don’t Change Because You Act
Applying systems thinking meant doing something that sounds simple but isn’t: ignoring what I thought I knew.
The hard part?
Your biases won’t let you.
Confirmation bias, sunk cost, ego, they all scream at you to trust your instincts, your training, your experience. That’s the trap.
So I made it a practice to “plug my ears", to deliberately set aside best practices and assumptions at the beginning of my interaction with a system I don't know yet, and also the ones I thought I understood well.
Then I watched. I waited. I listened.
Not because it’s noble. Because it's necessary. Rushing to fix things reinforces the same dynamics that created the problem in the first place.
Systems don’t change just because you act. They change when you understand what your actions have been reinforcing all along.
The "Candle" Moment
After all of that, how do we know we found the insight we are after?
It’s not a lightbulb moment, as you would expect, it's a bit more dim, like a candle.
Kinda like a "wait a second..."
You stop. Something doesn’t add up the way you expected. The fix doesn’t feel like a fix anymore.
You’re not sure what the answer is yet, but you know the question just changed.
That’s usually the indicator.
Why We Keep Solving the Wrong Problem
Cognitive biases. We all have them.
They’re mental shortcuts meant to save brainpower—but they quietly sabotage innovation.
If you're serious about solving real problems, you need to know how to spot them. Otherwise, you’ll keep mistaking movement for progress.
Bias isn’t the only thing blocking change, but it’s a great place to start looking. Especially in leadership settings, where everyone thinks they’re thinking clearly.
Confirmation bias makes us believe we've “tried everything,” so nothing will work. It convinces us not to try again—when in reality, trying again with fresh eyes is often exactly what’s needed.
Authority bias tells us to trust the expert, the manager, the one with the title. But experience doesn’t guarantee insight. Sometimes, the people closest to the problem are too deep in it to see clearly.
If you’re in a leadership role, I highly encourage you to let your team do something that doesn’t make sense to you. Not once, twice. Then write to me and tell me what happened.
Action bias pushes us to do something, anything, just to feel in control. Waiting, listening, observing, all feel like inaction. But often, that’s where real understanding begins.
Swimmer’s body illusion tricks us into copying surface-level strategies from others, hoping for the same outcome. But it ignores the context, the feedback loops, and the invisible structures that made their solution work.
Before You Fix, Observe
Systems thinking starts with observation, not action. Noticing how things connect. How changes ripple.
If you want to begin, don’t try to fix anything. Just watch. Pay attention to the patterns, delays, and unintended effects.
The goal isn’t control. It’s clarity.
And when you reach clarity, you might be in the position to make change happen.