The Rock and the Creek:
A lesson in systems thinking from an unlikely classroom

On systems thinking, working with nature rather than against it, and why the problem you can see is rarely the problem you have.

Sometime in the 1980s, a group of friends and I drove out to Diamond Head - one of those imposing, slightly eerie places on the New South Wales coast that feels like it has opinions about you - to watch the Leonid meteor shower.

We wandered south along Kylie's Beach and came across a creek intersecting the shoreline. Except "intersecting" is too polite a word: the creek was devouring the beach. The bank was actively collapsing, sections sliding away in real time, the whole dune slowly converting itself into the sea.

It just didn't seem right to us.

So we did what young men do when confronted with a problem they haven't thought about yet: we threw effort at it. We shovelled sand with our hands and feet to shore up the bank. It washed away as fast as we could pile it. We doubled down. We brought down half the remaining bank in the process and piled that on top. That washed away too. Somewhere around the third redoubling of effort, common sense and the noon-day sun conspired to make us sit down and actually look at what was happening.

Here's what I saw: the creek was curved and its outer curve - the side attacking our bank - was deeper and faster than the inner curve opposite. I'd seen that somewhere before - in high school Geography, as it turned out. Rivers are dynamic systems. Their outer curves cut and deepen; their inner curves receive and deposit. The creek wasn't doing anything wrong. It was doing exactly what a creek does. It had no problem at all.

We had the problem. And it was a problem of expectations, not engineering.

Reframing the problem: from symptoms to causes

Once I stopped trying to fight the creek and started trying to understand it, the question changed. We weren't asking "how do we hold this bank in place?" anymore. The new question was: "what would eventually move the creek away from the bank, and how could we accelerate that?"

A creek of that nature will migrate over time. The same dynamics that were cutting our bank would, left alone, eventually rotate the creek's course away from the dune. We didn't need to stop the water. We needed to make it do sooner what it was going to do anyway.

That's when I noticed the rock. Less than half a metre long, with an airfoil cross-section. The idea was simple: place it so the water flowing over its surfaces would generate downforce, digging and drawing water into the riverbed on the opposite side, giving the creek a reason to rotate toward it. We placed it carefully, and went off to watch meteors.

I don't remember much about the Leonid shower. There were lights. We slept in the car.

In the morning, we walked back to the creek. It had rotated nearly 90 degrees around the point where we'd placed the rock. The bank was untouched. The dune was safe - at least for now.

One rock. Placed correctly. Tonnes of water, moved by itself.

Three principles for working with complex systems

I've thought about that morning many times over the course of a career spent working on complex systems. There's something in it that comes back to me reliably.

The first lesson is about where the problem actually is. The bank was being eroded - that was visible, immediate, and distressing. But the bank wasn't the problem. The bank was where the consequences were showing up. The problem was the creek's dynamics, and those dynamics were operating upstream of everything we could observe. We'd spent hours working on the symptom because the symptom was right in front of us.

This pattern appears constantly in engineering. A system misbehaves in a way that's visible and urgent. The natural response is to address what you can see. But what you can see is always a consequence whose cause is frequently elsewhere. Sometimes that cause is geographically upstream, sometimes it's buried in the history of decisions that got you here, sometimes it's sitting at the boundary between two systems that each assume the other has handled something neither has.

The second lesson is about the nature of complex systems. A creek, if you trace it far enough, involves likely billions of variables: rainfall, topography, sedimentation, vegetation, the shape of every rock and bend for kilometres in any direction. No human being can model it completely. But you don't need to model it completely to work with it. You need to understand its dynamic - the forces that are actually driving its behaviour in the particular context you're dealing with. Understanding the dynamic gives you leverage. Trying to model the whole system gives you paralysis.

The third lesson is the one the Stoics captured better than I can: the obstacle is the way. The creek's power was not the enemy of our goal. It was the means of achieving it. The solution was to use what the system was already doing - to find a small intervention that would redirect the system's own energy toward the outcome we wanted. Not work harder against it.

One rock, placed correctly, does more than a thousand hands shovelling.

When the architecture is the problem

Decades after my experience with the creek I was brought in to work on a large software project that had been struggling for some time. The development team had identified problems and gone about solving them - methodically and with genuine commitment - but the solutions weren't working. The team was doubling down on an approach that couldn't succeed, because it had been derived from an architectural model designed for a very different environment. The mismatch was built in from the start.

My first task wasn't to write code: it was to understand history. I reconstructed the original design decisions, the assumptions behind them, and the constraints of the actual deployment environment. Once those were visible, the mismatch was obvious - not because anyone had done anything foolish, but because the original assumptions, reasonable in their original context, had silently stopped being true when the project moved into a constrained environment. Nobody had revisited them because nobody had been looking for them.

The system was shovelling sand. Very competent sand-shovelling, but sand-shovelling nonetheless.

Re-architecting from first principles - with the actual constraints in view - produced a system that was successfully deployed simultaneously across offices throughout Australia, on schedule, without disruption. The solution wasn't harder work. It was working on the right problem.


The creek is still there, incidentally. One of my co-terraformers visited it years later and sent a photograph. The course has shifted again over the decades - that's what creeks do - but the dune survived. We didn't solve the problem permanently. We nudged the system and bought some time.

In my experience, that's usually the most you can honestly claim to do.


Colin McCormack has spent decades working on complex systems: embedded, real-time, distributed, and the kind that cross the boundary between hardware and software. He consults in Australia and internationally through Percolation Consulting.