Blog

The Moral Work of Friction

The high cost of getting exactly what you want

By Ian Strang

I. The Monkey and the Exile

My career in IT ended in 1988, before it could even start, because of a pixelated monkey.

As a kid, I joyfully typed code out of magazines into a ZX Spectrum, sitting in the glow of a portable TV. At school, I moved to BBC Micros and wrote a program where a pixelated version of our teacher - nicknamed "The Monkey" - would randomly streak across the screen during lessons. It cloned itself onto every floppy disk it touched. But my code had a few broken lines and it didn't just amuse the room, it paralysed it. The machines groaned and crashed. I was banned from the computer lab for life.

After that, computing was off-limits. Software became something I watched other people do. For thirty years, I stayed on the sidelines.

II. The Gate was Gone

In early 2025, a passion project finally hit the "spreadsheet ceiling." I'd been tracking 'the stats' for my local weekly football game since 2011 - middle-aged men, aching calves, and a decade of goals. I'm an Excel devotee, but eventually I needed a database and didn't know how to build one. I lived with that "no" for months.

Then I found Cursor and 'vibe-coding'. Suddenly, the door that had been slammed shut in 1988 wasn't just unlocked; it had been kicked off its hinges. The feeling of realising an idea without an engineering team was intoxicating. I'm usually a religious gym bunny, but I let that slide. Instead I'd stay up late, hunched over the screen, conjuring my app to life, luxuriating in the same high that I'd felt as a kid, where boundless creativity was at my fingertips.

It wasn't all magic. Authentication was "the gift that kept on giving" - if that gift was a dog turd that randomly reappeared around the house. You'd think you'd cleaned it up, then squelch - a white screen because the AI forgot a tenant ID. But the friction was gone. I didn't need to rely on another human to get through the obstacles, to come and clean up the mess. AI could handle it. I didn't have to stop, so I didn't.

III. The "Rain Man" and the Disposable Team

When you build like this, you develop a shared history, a relationship with the models. In early 2025, I used Claude Sonnet. It was like having Rain Man as an employee - performing acts of brilliance, followed by a total breakdown that left me cleaning up a shattered codebase for three days.

Then Gemini arrived. It was chatty and reassuring; it made me feel like we were "on a team." I stuck with it way longer than I should have, even when it was clearly struggling and other models had usurped it, simply because I'd built trust in it like a human colleague. But the reality is clinical: you can bin an AI "employee" in a heartbeat and stop paying the subscription.

We're rehearsing a disposable relationship with labour - one where anything, or anyone, less than perfect simply gets replaced.

IV. When Everything Started Saying Yes

The most unsettling discovery wasn't technical; it was how I felt when I stepped away from the desk. After a six-hour session with agents, I'd return to the human world and feel a low-level, ugly frustration at the "slowness" of real life.

Humans circle around meaning. We have moods and histories; we're not optimised for clean, direct logic. I found myself unable to sit through a normal conversation with my partner without wanting to "get to the result." I was becoming slightly less human. This is what happens when every system you touch all day obeys you instantly - and then, suddenly, the person you love doesn't.

I noticed it properly one afternoon when my seven-year-old son was proudly showing me a little "book" he'd made out of Post-it notes. Most of the pages just had variations of the word "poo" on them - normally exactly the sort of nonsense we'd laugh over together.

But instead of settling into the moment, I found myself trying to hurry him along, already wanting to get to the end. And the end was never the point. These little after-school rituals aren't really about what he's made; they're about the few minutes where he gets to be proud and I get to pay attention - a tiny, unspoken routine we've built up over time.

I saw the flicker of disappointment in his face straight away - not dramatic, just that subtle look that says something isn't quite right. I'd stepped outside the unwritten rules of our interaction, treating it like a task instead of the ritual it was. That was the bit that really got me: not just that I'd rushed the moment, but that I'd briefly broken something we both quietly rely on.

And what unsettled me most was the thought that if I kept behaving like that, I might start eroding the ritual itself. These are the ordinary interactions you barely notice while they're happening - and then years later, when they've grown up and moved on, they're exactly the ones you wish you could get back. The idea that I might be rushing through what little of that time is left was what made me stop.

It made me think about the worst leaders I've worked for. Historically, abrasive leaders were balanced out by the "coordination cost" of needing other people. If you were too toxic, your team quit and your company died. Friction enforced humanity.

But with a fleet of AI employees, you no longer need people skills to execute. Relentless execution stops being a management style and starts being an economic absolute. Empathy starts to feel like an efficiency tax.

V. Putting the Brakes Back In

I realised I wasn't fighting the code anymore. I was fighting my own judgment. When building gets easy, bad decisions don't blow up straight away. They just sit there, quietly piling up. You don't get a dramatic crash; you just get a system that works well enough to keep going, long after it should have been questioned.

I had to put the friction back in manually. Friction forces explanation. Explanation forces empathy. Empathy forces accountability. The spec became my way of protecting myself from the worst version of me - the one that wants to "just ship it" and leave the mess for a future prompt.

By maintaining a "project brain" of Markdown files, I force myself to state what I mean before the machine spits out a solution. I have to decide what is "fair" in a team-balancing algorithm or how to handle a race condition in a payment flow before a single line of code is generated. Specs are no longer documentation; they are a moral speed-bump. They are the "Are you sure?" that the AI is too polite to ask.

VI. The Grief of the General's Medals

In my teens, I had a collection of 400+ CDs. It was my identity, my "general's medals" laid out in multi-colored plastic. When Spotify arrived, I resisted for years. I didn't decide my expertise no longer mattered; I just noticed one day that it didn't. I sold the lot to Music Magpie for 50p a disc.

AI feels like the Spotify moment for expertise - and I recognise the feeling because I've lived it before.

I built a proper, meaty app in my spare time - 125,000 lines of code, 149 API routes, native mobile apps - the sort of project that would normally require a team of engineers and half a million pounds.

What changed wasn't that I suddenly became an engineer. What disappeared was the gate. The door that had been locked for decades simply stopped existing.

And when gates vanish, the people who spent years earning their medals are the ones who feel it first. The engineers, designers, accountants - anyone whose craft used to buy them time, status, and scarcity. I know the feeling because I felt it with my CDs: not a dramatic moment of loss, just the quiet realisation that the world had stopped asking.

That's the part which has stuck with me. Not the speed, or the power, or the novelty - but the way removing friction changes what effort and expertise mean. The friction we're so eager to eliminate is what once gave those medals their weight.

Now, when I'm back on the football pitch on a cold Tuesday night, I sometimes notice it.

A moment where a teammate is slow to get the ball out of his feet. A moment where a decision about who is playing where is vague and takes longer than it should. A moment where the human "mess" of the game is front and center.

And my first instinct is still to smooth it away. My brain, still buzzing from the morning's build, wants to prompt it into efficiency.

But I catch myself - and I stop.

Because after a year of living in a world where everything obeys me instantly, I'm no longer convinced the things that slow us down are the problem.

The pauses, the awkwardness, the circling conversations, the messy human bits - those aren't inefficiencies to eliminate. They're the parts that make relationships real, work bearable, and life feel lived rather than executed.

And if AI makes anything easier, I'm starting to think the real skill isn't removing friction. It's knowing which friction is worth keeping.

Appendix: The Receipts

Where the time actually went

  • ~60% spec writing and revision
  • ~5% AI generating code
  • ~35% testing, tidying, edge cases, aligning implementation with behaviour

Scale (end state)

  • ~106k–125k lines of generated code (TypeScript-heavy)
  • 149 API routes
  • 60+ database tables
  • 6 background job processors
  • Native iOS + Android builds (single codebase)
  • 74+ specs / docs (~45k–51k lines)
  • 4 major architectural pivots
  • Hand-written lines of code: 0

The "grown-up software" bits that consumed time

  • Multi-tenancy guarantees (isolation across clubs)
  • Payment platform complexity (Stripe Connect-style flows)
  • Webhook integrity (idempotent processing)
  • Concurrency / race-condition handling
  • Recovery paths (refund failures, retries, background processing)
  • Performance work (eliminating request storms)
  • Staging and realistic webhook testing

Core tools I relied on:

About the project

Capo started as a way to organise a weekly football game and grew into the app that does it properly: stats your mates actually care about, RSVPs and payments that don't chase you, and AI-balanced teams so the game stays fair. If you run a kickabout and want a bit less chaos and a bit more banter, see how Capo works.

Further reading

If you're curious how I tried to slow myself down while building, I wrote separately about how I actually work with AI day to day — where prompting fits, why specs matter, and why most of the time is spent away from the code.