TL;DR

OutSystems published research this month — nearly 1,900 IT leaders surveyed — finding that 94% of organisations now worry that AI sprawl is making their stack slower, messier, and riskier. The enterprise version of the problem is well known. What nobody is writing about is the SME version, which is quieter but worse in proportion. I sat down this week and audited my own AI subscriptions, then ran the same lens over three client stacks I had been asked to help with. The results on my own setup were embarrassing. Five subscriptions where two would do. Overlapping capabilities I had stopped noticing. A monthly bill that was 40% larger than it needed to be. Here is the audit I ran, what I cut, and the three rules I am now using with clients.

The Research, In Plain English

OutSystems commissioned a survey of 1,886 IT leaders across the world, published this month. The headline is that 94% of organisations are worried about AI sprawl — the quiet accumulation of overlapping tools, agents, and subscriptions that gets harder to reason about every quarter. 49% describe their agentic AI capabilities as advanced or expert. 38% are running a mix of custom-built and off-the-shelf agents that they cannot easily standardise. Only 12% have a centralised platform to manage the lot.

This is an enterprise study, but the pattern rhymes all the way down. The numbers look smaller at the SME end of the market but the proportional damage is larger, because small businesses do not have procurement teams, security teams, or IT leadership asking the question “are we paying for the same thing twice.” They just pay for things, because at £20 or £60 a month nothing feels like enough money to justify a review.

Last Thursday I sat down and ran the review on my own setup. Then I did it again for three client businesses who had asked me something adjacent. The results were consistent enough that I want to share the method, because I suspect most people reading this are sitting inside a version of the same problem.

What My Own Stack Looked Like

I will put my own numbers on the table first, because I am asking you to do the same exercise and it is only fair.

Before the audit, my monthly AI-and-adjacent bill looked like this: a Claude Max subscription, a ChatGPT Pro subscription I had been on since before I had a Claude one, a Perplexity Pro, a Gemini Advanced I was paying for because I wanted to test it for a client, a Notion AI add-on on the team plan I share with two freelancers, a Zapier subscription with their AI tier enabled, a transcription tool, and two separate image generation subscriptions because I had forgotten to cancel one when I moved to the other. The total was roughly £310 a month.

None of that is scandalous on its own. Claude is the centre of my working day. Perplexity earns its keep for research. Zapier actually automates things. But I had not looked at the whole bill in about nine months, and when I did, three things jumped out.

Two general chat subscriptions. I was using Claude for 95% of my work and ChatGPT for the remaining 5%, which was almost entirely “check this second opinion” style queries that I could do on the free tier. I was paying for a tool I was using twice a week out of habit.

Overlapping image generation. The old tool had added the feature that made me want the new one. I was paying for both.

Notion AI that nobody used. I had added it during a quarter when I thought we would move our knowledge base into Notion. We didn't. The add-on kept charging. Nobody flagged it because it was a line item inside another line item.

After thirty minutes of actual decision-making, the stack dropped from eight tools to five, and the bill dropped to about £185 a month. £125 a month is £1,500 a year. Not a fortune for a consultancy, but embarrassing for someone who writes about operational efficiency for a living.

What I've Spotted Walking Into Client Stacks

I run a version of this audit when I start with a new client, because the picture it gives me of how a business has accumulated AI tooling tells me more than any kickoff call. Three I have helped untangle recently — anonymised, with the specifics rounded — show how pre-existing sprawl tends to look on the way in.

A hair salon group had been on the same booking platform for years and at some point had picked up an “AI front-desk” product on the side. The booking platform itself had quietly rolled the same capability into the base subscription a few months earlier — they had not been told, because nobody on the team was watching for that kind of update. The conversation became “we don't need the standalone, the thing we already pay for now does it.” About £180 a month removed without changing how anyone worked.

A property tech startup I came in to help with sales reporting had ended up with three different AI-assisted CRMs, because the team had grown by hiring people who each brought their preferred tool with them. Nobody had set out to do that. It just happened over a year. The interesting moment in that conversation was not the cost — it was showing them that one modern AI-native CRM could now do, in a single seat, what two of the older ones used to need to do between them. That made consolidating feel like a step forward into the new world rather than a downgrade.

A small jewellery e-commerce business had migrated between two product description tools and the old one had silently kept billing on a separate card. The catalogue lead had started noticing inconsistencies she could not place. We traced them back, switched the old tool off cleanly, and the inconsistencies stopped. The cost saving was small. The clarity of one tool, one source, was the larger win.

The thread running through all three is the same one I hit in my own stack. Tools get added because the AI world is moving quickly and adding always feels like the safer call. The job of an audit is to be the moment somebody finally looks at the whole picture — and, more often than not, to point out that the modern version of one tool will now do what two or three older ones were doing in parallel a year ago.

The line I am going to keep using: AI sprawl at small scale is not a budget problem. It is an attention problem. You pay the bills. You do not pay attention to what you are paying for. The two feel like the same thing and they are not.

The Audit, In Four Steps

This is what I did, exactly. It takes about an hour if you prepare before you sit down.

  1. Pull the last three months of card statements. Every subscription with “AI”, “assistant”, “agent”, “copilot”, “intelligent”, “GPT”, “smart” in the name or in your receipts inbox. Write them all down in one place. Include the add-ons that sit inside other tools — Notion AI, Zapier AI, HubSpot AI. The add-ons are where most of the hiding happens.
  2. Write one sentence next to each about what it does for you. Not what it could do. What it actually does. If you cannot write that sentence within thirty seconds, you are not using the tool enough to justify the subscription.
  3. Group by capability. General chat. Research. Image generation. Automation. Transcription. Writing. Coding. If you have two or more tools in one group, you have a consolidation decision to make. The right answer is almost always one tool per group — not two — and the one you keep is the one your team actually opens, not the one with the best marketing page.
  4. Cancel in the same sitting. This is the step where most audits fail. You identify the overlap, you promise yourself you will cancel next week, you don't. Every tool I kill, I cancel inside the same thirty-minute window I identified it in. Pull the plug while the conviction is fresh.

Three Rules I Am Using Now

After running the exercise on four different setups in a week, three simple rules came out of it. I am using them with every client who asks me to help with their AI tooling, and I am holding myself to the same rules.

One tool per capability, reviewed every quarter. Not because the market isn't moving — it is moving weekly — but because you cannot make a good decision about moving tools while you are simultaneously paying for three of them. Pick the best one today. Use it properly. Revisit the decision in ninety days. If something better has emerged, migrate cleanly. Do not run both in parallel for “just in case” reasons.

No AI add-on sits on a tool you are not actively using. The £10 a month Notion AI add-on on the Notion workspace nobody has opened since November is a pure leak. Same for the analytics tool's AI tier, the CRM's agent module, the project management tool's assistant. If you are not using the base product heavily, you do not need the AI on top.

Every new subscription gets a kill date. When I sign up for something new to try, I put a calendar reminder 28 days out to either commit to it properly or cancel it. The default is cancel. The thing that causes sprawl is not that we add tools — it is that adding is a decision and cancelling isn't. Make cancelling the default and the problem takes care of itself.

Why This Matters More Than It Sounds

An SME spending £300 a month on bloated AI tooling is not a crisis. £3,600 a year is less than an average client invoice for most of my clients. The reason I think the audit matters is not the money. It is the signal.

A business that cannot name every AI tool it pays for cannot reason about its AI risk, either. It will not know which tool has access to which data. It will not know which agent is making decisions on customer emails versus drafting them. It will not know which integration is about to break when a vendor changes its API. The 94% figure in the OutSystems research is not an enterprise problem that SMEs have dodged — it is an enterprise problem that SMEs have reproduced in miniature and then stopped looking at.

The good news is that the audit is quick, the savings are immediate, and the clarity it produces is worth more than the money. I ran mine on a Thursday afternoon. It took less than an hour. The bill is smaller and my stack is easier to think about. If you are reading this and cannot list, from memory, every AI subscription your business pays for, that is your next hour well spent.