The UK government loves to talk about AI. Every few months, another glossy report drops — the “AI Playbook,” the “AI Opportunities Action Plan” — each promising to make Britain a “global leader” in responsible AI. But there’s a glaring gap in all this ambition: nobody’s actually asked the people who have to make this stuff work.

Local authorities. Town halls. Councils. The boring, under-resized machinery of local government that actually delivers services to real people.

The Local Problem

Here’s the thing about AI infrastructure that Westminster keeps missing: most of the data, the talent, and the practical implementation sit with local government, not in Whitehall. The UK public sector employs about 18% of the workforce and touches every citizen’s life — but it’s fragmented across hundreds of councils, mayoral combined authorities, and NHS trusts. Each one collects data differently, if at all.

The government’s answer is a “National Data Library” — noble in theory, but you can’t build a national library when each shelf uses a different filing system. As a recent paper from Browne Jacobson argues, the government needs to prioritise which datasets actually matter, figure out who owns them, and standardise the damn formats before anything useful can happen.

This isn’t glamorous work. There are no press releases about data standardisation. But it’s the prerequisite for anything else.

The Devolution Angle

Where it gets interesting is devolution. The UK has been pushing powers out to local authorities — skills budgets, freeports, local growth funds. That means mayoral combined authorities and councils can now support data centre planning, upskill workers in AI roles, and shape local innovation hubs.

But local planning teams are already overwhelmed. They deal with housing, bins, social care — and now they’re supposed to become AI strategists? The government wants “AI Growth Zones” for data centres, but these require planning approvals, clean energy connections, and local buy-in. That’s a hard sell when residents see data centres as neighbourhood threats, not opportunities.

The paper argues locally-led developments, integrated with placemaking and local economic plans, have a better chance. Not revolutionary, but realistic.

The Real Test

Remember the West Midlands Police AI fiasco? The force used AI-generated “intelligence” that turned out to be wrong — leading to a bad decision and erode public trust. That’s what happens when you dump tech into organisations without the expertise to oversee it.

That’s the tightrope: the EU takes a safety-first approach (the AI Act), the US goes dogmatically pro-innovation, and the UK claims it wants “responsible innovation.” But that requires sophisticated risk management across every public body — not just a policy paper in Westminster.

The most interesting local experiment right now is “Waves” — a collaboration between Google, Demos, New Local, Camden Council, and South Staffordshire. It’s testing AI to improve citizen engagement on contentious local issues — finding consensus, identifying trade-offs, helping people actually participate in decisions that affect them.

That’s the kind of use case that could rebuild trust in institutions. But it requires local capability that doesn’t exist yet.

The Takeaway

The UK can talk about being a global AI leader all it wants. But until the basics are sorted — data, skills, local capacity — it’s just that. Talk.

The good news: local government is where the rubber meets the road. Get this right, and you have something real. Get it wrong, and it’s another dusty report nobody reads.