AI strategy: your first 90 days

Brilliant Noise 6 May 2026

We refresh this page regularly to keep pace with fast-moving AI platforms and policies.

Most “AI strategies” we see are wish lists. A list of use cases. A vendor preference. A vague aspiration about productivity gains. Usually a slide that says “transformation”.

What’s missing is the actual strategy – the work of translating principles into how the organisation operates. This piece is about that work: what an AI strategy needs to answer, what to do in the first 90 days, and the failure modes worth avoiding.

If you haven’t read our 8 AI principles for business leaders, that’s the foundation. This is the next layer up.

The strategy gap

Most leadership teams have accepted the principles. AI is a tool. AI literacy matters. Critical thinking is essential. None of this is controversial any more.

But agreeing with principles is not the same as having a strategy. Principles are the truths you hold. Strategy is what you do about them. Most organisations have skipped the middle step: translating “AI literacy matters” into “here’s the literacy programme, here’s the budget, here’s who runs it, here’s how we know it’s working”.

The 90 days is for closing that gap.

What an AI strategy must actually answer

A real AI strategy has answers to four questions. Most strategies in the wild answer one or two and leave the rest implicit – which is to say, unaddressed.

Value. Where does AI create real value for us, and how will we know? Not “where could AI be used” – everyone has that list. The harder question is which use cases actually matter for your business, what good looks like, and what you’ll measure. Without this, you end up with a portfolio of pilots and no way to tell which are worth scaling.

Capability. Who in the organisation needs to be fluent in what? AI literacy isn’t a flat requirement across the workforce. Senior leaders need different fluency from team managers, who need different fluency from individual contributors. Most “AI training” programmes ignore this and end up shallow at every level.

Governance. What are our rules for use, risk and accountability? What data goes in, what doesn’t, who’s responsible, what happens when something goes wrong. The organisations that have done this well have written policy in plain English that anyone can apply on a Tuesday morning.

Pace. How fast do we move, and on what? Strategy is as much about what you don’t do – and what you do later – as what you do now. A clear pace prevents the two failure modes at either end: paralysis (we’ll wait until everything is settled) and panic (we’ll do everything everywhere).

The 90-day plan

Three phases, thirty days each. The shape matters more than the specific dates – the point is to give yourself a deadline against each phase rather than letting any one of them sprawl.

Days 1–30: Assess

Start by understanding the actual state. Most leadership teams underestimate how much AI is already in use across their organisation, often invisibly.

The assessment covers four things:

  • Current usage. What tools are people using, with what permissions, on what data? Some of this will be invisible – personal accounts, browser extensions, embedded features in software you’ve already bought.
  • Existing capability. Who in the organisation has genuine AI fluency? Where are the super-users, what are they doing, what are they hitting walls on?
  • Governance reality. What policies do you have on paper, and what’s actually happening in practice? The two are usually different.
  • Strategic context. What does your business actually need from AI – linked to the strategy you already have, not a parallel one?

The output of the assessment is a clear-eyed picture. No recommendations yet, no ambitions. Just the truth on the ground.

Days 31–60: Align

With the assessment in hand, decide. This is where the four questions get answered.

  • Value: Pick the use cases worth investing in. Two to four is usually right. Be ruthless about saying “not yet” to the rest.
  • Capability: Define what fluency means at each level of the organisation, and what it would take to get there.
  • Governance: Draft the policy, in plain English, that says what’s in and out, who owns what, and what happens when things go sideways.
  • Pace: Set the cadence. What’s happening in the next 90 days, the next six months, the next year. Hold the line on what isn’t.

This phase usually surfaces the hardest decisions – which use cases to drop, which capability gaps need real investment, which parts of the business need to be told no for now. Push through them.

Days 61–90: Activate

Start doing it. The temptation here is to declare victory once the decisions are made. Don’t.

Activation means:

  • Kicking off the priority use cases. With clear owners, success criteria and review points.
  • Launching the literacy programme. With a sponsor, a budget and a meaningful sample of the organisation actually starting.
  • Publishing the governance policy. And making sure people have read it, understood it and know who to ask when in doubt.
  • Setting the measurement framework. What numbers you’ll track, how often you’ll review them, and what would make you change course.

By day 90, you don’t have a finished strategy. You have a working one – one that’s already producing data, surfacing problems and improving as it runs.

What “done” looks like

The end of 90 days is not a 60-slide deck. The deliverables are smaller and more practical:

  • A one-page AI strategy that names your priority use cases, capability bets, governance commitments and pace.
  • A published governance policy short enough that someone could read it before their morning coffee.
  • A literacy programme with a budget, a sponsor, and a clear definition of what fluency looks like at each level.
  • Two to four priority use cases in flight, with named owners and success criteria.
  • A measurement framework with the numbers you’ll track and the cadence for reviewing them.

These are the artefacts of a strategy that’s actually operating.

Common ways this goes wrong

Four failure modes show up repeatedly. Each is a recognisable pattern.

The “all use cases” trap. The organisation generates a list of forty possible AI applications and tries to pursue most of them. Energy gets diffused, no use case gets enough investment to succeed, and the strategy reads as ambitious but produces nothing in the P&L. The fix: pick two to four. Be ruthless.

The “all governance” trap. Governance becomes the strategy. Policies multiply, approval gates proliferate, every potential use case gets stuck in committee for months. The organisation appears to have its house in order while real progress stalls. The fix: lead with use cases, let governance follow at the same pace.

The “all literacy” trap. Training programmes become the strategy. Every employee gets put through a course on prompting. Six months later, fluency hasn’t actually grown because the programme wasn’t tied to real work. The fix: literacy programmes work when they’re built around use cases people actually have. Theory without application doesn’t transfer.

The “no governance” trap. The opposite failure. Use cases proliferate, super-users appear, productivity goes up, and one day someone pastes confidential client data into a public model. The fix: governance doesn’t have to be heavy, but it does have to exist before it’s needed.

Most strategies that fail aren’t in only one of these traps. They oscillate between them.

What comes next

Ninety days gets you a working strategy in motion. The longer work of building genuine AI capability across the organisation runs beyond it. At Brilliant Noise, we describe that journey through our protocol: Assess, Learn, Apply, Embed.

The 90 days feed directly into the protocol. Your initial Assess phase sets up a deeper Capability Assessment. The Align decisions surface the literacy needs that Learn will close. The Activate kick-off starts the Apply work, with priority use cases moving from idea to running pilot. Embed begins as soon as those first use cases need governance to operate at scale.

The 90 days is the start. The protocol is the path through.

Where this leaves you

What you’re really building over these 90 days is the organisational capacity to execute on AI strategy – and to keep developing it as the technology and the market evolve. The artefacts (the one-pager, the governance policy, the literacy programme) matter, but they’re the visible part of something bigger.

That underlying capacity – the discipline to make calls about priority and pace, the willingness to invest in unglamorous foundations, the structures that let learning compound – is the most valuable infrastructure you can build right now.

If you’d like to talk about what this might look like for your team, book an AI Power Hour and we can walk through it together.

If you spot a change in the platforms or the deployment landscape that affects this guidance, tell us. We keep this page updated so it stays practical and current.

Last updated: May 2026