Federal AI Advisory

Don't let the tool
define your institution.

Federal institutions are past "should we use AI?" and deep into "we already are, everywhere, and none of it is coherent." Imago Forge helps you get ahead of it — before the tool's defaults become yours, or after, when it's time to build something that actually fits.

The problem

The tool is making decisions your institution never made.

Every AI system ships with assumptions baked in — about how decisions get made, what gets prioritized, what gets ignored. Without deliberate customization, those assumptions quietly become yours.

Firsthand — Jeremy Wilcox, Imago Forge
"I watched a major research firm roll out vanilla Salesforce in 2013 and never customize it. A decade later they were running their business around the tool's logic. The vendor's team joked about it internally."
That's not a CRM story. It's a warning about what happens when you let a vendor's defaults become your operating model. AI is doing the same thing right now — only faster, deeper, and in workflows that touch real decisions about real people. Six months later, the tool's assumptions have become the institution's assumptions. And nobody noticed it happening.
Lidl — $580M invested, 7 years lost, reverted to original system
"SAP's inventory logic ran on retail prices. Lidl's entire operation ran on purchase prices. Rather than adapt the software, they tried to adapt a $30B retailer to match the tool."
Seven years. $580 million. Lidl shut it down and went back to their original system. The tool didn't reshape their inventory — it tried to reshape their entire business model. Nobody caught it until it was too late.
Target Canada — shuttered entirely within 2 years of opening
"Vanilla ERP, no customization for Canadian suppliers or data formats. Employees made thousands of manual errors. Shelves went empty on opening week."
Target Canada closed entirely in 2015 — two years after opening. The supply chain never recovered. A generic tool deployed without customization to how the operation actually worked collapsed an entire market expansion.
US Navy — $1B spent, four pilots built, none interoperable
"Four separate ERP pilots. No central governance. Each program silo picked their own system. A billion dollars later, nothing worked together."
The GAO called the billion largely wasted. Three of the four systems were scrapped. The problem wasn't the technology — each program defined its own model and nobody built the coherence layer first. Sound familiar?
The pattern — now happening with AI, everywhere, all at once
"The question isn't whether your institution is using AI. It's whether you're defining how — or whether the tool already has."
Generic AI tools are being deployed org-wide with no customization to mission, no governance, no defined accountability. The window to define the operating model on your terms is open — but it closes fast once the defaults take hold.
How we help

Two ways in. One honest conversation first.

Most clients start with one and grow into the other. Where you begin depends on where you are.

Entry point

Before you sign anything.

You're here when —

You're evaluating vendors, framing a requirement, about to issue an RFP, or trying to figure out whether to build or buy. You want a second set of eyes from someone who's been on both sides of the table.

Pattern recognition work. Jeremy has watched enough federal AI procurement go wrong — from inside agencies, from the vendor side, and as an analyst — that he catches the expensive mistakes before they get made.

  • Vendor evaluation and the questions you don't know to ask
  • Requirements framing that protects you in the contract
  • Build vs. buy analysis grounded in federal reality
  • OMB M-25-22 acquisition readiness
  • Leadership alignment before the purchase, not after
Talk before you decide →
Deeper engagement

Build a model that actually fits.

You're here when —

You're already using AI in multiple ways and need to make it coherent — a real operating model built around your mission, values, and accountability structure, not a framework off the shelf.

Operating model design. Not a governance deck. Not a compliance exercise. A working model that defines how AI operates inside your specific institution — who owns it, how decisions get made, and how it stays yours.

  • Use-case mapping to mission and institutional values
  • Accountability and decision authority structure
  • Human oversight design that actually holds
  • Cross-program coherence and interoperability
  • Pilot design with measurable, defensible outcomes
Build something that fits →
How most clients start

A conversation before the RFP

One call to sense-check the approach, the vendors, the framing. Often catches the expensive mistake before it's made.

Where it usually goes

Shaping the acquisition

Requirements development, vendor evaluation, contract language. The work that determines whether the tool serves the institution or the other way around.

For institutions ready to go deeper

Building the operating model

A defined, governed, mission-aligned AI operating model. The work that makes AI coherent across the institution instead of scattered across programs.

Why Jeremy Wilcox

He's been on every side of this transaction.

The value isn't that he's seen a lot of federal AI. It's that he's seen it as the analyst telling agencies what to buy, as the vendor watching how they actually use it, and as the practitioner inside the building when it goes wrong.

Jeremy Wilcox

Founder, Imago Forge

Jeremy spent two decades moving between the three roles that almost nobody occupies at the same time: federal technology analyst, AI vendor, and government practitioner. That triangle is what Imago Forge is built on.

At Forrester, he advised federal agencies on what to buy — and watched institutions make expensive, irreversible decisions without the right framework. On the vendor side at C3.ai and Accenture, he saw how those same institutions actually used what they bought, and how rarely the tool matched the institution's real needs.

Inside government at DHS and detailed through USDS to DOJ, he saw the deployment side: what happens when AI meets a real institution, real workflows, and real accountability pressures.

Inside government — DHS & USDSSenior Advisor for Customer Experience at DHS. Digital Services Specialist detailed to DOJ. Watched AI meet real institutions and real accountability pressures from inside the building.
On the vendor side — C3.ai & AccentureSenior Director of Strategic Solutions at C3.ai federal. Led ClearEdge Partners' entry into federal acquisition. Knows exactly how vendors frame AI to agencies — and where the gaps live.
As the analyst — Forrester ResearchAccount Director advising federal agencies on technology decisions. Multiple President's Club. Learned what the right questions look like before the contract gets signed.
Published voice — FCW contributorWritten on federal technology modernization and acquisition in Federal Computer Week. A practitioner who can articulate what he sees.
vs. the field

What makes this different.

The large firms have AI governance practices. They also have billions invested in their own platforms and a strong incentive to sell you their stack.

The large consulting firms

Scale, frameworks, and their platform

  • NIST RMF decks applied generically to every client
  • Governance as compliance theater, not operating design
  • Partners who've never been inside a federal agency
  • Incentivized to sell you their platform and their bench
  • Senior attention in the pitch, junior team on the work
Imago Forge

Built around your institution, not ours

  • Operating model designed to your mission and values, not a template
  • Governance that reflects how your institution actually decides things
  • Advisor who's been inside DHS, USDS, Forrester, and C3.ai federal
  • No platform to sell — fully aligned incentives from day one
  • Jeremy on every engagement, not handed to someone else

One conversation before the decision costs you.

Whether you're evaluating vendors, framing a requirement, or trying to make sense of AI that's already sprawling across your institution — the right starting point is a direct conversation with someone who's been on every side of this.

Email Jeremy directly Connect on LinkedIn

No intake form. No SDR. Just Jeremy.