Why 78% of Companies Adopted AI and Only 1% Got Real Results

Most AI projects fail because the order is wrong. Map the process first. Build the database second. Add AI last. The Hildebrandt framework, explained.

By Dimitris Kontaxis ·

Why 78% of Companies Adopted AI and Only 1% Got Real Results

Every AI consultancy sells you generic agents. Same template. Same pitch. They skip the hard part.

The hard part is not the model. It is your Monday morning.

The statistic that should make everyone uncomfortable

McKinsey's State of AI 2025 report, released in November, has one number that matters more than the rest. 78% of organizations now use AI in at least one business function. Only 1% of C-suite respondents describe their generative AI rollouts as mature, where "mature" means AI is fundamentally changing how work gets done.

Read that again. Seventy-eight to one.

The report lists the reasons in its own language. The biggest barrier to scaling is not the workforce, it is leadership not steering fast enough. The single attribute with the largest effect on whether an organization captures EBIT from generative AI is not the choice of model. It is the redesign of workflows. Most companies have not touched the workflow layer. They bought the tool and waited for the tool to figure out their business for them. It never does.

So when a founder asks us why their team signed up for three AI products last quarter and still sends the same emails by hand, the answer is not "you picked the wrong product." The answer is that nobody mapped the process first.

The order has been the same since the industrial revolution

The order is not new. Every operator who has ever walked a factory floor already knows it. What is new is the number of companies trying to reverse it.

Step 1. Process. Map what the business actually does, not what the leadership deck says it does.

Step 2. Database. Build a single source of truth where every business object (clients, projects, deliverables, team members) relates to every other one naturally.

Step 3. Automation. Only now layer the AI on top.

Ryan Hildebrandt, co-founder of OpsKings, has been writing about this order publicly for the last two years. His framing is the cleanest version we have seen for the SMB context. Break the business into an assembly line. Decide, station by station, which ones should get a human and which should get a machine. If you try to install the machine before you have the assembly line, the machine sits in the corner and hums.

The companies making up the 1% are the ones that treated Step 1 as the product. The companies making up the 78% bought Step 3 and called it strategy.

Invisible work is where the leverage hides

The diagnostic question that almost never gets asked in a sales call is this one. Can you take a month off, fully off the grid, and would your business still function?

If the answer is no, the reason is not a missing tool. The reason is invisible work.

Invisible work is the weekly Slack ping to a supplier about next month's order. It is the quarterly review of who is renewing and who is churning, tracked in the founder's head. It is the client question that arrives on WhatsApp every Tuesday and gets answered inside fifteen minutes by whoever happens to be near their phone. It is the Friday afternoon catch-up where the founder tells the team what the clients have been saying and the team adjusts next week's plan.

None of that is in the org chart. None of it is in the CRM. None of it is in any document a freelance consultant could ever find by asking "show me your process." Freelancers build what you tell them to build. The work you cannot describe is the work you cannot delegate, and the work you cannot delegate is the reason you cannot take that month off.

The diagnostic session is where that work gets surfaced. It is not a sales call. It is a mapping exercise. You walk us through your Monday morning and we write down the things that happen without a prompt, because those are the things the Full-Time Agent will eventually own.

What clients ask for versus what the process actually needs

Here is the pattern we see almost every time. The client arrives with a clean ask. The real problem, once we have mapped the process, looks different.

| What the client asked for | What the process actually needed | |---|---| | "Automate my task management." | A rewritten intake protocol that eliminates the step generating the tasks in the first place. | | "Save my team time on daily QA." | A tagging system that replaces the Loom recordings nobody has time to watch. | | "I need a better dashboard." | A dashboard the sales team uses inside deal calls, not one the finance team exports monthly. | | "Automate my onboarding." | A redesigned offer-delivery flow so onboarding can be repeatable at all. |

We are not making this up. Hildebrandt describes the same gap from OpsKings' side of the table, and we have lost count of the discovery calls where the pain point we uncover is three questions away from the one the founder opened with. The reframe is always the same. Just tell us what your business problems are. The framework will do the translation.

This is the reason "process-first" is not a buzzword. It is the entire delivery discipline. Skip it and you end up automating the ask. Do it properly and you end up removing the reason for the ask.

How we deliver the framework in practice

The framework is abstract. The delivery is not. Here is how the four phases map to actual edge247 work, in the order we do them.

Phase 1. Discovery. Two hours on site. One engineer, the founder, whoever else on the team carries invisible work. We walk through a normal week, station by station. Output: a process map, a ranked list of pain points, and a shortlist of the two or three places where a Full-Time Agent creates the most leverage.

Phase 2. Architecture. One hour, remote. Two engineers. We design the agent's harness around the process map. Which stations get a human, which get an agent, what the agent is allowed to touch, what it has to ask about, how the graduation plan works (Month 1 asks before acting, Month 3 handles routine, Month 6 anticipates). Output: an agent blueprint specific to the business.

Phase 3. Implementation. Two to three hours with both engineers, one on site, one remote. We deploy the platform on the business's own Mac Mini. We configure the personality, the tool policies, the memory architecture, the channel routing. We harden the file permissions and the network boundary. We test. Output: a running Full-Time Agent on the founder's WhatsApp.

Phase 4. Always-On Watch. Monthly, remote, for as long as the business wants it. Every platform release gets tested on our hardware before it gets anywhere near the client. The personality gets refined as the business changes. The memory gets tuned. The graduation plan advances. Output: an agent that compounds in value every month instead of decaying like a static install.

Total for phases 1 to 3, around six hours of engineering time. The output is not just a working agent. It is a founder who understands why the agent works, which is the difference between a tool you paid for and an operator you hired.

Why this lands harder in Europe

Eurostat's most recent numbers for enterprise AI use say that 13.5% of EU enterprises with 10 or more employees adopted AI in 2024. That figure grew to 20% in 2025, which is real movement, but it hides a bigger gap: large enterprises are at 41%, small enterprises at just 11%. The European SMB layer, the workshops and family-run trading companies and regional retailers, is where the shortfall is largest.

The usual explanation is that Europe is "behind." The more useful explanation is that the dominant delivery model (cloud-hosted, US-priced, English-only, compliance as an afterthought) was never built for European SMBs in the first place. GDPR is a real constraint. Local data handling is not a preference, it is the only version of the conversation that ends in a signature. Pricing that works for a Silicon Valley series B looks ridiculous to a Greek trading company with 12 employees.

There is also a second-order problem. European SMBs are the ones most exposed to invisible work, because so many of them run on founder knowledge and relationship capital that never gets written down. A mid-size business where the founder still answers the most important emails personally, and where two decades of supplier and client history live in two or three people's heads, is not a rare shape. It is the dominant shape. The standard cloud AI pitch ("upload your documents to our platform and we will index them") asks those founders to hand over the exact thing that keeps the business defensible. They say no, and they are right to say no. The only honest answer is an implementation that stays on their hardware, inside their office, behind their firewall, and gets better every month because someone is watching it for them. That is what we do.

INSEAD's recent "McKinsey in a Box" coverage points at the other side of this. The analytical capability that used to cost a business fifty thousand euros from a top consultancy is collapsing into tools that a mid-sized company can run themselves. The implication for SMBs is not "hire a consultant now while you still can." The implication is that process-first AI implementation, the thing the 1% of mature companies are doing, is for the first time accessible to a business that could never have afforded a McKinsey engagement. Somebody has to actually deliver it. That is the opening.

We are building edge247 for that opening. Discovery that maps your Monday morning. Architecture that respects the invisible work. Deployment on your own hardware in your own office. Ongoing care that does not stop when the invoice clears. You do not need to understand harness engineering or memory architecture or security hardening. You just need someone who does, and who will keep doing it every month while you run your business.

If this article describes your last year of AI investment

Book a discovery call. Thirty minutes. No slide deck. We walk through your Monday morning and we tell you honestly whether a Full-Time Agent is the right move or whether you need to fix something else first. If the answer is "fix something else first," we will say so. That is the entire job.

Because 78% is not a ceiling. It is a starting line. The 1% is whoever does the boring part in the right order.

Sources