By Sandra Naranjo Bautista

The use of generative AI has doubled in the past year. ChatGPT alone sees over 4.5 billion monthly visits, with 73% of messages related to non-work issues. AI is no longer just a workplace tool — it’s becoming embedded in everyday life.

The economic promise is huge. PwC estimates that AI could generate $15.7 trillion in productivity gains by 2030. The hype is real — but so is the gap.

AI adoption is not happening equally. And when you look at the data, the divide between the private and public sectors becomes hard to ignore. While businesses are racing ahead, many governments are still stuck in the pilot phase. I wanted to understand: how big is the gap — and why does it exist?

The differences between the two sectors

Adoption

Private sector adoption has rose. According to the latest McKinsey survey, 78% of firms are using AI (up from 20% in 2017), and 71% report using generative AI in 2024 — with higher adoption in large firms.

In contrast, in a survey of 14 high-income countries, 71% of agencies are in the planning or early implementation stage, only 26% have integrated AI across the organization, and just 12% are deploying GenAI tools.

Usage rates vary sharply across income levels. As of April 2025, 24% of internet users in high-income countries used ChatGPT, compared to 5.8% in upper-middle, 4.7% in lower-middle, and just 0.7% in low-income countries.

Investment

The investment gap is just as wide according to the AI Index Report from Stanford. In 2024, the U.S. private sector invested $109 billion in AI, compared to $3 billion by the federal government — 36 times more. Globally, private investment reached $252 billion. Meanwhile, public investment remains fragmented and often ad hoc.

Use Cases

The way each sector uses AI reflects its purpose.

Private companies have focused on revenue growth, customer experience, and operational redesign. AI is becoming a core part of how they work, not just what they produce.

The public sector has approached AI from two main angles: first, as a regulator, by creating AI strategies and policies — often lagging behind the pace of change. Second, as a user, with most applications focused on internal improvements such as fraud detection, audit analytics, and forecasting.

Only 4% of government AI use cases are citizen-facing. While AI is helping agencies improve existing processes, it’s rarely driving transformation.

In short, AI is helping businesses reimagine how they operate. In government, it’s helping agencies tweak what they already do.

Why the Gap Exists — and Why It’s Not Just About Technology

Different incentives, different speeds

Private firms operate in competitive environments. They have stronger incentives to experiment, shorter feedback loops, and a higher tolerance for risk.

Government, by contrast, is held to a different standard. Accountability, public trust, and resource constraints make it harder to take risks. A failed AI project in the private sector may mean a financial loss. In government, it could mean public harm.

In fact, 62% of public sector respondents cite data privacy and security concerns as a major barrier to adoption.

Structural barriers in government

Governments operate within rigid structures that slow down innovation. Budgets are usually annual, approvals are layered, and mid-year flexibility is limited. In many countries, even digital infrastructure is unevenly distributed.

Legacy systems are another major hurdle. 45% of governments say these systems significantly constrain AI implementation. A 2025 EY survey found strong investment in data infrastructure (64%) and analytics (41%), but far less in AI (26%) or GenAI (12%).

And it’s not just systems — it’s data. In Korea, for example, a study around AI failures found that 70–80% of hallucinations in government AI pilots were caused by poor or low-quality data. Without better inputs, even the best models won’t deliver.

Many agencies also operate in silos. Cross-agency collaboration is rare and often requires top-down coordination and deliberate policy changes.

Workforce and skills

Workforce readiness is a barrier everywhere — but the public sector feels it more acutely.

While private companies are partnering with universities, hiring fast, and investing in internal training, governments face salary caps, rigid recruitment systems, and higher turnover.

A 2024 Salesforce survey found that 60% of public sector IT professionals identified skills shortages as the top challenge to AI adoption. These shortages are particularly severe at the local level, where resources are even more constrained.

Trust, governance, and culture

Governments are subject to more scrutiny. They must comply with strict regulations, ensure fairness and explainability, and manage public expectations around transparency and ethics. This adds time, complexity, and risk aversion to any implementation process.

Meanwhile, the private sector’s governance pressures are mostly reputational — important, but not legally or politically binding.

Culture also plays a role. A 2025 study found that 35% of public sector leaders cite a lack of innovation and risk-taking culture as a constraint to AI adoption. As BCG reminds us, only 10% of AI transformation is about algorithms. The rest? People and processes. Change management matters — and it’s harder in bureaucratic systems designed for stability, not experimentation.

What This Means for Governments: Action, Not Luck

The adoption gap won’t close on its own. Governments need more than high-level strategies and ethical frameworks — they need deliberate, sustained action.

That means:

  • Strategic governance — a whole-of-government approach with clear mandates, coordination mechanisms, and incentives from the center of government.
  • Foundational readiness — investing in clean data, digital infrastructure, and integration capacity before scaling AI.
  • Enabling resources — particularly skilled people and modern systems. This might require new work arrangements, hiring flexibilities, and targeted training.
  • Room to experiment — with clear rules, transparency, and accountability. Pilots are a good start, but they must be designed with a path to scale.

AI won’t transform government by accident. It takes coordinated action, trusted institutions, and the courage to move beyond pilots.

You don’t raise to the level of your goals, you fall to the level of your systems.

James Clear