2:35 AM

Industry

Your Platform Is Either Agent-Friendly or It's About to Lose. Here's the Data.

The shift is happening faster than most companies realise. AI agents are becoming the primary interface between users and the web, and the data shows traffic patterns are already breaking in ways that punish platforms not built for them.

Some numbers worth knowing:

60% of all searches are now zero-click. The user gets their answer directly in the AI summary and never visits a website. AI Overviews appear on 48% of Google queries as of March 2026, up from 34.5% just three months earlier. For queries where AI Overviews appear, organic click-through rate dropped 61%.

Business Insider lost 55% of organic search traffic between April 2022 and April 2025. HuffPost lost half its search referrals over the same period. Travel blog The Planet D lost 50% of its traffic after Google launched AI Overviews, laid off staff to survive, watched traffic drop another 90%, and shut down earlier this year.

Gartner predicts traditional search traffic will fall 25% by 2026. We're already there.

The traffic isn't disappearing. It's shifting.

AI-referred traffic to Shopify grew 7x since January 2025. AI-attributed orders grew 11x. Brands cited in AI Overviews earn 35% more organic clicks and 91% more paid clicks than non-cited competitors. AI search visitors convert at 23x the rate of traditional organic search visitors — meaning 1,000 AI-driven visits produce roughly the same conversions as 23,000 traditional organic visits.

The platforms showing up in AI answers are gaining traffic, conversions, and discovery. The platforms invisible to AI are watching organic traffic decline and don't understand why.

Why this happens

When a human visits your website, they tolerate a lot. Slow loads, popups, scroll hijacking, cookie banners, marketing fluff before the actual information. They'll click through three menus to find pricing. They'll squint at a hero image to understand what you do.

Agents don't.

An agent visiting your platform wants structured data, clear endpoints, and predictable responses. It needs to know what your product is, what it costs, whether it's available, how to interact with it, and how to complete a transaction — all in machine-readable form. If that information is buried in a JavaScript-rendered hero section behind a cookie wall, the agent gives up and goes somewhere easier to parse.

Worse, 34% of SaaS companies are blocking at least one major AI crawler — GPTBot, PerplexityBot, or ClaudeBot — in their robots.txt. They're literally locking themselves out of AI discovery while watching traffic decline.

The buyer is already there

85% of B2B buyers have a "Day One List" of preferred vendors before they ever speak to a sales representative, according to Bain & Company. That list used to be formed through Google searches, peer recommendations, and analyst reports. It's now increasingly being formed in conversations with ChatGPT, Claude, and Perplexity.

If your platform isn't being mentioned in those conversations, you're not on the Day One List. You're not in the pipeline at all. You're not even in the consideration set.

What "agent-friendly" actually means

The platforms winning right now share a common pattern. They expose structured product data through APIs or MCP endpoints. They publish schemas that agents can read. They make pricing programmatic, not buried in PDFs or "contact sales" CTAs. They handle authentication and transactions through standardised flows. They treat the agent as a first-class user, not an inconvenience.

Shopify's AI Toolkit, released earlier this month, is the cleanest example. Every Hydrogen storefront now exposes an MCP endpoint by default. Every store is queryable, browsable, and transactable by AI agents without a human ever opening a browser. That's why Shopify's AI-attributed orders grew 11x in a year. It's not a coincidence — it's architecture.

The same pattern is showing up everywhere. Anthropic released Managed Agents to make it easier to deploy agents at scale. OpenAI is building toward a "super app" combining chat, code, and browsing. Google is integrating AI Mode into search. Every layer of the web is being rebuilt with agents as the primary consumer.

What this means for product teams

The question to ask isn't "is our website nice?" It's: can an AI agent figure out what we do, what we charge, and how to use us — without a human in the loop?

Minimum viable agent-friendliness includes a public API or MCP endpoint that exposes core product capabilities. Structured data on every public-facing page. Pricing that's programmatically accessible. Content that's parseable, attributable, and citation-friendly. Authentication and transaction flows that agents can complete on behalf of users. And critically — robots.txt that allows AI crawlers, not blocks them.

This is not exotic. It's table stakes for the next decade.

The bigger picture

For two decades, the goal was to rank on Google. SEO was about pleasing a crawler that scanned text and links. Now the crawler is an agent that wants to actually use your product on behalf of a human, and the standard is dramatically higher.

The platforms that adapt will see compounding advantages. As more users delegate to agents, the agent-friendly platforms get more traffic, more transactions, and more data. That data improves the agent's recommendations, which drives more usage. The flywheel accelerates.

The ones that don't adapt will see traffic decline, conversions drop, and discovery dry up. They'll blame algorithm changes, market conditions, or competitive pressure. The actual cause will be that agents stopped sending users their way.

Build for the agent. Or watch the agent build a path around you.

Let's build something.

I'm always up for a conversation with founders and teams who want to ship faster.