Your website was built for humans. But its most important visitors may not be human for much longer.
A recent article on Search Engine Journal covers guidance published by Google on web.dev, advising developers to treat AI agents as a distinct visitor type. This is not a vague recommendation. It is a clear signal about where the web is heading.
Google says it directly: more and more users are moving away from manual navigation and delegating tasks to AI agents. If your site is not ready for that, you are losing an entire category of traffic without knowing it. At difrnt., we see this daily in client analytics: new user-agent strings, non-human navigation patterns, sessions that move through pages in ways no real user would.
Three ways an AI agent reads a website
When a human visits a site, they see the visual interface: colors, buttons, text. An AI agent can perceive the site in three different ways, each with its own strengths and limitations.
The first method: screenshots processed with vision models. The agent takes a screenshot and uses a visual model to interpret what is on screen. It works, but it is slow and imprecise. If a button looks like a button but is actually a styled div, the agent might miss it entirely.
The second method: raw HTML structure. The agent reads the source code directly. Fast, but noisy. HTML full of nested divs and auto-generated CSS classes is hard to interpret even for an advanced language model.
The third method, and the one Google recommends: the accessibility tree. This provides a precise map of all interactive elements on the page, with roles, labels, and relationships. It is the most reliable way for an agent to navigate a website. If your interactive elements do not appear correctly in the accessibility tree, for an AI agent they simply do not exist.
What makes a site broken for agents
Here is the practical problem: many websites that look perfect for users are effectively unusable for AI agents. Google identifies several common anti-patterns.
Complex hover states. If essential information or menus only appear on hover, an agent cannot access them. Agents do not move a mouse cursor over elements. Information must be accessible without contextual interaction.
Unstable layouts. Elements that shift around the page during loading or on scroll confuse agents just as they confuse users. The difference: a user adapts, an agent gets stuck.
Styled divs instead of semantic HTML. A <div class="btn-primary"> looks like a button, but it is not recognized as one by the accessibility tree. A <button> is. The difference seems minor, but for an agent it is the difference between functional and invisible.
At difrnt., when we run technical audits for the AI agent era, we already check for these issues. But now the stakes are higher: it is no longer just about accessibility for people with disabilities (though that matters enormously), but about accessibility for the AI agents that will increasingly mediate the relationship between brands and consumers.
From accessibility to agent-readiness
The most interesting aspect of Google’s guidance is the near-total overlap between web accessibility practices and AI agent compatibility requirements. In practice, if your site meets WCAG standards, it is already largely prepared.
Semantic HTML (<button>, <a>, <label> instead of styled divs), stable layouts, correct label-to-input associations, cursor: pointer on clickable elements. These are things any competent developer should be implementing anyway.
But the reality is that many websites, including those in the Romanian market where we work, do not meet these standards. A quick Lighthouse audit on Romanian e-commerce sites frequently shows accessibility scores below 70. Forms without associated labels, buttons that are actually links, navigation menus that only work with JavaScript. Each of these problems is now also a barrier for AI agents.
We tested internally how several AI agents (Claude, GPT-4, Gemini) interpret our clients’ websites. The results confirmed what Google says: sites with clean semantic HTML were navigated almost perfectly. Sites with many custom JavaScript components generated errors and misinterpretations in over 40% of cases.
WebMCP and what comes next
Google is working on a standard called WebMCP, a protocol for agent-to-website interaction. It is in early preview, with sign-ups available for Chrome developers.
What does WebMCP promise? A standardized way for a site to tell an agent what actions are available, what data it can provide, and how to interact with it. Think of it as an implicit API for every web page. No separate documentation needed, no dedicated endpoints to build.
For marketers and business owners, the implication is clear: sites that adopt these standards early will be the first ones AI agents can use efficiently. And that translates into visibility, conversions, and relevance in a web increasingly mediated by AI. We wrote previously about what the data shows about content and AI Search, and this direction confirms the trend: your digital presence needs to be readable not just by people, but by machines.
Four checks you can run today
You do not have to wait for WebMCP to start. Here is what you can do on your website right now:
1. Run an accessibility audit. Use Lighthouse or axe DevTools and note the score. Every accessibility issue is now also an agent-readiness issue. Focus on interactive elements: buttons, forms, navigation. Aim for a score of at least 90.
2. Check your semantic HTML. Look for divs functioning as buttons or links. Replace them with the correct HTML elements (<button>, <a>, <label>). Add aria attributes where native structure is not sufficient.
3. Remove hover dependencies. Any content or functionality accessible only via hover must have an alternative. If a menu appears on hover, it must also work on click or focus. Test by navigating your site using only the keyboard.
4. Test layout stability. Core Web Vitals already measures CLS (Cumulative Layout Shift). A high CLS does not just annoy users, it makes the site unpredictable for agents. Keep it below 0.1.
What is fascinating about this entire shift is that Google says it explicitly: everything you do to make your site ready for AI agents also makes it better for humans. It is not a trade-off. It is a double-return investment. And in a context where agentic commerce is gaining ground, this is no longer a nice-to-have.





