The Semantic Web
“I have a dream for the Web in which computers become capable of analyzing all the data on the Web — the content, links, and transactions between people and computers. A ‘Semantic Web,’ which makes this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The ‘intelligent agents’ people have touted for ages will finally materialize.”
— Tim Berners-Lee, Weaving the Web, 1999.
Tim Berners-Lee invented the World Wide Web at CERN in 1989. He wrote the first browser, the first server, the specifications for HTML, HTTP, and the URL, and gave it all away for free. The creator of the internet as we know it today.
Ten years later, he wrote the paragraph above. The web he had built was a system of documents linked to other documents, designed for humans to read. His vision went further. Computers would be able to read the data on every page, understand what it meant, and act on it. Agents would do the work; humans would set the goals. He called this the Semantic Web.
The original web was never built for this. Berners-Lee initially included typed links, relationships that carried meaning, where one document could "describe" another or "depend on" another. To ship something working with the tools available in 1990, the system had to be stripped down to what HTML is today, a display layer. Berners-Lee spent the next 25 years trying to bring the Semantic Web to life.
He tried to fix this with a new technology stack. A new layer of standards that would turn web pages from documents into structured data, letting machines read individual facts, understand the relationships between them, and query the whole web like a single database. In practice, every publisher on the web would have to encode the meaning of their data by hand for the system to work. But the standards were complex, and the work fell on publishers who got nothing immediate in return. By 2006, Berners-Lee admitted the project remained largely unrealized, and by the mid-2010s, most of the developer community had quietly moved on.
Then large language models arrived. They could read prose, parse intent, follow context, and reason across messy data. The labor Berners-Lee had asked of publishers turned out to be unnecessary. Machines had become smart enough to work with data made for humans.
In November 2024, Anthropic released the Model Context Protocol, letting AI models connect to tools and data without bespoke integrations for every combination. Within a year, every major AI lab had adopted it. The integration problem collapsed from N × M to N + M.
In April 2025, Google launched Agent2Agent. Where MCP lets an agent talk to a tool, A2A lets an agent talk to another agent. It defines how two autonomous agents discover each other, authenticate, exchange tasks, and coordinate work. Each agent publishes an Agent Card, a JSON file describing its capabilities, endpoints, and authentication requirements, that other agents can read and act on. Where MCP gives agents tools, A2A gives them collaborators.
Once agents could read the web and talk to each other, ecommerce was an obvious use-case. The economic incentive was clear. Every successful agent transaction is a real transaction, with real money attached. OpenAI and Stripe launched Agentic Commerce Protocol (ACP), while Google and Shopify released Universal Commerce Protocol (UCP), both enabling agentic commerce.
The result is what Berners-Lee described in 1999, almost word for word. Someone asks ChatGPT for "best running shoes under a hundred dollars." The model reads structured product feeds from millions of merchants, surfaces a shortlist, and completes the transaction through ACP. The merchant ships a package. Machines talked to machines. The human never opened a website.
The era of designing every digital surface for human attention is ending. Retailers will need to expose their catalogs to agents with semantic, contextual data. Hotels will need to expose their endpoints to agents, not a clickable GUI. Payments are already being rewired through protocols like AP2, where tokenized intent mandates work alongside existing card networks and prove an agent acted on real authorization. The companies whose systems can be read, queried, and trusted by an agent will be the ones agents transact with.
Search is becoming a conversation. ChatGPT has over 900 million weekly users, and Google's AI Overviews reach over two billion a month. Both treat search as a conversation rather than a list of links, and nearly half of AI search users describe AI as their primary source of information. "Simply pulling text from a website isn't enough anymore," said Brandon Ervin, Google's Director of Product Management for Ads. When queries stop being keywords, keyword-based advertising stops working. Google is killing Dynamic Search Ads, the keyword-based product that has powered search advertising for two decades, and forcing every advertiser onto AI Max, which infers user intent rather than matching strings.
After all, Berners-Lee was right. Though the original Semantic Web as a technology project never won, the vision underneath did. The age of the semantic web is finally coming.