NLWeb, MCP, and llms.txt: The Three Protocols That Will Define the Agentic Web
The agentic web runs on three protocol layers. llms.txt tells AI what to read. NLWeb lets AI ask questions. MCP lets AI take action. Here's how they fit together and which one your site needs first.
Founder & CEO at AgentReady
The Web Just Got a New Stack
For thirty years, the web had one consumer: the human browser. HTML was designed for human eyes. CSS for human aesthetics. JavaScript for human interaction. That era is ending.
AI agents are the web's second consumer. And they need their own stack. Not a replacement for HTML — an addition. A parallel layer that lets machines read, understand, and act on your content without scraping, guessing, or hallucinating.
AgentReady™ has tracked three protocols emerging as the foundation of this new stack: llms.txt, NLWeb, and MCP. Each serves a distinct purpose. Together, they form the complete communication layer between your website and the AI agents that will increasingly drive traffic, transactions, and trust.
Think of it like the original web stack. HTML gave structure. CSS gave presentation. JavaScript gave behavior. The agentic web has the same three-layer logic — just for a fundamentally different audience.
Layer 1: llms.txt — The Instructions Layer (Read)
The simplest protocol is often the most powerful. llms.txt is a plain text file placed at your domain root that tells AI models what your site is, what it offers, and how to navigate its content. Think of it as robots.txt's smarter sibling.
Where robots.txt says "don't go here," llms.txt says "here's what matters." It provides structured context — your business description, key pages, content hierarchy, and preferred citation format. It takes 15 minutes to create and immediately improves how AI models understand your site.
Adoption is accelerating. As of early 2026, roughly 7.2% of the top 10,000 sites have implemented llms.txt, up from under 1% in mid-2025. The protocol was proposed by Jeremy Howard (fast.ai founder) and has gained traction because of its radical simplicity — no API, no server changes, just a text file.
- Plain text file at
/llms.txtor/.well-known/llms.txt - Describes your site's purpose, structure, and key content
- No server infrastructure required — just upload a file
- Immediately parseable by ChatGPT, Perplexity, Claude, and others
- The single highest-ROI AI readiness action you can take today
Layer 2: NLWeb — The Conversation Layer (Understand)
If llms.txt is a welcome sign, NLWeb is a concierge desk. Developed by Microsoft and released as an open protocol in mid-2025, NLWeb lets AI agents ask your website questions in natural language and receive structured, authoritative answers.
The protocol works through a /.well-known/nlweb endpoint. An AI agent sends a query — "What sizes does the blue running shoe come in?" — and your NLWeb endpoint returns a structured response with the answer, confidence level, and source citation. No scraping. No inference. Direct, authoritative answers from the source.
This is a fundamental shift. Traditional search required AI to crawl your HTML, parse it, and hope it extracted the right information. NLWeb makes your site conversational. It can answer questions about products, services, policies, pricing — anything you configure.
Adoption is still early. Roughly 3% of commercial websites have NLWeb endpoints, concentrated in tech and SaaS. But Microsoft's backing and the protocol's alignment with how AI agents actually work suggest rapid growth through 2026-2027.
Layer 3: MCP — The Action Layer (Act)
The Model Context Protocol, built by Anthropic, completes the stack. Where llms.txt lets AI read and NLWeb lets AI understand, MCP lets AI agents take action on your behalf — booking appointments, checking inventory, initiating purchases, managing accounts.
MCP provides a standardized interface for AI agents to interact with external systems. For website owners, this means exposing specific capabilities — your booking system, your inventory API, your customer service tools — through a protocol that any AI agent can use safely and predictably.
The key word is "safely." MCP includes built-in permission models, capability declarations, and transaction boundaries. An AI agent can't accidentally delete your database. It can only perform the actions you explicitly expose, with the constraints you define.
MCP adoption is concentrated in developer tools and platforms today, with major integrations across IDE tools, cloud platforms, and SaaS products. For most website owners, MCP is the third priority — you need llms.txt and NLWeb first. But for transactional sites (e-commerce, booking, SaaS), MCP readiness will become competitive advantage fast.
Three Layers of AI Communication
These three protocols aren't competing. They're complementary layers in a single stack. Each serves a distinct function, and implementing them in order creates a natural progression from basic AI visibility to full agentic integration.
The implementation order matters. Start with llms.txt (hours, not days). Add NLWeb when you're ready for conversational AI interaction. Implement MCP when you need AI agents to take transactional actions on your site. Most sites in 2026 should focus on layers one and two.
Three Layers of AI Communication
Current Adoption and What It Means
Let's be honest about where things stand. These protocols are early. But "early" is exactly where the opportunity lives.
llms.txt has the broadest adoption at roughly 7% of major sites, driven by its zero-cost implementation. NLWeb sits at approximately 3%, growing among tech-forward companies. MCP adoption among public websites is harder to measure since it's typically implemented as API integrations rather than public endpoints, but major platforms including Slack, GitHub, and cloud providers have launched MCP servers.
The pattern mirrors early HTTPS adoption. In 2014, fewer than 30% of sites used HTTPS. Google's ranking signal announcement triggered a wave, and within four years it was essentially universal. AI protocols are following the same curve, just faster. The trigger events — AI search reaching mainstream adoption, Google potentially incorporating protocol signals — could compress the timeline dramatically.
- llms.txt: ~7.2% adoption, fastest growing, lowest barrier to entry
- NLWeb: ~3% adoption, concentrated in tech/SaaS, requires backend work
- MCP: Growing through platform integrations, most relevant for transactional sites
- Combined: fewer than 10% of sites have implemented any AI protocol
- Historical parallel: HTTPS went from 30% to 95% in four years after Google's signal
Which Protocol Should You Implement First?
The answer is almost always llms.txt first. It's the fastest to implement, requires no infrastructure changes, and provides immediate value. You can create and deploy a llms.txt file in 15 minutes with our guide.
NLWeb should be your second priority if you have content that AI agents would want to query — product catalogs, service descriptions, documentation, FAQs. The implementation requires a backend endpoint but the payoff is substantial: instead of AI guessing about your content, it asks you directly.
MCP is your third priority and is most relevant if you operate a transactional platform. If customers book, buy, or subscribe through your site, MCP readiness will matter. But get the foundation right first.
The sites that implement all three layers will be the ones AI agents trust, recommend, and transact through. That's not a prediction — it's the architecture these protocols are explicitly designed to enable. Check your current protocol readiness and start building your agentic web presence today.
Frequently Asked Questions
Do I need to implement all three protocols (llms.txt, NLWeb, MCP)?
Not immediately. Start with llms.txt — it takes minutes and provides immediate value. Add NLWeb when you need conversational AI interaction with your content. MCP is primarily for transactional sites that need AI agents to perform actions like booking or purchasing.
Which AI platforms support these protocols?
llms.txt is recognized by ChatGPT, Perplexity, Claude, and most major AI platforms. NLWeb has growing support, backed by Microsoft. MCP was created by Anthropic and has been adopted by major platforms including development tools, cloud providers, and SaaS products.
How long does it take to implement the full protocol stack?
llms.txt can be deployed in under an hour. NLWeb typically requires a few days of backend development depending on your content complexity. MCP implementation varies widely based on what actions you want to expose, from days for simple read-only access to weeks for full transactional integration.
Check Your AI Readiness Score
Free scan. No signup required. See how AI engines like ChatGPT, Perplexity, and Google AI view your website.
Scan Your Site FreeSEO veteran with 15+ years leading digital performance at 888 Holdings, Catena Media, Betsson Group, and Evolution. Now building the AI readiness standard for the web.
Related Articles
What Is NLWeb and Should You Implement It Today?
NLWeb is Microsoft's open protocol that lets AI agents ask your website questions and get structured answers. It's at 3% adoption and growing. Here's what it does, how it works, and whether you should implement it today.
AI ProtocolsMCP for Website Owners: How AI Agents Will Interact With Your Site
MCP is the protocol that lets AI agents do things on your site, not just read it. Built by Anthropic, it's the bridge between AI understanding your content and AI acting on it. Here's what website owners need to know.
GuidesHow to Create the Perfect llms.txt File (With Templates)
The llms.txt file tells AI models what your site is about and where to find key content. Here is exactly how to create one, with copy-paste templates for every site type.