How to Create the Perfect llms.txt File (With Templates)
The llms.txt file tells AI models what your site is about and where to find key content. Here is exactly how to create one, with copy-paste templates for every site type.
Founder & CEO at AgentReady
What Is llms.txt and Why Does It Exist?
The web was built for humans and search engine crawlers. Neither of those audiences needed a plain-language overview of what a website is about — humans could see it, and crawlers could index it. AI language models need something different. They need a concise, structured summary they can process before deciding how to use your site’s content.
That is exactly what llms.txt provides. It is a plain text file (Markdown-formatted) served at your domain root that answers three questions for AI models: What is this site? What does it offer? Where are the key pages?
The concept was proposed by Jeremy Howard in late 2024 and has rapidly gained traction. Our protocol adoption research shows that llms.txt is the fastest-adopted AI protocol, with over 15% of top-10K websites now serving one. The full protocol landscape covers how llms.txt fits alongside NLWeb and MCP.
AgentReady™ checks for llms.txt as part of the AI Protocols factor. Sites with a well-structured llms.txt file consistently score higher on AI comprehension metrics.
How AI Models Actually Use Your llms.txt
Understanding the consumption flow helps you write a better file. When an AI model encounters your domain — either through a user query or during a research task — it follows a predictable sequence.
First, it checks robots.txt to confirm it has crawl permission. Then, if it is looking for site-level context (rather than a specific page), it checks for llms.txt at the domain root. If the file exists, the model parses it to understand your site’s purpose, structure, and key pages before deciding which pages to crawl in depth.
This is important: llms.txt does not replace deep crawling. It supplements it. Think of it as a table of contents that helps the model navigate your site intelligently rather than randomly crawling pages.
The llms.txt specification also supports an extended version at /llms-full.txt. The base llms.txt should be concise (under 500 words). The extended version can include full content from key pages, which is useful for AI models that prefer to consume comprehensive context in a single request.
How AI Models Use llms.txt
The llms.txt File Structure (Specification)
The llms.txt format follows a simple Markdown structure. The spec is intentionally minimal to maximize adoption. Here are the key elements:
The file starts with an H1 heading (#) containing your site or company name. This is followed by a blockquote (>) with a one-line description of what your site does.
After the description, you include sections organized by H2 headings (##). The most common sections are:
## Docs — links to your main documentation or content pages.
## Optional — supplementary links that provide useful but non-essential context.
## Blog or ## Resources — links to frequently referenced content.
Each link is formatted as a standard Markdown link with a colon-separated description: - Page Title: Brief description of what this page covers.
Keep the core llms.txt under 500 words. Be specific in your descriptions — do not just list page titles. Tell the AI model what it will find on each page.
# Your Company Name
> One-sentence description of what your company does and who it serves.
## Docs
- [Product Overview](https://yoursite.com/product): Complete overview of the product, features, and pricing.
- [Getting Started Guide](https://yoursite.com/docs/getting-started): Step-by-step setup instructions for new users.
- [API Reference](https://yoursite.com/docs/api): Full API documentation with endpoints, parameters, and examples.
## Optional
- [About Us](https://yoursite.com/about): Company history, team, and mission.
- [Case Studies](https://yoursite.com/case-studies): Customer success stories with measurable results.
- [Blog](https://yoursite.com/blog): Industry analysis and product updates.Basic llms.txt structure
Templates by Site Type
Different site types need different llms.txt approaches. Below are ready-to-use templates for the five most common site types. Copy the template that matches your site, fill in your specifics, and deploy. Each template is designed to provide maximum signal to AI models with minimum verbosity.
- SaaS: Lead with product overview, docs, pricing, integrations, and security
- E-commerce: Lead with product catalog structure, shipping info, returns policy, and top categories
- Blog/Publisher: Lead with topic coverage areas, editorial standards, and author credentials
- Agency/Consultancy: Lead with services, case studies, team expertise, and industries served
- Local Business: Lead with services, location, hours, reviews/testimonials, and service area
# [Your SaaS Product]
> [Product name] is a [category] platform that helps [target audience] [primary benefit]. Founded in [year], serving [number]+ customers.
## Docs
- [Product Overview](https://yoursite.com/product): Core features, use cases, and how the platform works.
- [Pricing](https://yoursite.com/pricing): Plans, pricing tiers, and feature comparison.
- [Documentation](https://yoursite.com/docs): Technical documentation, setup guides, and API reference.
- [Integrations](https://yoursite.com/integrations): Supported integrations and partner platforms.
- [Security](https://yoursite.com/security): Security practices, certifications, and compliance.
## Optional
- [About](https://yoursite.com/about): Company background, team, and mission.
- [Case Studies](https://yoursite.com/customers): Customer stories with measurable outcomes.
- [Blog](https://yoursite.com/blog): Product updates, industry analysis, and best practices.
- [Changelog](https://yoursite.com/changelog): Recent product updates and new features.
- [Status](https://status.yoursite.com): Current platform status and uptime history.SaaS Product template
E-Commerce & Local Business Templates
E-commerce and local business sites have distinct needs. An e-commerce llms.txt should help AI models understand your product catalog structure, while a local business llms.txt needs to communicate location, services, and service area clearly.
For e-commerce, prioritize your category pages over individual products. AI models querying about your store typically want to understand what you sell and how your catalog is organized, not details about a specific SKU. Include your returns policy and shipping information since those are among the most common AI-generated queries about e-commerce sites.
For local businesses, geographic context is critical. AI models frequently answer location-based queries (“best plumber near me”) and your llms.txt should explicitly state your service area, locations, and hours.
# [Your Store Name]
> [Store name] is an online retailer specializing in [category]. Offering [number]+ products with [key differentiator, e.g., free shipping, handmade, sustainable sourcing].
## Docs
- [All Products](https://yourstore.com/collections/all): Complete product catalog.
- [Best Sellers](https://yourstore.com/collections/best-sellers): Most popular products by customer sales.
- [New Arrivals](https://yourstore.com/collections/new): Latest product additions.
- [Shipping Policy](https://yourstore.com/policies/shipping): Shipping methods, costs, and delivery times.
- [Returns & Exchanges](https://yourstore.com/policies/returns): Return policy, process, and refund timeline.
## Optional
- [About Us](https://yourstore.com/about): Brand story, values, and sourcing practices.
- [Size Guide](https://yourstore.com/size-guide): Sizing information and measurement instructions.
- [Reviews](https://yourstore.com/reviews): Customer reviews and ratings.
- [FAQ](https://yourstore.com/faq): Common questions about orders, products, and policies.E-Commerce template
Deployment: Where and How to Host Your llms.txt
Your llms.txt file must be accessible at https://yourdomain.com/llms.txt. It should return a 200 status code with a text/plain or text/markdown content type. Do not put it in a subdirectory or behind authentication.
Here are platform-specific deployment instructions for the most common setups:
WordPress: Upload the file to your theme root or use a plugin like Yoast SEO (which now supports llms.txt) to manage it. Alternatively, add a rewrite rule in your .htaccess file to serve a file from wp-content/uploads/llms.txt.
Shopify: Add the file through Settings → Files, then create a URL redirect from /llms.txt to the file URL. Alternatively, use a Shopify app that supports llms.txt deployment.
Vercel / Next.js: Place the file in your public/ directory. It will be automatically served at the root path.
Cloudflare Pages: Place the file in your build output’s root directory, or create a Cloudflare Worker that serves the content at /llms.txt.
Static Sites (Netlify, GitHub Pages): Place llms.txt in the root of your build output directory alongside your index.html.
After deployment, verify by navigating to yourdomain.com/llms.txt in a browser. Confirm the content type header with curl -I yourdomain.com/llms.txt.
# Check that the file is accessible and returns correct headers
curl -I https://yourdomain.com/llms.txt
# Expected output should include:
# HTTP/2 200
# content-type: text/plain or text/markdown
# View the actual content
curl https://yourdomain.com/llms.txtVerify your llms.txt deployment
Advanced: llms-full.txt and Dynamic Generation
The llms.txt specification includes support for an extended file at /llms-full.txt. While your base llms.txt is a concise summary (under 500 words), the full version can include complete content from your key pages. Some AI models, particularly those operating in research mode, prefer to consume a single comprehensive document rather than crawling multiple pages.
The llms-full.txt file follows the same Markdown format but expands each section with actual content. For a documentation site, this might mean including the full text of your getting started guide, API reference, and key concept pages.
For dynamic sites with frequently changing content (e-commerce catalogs, news sites, SaaS changelogs), consider auto-generating your llms.txt. Build a script or serverless function that pulls your current key pages, product categories, or latest articles and outputs a fresh llms.txt on each request or at a scheduled interval.
Keep the dynamic version cached with a reasonable TTL (1–24 hours). AI crawlers do not need real-time updates, and regenerating on every request wastes server resources.
// app/llms.txt/route.ts (Next.js App Router)
import { NextResponse } from "next/server";
export async function GET() {
const content = `# Your Company Name
> Brief description of your company and what it does.
## Docs
- [Product Overview](https://yoursite.com/product): Core features and capabilities.
- [Documentation](https://yoursite.com/docs): Technical docs and guides.
- [Pricing](https://yoursite.com/pricing): Plans and pricing information.
## Blog
${await getRecentPosts()}
## Optional
- [About](https://yoursite.com/about): Team and company background.
- [Contact](https://yoursite.com/contact): How to reach us.
`;
return new NextResponse(content, {
headers: {
"Content-Type": "text/plain; charset=utf-8",
"Cache-Control": "public, max-age=86400",
},
});
}
async function getRecentPosts(): Promise<string> {
// Fetch your 5 most recent blog posts
const posts = await fetchRecentBlogPosts(5);
return posts
.map((p) => `- [${p.title}](${p.url}): ${p.description}`)
.join("\n");
}Next.js API route for dynamic llms.txt generation
Common Mistakes to Avoid
After reviewing thousands of llms.txt files through AgentReady scans, these are the patterns that consistently reduce effectiveness:
Mistake 1: Writing for humans instead of models. Your llms.txt is not a marketing brochure. Skip the superlatives and buzzwords. AI models need factual descriptions, not persuasive copy. Instead of “The world’s most innovative platform,” write “A project management platform for remote teams with time tracking, task boards, and reporting.”
Mistake 2: Linking to every page. More links does not mean more coverage. Curate. Link to the 10–15 most important pages that represent your site’s core content. AI models will discover deeper pages through crawling.
Mistake 3: Stale content. If your llms.txt references pages that return 404 errors or describes products you no longer offer, it damages your credibility signal. Set a reminder to review the file quarterly.
Mistake 4: Missing the description block. The blockquote description after your H1 heading is the single most important line in the file. It is what AI models use for site-level classification. Write it carefully.
Mistake 5: Wrong content type. If your server returns text/html instead of text/plain, some AI parsers will fail. Verify your content type headers after deployment.
- Do not use marketing language — be factual and specific
- Do not link to more than 15–20 pages — curate for quality
- Do not forget to update when your site changes
- Do not skip the blockquote description line
- Do not serve with incorrect content-type headers
- Do validate all links resolve to 200 status codes
- Do include brief descriptions for every linked page
Frequently Asked Questions
What is an llms.txt file?
An llms.txt file is a plain text or Markdown file served at your domain root (e.g., yoursite.com/llms.txt) that provides AI language models with a structured summary of your website — who you are, what you do, and where to find your most important content.
Is llms.txt an official web standard?
llms.txt is a community-driven proposal, not an official W3C or IETF standard. However, it has gained significant adoption. Major AI providers acknowledge and consume llms.txt files when they are present, making it a de facto standard.
Does llms.txt replace robots.txt?
No. robots.txt controls crawl access (which bots can visit which pages). llms.txt provides comprehension context (what your site is about and how to navigate it). You need both. Think of robots.txt as the door and llms.txt as the welcome guide.
How often should I update my llms.txt file?
Update it whenever you add or remove major sections, products, or services. For most sites, a quarterly review is sufficient. If you publish content frequently, consider auto-generating it from your CMS sitemap.
Check Your AI Readiness Score
Free scan. No signup required. See how AI engines like ChatGPT, Perplexity, and Google AI view your website.
Scan Your Site FreeSEO veteran with 15+ years leading digital performance at 888 Holdings, Catena Media, Betsson Group, and Evolution. Now building the AI readiness standard for the web.
Related Articles
NLWeb, MCP, and llms.txt: The Three Protocols That Will Define the Agentic Web
The agentic web runs on three protocol layers. llms.txt tells AI what to read. NLWeb lets AI ask questions. MCP lets AI take action. Here's how they fit together and which one your site needs first.
Data & ResearchAI Protocol Adoption: Where the Web Stands in March 2026
We measured adoption rates for llms.txt, NLWeb, and MCP across 5,000 websites. The numbers are tiny but growing fast, with llms.txt doubling since December 2025.
GuidesThe Complete Guide to Making Your Website AI-Ready in 2026
Everything you need to know about making your website visible to AI systems in 2026 — the 8 factors that determine whether AI agents cite your content or skip it entirely.