Add llms.txt to Your Site
TL;DR
An llms.txt file is a simple text file at your site's root that tells large language models which of your pages matter most. It acts as a curated guide for AI systems, pointing them to your best content and explaining how to use it.
Last updated: 2026-03-09
What llms.txt Does#
While
robots.txt tells crawlers what they can access, llms.txt tells language models what they should focus on. It is a protocol designed specifically for the AI era.
The file sits at https://yourdomain.com/llms.txt and provides a structured summary of your site's most important content. It includes your site name, a brief description, and a curated list of URLs with short descriptions of what each page covers.
Why does this matter? AI systems face a discovery problem. Even if they can crawl your entire site, they do not know which pages are your best, most authoritative content. The llms.txt protocol solves this by giving AI a table of contents written specifically for machines. Sites that adopt it give AI a fast path to their most citable content.Creating Your llms.txt File#
The format is straightforward. Start with a markdown heading for your site name, followed by a brief description. Then list your key URLs, each as a markdown link with a short explanatory note.
The file uses standard markdown syntax. Section headings organize your URLs by topic. Each URL entry is a list item with a linked title and a colon-separated description. Keep descriptions concise — one sentence explaining what the page covers and why it is useful.
The example below shows a complete llms.txt file for a fictional SaaS company. Notice how it prioritizes the most important pages rather than listing every page on the site. Quality matters more than quantity here.
Complete llms.txt example
# Acme Project Tools
> Acme provides project management software for mid-market teams. We help companies plan, track, and deliver projects on time.
## Core Pages
- [About Acme](https://www.acmetools.com/about): Company overview, founding story, and mission statement
- [Product Overview](https://www.acmetools.com/product): Full feature list and capabilities of the Acme platform
- [Pricing](https://www.acmetools.com/pricing): Current plans, pricing tiers, and feature comparison
## Documentation
- [Getting Started Guide](https://www.acmetools.com/docs/getting-started): Step-by-step onboarding for new users
- [API Reference](https://www.acmetools.com/docs/api): Complete REST API documentation with examples
- [Integrations](https://www.acmetools.com/docs/integrations): Supported third-party integrations and setup guides
## Resources
- [Blog](https://www.acmetools.com/blog): Articles on project management best practices
- [Case Studies](https://www.acmetools.com/case-studies): Customer success stories with measurable outcomes
- [Changelog](https://www.acmetools.com/changelog): Latest product updates and feature releasesmarkdown
Choosing Which URLs to Include#
Your llms.txt file should not be a copy of your sitemap. It is a curated list of your strongest, most citable content. Think of it as a highlight reel, not a complete archive.
Start with your core pages: homepage, about, product or service pages, and pricing. These are the pages AI is most likely to need when answering factual questions about your company.
Next, add your best educational content: cornerstone blog posts, definitive guides, documentation pages, and FAQ sections. Prioritize content that answers questions clearly and authoritatively. AI systems look for sources they can cite, so choose pages with clear, factual, well-structured answers.
Finally, add trust-building pages: case studies, team pages, and press mentions. These support the authority signals that make AI systems more likely to trust and cite your content.
A good llms.txt file has 15 to 50 URLs. Fewer than 10 and you are not giving AI enough to work with. More than 100 and you are diluting the signal.
Deploying the File#
Save your file as
llms.txt and upload it to the root of your web server so it is accessible at https://yourdomain.com/llms.txt. It must be at the root — subdirectories will not be discovered by AI systems.
If you use a static site generator like Next.js, place the file in your public/ directory. For WordPress, you can either upload it via FTP to your site root or use a plugin that manages static files. For other CMS platforms, check your documentation for how to serve static files from the root.
Make sure the file is served with a text/plain or text/markdown content type. Most web servers handle this correctly by default for .txt files. If you are behind a CDN, clear your cache after uploading so the new file is served immediately.
Some sites also create an llms-full.txt file with more detailed content for AI systems that want deeper context. This is optional but recommended if you have extensive documentation or product pages.Testing Your llms.txt#
After deploying, open
https://yourdomain.com/llms.txt in your browser and verify it loads correctly. Check that all URLs in the file are valid and return 200 status codes. Broken links in your llms.txt undermine credibility.
Run an AgentReady™ scan to see the impact on your AI protocols score. The scan detects the presence of llms.txt and evaluates its quality, including the number of URLs, description completeness, and formatting correctness.
You can also test manually by asking an AI system about your company or product. While there is no guarantee that any specific AI will use your llms.txt immediately, over time, sites with this file see improved AI visibility because crawlers have a clear path to important content.
Review and update your llms.txt quarterly. As you publish new content, add your best pages and remove any that have become outdated or redirected. A stale llms.txt is better than none, but a current one is best.Related Pages
Frequently Asked Questions
Is llms.txt an official standard?
It is an emerging protocol, not a formal web standard like robots.txt. It was proposed in 2024 and has gained adoption among AI-forward companies. While not all AI systems support it yet, its adoption is growing and early implementation gives you a head start.
How is llms.txt different from a sitemap?
A sitemap lists every page on your site for search engine crawlers. llms.txt is a curated list of your best pages specifically for AI language models. It includes descriptions and context that help AI understand what each page offers, not just its URL.
Can llms.txt hurt my site in any way?
No. It is purely informational. It does not grant any special access or override your robots.txt. If an AI crawler is blocked by robots.txt, llms.txt will not bypass that restriction. It simply helps AI systems prioritize your content once they have access.
Was this page helpful?