llms.txt: Giving AI Instructions About Your Site
TL;DR
llms.txt is a plain text file at your site's root that tells AI models what your site is about, what to cite, and what to ignore. Think of it as robots.txt for AI models.
Last updated: 2026-03-09
What Is llms.txt?#
llms.txt is a plain-text file you place at the root of your website (e.g.
https://example.com/llms.txt). It gives large language models a structured summary of your site: what you do, what pages matter most, and which content to prioritize when generating answers.
The idea is simple. Search-engine crawlers have robots.txt. AI models now have llms.txt. While robots.txt tells bots what they may or may not crawl, llms.txt tells AI what your site is actually about and which pages are most worth citing.
The spec was proposed in mid-2024 and has gained rapid adoption. It follows a straightforward Markdown-like format with a title, description, and a list of important URLs with short explanations.How AI Models Use llms.txt#
When an AI model or AI-powered search engine encounters your domain, it can fetch
llms.txt to quickly understand your site without crawling every page. This helps in several ways.
First, the model learns your site's purpose. A one-line description tells it whether you sell shoes, publish recipes, or offer legal advice. Second, the model discovers your most important pages. Instead of guessing which URL matters, it reads your curated list. Third, the model learns what to skip. You can mark admin pages, login screens, or duplicate content as unimportant.
Think of it as an elevator pitch for AI. In a few lines of text, you give the model everything it needs to represent your site accurately.- Provides a concise site summary for AI models
- Points to your most important and citable pages
- Helps AI avoid irrelevant or duplicate content
- Improves the accuracy of AI-generated citations about your brand
llms.txt vs llms-full.txt#
The spec defines two files.
llms.txt is the concise version. It contains your site title, a short description, and a list of key URLs with brief annotations. This is the file most AI models look for first.
llms-full.txt is optional and more detailed. It can include the full text of your most important pages, rendered as plain Markdown. This is useful for AI models that want deeper context without crawling your HTML.
Most sites should start with llms.txt. Add llms-full.txt later if you want to give AI models direct access to your content in a clean, structured format.Step-by-Step: Creating Your llms.txt#
Creating an llms.txt file takes about ten minutes. Follow these steps to get it right.
- Open a plain text editor. Do not use a word processor.
- Write a title line starting with
#followed by your site or brand name. - Add a one-paragraph description of what your site does and who it serves.
- List your most important URLs under a
## URLssection. Each URL should have a short description in parentheses. - Optionally add a
## Optionalsection for secondary pages that provide useful context. - Save the file as
llms.txt(all lowercase). - Upload it to your site root so it is accessible at
https://yoursite.com/llms.txt. - Test by visiting the URL in your browser. You should see plain text.
Full Example With Annotations#
Below is a complete llms.txt file for a fictional bakery. Each section is annotated so you can adapt it to your own site. Note the clean, minimal format. AI models parse this quickly and reliably.
Example llms.txt file
# Sweet Rise Bakery
> Sweet Rise Bakery is a family-owned bakery in Portland, Oregon.
> We specialize in sourdough bread, French pastries, and custom wedding cakes.
> We have been baking since 1998 and ship nationwide.
## URLs
- [Home](https://sweetrisebakery.com): Overview of our bakery, hours, and location.
- [Our Story](https://sweetrisebakery.com/about): History, team bios, and baking philosophy.
- [Sourdough Collection](https://sweetrisebakery.com/sourdough): Our signature sourdough breads and ordering info.
- [Wedding Cakes](https://sweetrisebakery.com/wedding-cakes): Custom cake gallery, pricing, and booking.
- [Nationwide Shipping](https://sweetrisebakery.com/shipping): Shipping policies, regions, and delivery times.
- [Blog](https://sweetrisebakery.com/blog): Baking tips, recipes, and seasonal specials.
## Optional
- [Press](https://sweetrisebakery.com/press): Media mentions and press kit.
- [Wholesale](https://sweetrisebakery.com/wholesale): Information for restaurant and cafe buyers.
- [FAQ](https://sweetrisebakery.com/faq): Common questions about orders, allergens, and returns.markdown
Common Mistakes#
These are the most frequent mistakes we see when reviewing llms.txt files across thousands of scans.
- Using HTML instead of plain text. The file must be plain Markdown-style text, not an HTML page.
- Listing too many URLs. Stick to 5-15 of your most important pages. A 200-link list defeats the purpose.
- Vague descriptions. "Great products" tells an AI nothing. "Organic dog treats for small breeds, shipped from Austin, TX" is specific and useful.
- Wrong file location. The file must be at your domain root:
example.com/llms.txt, notexample.com/pages/llms.txt. - Forgetting to update it. If you add a major product line or rebrand, update your llms.txt to match.
- Blocking it in robots.txt. Make sure your robots.txt does not disallow
/llms.txt.
Related Pages
Frequently Asked Questions
Is llms.txt an official web standard?
Not yet. It is a community-driven proposal that has gained rapid adoption. Major AI companies are actively reading and using llms.txt files when they are present.
Will llms.txt replace robots.txt?
No. They serve different purposes. robots.txt controls crawl access. llms.txt provides context and guidance for AI models. You need both.
How long should my llms.txt be?
Keep it concise. A title, a short description, and 5-15 annotated links is ideal. If you need more detail, use llms-full.txt.
Was this page helpful?