AI Protocols
TL;DR
AI Protocols (10% weight) evaluates whether your site has adopted emerging standards designed specifically for AI interaction. These include llms.txt, AI-specific meta tags, and machine-readable permission signals.
Last updated: 2026-03-09
What It Measures#
The AI Protocols factor checks whether your site implements standards designed specifically for AI system interaction. The most significant is
llms.txt, a file at the root of your domain that provides structured information for large language models, including a site description, content overview, and guidance on how AI should interpret your pages. The factor also evaluates AI-specific meta tags that communicate permissions and preferences to AI crawlers, such as whether AI systems may use your content for training, citation, or summarization. Additionally, it assesses whether your site provides machine-readable signals about content licensing, update frequency, and preferred citation format. These protocols are relatively new, and adoption is still low across the web. This means early adopters gain a meaningful competitive advantage in AI visibility.Why It Matters for AI#
Traditional web standards like
robots.txt and sitemap.xml were designed for search engine crawlers. AI systems have different needs. They do not just index pages for keyword matching — they read, interpret, and synthesize content. AI protocols give these systems explicit guidance on how to interact with your site. A well-crafted llms.txt file tells AI systems what your site is about, which pages are most important, and how to properly cite your content. AI-specific meta tags communicate your preferences for AI usage, reducing uncertainty for both you and the AI system. Sites that adopt these protocols are easier for AI to work with, which translates to better representation in AI-generated answers. At 10% of your total score, this factor rewards forward-looking sites. See Bot Access for the complementary access layer and How Scoring Works for the complete model.How to Check Yours#
Visit
yoursite.com/llms.txt in your browser. If you get a 404, you do not have the most important AI protocol in place. If the file exists, review its contents. A basic llms.txt file should include a site title, description, and links to key content. More advanced versions include content categorization, update frequency, and citation preferences. Next, view the source of your key pages and search for AI-specific meta tags. These may include tags with names like ai-content-usage, llm-instructions, or publisher-specific AI directives. Your AgentReady™ scan checks all known AI protocols and reports which ones are implemented, partially implemented, or missing entirely.Example: Basic llms.txt
# Example Inc
> Example Inc helps businesses optimize for AI discovery.
## About
Example Inc provides tools and consulting for AI-powered search readiness.
## Key Pages
- [AI Readiness Guide](https://example.com/guides/ai-readiness): Comprehensive guide to AI optimization
- [Product Overview](https://example.com/product): Main product features and pricing
- [Blog](https://example.com/blog): Latest articles on AI and search
## Contact
- Website: https://example.com
- Email: hello@example.commarkdown
How to Improve#
Create a
llms.txt file and place it at the root of your domain. Start with the basics: site name, a one-sentence description, and links to your most important pages. Use the standard Markdown-based format so AI systems can parse it consistently. Add AI-specific meta tags to your page templates. At minimum, signal whether you permit AI citation and summarization. If you want AI systems to cite your content, say so explicitly. Many sites miss this step and leave AI systems uncertain about usage permissions. Keep your llms.txt updated as your site structure changes. It should always reflect your current most important pages and content areas. Consider creating a llms-full.txt with more detailed content summaries for sites with complex information architectures. This factor pairs well with Bot Access for the Protocol Trifecta bonus.Staying Ahead of Emerging Standards#
AI protocols are evolving quickly. New standards and conventions are being proposed and adopted by major AI providers on a regular basis. The AgentReady™ scoring system tracks these developments and incorporates new protocols as they gain meaningful adoption. Staying current with AI protocol developments gives your site an edge. Follow industry discussions, monitor AI provider documentation for new crawler guidelines, and review the Algorithm Changelog for updates to how this factor is evaluated. Early adoption of new protocols, even before they are widely used, signals to AI systems that your site is actively maintained and AI-aware.
Related Pages
Frequently Asked Questions
Is llms.txt an official standard?
llms.txt is an emerging convention that has gained significant traction among AI developers and website operators. While it is not governed by a formal standards body, it is recognized by multiple major AI systems and its adoption is growing rapidly.
What happens if I do not implement any AI protocols?
Your AI Protocols sub-score will be low, which reduces your overall score by up to 10 points. You will not trigger a floor for this alone, but you will miss the Protocol Trifecta bonus if you are strong in other protocol-related areas.
Do AI protocols replace robots.txt?
No. They complement it. robots.txt controls crawl access (covered under Bot Access). AI protocols like llms.txt provide guidance and context for AI systems that already have access. Both are important, and they work best together.
Was this page helpful?