llms.txt
A proposed standard file (similar to robots.txt) that provides AI language models with a structured overview of a website's content.
llms.txt is an emerging standard that provides AI language models with a structured, machine-readable overview of a website's content. Similar to how robots.txt guides web crawlers, llms.txt helps AI systems understand and navigate your site's content.
How llms.txt works
A llms.txt file is placed at the root of your domain (e.g., yoursite.com/llms.txt) and contains:
- A brief description of your organization or website
- Links to key pages and content sections
- Structured navigation of your documentation, guides, and resources
- Context about what your site offers
Why llms.txt matters for GEO
- Discoverability: Helps AI engines find and understand your key content
- Structure: Provides a curated index of your most important pages
- Context: Gives AI models context about your brand and offerings
- Freshness: Can be dynamically generated to always reflect current content
llms.txt vs robots.txt
| Aspect | robots.txt | llms.txt |
|---|---|---|
| Purpose | Controls crawler access | Guides content understanding |
| Audience | Search engine bots | AI language models |
| Content | Allow/disallow rules | Content index and descriptions |
| Format | Key-value directives | Markdown with links |
Implementing llms.txt
Many sites generate llms.txt dynamically to include the latest content. The file should be concise, well-organized, and focused on your most valuable content for AI consumption.
