llms.txt
A proposed web standard (introduced by Jeremy Howard in late 2024) that provides AI systems with a curated, Markdown-formatted summary of a website's content. The file lives at /llms.txt on a site's root, like robots.txt, but friendlier. It contains a structured overview: site title, description, and categorized links to key pages, each with a brief description of what they contain.
The idea is elegant: instead of making AI systems parse your full HTML, hand them a curated summary on a platter. As of early 2026, adoption is real but concentrated among developer-documentation sites — community directories list roughly 784 implementations, with 105 found in the Majestic Million. Adoption and usage are different things, though. Just because you built the buffet doesn't mean anyone's eating.
Why it matters for writers: If you produce content for the web, llms.txt is one way to control what AI systems see when they visit your site. Instead of letting a crawler extract whatever it finds from your HTML, you provide a curated summary. Whether this currently makes a practical difference is one of the open questions in the llms.txt Access Paradox research.
Related terms: llms-full.txt · robots.txt · Web Application Firewall · Generative Engine Optimization