Why llms.txt exists
The llms.txt proposal from Jeremy Howard (Answer.AI) addresses a gap: AI engines crawl the web but have no signal for which of your pages you'd most like them to cite. A sitemap lists everything; llms.txt lists what matters most, with context. It's analogous to a curated reading list versus an index.
What to put in it
Start with your highest-value pages — flagship guides, original research, key tools, definition pages. Group them by category (Tools, Guides, Research). Add a one-line description for each so the AI understands the page's topic without crawling it. Keep it under 100KB total; this is a directory, not a content dump.
Does it actually work?
Adoption is early — only ~5% of sites have one as of early 2026 — but the engines that respect it (Anthropic's Claude, Perplexity, OpenAI's research crawlers) explicitly use it for discovery. The cost is one file; the upside is asymmetric. The GEO guide covers the full setup.