Updated April 18, 2026

What is llms.txt?

llms.txt is a proposed standard (similar to robots.txt or sitemap.xml) that publishes a curated, AI-readable directory of your site's most important content for large language models. It lives at /llms.txt and uses Markdown to list pages with brief descriptions. Adopting it makes it easier for ChatGPT, Perplexity, Claude, and similar engines to discover your best content for citation.

https://
FreeNo signup~1 minute

Why llms.txt exists

The llms.txt proposal from Jeremy Howard (Answer.AI) addresses a gap: AI engines crawl the web but have no signal for which of your pages you'd most like them to cite. A sitemap lists everything; llms.txt lists what matters most, with context. It's analogous to a curated reading list versus an index.

What to put in it

Start with your highest-value pages — flagship guides, original research, key tools, definition pages. Group them by category (Tools, Guides, Research). Add a one-line description for each so the AI understands the page's topic without crawling it. Keep it under 100KB total; this is a directory, not a content dump.

Does it actually work?

Adoption is early — only ~5% of sites have one as of early 2026 — but the engines that respect it (Anthropic's Claude, Perplexity, OpenAI's research crawlers) explicitly use it for discovery. The cost is one file; the upside is asymmetric. The GEO guide covers the full setup.

Go deeper

Related questions

Browse all questions →

See how your page scores

Free 8-dimension analysis. About a minute. No signup.

https://