llms.txt: a new standard for efficient AI crawling

There is an increase in reports about heavy website traffic caused by AI systems and crawlers, resulting in higher costs for website owners. At the same time, there's an emerging standard called llms.txt that lets these systems efficiently access important content and deep links through a single static text file.
At Tanner Lab, we generate llms.txt automatically during build-time directly from our translation files: tannerlab.ch/llms.txt
What is llms.txt?
llms.txt is a static text file through which AI systems can retrieve relevant information from a website in a structured and consolidated way - including links to important subpages. It reduces crawling, server load, and uncontrolled access while improving data quality for AI applications.
Why it matters
- LLMs are here to stay; llms.txt provides content to AI in a resource-efficient, structured, and controlled manner.
- It helps AI engineers design smarter, leaner crawlers without hammering your site.
- A single, predictable source of truth simplifies future integrations.