What is llms.txt? How to Make Your Website AI-Readable

A complete guide to the llms.txt file standard -- what it is, why it was created, and how to use it to help AI models like ChatGPT, Claude, and Perplexity understand your website.

Last updated: February 25, 2026 · By Vida Together

llms.txt is a plain text file you place at the root of your website (at yoursite.com/llms.txt) that provides AI language models with a structured, Markdown-formatted summary of your site's content, purpose, and organization. Think of it as a welcome guide written specifically for AI -- it tells ChatGPT, Claude, Perplexity, and other AI models what your website is about, what pages matter most, and how to navigate your content. Unlike robots.txt, which tells crawlers what they cannot access, llms.txt tells AI models what they should understand. If you are working on AI Engine Optimization (AEO), adding an llms.txt file is one of the simplest and most impactful steps you can take to improve your website's AI visibility.

Key Takeaways

  • 1.llms.txt is a plain text file placed at your site's root that gives AI models a structured overview of your website's content and purpose.
  • 2.Created by Jeremy Howard (fast.ai, Answer.AI), llms.txt uses Markdown formatting to be readable by both humans and machines.
  • 3.llms.txt complements robots.txt and sitemap.xml -- it does not replace them. Each file serves a distinct purpose in your site's discoverability stack.
  • 4.AI-native search tools like ChatGPT, Claude, and Perplexity are already looking for llms.txt files to better understand websites they reference.
  • 5.Creating an llms.txt file takes less than 30 minutes and can significantly improve how AI models represent and cite your content.

What is llms.txt?

llms.txt is a proposed standard for a plain text file, formatted in Markdown, that website owners place at the root of their domain to help large language models (LLMs) understand their site. The concept was created by Jeremy Howard, the co-founder of fast.ai and founder of Answer.AI. Howard recognized a fundamental problem: when AI models encounter a website, they have no standardized way to quickly understand what the site is about, what its most important pages are, and how the content is organized. They can crawl HTML, but that is slow, noisy, and often cluttered with navigation elements, ads, and boilerplate.

llms.txt solves this by providing a single, clean, Markdown-formatted file that serves as a concise briefing document for AI. The file typically includes a site description, a list of key content areas, links to the most important pages, and optional guidance on how the AI should interpret or use the content. The name follows the longstanding convention of placing machine-readable files at a site's root -- just like robots.txt, sitemap.xml, humans.txt, and security.txt.

The key innovation of llms.txt is its format. Unlike sitemap.xml (which is XML) or robots.txt (which uses a specialized directive syntax), llms.txt uses Markdown -- a format that is simultaneously human-readable and trivially parseable by AI models. This dual readability means the file serves as documentation for both your team and the AI models that encounter your site.

How llms.txt Works

The mechanics of llms.txt are deliberately simple. You create a plain text file named llms.txt and place it at the root of your website so it is accessible at https://yoursite.com/llms.txt. The file is written in Markdown and follows a recommended structure:

  • Title (H1): The name of your website or organization, formatted as a top-level Markdown heading.
  • Site description: A brief, one-to-three paragraph overview of what your site is, what it does, and who it serves. This is the most important section -- it gives AI models immediate context.
  • Content sections (H2): Organized groups of links to your most important pages, each with a brief description. Common sections include documentation, blog posts, product pages, about pages, and guides.
  • Optional guidance: Instructions or notes for AI models, such as how to attribute content, which pages are most authoritative, or what topics the site covers in depth.

When an AI model encounters your site -- either through direct retrieval or as part of its web browsing capabilities -- it can fetch your llms.txt file and instantly gain a high-level understanding of your entire site. This is dramatically more efficient than crawling dozens or hundreds of individual pages. The AI can then make better decisions about which of your pages to cite, how to describe your business, and where to direct users for more information.

There is also an extended variant called llms-full.txt for sites that want to provide more detailed content. The standard llms.txt serves as a summary, while llms-full.txt can include the full text of key pages, more extensive descriptions, and deeper context. Think of llms.txt as the executive summary and llms-full.txt as the full report.

Example llms.txt File

Here is a complete example of what an llms.txt file looks like for a fictional SaaS company. You can adapt this structure for any type of website:

# Acme Project Management

> Acme is a project management platform designed for remote teams.
> We help teams of 5-500 organize tasks, track progress, and
> collaborate asynchronously. Founded in 2022, based in Austin, TX.

## Docs

- [Getting Started Guide](https://acme.com/docs/getting-started): Step-by-step onboarding for new users
- [API Reference](https://acme.com/docs/api): Complete REST API documentation
- [Integrations](https://acme.com/docs/integrations): Connect Acme with Slack, GitHub, Jira, and 40+ tools
- [Pricing](https://acme.com/pricing): Free tier, Pro ($12/user/mo), and Enterprise plans

## Blog

- [Remote Team Productivity in 2026](https://acme.com/blog/remote-productivity-2026): Research-backed strategies for async teams
- [How We Built Our AI Task Prioritizer](https://acme.com/blog/ai-task-prioritizer): Technical deep-dive on our ML pipeline
- [Acme vs Asana vs Monday](https://acme.com/blog/acme-vs-asana-vs-monday): Honest feature comparison

## Company

- [About Us](https://acme.com/about): Our story, mission, and team
- [Careers](https://acme.com/careers): Open positions (engineering, design, sales)
- [Contact](https://acme.com/contact): Support and sales inquiries
- [Security](https://acme.com/security): SOC 2 Type II certified, GDPR compliant

## Optional

- [Changelog](https://acme.com/changelog): Weekly product updates
- [Status Page](https://status.acme.com): Real-time uptime monitoring

Notice how the file is clean, scannable, and immediately informative. An AI model reading this file can understand within seconds that Acme is a project management tool for remote teams, what their pricing looks like, what their most important content is, and how to find specific information. This is exactly the kind of structured overview that AI models need to generate accurate, well-cited responses about your business.

Why llms.txt Matters for AEO

If you are investing in AI Engine Optimization, llms.txt addresses one of the most fundamental challenges: helping AI models understand your site quickly and accurately. Here is why it matters:

It Reduces AI Misrepresentation

One of the biggest risks in AI-powered search is that an AI model misunderstands or misrepresents your business. Without clear guidance, AI models piece together information from whatever pages they can find -- which might be outdated blog posts, partial product descriptions, or third-party reviews. An llms.txt file gives you a direct channel to tell AI models: "This is who we are, this is what we do, and these are our most important pages." It is a form of controlled first-party information delivery.

It Improves Citation Accuracy

When AI models cite your content, they need to link to the right page. Without llms.txt, an AI might link to a generic homepage when a specific product page or guide would be more relevant. By listing your key pages with descriptive annotations, you increase the likelihood that AI citations point to exactly the right content. This is especially important for AEO ranking factors related to citation potential and content discoverability.

It Complements Schema Markup

llms.txt and schema markup (JSON-LD) work together beautifully. Schema markup provides granular, page-level structured data -- telling AI models that a specific page is an Article with a specific author, or a Product with a specific price. llms.txt provides the site-level overview -- telling AI models what the site is about overall and where to find the most important content. Together, they create a comprehensive machine-readable layer that dramatically improves AI comprehension.

It Signals AI-Readiness

Having an llms.txt file signals to AI systems that your site is actively optimized for AI consumption. As AI-powered search becomes more prevalent, this kind of proactive optimization will increasingly differentiate sites that get cited from those that get overlooked. Early adopters of llms.txt are establishing a baseline advantage that will compound over time.

How to Create llms.txt for Your Website

Creating an llms.txt file is straightforward. Follow these steps:

Step 1: Create a New Text File

Create a plain text file named llms.txt in your website's root directory (the same directory where your robots.txt and sitemap.xml live). If you are using a CMS like WordPress, you may need to configure your server or use a plugin to serve a static file from the root URL.

Step 2: Write Your Site Description

Start with an H1 heading (your site or organization name) followed by a blockquote containing a concise description of what your site is about. Be specific -- mention your industry, your target audience, your location (if relevant), and your core value proposition. This is the first thing an AI model will read, so make it count.

Step 3: Organize Your Key Pages into Sections

Create H2 sections for each major content area of your site. Common sections include Documentation, Blog, Products, About, and Resources. Under each section, list your most important pages as Markdown links with a brief description after the colon. Focus on pages that are most likely to be useful when an AI is answering questions about your business or industry.

Step 4: Add Descriptions to Every Link

Do not just list URLs. Every link should include a short, descriptive annotation that tells the AI model what the page covers. Compare these two approaches:

# Bad -- no context for AI
- [Blog Post](https://example.com/blog/post-1)
- [Blog Post](https://example.com/blog/post-2)

# Good -- descriptive annotations
- [How to Choose a CRM in 2026](https://example.com/blog/choose-crm-2026): Comparison of 12 CRM platforms with pricing, features, and use-case recommendations
- [Our API Rate Limits Explained](https://example.com/blog/api-rate-limits): Technical guide covering throttling, quotas, and best practices for high-volume integrations

Step 5: Deploy and Verify

Upload the file to your server and verify it is accessible by visiting https://yoursite.com/llms.txt in your browser. The file should render as plain text. Make sure it is served with a text/plain or text/markdown content type. If you see your Markdown content displayed correctly, your llms.txt file is live.

Step 6: Keep It Updated

llms.txt should be a living document. When you publish major new content, launch new products, or restructure your site, update llms.txt to reflect those changes. A stale llms.txt file with broken links or outdated descriptions is worse than not having one at all -- it actively misleads AI models. Set a calendar reminder to review your llms.txt file monthly.

llms.txt vs robots.txt vs sitemap.xml

These three files serve different but complementary roles in your site's discoverability stack. Here is how they compare:

Featurellms.txtrobots.txtsitemap.xml
PurposeHelp AI models understand your siteControl crawler access to pagesList all indexable pages with metadata
FormatMarkdown (plain text)Custom directive syntaxXML
Target audienceAI language models (LLMs)Search engine crawlersSearch engine crawlers
ContentCurated links with descriptionsAllow/disallow rules per crawlerAll URLs with lastmod, priority
Human-readableYes (Markdown)SomewhatNot easily
ScopeKey pages only (curated)All pages (rules-based)All pages (comprehensive)
Adoption statusEmerging (growing fast)Universal standardUniversal standard

The best practice is to use all three files together. robots.txt controls access, sitemap.xml ensures comprehensive crawling, and llms.txt provides AI-specific context and curation. For a complete AEO strategy, you should also add schema markup (JSON-LD) to your individual pages.

Which AI Engines Support llms.txt?

As of early 2026, several major AI platforms are actively honoring or working toward supporting llms.txt:

  • ChatGPT (OpenAI): ChatGPT's web browsing mode can read and process llms.txt files when retrieving information about a website. OpenAI has shown interest in the standard as a way to improve the accuracy of web-sourced responses.
  • Claude (Anthropic): Claude's web retrieval capabilities can leverage llms.txt for site understanding. The format aligns well with Claude's approach to grounded, citation-heavy responses.
  • Perplexity: Perplexity's search engine is designed to deeply understand websites it cites. llms.txt provides a shortcut for Perplexity to grasp site structure and identify the most authoritative pages for any given query.
  • Google AI Overviews: While Google has not publicly endorsed llms.txt as a formal signal, Google's AI systems process accessible content at standard well-known paths. A well-structured llms.txt file is additional clean content that Google's AI can use for context.
  • Microsoft Copilot: Copilot's web grounding features can benefit from llms.txt when pulling information about businesses and products during conversational search.
  • Other AI tools: The growing ecosystem of AI-powered search tools, coding assistants, research agents, and enterprise AI systems can all benefit from llms.txt. As the standard gains adoption, support will continue to expand.

Even for AI platforms that do not explicitly "support" llms.txt today, the file still helps. Any AI model that can browse the web can fetch and read a plain text Markdown file. The content is inherently useful regardless of whether the AI has been specifically programmed to look for it.

Common Mistakes When Creating llms.txt

Most llms.txt errors come from treating it like a sitemap or ignoring it after initial creation. Avoid these common pitfalls:

1. Listing Every Page on Your Site

llms.txt is not a sitemap. You do not need to list every URL. Focus on the 15 to 40 most important pages -- the ones that best represent your business, answer common questions, and contain your most authoritative content. A bloated llms.txt file with hundreds of links defeats the purpose of providing a concise overview.

2. Skipping Page Descriptions

Every linked page should have a short description explaining what it covers. Without descriptions, the AI model only sees URLs and link text -- not enough context to understand which pages are most relevant for any given query. Descriptions are what make llms.txt genuinely useful rather than just a list of links.

3. Using HTML Instead of Markdown

llms.txt should be a plain Markdown file, not HTML. Do not wrap your content in HTML tags, use anchor tags for links, or include CSS. Markdown is the expected format because it is clean, lightweight, and universally parseable. AI models are exceptionally good at processing Markdown -- it is the native format for most LLM interactions.

4. Letting It Go Stale

An outdated llms.txt file with broken links, discontinued products, or old descriptions actively harms your AI visibility. If an AI model reads your llms.txt and finds that linked pages return 404 errors, it reduces the model's confidence in your site as a reliable source. Treat llms.txt as a living document and review it at least monthly.

5. Forgetting the Site Description

The opening description (the blockquote section after the H1) is arguably the most important part of your llms.txt file. It gives AI models immediate, high-level context about your entire organization. Jumping straight into page links without a site description forces the AI to infer what your business does from page titles alone -- a much less reliable approach.

6. Not Making It Publicly Accessible

Your llms.txt file must be publicly accessible at the root of your domain. If it is blocked by authentication, served behind a CDN that requires JavaScript, or placed in a subdirectory, AI models will not find it. Verify that https://yoursite.com/llms.txt returns the file with a 200 status code and no redirects.

Frequently Asked Questions About llms.txt

Is llms.txt an official web standard?

Not yet. llms.txt is a community-driven proposal created by Jeremy Howard of fast.ai and Answer.AI. It follows the convention of other well-known files like robots.txt and humans.txt, but it has not been formally adopted by a standards body like the W3C or IETF. That said, it is gaining rapid traction because it solves a real problem -- AI models need a clean, structured summary of a website, and llms.txt provides exactly that. Many AI companies are already honoring it.

Does llms.txt replace robots.txt or sitemap.xml?

No. llms.txt serves a completely different purpose. robots.txt controls which pages crawlers can access. sitemap.xml tells search engines which pages exist and when they were updated. llms.txt provides a human-and-AI-readable summary of what your site is about, how it is organized, and where to find the most important content. All three files complement each other and should be used together for maximum discoverability.

How long should my llms.txt file be?

There is no strict length requirement, but the best llms.txt files are concise yet comprehensive. Aim for enough detail that an AI model can understand your site's purpose, structure, and key content without having to crawl every page. For most small to medium websites, this means roughly 50 to 200 lines. Large sites with many content areas may need more. The key is clarity over length -- every line should add meaningful context.

Will llms.txt help my site rank higher in Google?

llms.txt is not a Google ranking factor in the traditional SEO sense. It is designed for AI models, not for Google's search crawler. However, as Google increasingly integrates AI Overviews into search results, having a well-structured llms.txt file can indirectly help by making it easier for Google's AI systems to understand and cite your content. The primary benefit is with AI-native search tools like ChatGPT, Claude, and Perplexity.

Can I use llms.txt to prevent AI from using my content?

llms.txt is designed to help AI understand your content, not to restrict access. If you want to prevent AI crawlers from accessing your site, you should use robots.txt with the appropriate user-agent directives (such as blocking GPTBot, ClaudeBot, or PerplexityBot). Some site owners use a combination -- robots.txt to control access and llms.txt to guide the AI models that are allowed to read their content.

Scan your site with Vida AEO

See how your website scores across 34 AI visibility factors -- including llms.txt, schema markup, content structure, and more. Free, instant results.

Get My Free AEO Score

Continue learning