Free llms.txt generator

llms.txt Generator

Help AI understand your website

Crawls your sitemap, fetches your pages, and generates spec-compliant llms.txt and llms-full.txt files. Download them and upload to your site root.

Sitemap discoverySpec-compliant outputllms-full.txt included
Last updated
What is llms.txt?
A standard file that tells AI models what your site is about.

Like robots.txt, but for AI language models.

Lists your pages with titles, URLs, and descriptions.

llms-full.txt adds the full markdown content inline.

Helps AI systems cite and reference your content accurately.

Generate llms.txt
Paste a public URL. Leave off https:// if you want — we'll add it.
No account needed. Rate-limited.Public URLs only

After generating

How to deploy your llms.txt

1
Download the files

Download or copy the generated llms.txt (and optionally llms-full.txt) from the results above.

2
Upload to your site root

Place the files at the root of your website, alongside robots.txt and sitemap.xml.

3
Verify access

Visit yourdomain.com/llms.txt to confirm it is publicly accessible.

Definition

What is llms.txt?

llms.txt is a Markdown file at your domain root (e.g. example.com/llms.txt). It lists your pages with titles, URLs, and one-line descriptions so AI models can understand your site without crawling it. The optional companion file, llms-full.txt, embeds the full page content inline.

Comparison

llms.txt vs robots.txt vs sitemap.xml

Featurellms.txtrobots.txtsitemap.xml
Primary audienceAI language modelsSearch engine crawlersSearch engine indexers
PurposeDescribe site content for AIControl crawl accessList all indexable URLs
File location/llms.txt/robots.txt/sitemap.xml
FormatMarkdownPlain text directivesXML
Content descriptionsYes — title + summary per pageNoNo
Full content optionYes (llms-full.txt)NoNo
Adopted since202419942005

Background

Why llms.txt matters

AI assistants increasingly answer questions that used to go to Google. When someone asks ChatGPT or Claude for a tool recommendation, the AI pulls from what it knows about your site. llms.txt gives it a clean summary instead of forcing it to guess from scattered crawl data.

Without llms.txt, AI models rely on whatever fragments they picked up during training or from live crawling. With it, you decide which pages matter and how they get described.

It sits at your domain root, always available, always current. Not a silver bullet, but one of the few things you can actually do to influence how AI talks about you. That matters for AI visibility and getting properly cited.

Reference

llms.txt Format & Specification

The file starts with your site title and a one-line description, followed by a list of pages — each with a title, URL, and short description. llms-full.txt adds the full markdown content inline for each page, so models can understand your site without crawling it.

# Site Title

> One-line site description.

## Pages

- [Page Title](https://example.com/page): Short description.
- [Another Page](https://example.com/other): Another description.

The spec follows Markdown conventions. Each page entry is a list item with a link and an optional colon-separated description. See the full specification at llmstxt.org.

You've told AI about your site
Now see if AI is actually recommending you. Prompt Metrics tracks your AI visibility over time.

See what AI models actually say when your category comes up.

Track which competitors get cited — and which sources AI pulls from.

Actionable recommendations, not dashboards for dashboards' sake.

Reports that refresh automatically.

Related tools
More free tools to improve your AI visibility.

AI Crawlability CheckerCheck whether AI crawlers can reach your pages and identify technical blockers.

What is AI visibility?Why AI mentions matter and what you can do about them.

FAQ

llms.txt generator FAQ

llms.txt is a file you host at your domain root that describes your site to AI models. Think robots.txt, but instead of telling crawlers what to access, it tells language models what your site is about and which pages matter.

Upload it to the root of your website so it is accessible at yourdomain.com/llms.txt. Same location as robots.txt and sitemap.xml.

llms.txt contains a structured list of your pages with titles, URLs, and short descriptions. llms-full.txt includes the same structure plus the full markdown content of each page, giving AI models more context without needing to crawl your site.

Claude actively checks for llms.txt when browsing sites. Perplexity and ChatGPT consume it when present during web retrieval. On the publishing side, platforms like Mintlify, ReadMe, and GitBook auto-generate llms.txt for hosted docs, and over 30% of top developer documentation sites now serve one.

Without llms.txt, AI models piece together your site from training data and whatever they happen to crawl. With it, you hand them a clean summary — your pages, your descriptions, your priority order. That structured context is what gets you cited instead of paraphrased or ignored.

No. robots.txt controls access for search crawlers. llms.txt describes your site for AI models. Different audiences, different jobs. You need both.

Regenerate it whenever you add, remove, or significantly change important pages. If you publish content regularly, updating monthly is a good cadence. The file should always reflect your current site structure.

Not directly. Google doesn't read llms.txt. But the work that goes into a good llms.txt — clear page descriptions, logical site structure — tends to help your SEO too.