robots.txt Generator

Create and customize your robots.txt file in seconds

Quick Presets

One-click configurations

or customize
Crawler Rules

No rules yet. Add Allow or Disallow rules below.

Sitemaps
Crawl Delay (Optional)

Delay Between Requests

Seconds to wait between crawler requests (0-60)

SEO Compliant

Follows official robots.txt specification

Live Preview

See your robots.txt update in real-time

100% Private - Nothing stored on servers

How to Use

  1. 1 Choose a preset or start from scratch
  2. 2 Add rules for specific crawlers (Allow/Disallow paths)
  3. 3 Add your sitemap URL (optional)
  4. 4 Preview the generated robots.txt
  5. 5 Copy to clipboard or download the file

What You Get

Generate a valid robots.txt file with custom rules for search engine crawlers and AI bots. Includes presets for WordPress, e-commerce, and AI bot blocking.

Input: Block /admin/ and /private/

Output: Disallow: /admin/ Disallow: /private/

Input: Block all AI bots

Output: User-agent: GPTBot Disallow: /

Input: Add sitemap

Output: Sitemap: https://example.com/sitemap.xml

What is robots.txt?

robots.txt is a text file that tells search engine crawlers which pages they can or cannot access on your website. It lives at yoursite.com/robots.txt.

Should I block AI bots like GPTBot and Claude?

If you don't want AI companies to train on your content, you can block GPTBot (OpenAI), Claude-Web (Anthropic), and CCBot (Common Crawl). Our generator makes this easy with the "Block AI Bots" preset.

Does robots.txt guarantee crawlers won't access my pages?

No. robots.txt is a guideline, not a security measure. Well-behaved bots respect it, but malicious bots may ignore it. For sensitive content, use authentication or password protection.

Should I add my sitemap to robots.txt?

Yes! Adding your sitemap URL helps search engines discover all your pages. Format: Sitemap: https://yoursite.com/sitemap.xml

What paths should I typically block?

Common paths to block: /admin/, /wp-admin/, /cart/, /checkout/, /my-account/, /api/, /tmp/, /*.pdf$ (if you don't want PDFs indexed).

How do I block a specific bot?

Add a User-agent line with the bot name, then Disallow rules. Example: User-agent: GPTBot Disallow: /

100% client-side processing. Your configuration never leaves your device.