Robots.txt Generator
Presets
Host (optional)
Specifies the preferred domain (rarely used)
User-Agent Rules
Sitemaps
No sitemaps added
robots.txt Output
5 lines# robots.txt generated by Autonomous Factory Tools # https://autonomousfactory.tools/robots-txt-generator User-agent: * Allow: /
How It Works
User-agent: Specifies which bot the rules apply to. Use * for all bots.
Allow/Disallow: Controls which paths the bot can or cannot crawl. Paths support wildcards (*).
Crawl-delay: Requests bots to wait X seconds between requests (not all bots honor this).
Sitemap: Tells search engines where to find your XML sitemap.
Common Patterns
Disallow: /Block entire siteDisallow: /admin/Block /admin/ directoryDisallow: /*.pdf$Block all PDF filesAllow: /public/Allow /public/ directoryWhat This Tool Does
Robots.txt Generator is built for deterministic developer and agent workflows.
Generate and validate robots.txt files with user-agent rules, sitemap URLs, crawl delays, and AI scraper blocking. Download or copy instantly.
Use How to Use for execution steps and FAQ for constraints, policies, and edge cases.
Last updated:
This tool is provided as-is for convenience. Output should be verified before use in any production or critical context.
Agent Invocation
Best Path For Builders
Browser workflow
Runs instantly in the browser with private local processing and copy/export-ready output.
Browser Workflow
This tool is optimized for instant in-browser execution with local data handling. Run it here and copy/export the output directly.
/robots-txt-generator/
For automation planning, fetch the canonical contract at /api/tool/robots-txt-generator.json.
How to Use Robots.txt Generator
- 1
Specify crawlable pages
Enter paths you want to allow search engines to crawl (/blog, /products). Or specify disallowed areas (/admin, /private).
- 2
Add crawler rules
Set rules per bot (Googlebot, Bingbot, etc.) with different allow/disallow paths. Add crawl-delay or request-rate limits.
- 3
Define sitemaps
Add URLs to your XML sitemaps. The generator creates proper Sitemap directives that search engines use to index efficiently.
- 4
Test against URLs
The tool includes a tester to check if specific URLs are allowed or disallowed by your rules. Shows which rule matched.
- 5
Validate and deploy
Verify the robots.txt is syntactically correct. Download and place at yourdomain.com/robots.txt. No reload needed; immediate effect.