robots.txt Generator

Build a robots.txt with multiple user‑agent blocks, Allow/Disallow, optional Crawl-delay, Host, and Sitemap lines. Live preview updates as you type.

User‑agent blocks

Global

Notes: Host is honored by some crawlers (e.g., Yandex). Crawl-delay isn’t used by Google but may be used by others. Wildcards * and $ are widely supported (not in the core RFC).

Preview (robots.txt)

URL tester

Reminder: robots.txt is publicly accessible and is an advisory for well‑behaved crawlers; it’s not an access control mechanism.

Comments (0)

Share your thoughts — please be polite and stay on topic.

No comments yet. Leave a comment — share your opinion!

To leave a comment, please log in.

Log in to comment