Robots.txt Generator

Create a robots.txt file to control search engine crawling.

Generate a robots.txt file to control search engine crawling of your website. This file tells search engines which pages or sections of your site should not be crawled or indexed.

User-agent Rules

Rule 1

Use * for all bots or specify a particular bot name

No allow paths added

No disallow paths added

Additional Settings

Number of seconds between requests (some bots support this)

Preferred domain version (for Yandex)

Sitemaps