The robots.txt file is a plain-text file placed in the root directory of your website that tells search engine crawlers which pages or sections they are allowed or not allowed to access. A well-configured robots.txt file helps control how search engines index your site, prevents them from wasting crawl budget on unimportant pages, and keeps private directories out of search results.
This robots txt generator lets you define multiple rule groups, each targeting a specific user agent or all crawlers. You can specify allowed and disallowed paths, set a crawl delay, and include a sitemap URL. The generated file follows the Robots Exclusion Standard and is ready to be uploaded to your server.
All processing runs locally in your browser. Your configuration is never sent to any external server, making this tool safe to use for client projects and private websites.