DevTulz Online

Robots.txt Generator


What is robots.txt?

robots.txt is a text file placed at the root of your website (e.g. https://example.com/robots.txt) that tells web crawlers which pages or sections they can or cannot access. It is part of the Robots Exclusion Protocol. While search engine bots generally respect it, malicious bots may ignore it. Use it to block duplicate pages, admin areas, and private sections from being crawled.

How to Use the Robots.txt Generator

  1. Pick a preset or start from scratch by clicking '+ Add rule'.

  2. Set the User-agent field to target a specific bot (use * for all bots).

  3. Add Disallow paths for pages you don't want crawled, and Allow paths for exceptions.

  4. Optionally set a Crawl-delay (in seconds) to slow down aggressive crawlers.

  5. Add your sitemap URL at the bottom.

  6. Copy the output and save it as robots.txt in your website's root directory.

Keywords: robots.txt generator, robots.txt file, disallow crawlers, block bots, SEO robots, sitemap robots, web crawler rules