Robots.txt Generator

Generate Your robots.txt File

Configure the rules below to generate a `robots.txt` file. This file tells search engine bots which pages or files they can or cannot request from your site.

Default Rules (for all bots: User-agent: *)

Specific Bot Rules (Optional)

Sitemap Location (Optional)

How to Use

  1. Choose the default access rule for all crawlers (Allow All or Disallow All).
  2. Optionally, click "Add Specific Bot Rule" to define rules for specific user-agents (e.g., `Googlebot`, `Bingbot`).
  3. For each specific bot rule, enter the User-agent name and add `Allow:` or `Disallow:` paths (one per line).
  4. Optionally, enter the full URL to your XML sitemap.
  5. The content for your `robots.txt` file will be generated automatically in the output area.
  6. Click "Copy Content" or "Download robots.txt" (save the downloaded file as `robots.txt` in your website's root directory).

Important: `robots.txt` is a guideline, not strictly enforced by all crawlers. Malicious bots will likely ignore it. Use it primarily to guide well-behaved crawlers like Googlebot.

Why Use a robots.txt File?