SiteForge

Robots.txt Generator

Create a robots.txt file to control search engine crawling of your site.

Configure Robots.txt
Set up rules for search engine crawlers.

Rule 1

Number of seconds between requests (supported by some crawlers)

Generated Robots.txt
Copy and save this as robots.txt in your root directory.
# robots.txt generated by SiteForge
# https://siteforge.diy

User-agent: *
Disallow: /admin/
Disallow: /private/
Allow: /blog/
Allow: /products/

Sitemap: https://example.com/sitemap.xml
How to Use
Follow these steps to implement your robots.txt file.
  1. Copy the generated robots.txt content.
  2. Create a new file named "robots.txt".
  3. Paste the content into the file.
  4. Upload the file to your website's root directory.
  5. Verify it's accessible at https://yourdomain.com/robots.txt.