Robots.txt Generator
Create a robots.txt file to control search engine crawling of your site.
Configure Robots.txt
Set up rules for search engine crawlers.
Rule 1
Number of seconds between requests (supported by some crawlers)
Generated Robots.txt
Copy and save this as robots.txt in your root directory.
# robots.txt generated by SiteForge # https://siteforge.diy User-agent: * Disallow: /admin/ Disallow: /private/ Allow: /blog/ Allow: /products/ Sitemap: https://example.com/sitemap.xml
How to Use
Follow these steps to implement your robots.txt file.
- Copy the generated robots.txt content.
- Create a new file named "robots.txt".
- Paste the content into the file.
- Upload the file to your website's root directory.
- Verify it's accessible at
https://yourdomain.com/robots.txt
.