Rules
Presets
robots.txt
How it works
Build a robots.txt file to control crawler access:
- User-agent — which crawler the rule applies to (* = all)
- Allow/Disallow — permit or block specific paths
- Sitemap — tell crawlers where your sitemap is
Want to learn more?
Read the complete guide with examples and tips