I want to stop Crawler from crawling the subdomain tools.subdomain.com
I found a Snippet on the Internet which show following Rewrite Rule:
RewriteCond %{HTTP_USER_AGENT} (googlebot|bingbot|Baiduspider) [NC]
RewriteRule .* - [R=403,L]
How can i manage to block those Crawler on this subdomain, or just allow the current up to date Browser to visit the Subdomain? I Want to manage this through .htaccess, because not every crawler accepts the robots.txt. For the robots.txt i have following rewrite Condition.
RewriteCond %{HTTP_HOST} =testing.subdomain.com
RewriteRule ^robots\.txt$ /robots_testing.txt [L]
Cheers
Sven