06-20-2017, 10:41 PM
you generaly do this in a robot.txt file however if that fails to stop them then you can use the following in your htaccess for the directory you dont want crawled
<pre></pre>
change the bots to the ones you want stopped and they arent case sensitive, you can change the error code to what you wish aswell, currently 403 access forbidden
<pre>
Code:
RewriteEngine On
RewriteCond {3bc1fe685386cc4c3ab89a3f76566d8931e181ad17f08aed9ad73b30bf28114d}{HTTP_USER_AGENT} (googlebot|bingbot|Baiduspider) [NC]
RewriteRule .* - [R=403,L]
change the bots to the ones you want stopped and they arent case sensitive, you can change the error code to what you wish aswell, currently 403 access forbidden