Hi,
So at least we know it works now :) One small tweak I'd suggest, is maybe:
RewriteRule .* http://google.com [F]
I'm a bit concerned at the fact Google is complaining about auth errors (it shouldn't be)
Also, in robots.txt you could set a crawl-delay:
Crawl-delay: 10
This should limit each search engine request to 1 request every 10 seconds (assuming the robot obeys they rules!)
Cheers
Andy (mod)
andy@ultranerds.co.uk
Want to give me something back for my help? Please see my Amazon Wish List
GLinks ULTRA Package | GLinks ULTRA Package PRO
Links SQL Plugins | Website Design and SEO | UltraNerds | ULTRAGLobals Plugin | Pre-Made Template Sets | FREE GLinks Plugins!
So at least we know it works now :) One small tweak I'd suggest, is maybe:
Code:
RewriteCond %{HTTP_USER_AGENT} ^Baiduspider.* [NC] RewriteRule .* http://google.com [F]
I'm a bit concerned at the fact Google is complaining about auth errors (it shouldn't be)
Also, in robots.txt you could set a crawl-delay:
Code:
User-agent: * Crawl-delay: 10
This should limit each search engine request to 1 request every 10 seconds (assuming the robot obeys they rules!)
Cheers
Andy (mod)
andy@ultranerds.co.uk
Want to give me something back for my help? Please see my Amazon Wish List
GLinks ULTRA Package | GLinks ULTRA Package PRO
Links SQL Plugins | Website Design and SEO | UltraNerds | ULTRAGLobals Plugin | Pre-Made Template Sets | FREE GLinks Plugins!