Gossamer Forum
Home : Products : Gossamer Links : Discussions :

Ban search bots

Quote Reply
Ban search bots
Is it possible to ban search bots (like google, yahoo etc) from using search.cgi?

Many of these bots are taking my query links from home page (most recent queries) which I don't like.

I would also like to do this for recomment_it.cgi, contact.cgi, review.cgi and some more.

Basically I want search.cgi, recomment_it.cgi, contact.cgi, review.cgi etc. only to be accessible from my domain name.

Regards.

UnReal Network

Last edited by:

deadroot: Feb 26, 2008, 10:15 AM
Quote Reply
Re: [deadroot] Ban search bots In reply to
Have you tried to put some rules for this in robots.txt?

Code:
Disallow: /cgi-bin/add.cgi?*
Disallow: /cgi-bin/modify.cgi?*
Disallow: /cgi-bin/subscribe.cgi?*
Disallow: /cgi-bin/bookmark.cgi?*
Disallow: /cgi-bin/search.cgi?*

The above works fine for me.

Cheers,
Boris

Facebook, Twitter and Google+ Auth for GLinks and GCommunity | reCAPTCHA for GLinks | Free GLinks Plugins
Quote Reply
Re: [eupos] Ban search bots In reply to
Will try. Thanks.

Regards.

UnReal Network
Quote Reply
Re: [eupos] Ban search bots In reply to
And how could I block all spiders for all .cgi files?

UnReal Network
Quote Reply
Re: [deadroot] Ban search bots In reply to
Hi deadroot,
Code:
Disallow: /cgi-bin/
should do it
Matthias

Matthias
gpaed.de
Quote Reply
Re: [Matthias70] Ban search bots In reply to
And what about if I want to allow just one cgi script in cgi-bin but disallow all others?

UnReal Network
Quote Reply
Re: [deadroot] Ban search bots In reply to
Well, than I think you have to choose the way eupos wrote it above.
When you have many cgi-files it's perhaps better to use "Allow"
You can find more infos here
http://www.google.com/support/webmasters/bin/answer.py?answer=40367&ctx=sibling

Matthias

Matthias
gpaed.de

Last edited by:

Matthias70: Mar 1, 2008, 11:42 AM