This is a problem with the script, I've tried to figure out how to get around. The spider "clicks" on links, so it "clicks" on the "report bad link" link, and the link is flagged as bad.
I've tried various things to make it stop that, but short of requiring a user to log-on to access the bad-link report, nothing keeps the spiders out, other than editing your robots.txt and telling them to ignore the badlink.cgi
Might be able to modify it to do what some whois engines are doing, providing a code number on the page that you have to type into the form before it can report the link as bad -- maybe make the user type in the link ID? (confirm the ID?)
I'm almost ready to keep the spiders out of my sites except for the "home" pages. They do a lot of damage, and the competition is going to get worse, not better as the economy stalls.
PUGDOG� Enterprises, Inc.
The best way to contact me is to NOT use Email.
Please leave a PM here.
I've tried various things to make it stop that, but short of requiring a user to log-on to access the bad-link report, nothing keeps the spiders out, other than editing your robots.txt and telling them to ignore the badlink.cgi
Might be able to modify it to do what some whois engines are doing, providing a code number on the page that you have to type into the form before it can report the link as bad -- maybe make the user type in the link ID? (confirm the ID?)
I'm almost ready to keep the spiders out of my sites except for the "home" pages. They do a lot of damage, and the competition is going to get worse, not better as the economy stalls.
PUGDOG� Enterprises, Inc.
The best way to contact me is to NOT use Email.
Please leave a PM here.