This was implemented on earlier versions of Gossamer Forum but was later removed. I'm not sure if this was a Gossamer Forum function or an Apache directive.
Anyhow, the thread in question is most probably in the Chit Chat forum. The reason it was removed was that some users that opened multiple messages in multiple windows were blocked from the forum for x amount of seconds, and this caused some frustration. Basically there is no way to differentiate between a robot, off-line downloader, or just a very fast forum user.
These days, I believe this setup here has a list of allowed and disallowed robots, probably just using Apache directives, and when a disrespectful robot hits the site it simply gets blocked. Robots are meant to follow standards and are suppose to only spider x amount of pages per second/minute, but many robots ignore these rules.
Fortunately, we've seen that Gossamer Forum on *this* server setup has managed to withstand many robot "attacks" as mentioned by Alex in a thread in the Chit Chat forum again. And as long as the server can handle the extra load, then Alex seemed more than happy for the robot to spider the site's contents for obvious reasons.
Update: Here are some of the threads that discussed this when Gossamer Forum was first released:
http://www.gossamer-threads.com/...orum.cgi?post=180588
http://www.gossamer-threads.com/...orum.cgi?post=176310
Update 2: And here's my initial complaint :-)
http://www.gossamer-threads.com/...orum.cgi?post=164161
- wil
Anyhow, the thread in question is most probably in the Chit Chat forum. The reason it was removed was that some users that opened multiple messages in multiple windows were blocked from the forum for x amount of seconds, and this caused some frustration. Basically there is no way to differentiate between a robot, off-line downloader, or just a very fast forum user.
These days, I believe this setup here has a list of allowed and disallowed robots, probably just using Apache directives, and when a disrespectful robot hits the site it simply gets blocked. Robots are meant to follow standards and are suppose to only spider x amount of pages per second/minute, but many robots ignore these rules.
Fortunately, we've seen that Gossamer Forum on *this* server setup has managed to withstand many robot "attacks" as mentioned by Alex in a thread in the Chit Chat forum again. And as long as the server can handle the extra load, then Alex seemed more than happy for the robot to spider the site's contents for obvious reasons.
Update: Here are some of the threads that discussed this when Gossamer Forum was first released:
http://www.gossamer-threads.com/...orum.cgi?post=180588
http://www.gossamer-threads.com/...orum.cgi?post=176310
Update 2: And here's my initial complaint :-)
http://www.gossamer-threads.com/...orum.cgi?post=164161
- wil