Recently, there was a problem where the gforum.cgi script overloaded the shared server to the point where the sysadmin had to shut it down, and restricted access for testing only.
There were 4 instances of gforum.cgi running at the time of the overload. My forum isn't very big, only a total of about 8500 posts, but it is visited by many bots who account for most of the traffic. I spent some time testing the forum, and it seemed to be working fine. I found a thread on this forum about server overload due to searching posts, with advice to change the gforum_Posts table from NONINDEX to INTERNAL, but I don't think that this is the problem. My forum is pretty small, and was not being searched when the problem occurred as far as I could tell from the logs. Also, searches were very fast during testing.
In investigating the problem I noticed that there is a huge table in the database called gforum_Expanded. This has over 3 million entries and uses 291 MBytes. The next largest table is gforum_Posts with 8500 entries and about 7.9 MBytes.
I've added a robots.txt file to my root directory to discourage bots from accessing the gforum script. After doing their own testing, my hosting service agreed to restore access to the script, thinking this might be a one time only problem, possibly due to bots.
I have two questions:
1. Does anyone have any suggestions about the overload ?
2. What is the gforum_Expanded table and why is it so large?