Gossamer Forum
Home : Products : Gossamer Links : Version 1.x :

search.cgi - lots of hung processes

Quote Reply
search.cgi - lots of hung processes
Hi,

My server went a bit crazy today and when I ran 'top' I found it was down to 3450k of available memory. I did a ps -ef and found a lot of [search.cgi] processes that were dated from over a week ago.

After killing these'hung' processes it freed up resources and the server came back to life.

Anyone know what could be causing this?


All the best
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Probably, the category search code hack you wrote and installed...hehe!

Regards,

Eliot Lee
Quote Reply
Re: search.cgi - lots of hung processes In reply to
lol Eliot Smile

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Seriously though...I've noticed the same problem...actually brought the server I was down a few times after I installed in the category search...still try to reduce the processing time/cycles, but still hovering around 40%CPU and a lot of memory is eaten up when that script is executed.

Didn't occur until I installed the category search codes...

hehe!

Regards,

Eliot Lee
Quote Reply
Re: search.cgi - lots of hung processes In reply to
Eliot,

I see what you mean ... I took some time to monitor memory usage and when search.cgi is run it takes up quite a bit, although I think my main problem is the virtual hosts I'm running.

I didn't realise that each of them behaves like it's own independent server and eats a chunk of memory for itself, so with several running I'm using up a good portion of available memory already before doing anything else.

I may have to re-think my server layout, or just put my hand in my pocket and buy some more memory.

Gosh this is all terribly good fun, isn't it ? Smile

All the best
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Hi,

I wouldn't have thought it is the virtual hosts. Some servers have between 100-200 vhosts on them. The co-located server I use has over 20 vhosts on it and runs fine (256MB RAM).




Installations:http://www.wiredon.net/gt/
Favicon:http://www.wiredon.net/favicon/

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Thanks for your replies,

hmmm ... I guess I've got a lot to learn ... maybe I'm barking up the wrong tree?

My server has 128MB and when running top it reports the following:
Mem: 127068k - Used: 113232k - Free: 13836k

When sorted into Memory order, the httpd's (I assume that's the virtual host deamons?) are as follows:
3508, 3460, 3456, 3456, 3408, 3404, 3380, 3372, 3364, 3360, 3332, 2960, total: 40460

(there are also 4 mysqld's as well that use 8000k each and 2 miniserv.pl's at 4000k each as well.)

I think my best approach would be to find a good resource to learn about Linux and just take some time to get a bit more familiar with it Smile


All the best
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
I think the httpd instances are spare servers rather than vhost daemons - in httpd.conf you should have MinSpareServers and MaxSpareServers and I think that is what they are (although I'm not 100% on that).

If you have the cash you should probably upgrade to 265MB RAM at least. I don't think it is too expensive to buy 128MB RAM. I bought 265MB for my home pc and it was about $140. I thought that was fairly cheap.

Are you running intensive scripts on the vhosts?

Installations:http://www.wiredon.net/gt/
Favicon:http://www.wiredon.net/favicon/

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Hi Paul,

Thanks for the advice regarding the xSpareServers directives, I've checked and they are tuneable so I may lower the numbers a little and see if that helps. I'm also considering disabling php since I don't use it (can I do that?).

The server is a 'sealed box' type (I could only afford the cheapest one to start with Smile) and so I can't add any more memory for now and have to manage as best I can with what I've got. I think tuning the server will help anyway as I'll gain some experience and also hopefully get the best out of it.


All the best
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Do you have mod_perl?

Installations:http://www.wiredon.net/gt/
Favicon:http://www.wiredon.net/favicon/

Quote Reply
Re: search.cgi - lots of hung processes In reply to
I haven't checked, probably, they've put all sorts of stuff on it for me Smile - would that make a difference?

All the best
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Yes definitely - if you are running Links SQL or any perl script really mod_perl will increase performance.

Shove this code into a cgi script and run it from your broswer:

#!/usr/bin/perl

print "Content-type: text/html\n\n";

foreach $key (keys %ENV) {
print "$key : $ENV{$key}\n";
}

If GATEWAY_INTERFACE is CGI/1.1 them mod_perl isn't installed.

Installations:http://www.wiredon.net/gt/
Favicon:http://www.wiredon.net/favicon/

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Something to be said for having a sys admin? =) Let's see if I can clear some things up:

In Reply To:
I didn't realise that each of them behaves like it's own independent server and eats a chunk of memory for itself
No, not true. You have only one apache, and it handles all the domains on the machine. You could have 5,000 and be ok as long as none of them got much traffic. =) It's all about the number of hits the combinded total of your domains get, not the number of domains you have.

In Reply To:
Mem: 127068k - Used: 113232k - Free: 13836k
I'm assuming linux here, but having 13k free is ok. Your operating system will cache frequently requested information into memory and so it will appear there is very little memory free, but this is ok. What you want to watch out for is that you are not using any swap. For instance on our server we have:

Code:
total used free
Mem: 517052 490440 26612
Swap: 393584 3660 389924
So while we only have 26MB out of 512MB free, this is ok as the memory is all being utilized caching frequently requested information. You don't want to see anything significant being used in swap though (as this means pulled from disk). If you are swapping, then it's time to go out and buy memory (actually it was time a couple days ago, you shuold never get to swapping on a web server).

In Reply To:
When sorted into Memory order, the httpd's (I assume that's the virtual host deamons?) are as follows:
3508, 3460, 3456, 3456, 3408, 3404, 3380, 3372, 3364, 3360, 3332, 2960, total: 40460
The way Apache works is it has one process that runs as root, and then spawns several other processes that handle all the incoming requests. So if you see 10 or 15 httpds, a lot of them are just sitting around waiting for requests. You can tweak this setting by editing httpd.conf.

Also, a lot of that memory is shared, so the total memory used by Apache is not the sum of those numbers, but something quite a bit smaller.

As for mod_perl, you don't want to run it as the main webserver, but rather as a proxied web server. It takes a bit of playing around with it, but is worth it once setup properly. =)

Cheers,

Alex

--
Gossamer Threads Inc.
Quote Reply
Re: search.cgi - lots of hung processes In reply to
Alex,

Thank you for all the info, it's much appreciated (yes I'm running Linux) - this junior sys admin is learning Smile

I've had another look at the memory usage and on average there's around 20-16000k free, with very little swap space used so it would appear that I'm okay, although I'm still not sure why sometimes there's a few [search.cgi] process showing that are hours and days old - do these get automatically killed off eventually?

Thanks.
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Hi,

Just an update, I found out what was causing the search.cgi processes to hang;

I had setup an affiliated search results page for the times when there were no results returned. It calls the merchant's search tool and parses the returned results into a modified Links search results template.

I disabled it yesterday and haven't had any problems since, so I think I need to look at the code I wrote as there's obviously something wrong with it somewhere.

BTW: I'm now slowly getting the hang of managing the server, and I'm a little calmer and a touch more confident with it as well ... Smile

All the best
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Okay, here's the latest on the memory problems .... and, yes, I'm still having some trouble.

Although the updated search.cgi doesn't hang any more, I'm still loosing a large chunk of memory, and I believe I've isolated it to the nph-build.cgi operation, or something related.

For the past week available free memory has been hovering around 15000k and yesterday, after a busy half an hour of activity, the server fell over and ran out of free mem. I did a remote shutdown/reboot of the server and after starting Apache I found I had gained roughly 45000k above what I've had available recently - much to my surprise, as I'd been pretty sure from the start that I should have had more free mem.!!

Everything ran OK and there was loads of free mem. until I did a rebuild this afternoon, after which I'm back down to around 15000-10000k again.

I'm running nph-build.cgi through Putty ssh from the command line with the ' -all ' tag. Any idea why nph-build.cgi would casue the memory to disappear, or could there be something else that is triggered by nph-build that doesn't quit afterwards?

As usual, any and all help appreciated, and if you think I should move this to the PERL, SQL and more forum please let me know and I'll start a new thread there Smile


All the best
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
There is a --changed option isn't there?....Have you tried that to see what happens?

Installs:http://wiredon.net/gt
FAQ:http://www.perlmad.com

Quote Reply
Re: search.cgi - lots of hung processes In reply to
Paul,

Thanks, but that was for 2.x onwards Smile

Anyway I did a staggered built from the browser (a long process but worth doing to check), and whilst running 'top' from a Putty login I noticed that for each 'batch' of category pages it built, a chunk of memory appeared to go missing.

I'd expect a fair deal of it to be used during a rebuild, but then I'd also expect it to be 'given back' after everything's completed.

It's got me baffled this one ....

All the best
Shaun

Quote Reply
Re: search.cgi - lots of hung processes In reply to
The memory problems have finally been resolved.

After reconfiguring Apache and Perl the server is now using the cache memory properly and isn't falling over anymore Smile

All the best
Shaun