Login | Register For Free | Help
Search for: (Advanced)

Mailing List Archive: ModPerl: ModPerl

Tool to create multiple requests

 

 

ModPerl modperl RSS feed   Index | Next | Previous | View Threaded


perl at wagener

Feb 6, 2012, 11:05 PM

Post #1 of 8 (1292 views)
Permalink
Tool to create multiple requests

Hello,

I'm currently developing a huge application with mod_perl, unixODBC and MaxDB/SAPDB.
On my developing system everything is fine. But on the productive system
with > 50 users, I have database connection errors and request aborts and
so on.

Now I want to ask if someone knows a tool or perl modules, where I can simulate
50 users. I have a list with some common request including the query parameter
in order of appearence. But I don't know, how to send them to my developing
system to create the same load as it will be on the productive system.

Can someone help me with this issue?

Thanks and best regards,

Tobias


aw at ice-sa

Feb 7, 2012, 12:58 AM

Post #2 of 8 (1244 views)
Permalink
Re: Tool to create multiple requests [In reply to]

Tobias Wagener wrote:
> Hello,
>
> I'm currently developing a huge application with mod_perl, unixODBC and MaxDB/SAPDB.
> On my developing system everything is fine. But on the productive system
> with > 50 users, I have database connection errors and request aborts and
> so on.
>
> Now I want to ask if someone knows a tool or perl modules, where I can simulate
> 50 users. I have a list with some common request including the query parameter
> in order of appearence. But I don't know, how to send them to my developing
> system to create the same load as it will be on the productive system.
>
> Can someone help me with this issue?
>
As a simple tool, have a look at the "ab" program that comes with Apache.


bac2bac at bac2bac

Feb 7, 2012, 1:37 AM

Post #3 of 8 (1244 views)
Permalink
Re: Tool to create multiple requests [In reply to]

It's rudimentary but you can try Apache ab, the Apache benchmarking
tool. You probably have it installed already. Try 'man ab' at the prompt.

If you want to emulate 50 concurrent requests, sent twice, you'd do
something like:

ab -c 50 -n 100 http://example.com/etc?etc

If you don't have it installed already, then see
http://httpd.apache.org/docs/2.2/programs/ab.html


If that doesn't provide any clues, check

http://modperlbook.org/html/Chapter-9-Essential-Tools-for-Performance-Tuning.html

Good luck.

g.



On 2/6/2012 11:05 PM, Tobias Wagener wrote:
> Hello,
>
> I'm currently developing a huge application with mod_perl, unixODBC and MaxDB/SAPDB.
> On my developing system everything is fine. But on the productive system
> with> 50 users, I have database connection errors and request aborts and
> so on.
>
> Now I want to ask if someone knows a tool or perl modules, where I can simulate
> 50 users. I have a list with some common request including the query parameter
> in order of appearence. But I don't know, how to send them to my developing
> system to create the same load as it will be on the productive system.
>
> Can someone help me with this issue?
>
> Thanks and best regards,
>
> Tobias
>
>
>


js5 at sanger

Feb 7, 2012, 2:18 AM

Post #4 of 8 (1247 views)
Permalink
Re: Tool to create multiple requests [In reply to]

On 07/02/2012 08:58, André Warnier wrote:
> Tobias Wagener wrote:
>> Hello,
>>
>> I'm currently developing a huge application with mod_perl, unixODBC
>> and MaxDB/SAPDB.
>> On my developing system everything is fine. But on the productive system
>> with > 50 users, I have database connection errors and request aborts
>> and
>> so on.
>>
>> Now I want to ask if someone knows a tool or perl modules, where I
>> can simulate
>> 50 users. I have a list with some common request including the query
>> parameter in order of appearence. But I don't know, how to send them
>> to my developing system to create the same load as it will be on the
>> productive system.
>>
>> Can someone help me with this issue?
>>
> As a simple tool, have a look at the "ab" program that comes with Apache.
>
ab isn't usually that much use as it does strain the databases in the
same way - especially as it only takes 1 URL - any "production" scaling
e.g. caching etc will make it even worse. I have a simple one I wrote
based on curl which can take a list of URLs which is generally a lot
better.
Without being careful it is still difficult to cope with large sites
acting like users (you will need to extend the code to do cookie jars!)...

Things we have come across in large production systems are.

* Apache::DBI sometimes cause issues with too many database
connections - we tend to turn it off and use DBIx::Connector as
mentioned (and carefully selected caching) here to cope with
persistence of connections;
* You may be serving too many requests on the server to the users
(other than the main page request).... look at
caching/minimising/merging page design elements - background images,
javascript, CSS;
* OR - serve them from a second apache (use it either as a proxy or
have a proxy in front of the two apaches)
* Don't over AJAX pages - lots of "parallel" AJAX requests can
effectively DOS your service;

Another trick is to add extra diagnostics to your apache logs with a
couple of mod_perl handlers:


*Apache configuration file:*

PerlLoadModule Pagesmith::Apache::Timer
PerlChildInitHandler Pagesmith::Apache::Timer::child_init_handler
PerlChildExitHandler Pagesmith::Apache::Timer::child_exit_handler
PerlPostReadRequestHandler
Pagesmith::Apache::Timer::post_read_request_handler
PerlLogHandler Pagesmith::Apache::Timer::log_handler

LogFormat "%V [*%P/%{CHILD_COUNT}e %{SCRIPT_TIME}e*
%{outstream}n/%{instream}n=%{ratio}n] %h/%{X-Forwarded-For}i
%l/%{SESSION_ID}e %u/%{user_name}e %t \"%r\" %>s %b \"%{Referer}i\"
\"%{User-Agent}i\ " \"%{Cookie}i\" \"%{X-Requested-With}i\"
*%{SCRIPT_START}e/%{SCRIPT_END}e*" diagnostic

The following module sets up the four environment variables:
*CHILD_COUNT, SCRIPT_START, SCRIPT_END, SCRIPT_TIME* so you can see
which requests are slow and you can also see if you have any
"clustering" of requests...Setting the "Readonly my $LEVEL" to either
normal or noisy will give you more diagnostics in the error log as well....


*Module:*

package Pagesmith::Apache::Timer;

## Component
## Author : js5
## Maintainer : js5
## Created : 2009-08-12
## Last commit by : $Author: js5 $
## Last modified : $Date: 2011-10-26 12:44:20 +0100 (Wed, 26 Oct 2011) $
## Revision : $Revision: 1489 $
## Repository URL : $HeadURL:
svn+ssh://web-svn.internal.sanger.ac.uk/repos/svn/shared-content/trunk/lib/Pagesmith/Apache/Timer.pm
$

use strict;
use warnings;
use utf8;

use version qw(qv); our $VERSION = qv('0.1.0');

use Readonly qw(Readonly);
Readonly my $VERY_LARGE_TIME => 1_000_000;
Readonly my $CENT => 100;
Readonly my $LEVEL => 'normal'; # (quiet,normal,noisy)

use Apache2::Const qw(OK DECLINED);
use English qw(-no_match_vars $PID);
use Time::HiRes qw(time);

my $child_started;
my $request_started;
my $requests;
my $total_time;
my $min_time;
my $max_time;
my $total_time_squared;

sub post_config_handler {
return DECLINED if $LEVEL eq 'quiet';

printf {*STDERR} "TI: Start apache %9d\n", $PID;
return DECLINED;
}

sub child_init_handler {
return DECLINED if $LEVEL eq 'quiet';

$child_started = time;
$requests = 0;
$total_time = 0;
$min_time = $VERY_LARGE_TIME;
$max_time = 0;
$total_time_squared = 0;

printf {*STDERR} "TI: Start child %9d\n", $PID;
return DECLINED;
}

sub post_read_request_handler {
my $r = shift;

return DECLINED if $LEVEL eq 'quiet';

$request_started = time;
$requests++;

return DECLINED unless $LEVEL eq 'noisy';

printf {*STDERR} "TI: Start request %9d - %4d %s\n",
$PID,
$requests,
$r->uri;
return DECLINED;
}

sub log_handler {
my $r = shift;

return DECLINED if $LEVEL eq 'quiet';

my $request_ended = time;
my $t = $request_ended - $request_started;

$total_time += $t;
$min_time = $t if $t < $min_time;
$max_time = $t if $t > $max_time;
$total_time_squared += $t * $t;
$r->subprocess_env->{'CHILD_COUNT'} = $requests;
$r->subprocess_env->{'SCRIPT_START'} = sprintf '%0.6f', $request_started;
$r->subprocess_env->{'SCRIPT_END'} = sprintf '%0.6f', $request_ended;
$r->subprocess_env->{'SCRIPT_TIME'} = sprintf '%0.6f', $t;

return DECLINED unless $LEVEL eq 'noisy';

printf {*STDERR} "TI: End request %9d - %4d %10.6f %s\n", $PID,
$requests, $t, $r->uri;
return DECLINED;
}

sub child_exit_handler {
return DECLINED if $LEVEL eq 'quiet';

my $time_alive = time - $child_started;
printf {*STDERR} "TI: End child %9d - %4d %10.6f %10.6f %7.3f%%
%10.6f [%10.6f,%10.6f]\n",
$PID,
$requests,
$total_time,
$time_alive,
$time_alive ? $CENT * $total_time / $time_alive : 0,
$requests ? $total_time / $requests : 0,
$min_time,
$max_time;
return DECLINED;
}

1;




--
The Wellcome Trust Sanger Institute is operated by Genome Research
Limited, a charity registered in England with number 1021457 and a
company registered in England with number 2742969, whose registered
office is 215 Euston Road, London, NW1 2BE.
Attachments: curl-get-pages.tgz (2.82 KB)


milu71 at gmx

Mar 2, 2012, 11:22 AM

Post #5 of 8 (1166 views)
Permalink
Re: Tool to create multiple requests [In reply to]

Tobias Wagener schrieb am 07.02.2012 um 08:05 (+0100):
>
> I'm currently developing a huge application with mod_perl, unixODBC
> and MaxDB/SAPDB. On my developing system everything is fine. But on
> the productive system with > 50 users, I have database connection
> errors and request aborts and so on.
>
> Now I want to ask if someone knows a tool or perl modules, where I can
> simulate 50 users. I have a list with some common request including
> the query parameter in order of appearence. But I don't know, how to
> send them to my developing system to create the same load as it will
> be on the productive system.

You could spawn many processes using WWW::Mechanize.

But I'd rather learn to configure/program a stress tool.
Take a look at the following Java tools:

* JMeter
* Grinder

http://www.google.com/search?q=jmeter+grinder

This will bring up a couple comparisons that'll help you decide.

Michael


perrin at elem

Mar 2, 2012, 11:42 AM

Post #6 of 8 (1180 views)
Permalink
Re: Tool to create multiple requests [In reply to]

On Tue, Feb 7, 2012 at 2:05 AM, Tobias Wagener <perl [at] wagener> wrote:
> Now I want to ask if someone knows a tool or perl modules, where I can simulate
> 50 users.

http://www.hpl.hp.com/research/linux/httperf/

It can take a file of URLs to hit in order and it can do MUCH more
than 50 users even on cheap hardware.

- Perrin


perrin at elem

Mar 2, 2012, 11:49 AM

Post #7 of 8 (1165 views)
Permalink
Re: Tool to create multiple requests [In reply to]

On Tue, Feb 7, 2012 at 5:18 AM, James Smith <js5 [at] sanger> wrote:
> Apache::DBI sometimes cause issues with too many database connections - we
> tend to turn it off and use DBIx::Connector as mentioned (and carefully
> selected caching) here to cope with persistence of connections

Can you say more about this? There are sensible arguments that
DBIx::Connector is less magical and therefore better from a coder's
perspective, but I can't see any obvious reason that persistent
connections would be a problem with one and not the other.

- Perrin


perrin at elem

Mar 4, 2012, 4:03 PM

Post #8 of 8 (1161 views)
Permalink
Re: Tool to create multiple requests [In reply to]

Thanks for the explanation.

> A large system with upwards of 250 databases (on a relatively small
> number of database machines) as part of the system, these are used by up
> to a half a dozen web machines which can be forced up to a limit of 50
> children per machine... We regularly broke the connections limit on
> our mysql instances!
>
> Alternative scenario - a single website this time with a small number
> of databases but up to 50-60 developers each having their own sandbox
> which needs access to these core sets of databases!

Ok, so both situations where you have many db logins. It makes sense
that you wouldn't want to keep all the connections persistent then.
There's actually a warning in the Apache::DBI docs about that
scenario. And with MySQL connections you can skip persistence without
taking much of a hit.

Sorry to put you on the spot. There's been some FUD about Apache::DBI
lately, and it's just better for the community to know whether
something is actually broken or not. In this case, Apache::DBI worked
as intended but it was wrong for the situation.

- Perrin

ModPerl modperl RSS feed   Index | Next | Previous | View Threaded
 
 


Interested in having your list archived? Contact Gossamer Threads
 
  Web Applications & Managed Hosting Powered by Gossamer Threads Inc.