Gossamer Forum
Home : General : Databases and SQL :

Reducing Load.

Quote Reply
Reducing Load.
I am writing a perl script to retrieve records and split it into pages.

my $limit = $q->param('limit') || '0';
my $results_per_page = 50;

my
$query = $dbh -> prepare("SELECT * FROM $table Where reply=0 ORDER BY msg DESC LIMIT $limit, $results_per_page");
$query->execute();
while (my @row = $query -> fetchrow_array()) {

print "$row[1]<br>$row[2]<br>$row[3] blah blah blah.....";
}
$query->finish();
----------------------------------------------

The table holds about 50,000 rows of records, The script is supposed to be accessed 100,000 times a day by web users. After reading manual for LIMIT, I came to know that LIMIT works after getting the whole matrix of records, so it's not going to reduce any load on server, so I am using it only for splitting records on to pages.

----------------------------------------------
My Questions:

1 - Is it safe to load such huge data 100,000 times a day? Or there is a way that I can load only those records which are going to be printed on each page?

2 - How much data in terms of size (MB) can be hold safely by my @row = $query -> fetchrow_array())

3- What else can I do to reduce load on mySQL and server? Should I use CGI::Fast?

4- Anyother possible query?

Thanks,
Sara.
Quote Reply
Re: [Sara_Samsara] Reducing Load. In reply to
Does the data change often? If not, then use MySQL's query caching feature to store the results.

Cheers,

Alex
--
Gossamer Threads Inc.