Gossamer Forum
Home : Products : Links 2.0 : Discussions :

Questions about bottlenecks in Links 2 scripts

Quote Reply
Questions about bottlenecks in Links 2 scripts
Hello everyone!

I am a bit concerned about performance bugs (bottlenecks) in Links 2.

I have noticed that there are a lot of places where the *.DB files can be closed

much earlier than the scripts do, and i'm wondering if those areas should be

improved.



For example, this most frequently used sub in db_utils.pl
Code:
sub get_record {
# --------------------------------------------------------
# Given an ID as input, get_record returns a hash of the
# requested record or undefined if not found. my ($key, $found, @data, $field);
$key = shift; $found = 0; # The bottleneck starts here open (DB, "<$db_file_name") or &cgierr("error in get_records. unable to open db file: $db_file_name.\nReason: $!");
if ($db_use_flock) { flock(DB, 1); }
LINE: while (<DB>) {
(/^#/) and next LINE;
(/^\s*$/) and next LINE;
chomp;
@data = &split_decode($_);
if ($data[$db_key_pos] eq $key) {
$found = 1;
%rec = &array_to_hash (0, @data);
last LINE;
}
}
close DB;
# Bottleneck ends here $found ? (return %rec) : (return undef);
}
Wouldn't it be much better if we close the DB right after we read itinto an array @db, and then do all that while (@db).... &split_decode;and &array_to_hash; stuff? There are a lot of places like this. I just want to know if this would helpthe performance. if yes, how much? Thanks y'all!


============
Enjoy Your Day!
============
SponsorPlus.NET
============
Quote Reply
Re: [spnet] Questions about bottlenecks in Links 2 scripts In reply to
Nope, that would make performance worse as reading all the data into an array takes up memory...a while loop doesn't

Having open file handles isn't a problem as long as you close them when you are done.

Last edited by:

Paul: Jun 24, 2002, 2:36 AM
Quote Reply
Re: [Paul] Questions about bottlenecks in Links 2 scripts In reply to
Right.... Yeah..

I was thinking about small file sizes and thought keeping

the file open too long would decrease speed as it would

make a queue. But when the DBs get larger this method

should help, is that correct?


hmm... another thought,

so if i have a 2mb links.db and i use a dynamic detailed.cgi

and i have like 10 hits per second (i hopeUnsure) it would take

at least 20mb per second for the machine ehh?



Ouch.... Smile


============
Enjoy Your Day!
============
SponsorPlus.NET
============

Last edited by:

spnet: Jun 24, 2002, 5:27 AM