Gossamer Forum
Home : Products : Gossamer Links : Version 1.x :

pre-compiled or cached searches

Quote Reply
pre-compiled or cached searches
My brain may not have turned on yet... but is there an easy way to do pre-compiled searches?

I'd like to set up a page with the suggested searches in links (taking the top 50 or so plus things people haven't thought of) and pre-compile them so that they are like the category pages, and don't burn cpu time.

A cache system would be the most elegant, but even a sub-system that would regenerate them every 24 hours would be great (my site doesn't change much more often than that... I add links once a day, basically).

I was looking at alex's Time:HiRes left over, and in one search alone I use about 5 hours of time a day... that could be cut significantly with pre-compiled or cached searches.

Quote Reply
Re: pre-compiled or cached searches In reply to
There is a fine script that do searching tru html-pages and save the words inside to an index. It is thinkable to do this with the pages in links, too.
The prob so far, i used this script, is that it takes so much time running tru all the sides.
You will find it at

http://bignosebird.com/carchive/search2.shtml

Rob
Quote Reply
Re: pre-compiled or cached searches In reply to
Well, you could write a little script that would do something like:

Code:
use LWP::Simple;
use CGI ();
my $in = new CGI;
my @terms = qw/a list of search words/;
foreach my $term (@terms) {
$term_q = $in->escape ($term);
my $output = get ("http://www.yoursite.com/cgi-bin/search.cgi?query=$term_q");
open (FILE, "> /path/to/save/$term.html");
print FILE $output;
close FILE;
}

and what that would do is run search.cgi for each of the words you want, and save the output to a file. Then you just create the links manually somewhere to where the output was saved.

Hope that helps,

Alex
Quote Reply
Re: pre-compiled or cached searches In reply to
Still the same problem as mentioned about the LWP thing. I have dowload all these modules, but i still donīt know, where to put them at my server and how to get them.
As i know i must do:

(from parent Dir of this file)
use nextDir::nextDir::LWP::UserAgent

so this script use the UserAgent.pm in nextDir/nextDir/LWP

but this UserAgent wants:
use HTTP::another.pm
and this one would have another;

So is it the solution to throw all the .pms in one Dir and change all the use x::x::x
to this Dir ???

How can i say 'go one Dir up'. so i can still left all .pms in libs/theme_x
(libs/HTTP, libs/LWP and so on)

e.g. use ../HTTP::

I cant beieve the programmers of these libs have not thought of this problems, so there must be a way to tell the .pms the dir-structure, isnīt it so? Or was the theorie to get all .pms , you need for a problem and write the pathes new?

Please help me on this, i think its very important for my learning perl to use so many libs as possible.

And very important, too:
If i get the libs, there was different texts, saying i must 'make' my perl new.

As i understand right: i could implement these libs to my perl with a new make and!!! i can use these libs by calling them from file. Is this right? If not, noone could write a script with these libs for using everywhere, cause all users must make a 'make' to there perl.

But if i made a 'make', can i left the .pms then?

Itīs horrible, please drop my some help.
Robert





Quote Reply
Re: pre-compiled or cached searches In reply to
The easiest way to install modules is to use CPAN. Assuming you have root access to the machine, all you do is type:

perl -MCPAN -e shell

then you go through some initial setup about your system, you can except the default for almost everything.

Then just type:

install Bundle::libnet

and it will install LWP plus all required modules into the appropriate place.

Cheers,

Alex