Gossamer Forum
Home : Products : Gossamer Links : Version 1.x :

150,000 links on SQL?

(Page 2 of 2)
> >
Quote Reply
Re: 150,000 links on SQL? In reply to
a "fetch" is simply serving a static page. All the server does is find the file, and push it out the port. Something Unix is really good at doing. It's originally what all servers were designed to do.

When you start asking the server to use SSI, or process the file in some way -- .shtml .php3 whatever, you are asking the server to parse the file, looking for the tags, then act on those tags.

So, rather than just taking the page, and pushing it out the port, the server has to take the file, parse it, look for any tags, then work on those tags, THEN push it out the port in altered format.

This is partly why they started using .html and .shtml The server would ignore the .html files, and only parse the .shtml files. But, since most people look for .html files, many people have enabled SSI on .html as well as .shtml, so every file sent out the server is parsed. This is a really heavy drain.

Any files with .php3 is going to be even more heavily parsed.

This is why Links generates static pages. It's much less load on the CPU to serve a static page than to serve a dynamic page. Think about it this way:

You have a big can of tennis balls, and you have to toss them at the people in front of you who are practicing against the wall. If all you have to do is reach in, and toss any ball to the next person waiting, you can do it without looking at the can of balls, or really looking at anything but who wants a ball. Just reach in, and toss to whomever asked. If you had to toss only blue balls to the boys, and yellow balls to the girls, then you'd have to first think -- is it a boy or a girl -- then look for the right color ball -- then toss it. Much more work.

This is why people often run two different server configurations -- plain apache and apache mod_perl -- for instance. Any simple static request goes to the plain apache server that just tosses the ball out the port. Dynamic requests, to /perl-bin, for example, is passed to the mod_perl apache that can more efficiently handle the cgi call than plain apache -- but consumes more resources. This allows a lean, mean apache for regular requests and apache on steroids for the dynamic ones.

Quote Reply
Re: 150,000 links on SQL? In reply to
It's not so much the parsing that is server intensive, it's actually doing the command. For instance, if you have an SSI for a banner ad program like:

<!--#exec cgi="/path/to/banner.cgi"-->

The server has to fork a new process, load perl, parse and compile the program, return the results, and clean up. If you do this for every page, it can be quite intensive with high loads. PHP3 is similiar, unless you embed it inside of Apache, where you don't have the overhead of forking a new process. This is basically what mod_perl does (as well as some other optimizations).

Jimz, sorry, I missed your question:

Quote:
Alex, how much space does the current linkssql demo take up?

The current demo uses 36 MB of disk space for the actual mysql data, and another for the HTML. If all you want to do is replicate DMoz and not handle the additions or managing the content of the database, then I wouldn't recommend Links SQL. If you do want to manage it, then it would be a good tool, however I suspect you would need a dedicated server.

Cheers,

Alex
Quote Reply
Re: 150,000 links on SQL? In reply to
Alex,
What I'd like to do is the following:
Create a Nice looking portal with email, weather, news, homepages, etc.
Manage A links Database. Will use allot of the dmoz directory structure to a given point. Links will be added by my vistitors. But, I want to create a system, as the more vistors come, the system dosent fail.
Allow partners to join my network, say company a wants to have a portal for there users. I would like to be able to have them customize the page layout to there liking. Then use a Perl script, probley using FastCGI, to create the page, with the approiate layout. I figured, if I use FastCGI, there will be verrry little load time, as the script is always loaded and waiting to be ran.

Sounds kinda complicated, ah?
Quote Reply
Re: 150,000 links on SQL? In reply to
Is the SSI call
Code:
<!--#include virtual="/cgi-bin/whatever.cgi?etcetc" -->
any less server intensive? I'm using it on a dedicated Solaris Sparc 10 currently doing 20k+ calls in 14 hours. It seems fast compared to my old banner scripts which couldn't keep up --'course it's written in C++ and I expect that's most of the difference.

pugdog,
Quote:
This is partly why they started using .html and .shtml The server would ignore the .html files, and only parse the .shtml files. But, since most people look for .html files, many people have enabled SSI on .html as well as .shtml, so every file sent out the server is parsed. This is a really heavy drain.

There is a work-around on this. In a nutshell, I'm using DHTML to embed .shtml in <IFRAMES> or <LAYERS> (shifts on the fly using browser detection) in the .htm or .html file. That way SSI is called only if there is a embedded .shtml, and not every html file. It also has the side benefit of breaking cache on static pages. I've done a write-up on this, but the technique has evolved quite a bit since I first posted it. I'll email it to anyone that might have an interest (or update the web page).

Quote Reply
Re: 150,000 links on SQL? In reply to
Jimz:
Quote:
Create a Nice looking portal with email, weather, news, homepages, etc.
Manage A links Database. Will use allot of the dmoz directory structure to a given point. Links will be added by my vistitors. But, I want to create a system, as the more vistors come, the system dosent fail.
Allow partners to join my network, say company a wants to have a portal for there users. I would like to be able to have them customize the page layout to there liking. Then use a Perl script, probley using FastCGI, to create the page, with the approiate layout. I figured, if I use FastCGI, there will be verrry little load time, as the script is always loaded and waiting to be ran.

The next version of Links SQL will have a page.cgi that will allow you to load pages dynamically. This will be useful where you want to give other users a URL to your site but use different template sets.

I'm hoping to have the next version done end of next week. It will be a pretty major upgrade. If you are planning to do this seriously, you might want to consider using a mod_perl dedicated server for the best performance.

bjordan:

No, the SSI is just as intensive. Since it's compiled though you save a couple of steps, it's pretty much as fast as you will get with CGI. Faster alternatives are using technologies that embed the program into the web server.

Cheers,

Alex
Quote Reply
Re: 150,000 links on SQL? In reply to
Alex,
Someone in another form here stated that you will be working on syncing Links2 and LinksSQL so that the only difference is the DB functions so that modifications will work on both? Is this true? If so, when is the approx release date? That will become handy for me, as all once i hit the point where flat databases are not responding correctly or slowly, i can switch to linkssql, and note have to install a new verson of links totally. In tge new release, is the price still standing at $450?
Quote Reply
Re: 150,000 links on SQL? In reply to
I said that, since Alex had indicated that here as a 'wish' of his own. He didn't have a time on it, and I didn't mean to imply he did. It was in response to someone elses 'wish' for a Links 3.0 ... I took my guess as to what Alex had in mind.



> >