Gossamer Forum
Home : Products : Gossamer Links : Version 1.x :

150,000 links on SQL?

(Page 1 of 2)
> >
Quote Reply
150,000 links on SQL?
We are in the process of indexing a pretty sizeable database,(150,000+ links) is Links SQL the answer or is there somthing else that you would recommend for a database of this size.

Thanks for any help with this.
Quote Reply
Re: 150,000 links on SQL? In reply to
Hello!

How many categories you have?

I have problems that :

I plan to have 2000 categories min. in my database as it make sense to classify them into such a categorization (per page about 10 links).

Links SQL is giving me a big headache since few months. Alex designed a modification called Multi_Category mod. So one can insert a link into more than one category.

But in the submit or add or modify or admin section all this category will appear. So I get a nightmare to think on it. For e.g.

If I want to validate 50 links, then

All this 50 links will be loaded and also at that time with every link 2000 categories list will be loaded. To be able to do this I need to pay the shity German telecom about $o.50 for that page!!! Alex have given a very simple answer, which you will find it in this forum, that I could turn it down and change it to Number instead of names. Who can accept that. For this use with numbers I need to always have a chart in frot of me and ofcourse that list of categories grows. So I need to shift my workspace to the Piano table, (because using Links SQL otherwise is too expensive regarding to the telephone costs) and keep the categories listing in front of me and change or work on links and modify the categories. Later on I may realize that while turning the pages I was online and the telephone bill is turning out to be more than using the original system of keeping the Categories names in there!!! Smile

Here the problem is that Alex have mainly designed this Links SQL for mainly keeping the existing users in mind. So all the further development this Links v2.0 is always kept in mind. Therefore keeping only one table of Mysql for categories and importing it was the ideal solution. The database situation and design is similar to that of Links v2.0 that results in the inbuilt limitations of Links SQL from my point of view. Questioning its fundamental design principles and approach to its making of programming logic, I do not agree with what it is done in the category area. If the design was done without keeping Links v2.0 in mind then there would be many things diffirent.

Currently I have an inflow of 150 links request per day and about 3000 per month (I got picked up by linkexchange and get submission from them) If I get on the listing of others I may get a volume of 3000 per day. This is normal for large directories.

Here if one has to work on classification system and what is being claimed "Links SQL is for large directories" may only be partially true if I argue the categories problem. So one has to live with their inherent problems untill Alex feels what I am asking makes sense and comes up with an upgrade which will help all the users. Untill now he wants some money from me as he said in the earlier thread saying it is a coustom need.

So that is what I wanted to inform you. If you have 2,00,000 Links then Links SQL will have no problems as it uses the MySQL database. The programming is Great, its very very nice. But you will have to coustomize it a lot for all the understandable reasons.

The only problem you may have is

1 - less support from many users to share their experience.
2 - A lot of problems in handling MySQL databases + related problems
3 - If you have many Categories that pops up in the admin/add/modify section every time then you need to compare your telephone costs in one year as regards to the value of the product.
4 - Modification of the scripts may be difficult.
5 - Less support from Alex or at least in terms of time. I mean it may not be immediate or in a short time.
6 - A horror if there is any mistake you may have made somewhere or if there are compatibility problems of the server or any such thing (If you are a starter).

Above all, this program is great and I like it very much. I ordered one liscence of links SQL even before anyone else and before the demo was put up. I have till today not able to solve the Multiple_Category problem modification and I think I will not be able to. Had my first copy on the 23rd of June and have not been able to start my website nor there is any chance of starting it in the near future.

After paying an handsome ammount of the product people want to use the product and not become a programmer to solve some inherent problems.

What would be very interesting is what would happen if Alex is going to insert 5000 categories + 5,00,000 in the Demo hanging on the web? That would be a challenge from me to Alex!!! Then lets see if this script is meant for large directories!!! The word large directories is only a salesman word that could draw false conception of the program. from my point of view, Links v2.0 is from very small to small directories and Links SQL is from small to medium size directories.

So if you have such less knowledge of perl or Mysql then be careful as this product will not be easy and is not meant for novices. If you are an expert then this script can do a lot and you can use a lot of features it can offer.

------------------
rajani











Quote Reply
Re: 150,000 links on SQL? In reply to
I would disagree with you on a number of points:

Quote:
But in the submit or add or modify or admin section all this category will appear.

There is an option to turn category listings off, and you enter in the category ID instead. This will save you from download really large select lists. You will need to have a separate window to pull up the ID's as needed, but it does save on download time.

Quote:
A lot of problems in handling MySQL databases + related problems

Mysql is a really good database, and it's considerably easier to use then any other comparable product. If you, or your ISP, are not comfortable with how to use Mysql, have a look at our hosting plan, or look for an ISP who knows what they are doing. Wink Also, with the latest version, you can now import and export straight from the web, minimizing the amount (if any) of mysql you'll need to learn.

Quote:
If you have many Categories that pops up in the admin/add/modify section every time then you need to compare your telephone costs in one year as regards to the value of the product

Again, see db_gen_category_list feature.

Quote:
Modification of the scripts may be difficult.

All the HTML is templatized, so changing the design should be easy. If you want to start changing the code, you should be a programmer. =) Have a look at other products like Hyperseek: if you change the code, you void any sort of support from the author.

Quote:
Less support from Alex or at least in terms of time. I mean it may not be immediate or in a short time.

I spend considerably more time in support with Links SQL then any other product. You won't get free modifications to the program, however you will get free advice and I'd be happy to guide you what to do.

Quote:
A horror if there is any mistake you may have made somewhere or if there are compatibility problems of the server or any such thing (If you are a starter).

Installation is included. This is because I can quickly spot any missing DBI/DBD perl modules, Mysql not setup properly, or other initial configuration problems. Once the program is up and running properly, you can start changing the design, etc.

Quote:
What would be very interesting is what would happen if Alex is going to insert 5000 categories + 5,00,000 in the Demo hanging on the web?

I'm not sure how useful that is. I'd rather the product performs really quickly with ~ 200,000 links, then try and sacrifice features for faster performance at 5 million. Very few people have the resources required to run a directory of that size.

If you have any questions, please don't hesitate to ask.

Cheers,

Alex

Quote Reply
Re: 150,000 links on SQL? In reply to
I just wanted to know if this will work for a directory of about 150,000+

And what other programs do I need to run this efficiently?

Thanks, Mark
Quote Reply
Re: 150,000 links on SQL? In reply to
 
Quote:
I just wanted to know if this will work for a directory of about 150,000+
And what other programs do I need to run this efficiently?

=) Yes, it will work on a directory of 150,000 links. You will need a system that meets the system requirements. I'd recommend a Linux system with Mysql and at least 256 MB of memory.

Cheers,

Alex
Quote Reply
Re: 150,000 links on SQL? In reply to
I just setup new unix machine that should fit the bill. Now I need to learn mySQL.

Thanks, Mark
Quote Reply
Re: 150,000 links on SQL? In reply to
Whats the maximum # of entries Links SQL can take ?
This is very important, and please only answer if your are sure, or youve tried it.

Thanx
Quote Reply
Re: 150,000 links on SQL? In reply to
MySQL has stats posted on their site. 1,000,000+ by 40 fields has been running on one site without a problem.

NT - is a different story. IF you ever look at the CPU usage of a Windows machine vs Unix it's orders of magnitude more. Unix was designed to be CPU friend. NT was designed to try to be flashy to take away the market from IBM --and did, but using dirty tricks.

I was running NT 4.something and loaded WORD and got 90% cpu usage just sitting doing nothing.

On the other end, I ran an 8-line BBS under OS/2 Warp in 1995 on a i486 with no performance problems and plenty of spare power for admin use.

NT is _NOT_ a heavy duty server platform no matter what M$ says. Even when they rig the tests, they can't compete. How many Unix (non-anon-FTP) sites have you gone to and gotten the message "Server License Exceeded..." or "No connections available"
Only Under NT do you see that white ODBC screen telling you you can't get in.

In fact, most places that need NT for some reason (windows-compatible program) run it as a seprate server only for that process. The web interface is still Unix. You get more *bang* for any platform with Unix than with almost any other OS.

If you are worried about what SQL can do, don't use NT. You will need 3x the hardware (minimum) under NT to do what Unix can do -- and you'll have all the extra headaches. If you have to use NT, use it only where needed.

Even M$ is trying to buy into Linux. There's good reason. It works better than NT -- and people _LIKE_ it.... Smile

Our servers are Solaris 7 only because that's what my ISP uses, and I figured it would cause less conflicts and easier set up to go with their Unix version. If I was doing it myself, I'd go Linux (even over BSD which is what I started with).

But... when you pick a Unix flavor, it's best to stick with it.

On this issue I feel like I do when I see kids smoking -- "Why?" with all the data out there, why start? With all the info out there on the benefits of Unix/Apache as a webserver, and the problems of NT/anything as a webserver why do it?? Smile

If you already are stuck with NT, moving the front end to Unix can save headaches and prevent spiraling hardware costs.

Considering the costs in maintaining an NT server, it's even cheaper for a pretty busy site to put in a Unix box for the webserving and leave the NT for legacy programs. Once you get the Unix machine working -- you leave it alone. I reboot once every 2-3 weeks for S&G's. Just to make sure I dont' forget how to handle any 'situation' that may arise on restart. Unix builds complacency Smile How many NT machines do you know that brag about 2 years of uptime?

Heck, my window machines uptime is usually measured in hours, or minutes, not days or months Wink

Pardon the ramblings... but if you are _REALLY_ concerned about performance and capacity, switch to Unix. On NT... it almost depends on the phase of the moon and how the machine is feeling that day what it can do.



Quote Reply
Re: 150,000 links on SQL? In reply to
ok, I'm planning to host with 9Net Avenue , on an NT machine with MS SQL . now i dont know the details of the server.

I have my own server, but its still under testing and tweaking, becasue its new. And i dont have MS SQL installed, and i dont think i will for the first 6 months .

this is the situation. I was just asking becasue i had an argument with the people m developing this site for , about the amount of entires SQl can handle. I just wanted to give them answers based on actual experiences.

Quote Reply
Re: 150,000 links on SQL? In reply to
I don't think the maximum is known. The more powerful your hardware, the more "real-time" performance you can get out of it for any given load.

MySQL can handle 1,000,000+ entries in a table. If you have that many entries, you really should consider a larger database product with added features. No matter what the absolute numbers are, MySQL and LinkSQL are pointed at the middle ground between the needs of the 'flat-file, sequential access' system, and a high-end product like Oracle.

The advantage of an SQL product, and the performance of MySQL in particular, is that with better programming, more hardware resources, and careful planning, you might be able to forestall the investment in a high-end package.

Today, everyone is looking at 'bigger' and 'more'. That's only part of the story. Just hoarding links is not going to serve any purpose. What are you doing with them is what matters.

If you have 1,000,000+ links, and the users to support it, you probably have multiple servers and multiple connections to the Internet. You can load balance between them and boost performance by putting the database and cgi on a server compiled with mod_perl, and use a lean version of the server to handle the rest of the 'front end' work.

2 years ago, this sort of product wasn't even dreamed about for the ordinary user running on a PC or virtual account. If you are thinking of growing into 1,000,000+ links over the next couple of years, well, LinkSQL will change, MySQL will change, and so will the platforms they run on.

After looking around -- and looking around every 2-3 days for anything new -- I have not found anything better than Links 2.0 for flat-file link management and LinkSQL + MySQL for the move to SQL.

I have left flat-files behind for the most part, and will be migrating everything important to database-backed service simply because it's 'safer' for the data than flat-files and 'flock' and it significantly reduces server load and programming effort to add/maintain/develop programs that need to store and access data -- links, messages, photo-galleries, user-profiles -- anything.

Back to the original point -- Before asking what LinkSQL will handle, provide the platform -- Unix or NT, RAM, Processor, diskspace, server load, and any other parameters that might affect things - such as if mod_perl is installed.

Given that, Alex could probably come up with some numbers based on his test of importing the ODP on his P3 (?).

I'm running 4 websites -- lots of CGI, MySQL and more on a Sun Spark 10. I've never seen more than 1.8 cpu load running any of these services -- and I'm still running Link2.0 for the main site -- the SQL version is still being hacked for prime-time.

http://www.postcards.com is _not_ a low volume site, and it's heavily CGI and graphics based.

Point being on an NT P3 the same situation (without MySQL load) ran it at almost 70% CPU. Not very performance friendly.

I guess rather than asking "What's the maximum" and forcing people to come up with a guess, why not post your situation and say "Will LinkSQL and MySQL handle this?"

Quote Reply
Re: 150,000 links on SQL? In reply to
Alex, how much space does the current linkssql demo take up? I am starting up a fully intergrated portal. My three options are as follows:
1. Somehow find the $450 for linksSQL.
2. Use a script to mirror the dmoz site.
3. Use sengas catalog.
Items #1,3 both use sql. The difference is the price. Senga has a full copy of dmoz working on there site. Allot is now using php, but thats not a problem. It is actually better for me, as the pages are generated dynammicaly, and does not have to have all the pages. PHP would access the MySQL database. Is there a way for LinksSQL to do this?
Jim
Quote Reply
Re: 150,000 links on SQL? In reply to
jimz: what is sengas catalog?
Quote Reply
Re: 150,000 links on SQL? In reply to
Senga's Catalog can be found at senga.org.
Quote Reply
Re: 150,000 links on SQL? In reply to
PHP is a server side scripting language, which means the server is parsing and processing each and every page you serve. This puts a heck of a load on a server, and an NT machine is already compromised in that that area.

PLP3 bloated my Apache bin to over 1 meg each copy. More than doubling the size of the server process.

The other problem is that PHP is a _SERVER_ scripting language, so you are limited to running the processes through the server.

You have to call the program through the server, so I don't know how easy it is to run cron jobs...

The plus of PHP is that it can dynamically serve pages -- which is what it was designed to do.

THe downside is it's not designed to _RUN_ a website the way Links and LinkSQL is.

You can mix PHP with LinkSQL to increase the features of the pages, and that is something I'm looking at. More dynamic reporting of statistics from the database.

FWIW -- PHP is in beta 4.0 release.

PHP is a way to access the database and provide dynamic content to the user. You still need some 'back end' to manage all the data.

As far as the space required by LinkSQL, the program is tiny. Insignificant. The MySQL database size is going to be about the same for the same amount of data. The difference being that links allows MORE data to be stored, and has generated indexes which take up space. Indexes will take up space no matter what type of implemenation you use, and while "part" of the database, are really 'outside' of it conceptually.

LinkSQL also has been designed to be flexible. While many people have started using the dmoz database as the 'test' database, they are optimized for that sort of directory (to make themselves look good) but are more difficult to alter. Links was made to be alterable to YOUR situation, and one of the situations it was shown to handle was the DMOZ database.

Quote Reply
Re: 150,000 links on SQL? In reply to
Hello!

A bit off the topic. But thought may be helps someone. Look at the following URL and you will find good scripts there.

1 - www.phpwizard.net for excellent scripts. I have tested.

2 - www.pair.com
They are the best!!! Smile

The main problem with Links SQL is its administration when the database goes beyond 10,000 - 30,000. Problem is how to handle all the categories tree growing and shifting links from one to anathor. Also import export scripts and just basic and do not provide many features to handle when we talk a database of a volume 1,00,000+

Somehow this Validation routine seems like designed for those lucky guys like plugdog, who have DSL and are constantly pluged with the internet (I am very jealous Wink). I am not. In Germany, a line like DSL costs $5,000+ for installation $ monthly $700 - $1,000!!! Every validation through a dial up and in a batch costs money. Links SQL has given a lot of nightmares and will to everyone.

The best way is to simply export the Validation data and compress it, download it into the local system and work on it. After wards, upload it and build.

Honestly speaking, if one uses php3, then one does not really, really need and script and pay for it. If you are able to work with php3 + MySQL then you do not need Links SQL! Just feed in the database with the help of php3 and get it out with the help of search in MySQL tables again with php3!!! Smile



------------------
rajani











Quote Reply
Re: 150,000 links on SQL? In reply to
 
Quote:
Somehow this Validation routine seems like designed for those lucky guys like plugdog, who have DSL and are constantly pluged with the internet (I am very jealous ). I am not. In Germany, a line like DSL costs $5,000+ for installation $ monthly $700 - $1,000!!! Every validation through a dial up and in a batch costs money. Links SQL has given a lot of nightmares and will to everyone.

Argh! Have you seen the option to turn off generating the drop down lists?!!?

Quote:
Honestly speaking, if one uses php3, then one does not really, really need and script and pay for it.

PHP is a programming language! You are still going to have to create and build something to manage the links. You will also need to understand SQL, how to do queries, etc. You are really misleading people, and trivializing the amount of work involved.

Alex

Quote Reply
Re: 150,000 links on SQL? In reply to
ok, heres are reply:

Quote:
PHP is a server side scripting language, which means the server is parsing and processing each and every page you serve. This puts a heck of a load on a server, and an NT machine is already compromised in that that area.
Yes, I know PHP is a Server Side Scripting Language. It has been designed with databases like mSQL in mind. Im the biggest anti-micro$oft person possible. The server Im using is a pretty fast one, 500MHz, 256MB ram, etc.

Quote:
PLP3 bloated my Apache bin to over 1 meg each copy. More than doubling the size of the server process.
Wow, 1MB!

Quote:
The other problem is that PHP is a _SERVER_ scripting language, so you are limited to running the processes through the server.
I don't care. Of course. Perl is a **_SERVER_** scripting language. PHP is fine.

Quote:
You have to call the program through the server, so I don't know how easy it is to run cron jobs...
Did you not read what i wrote. Its a freggin perl script using PHP do display the category pages to the user.

Quote:
The plus of PHP is that it can dynamically serve pages -- which is what it was designed to do.
Correct, thats why it is being used.

Quote:
THe downside is it's not designed to _RUN_ a website the way Links and LinkSQL is.
Its ment to provide database intergration in standard html pages.

Quote:
You can mix PHP with LinkSQL to increase the features of the pages, and that is something I'm looking at. More dynamic reporting of statistics from the database.
Im looking to get fast easy pages, dynamic content, and the least ammount of space used on the server.

Quote:
PHP is a way to access the database and provide dynamic content to the user. You still need some 'back end' to manage all the data.
Now im starting to assume you did not look at the script, becuase the whole backend managemnt is in perl. the only php scripts used is to display the directory.

Quote:
As far as the space required by LinkSQL, the program is tiny. Insignificant. The MySQL database size is going to be about the same for the same amount of data. The difference being that links allows MORE data to be stored, and has generated indexes which take up space. Indexes will take up space no matter what type of implemenation you use, and while "part" of the database, are really 'outside' of it conceptually.
I dont care if the script takes 5 10MB. The real problem was where when i was using a copy of links2, with 3,000 categories, and 7,000 liks, the directory itself was about 30MB. I dont have the money right now to throw towards server space. Once I start out and startt earning money from banners, i will be able to pay fore better hosting services. Then I will toss the php (unless there is no difference in speed)

I doubt I will be using ether script. It will probly turn out to be a mostly rewritten links 2. Now, that I think about it, all the pages will have to be automatically generated. Im hoping for some partnerships with some other places to provide a portal to there users. It will almost be impossible to provide 5 to 10 copies of the standard html tree.The whole thing will most likley be made out of php, perl and c/c++. PHP to do the front end stuff, and perl and c/c++ to do the backend. Its just gonna be a whole lump of code. This is gonna be intresting to do.
Quote Reply
Re: 150,000 links on SQL? In reply to
If Links 2.0 consumed this much space:

Quote:
links2, with 3,000 categories, and 7,000 liks, the directory itself was about 30MB

MySQL and any other DBMS will consume _MORE_ because they have inherent overhead. For one, Links is a flat-file database, and every character (except the single '|' between fileds) was data. _SOME_ sort of field delimter is needed, so this is as space-efficient as it got. Links 2.0 also did not keep any indexes around. It did a search of the databse each time you went looking.

MySQL will keep all that data, _PLUS_ extra overhead in each field. Your database will undoubtably be _much_ larger.

Also, your Index files can take up as much room -- or more -- than the database itself, depending on what you have indexed.

Links is about as efficient on space as anything you will find. Only data compression could make it more space efficient.

It's _NOT_ efficient on CPU usage, or data base look up. You _could_ make it more efficient by developing your own index routine for searching, but your index would have to be re-compiled every change or alteration to the database.

The RDBMS does all that, sacrificing space for speed and look up efficiency.

Think about a M$ Word document. When you store it in "word" format with all the formatting and information it's about 800k for 55 pages (real world example). Those 55 pages of text saved out at about 120 K plain txt.

This is the type of thing you will see moving from flat-file to RDBMS, _BUT_ disk is cheaper than CPU and time, so that is where the tradeoff is made.

And, BTW, PERL is _NOT_ a server scripting language. It's a programming language that can be called by a server to do things. It does not depend on the server at all -- except when you _INCLUDE_ server-dependent code for CGI where it needs the environment variables. Perl is a stand-alone item. PHP3/PHP4 is not. It requires the server to parse the files and do the work of presenting the information to the user. Perl doesn't need the server. Perl works just fine over Telnet and direct log on to the server.

I haven't tried it, but I would imagine that LinkSQL could run pretty well without the server running for most of the features. It only uses the server to gather and output the results. ALl the work is done in Perl and the OS.
Quote Reply
Re: 150,000 links on SQL? In reply to
ok,

Quote:
MySQL and any other DBMS will consume _MORE_ because they have inherent overhead. For one, Links is a flat-file database, and every character (except the single '|' between fileds) was data. _SOME_ sort of field delimter is needed, so this is as space-efficient as it got. Links 2.0 also did not keep any indexes around. It did a search of the databse each time you went looking.

MySQL will keep all that data, _PLUS_ extra overhead in each field. Your database will undoubtably be _much_ larger.

Ok, forget about that, i was aiming more towards the html files. the database itself was 2MB, the HTML pages, were around 30MB.

Quote:
Links is about as efficient on space as anything you will find. Only data compression could make it more space efficient.
Yes


Quote:
It's _NOT_ efficient on CPU usage, or data base look up. You _could_ make it more efficient by developing your own index routine for searching, but your index would have to be re-compiled every change or alteration to the database.
I will rewrite allot of the stuff in db_utilities in the regular links. Basically the whole script will be rewritten.


[quote[Think about a M$ Word document. When you store it in "word" format with all the formatting and information it's about 800k for 55 pages (real world example). Those 55 pages of text saved out at about 120 K plain txt.[/quote]M$ Word .... Ewwe, i dont want to think about that Bloated peice of what they call "Software". More like bloatware.

Quote:
And, BTW, PERL is _NOT_ a server scripting language. It's a programming language that can be called by a server to do things. It does not depend on the server at all -- except when you _INCLUDE_ server-dependent code for CGI where it needs the environment variables. Perl is a stand-alone item. PHP3/PHP4 is not. It requires the server to parse the files and do the work of presenting the information to the user. Perl doesn't need the server. Perl works just fine over Telnet and direct log on to the server.
Perl is a scripting language, not a programming language. Programming languages are languages that are complied. Perl is not. Therefore its a scripting language.

Quote:
I haven't tried it, but I would imagine that LinkSQL could run pretty well without the server running for most of the features. It only uses the server to gather and output the results. ALl the work is done in Perl and the OS.
Correct.

Also, I thought about it more, and I have decided on removing PHP from my project. I will most likley use PERL, unless another scripting/programming language comes along, that works well with perl scripts (including output from a perl script) and with MySQL nativly.
Quote Reply
Re: 150,000 links on SQL? In reply to
Again, we go round the mulberry bush...

In the trade off of _size_ vs _server_load_ the static pages that Links generates are far more efficient than any server-parsed alternative no matter what language. All the server does is grab the file and toss it out the port. The reason for statically generating pages is the same reason sites cache frequent requests -- performance.

Since the data written to the static pages doesn't really change much between builds, it's far more efficient to generate it once. If you want a dynamic 'rate' page, that's possible with LinkSQL -- even between builds.
If you change the way 'ratings' are handled a bit, the program can write the new rating to the links record in real-time without going through an intermediate table (links 2.0 used the update-on-build process for practical reasons with flat files). Then, your ratings page could access the database and calculate the updated ratings on each access -- this would consume ENORMOUS amounts of CPU even on a slow site.

In order to make this even worthwhile, you'd have to change the whole logic so that every time a rating was added, the new 'rank' was compiled, and it was compared to the overall rank list, and the overall rank list updated with the new value.

It's really easy to get caught up in 'dynamic everything' and you lose site of what your site is _really_ trying to do. You want it to remain fresh -- rebuild it 4x a day -- no matter what load that puts on the server it's less than dynamic access. Searches are dynamic, since they are a 'user request' so minimize the impact to the server with as efficient an index and search routine as possible. Spend an extra minute or two of CPU time during the build to generate it. It will pay off.

And as for perl being 'compiled' or 'scripted' check out this by Tom Christiansen:

http://www.perl.com/.../comp-vs-interp.html

for what perl is and isn't. Perl lives in the ether between realms, but it's not 'just' a 'scripting' language... that's for sure -- although it _can_ be used to write scripts. With mod_perl and OOP perl is showing just what it is, and can do.

Perl "is". Smile
Quote Reply
Re: 150,000 links on SQL? In reply to
It doesn't matter about ratings, im not gonna even include that future. The thing i dont have is 400MB to throw towards the site. Then my host wants to include my site as a portal for there users, i got to make the next 400MB of pages, and any other people who want to partner with me i dont have 200MB to give to each person.
Quote Reply
Re: 150,000 links on SQL? In reply to
a small question here ;

as is said Im hosting with 9Net Avenue, on an NT server, but my SQL database will be on a separate server, which i think is Unix , but i will make sure .
now i will pay for 2 years in advance for a bunch of features, one of them is 20 Mb of space epecified for the SQL database.

im worried about the space, i need it to last at least 6 months.

my database will consist of large entires; i will be accepting medical posting and articles from doctors and anybody who wants to post their research or papers or anything. so how much space do you thin i will need , depending on the fact that SQL takes up more space than flat text files ?
Quote Reply
Re: 150,000 links on SQL? In reply to
Laith:

I don't know how much space it will take up, since it depends on the data, and how it's index, and how many tables you use to manage the data.

20 meg is a _very_ small amount of space by todays standards. I don't expect you to get very far with full text articles and 20 meg.

Just my opinion. I wouldn't have started a project like that without at least 10 times that much space now, and 10x more in 6 months. Also, depending on your anticipated traffic, reasonable bandwidth. I've always run image sites, so I don't know what a pure-text site would consume, but most ISP's seem to be targeting about 6 meg a month for an 'average' site, though any sort of 'portal' is going to require more.

Jimz:

I don't know if you realize the performance difference between static pages and dynamic pages. Dynamic pages are great for small sites, low traffic sites, and sites with really high-end power.

You should run a test of what it costs for 1000 hits to a dynamic page a day, vs static pages in terms of CPU. (that's assuming 1000 hits to the portal). If the portal gets 5,000 hits a day, that's 5,000 cgi calls before anyone does anything.

Any activiity inside your site is then another cgi call.

This is important under Unix, but even more so under NT. In 5+ years of doing this, and running my own servers (so I am in control of my CPU usage) I have avoided SSI and as much CGI as possible. My current situation requires CGI to keep the site fresh, but I've minimized it to the least amount possible. The fewer processes running in the background, the faster the site goes. A simple 'fetch' requires much less cpu than starting a CGI process.

You can really see this when you run your own server. But you have dealt with the results on overloaded servers as you wait for your request to process - or die.

Dynamic sites are great, but they require far more horsepower per-connection than static sites. The more they do with each connection, the fewer connections they can handle per unit time.

Quote Reply
Re: 150,000 links on SQL? In reply to
pugdog,
I will work on your advice and increase the space as much as possible.

Thank you
Quote Reply
Re: 150,000 links on SQL? In reply to
Hello Mr. Pataki!

Quote:
A simple 'fetch' requires much less cpu than starting a CGI process.

Very nice feedback. Where does this appply? Would you be so kind to give an example.

------------------
rajani











> >