Links SQL "worked" with the full open directory (600,000 links), but I wasn't happy with performance. The build process took to long and the search/index took to long to be usable.
I would say that at present the recommended limit of Links SQL is 100,000 links (although if you have some more powerful hardware you could do better, best I've used it on is a pII 350 so far).
We are actively working on improving that. Things that need to change:
- Better Index/Searching. Links SQL currently indexes all the information, but it needs to do it differently to still return search results in a reasonable amount of time.
- Build Changed: There's no reason to rebuild a couple hundred megabytes of pages because a new link was added, instead only affected pages should be updated. This feature will make Links usable on very large directories (> 200,000).
Quote:
The database system, NOT the interactive front end ie: LinksSQL is what will determine capacity, speed and performance.
The design of the program has a lot to do with it! If you try and search across a table with 1 million rows, and don't use some indexing scheme, it's going to take a long time! Links SQL maintains it's own search index to help speed up searches.
Quote:
Also what's next from SQL. I mean look at Altavista. How on earth does that search engine search through millions of web sites in seconds? Can links SQL do this?
No and I don't think it ever will. Altavista uses some pretty serious hardware (I couldn't find their about page anymore that listed it), but it's a couple million dollars worth at least. They also use proprietary programs to do the indexing/searching, they don't use an SQL database.
Cheers,
Alex