I want to use Links SQL as an engine to store large numbers of links (500 000+) so I can a) build solid html pages from the database and b) serve up XML to other sites.
Pre-processing involves consolidating files from various sources and formatting the master file as per my LSQL database. I output a text file to paste into the Multi-Categories plugin to create/update the categories, and a text file on the server formatted for the LinksUpload plugin.
The process works well for 2-3000 sized tests, but doesn't look practical via the web interface for files containing 300 000 links. It's difficult to tell if the web interface times out on the links upload or just lags drastically behind what's really happening, but the msql daemon appears to keep running and the links do get added. So far so good - but the subsequent rebuild tables/search seems to hang (incomplete) at the "updating new flags" stage.
OK, that's the background ...
I know I could insert links directly into the relevant table - and indeed could delete selected records etc from there too. I think that would be acceptably fast and could be scripted/cron'd in due course. What I can't see (1) is how I can update the categories table at the same time. I have the categories in a field in the links table, but the entries in the categories table are numeric and it's not clear to me how I can access/use those categories while adding links.
... and then, if I succeed in bulk adding links and updating categories, what's the best/fastest way to rebuild what needs rebuilding?
I have shell access, and indeed would prefer to work that way rather than through the web interface if possible for these operations so I can subsequently script them. Additionally, with these large files, there doesn't seem to be a very firm connection between what the web interface is reporting and what the daemon is actually up to in the background.
Thanks in advance for any suggestions, experiences, solutions
Pre-processing involves consolidating files from various sources and formatting the master file as per my LSQL database. I output a text file to paste into the Multi-Categories plugin to create/update the categories, and a text file on the server formatted for the LinksUpload plugin.
The process works well for 2-3000 sized tests, but doesn't look practical via the web interface for files containing 300 000 links. It's difficult to tell if the web interface times out on the links upload or just lags drastically behind what's really happening, but the msql daemon appears to keep running and the links do get added. So far so good - but the subsequent rebuild tables/search seems to hang (incomplete) at the "updating new flags" stage.
OK, that's the background ...
I know I could insert links directly into the relevant table - and indeed could delete selected records etc from there too. I think that would be acceptably fast and could be scripted/cron'd in due course. What I can't see (1) is how I can update the categories table at the same time. I have the categories in a field in the links table, but the entries in the categories table are numeric and it's not clear to me how I can access/use those categories while adding links.
... and then, if I succeed in bulk adding links and updating categories, what's the best/fastest way to rebuild what needs rebuilding?
I have shell access, and indeed would prefer to work that way rather than through the web interface if possible for these operations so I can subsequently script them. Additionally, with these large files, there doesn't seem to be a very firm connection between what the web interface is reporting and what the daemon is actually up to in the background.
Thanks in advance for any suggestions, experiences, solutions