What's happening is that the script is trying to connect and verify every link in your database. Each link can take as short as .5 secs or as long as 60, depending on your link to the (non)existing server. There really is no way of making it work faster since the script is at the mercy of internet latency and other things it has no control over.
Batch processing of the link items is not possible unless you modify the script. Yep, Links SQL has a verification system that has crash recovery but links 2.0 doesn't have that luxury.
What you may be able to do is if you have access to telnet, login and run the script nph-verify.cgi. You'll have to cd into your admin directory and run it by doing something like:
Make sure you do the output.txt. If you don't all the information will just scroll on by. Then when the script is done, just download the output.txt file using something like FTP and you can see what the statuses of your links are.
Batch processing of the link items is not possible unless you modify the script. Yep, Links SQL has a verification system that has crash recovery but links 2.0 doesn't have that luxury.
What you may be able to do is if you have access to telnet, login and run the script nph-verify.cgi. You'll have to cd into your admin directory and run it by doing something like:
Code:
perl nph-verify.cgi > output.txtMake sure you do the output.txt. If you don't all the information will just scroll on by. Then when the script is done, just download the output.txt file using something like FTP and you can see what the statuses of your links are.