I need some help with this. I'm writing a FTP crawler script that will be backend by MySQL. Currently, the crawler works great, but each time it backs out of one directory, it will index old directories, but doesn't go into it. IE. I have the script start out on a root directory. Inside the root directory, it contains 3 folders. It crawls the 1st folder, then the sub folders. It backs out and index the 1st folder again, but then goes and crawl folder 2 and 3. Isn't there some sort of way that I can query the database for previous recorded directories, or will that make the crawler slower? Any ideas to get around this would be very grateful.
Subject | Author | Views | Date |
---|---|---|---|
MySQL and PERL Help! | XanthisHP | 2773 | May 26, 2000, 9:21 PM |
Re: MySQL and PERL Help! | XanthisHP | 2659 | May 26, 2000, 9:50 PM |