I need to get text from a website, but have never really thought about it before...
Basically I need to compare data from a Govt Website in an automated form. I need a way to "get" a webpage and then "save" the contents. (From there I can use pattern matching to compare the numbers and find what I need.)
Someone suggested a "Screen Scraper"... In the forums here I've seen a couple of references - does anybody have any comments, suggestions?
I think for my purpose having the data saved as a text-based file is the best solution. I've done similar things with net::ftp, but never thought about trying it via http.
Any comments are welcome. Is this legal, ethical, etc.?
Basically I need to compare data from a Govt Website in an automated form. I need a way to "get" a webpage and then "save" the contents. (From there I can use pattern matching to compare the numbers and find what I need.)
Someone suggested a "Screen Scraper"... In the forums here I've seen a couple of references - does anybody have any comments, suggestions?
I think for my purpose having the data saved as a text-based file is the best solution. I've done similar things with net::ftp, but never thought about trying it via http.
Any comments are welcome. Is this legal, ethical, etc.?