I just wrote a bit of code to grab all the links off of web sites. When I finished writing it I find that one of the modules I used (HTML::Parse) is now depreciated!
So I go to have a look at HTML:Parser which seems to be the replacement for this, but looks sooooo different (to an amateur like me at least).
Has anyone used HTML:Parser and could show me what the replacement for this would be: (I have tried a few things, and none worked).
#----------------------------------
#use HTML::Parse;
#use HTML::Element;
#use URI::URL;
# are required for this part
my $html = get $url;
my $parsed_html = HTML::Parse::parse_html($html);
for (@{ $parsed_html->extract_links() }) {
my $link=$_->[0];
my $urlb = new URI::URL $link;
my $full_url = $urlb->abs($url);
print qq~$full_url<br>~;
}
#----------------------------------
http://www.iuni.com/...tware/web/index.html
Links Plugins
So I go to have a look at HTML:Parser which seems to be the replacement for this, but looks sooooo different (to an amateur like me at least).
Has anyone used HTML:Parser and could show me what the replacement for this would be: (I have tried a few things, and none worked).
#----------------------------------
#use HTML::Parse;
#use HTML::Element;
#use URI::URL;
# are required for this part
my $html = get $url;
my $parsed_html = HTML::Parse::parse_html($html);
for (@{ $parsed_html->extract_links() }) {
my $link=$_->[0];
my $urlb = new URI::URL $link;
my $full_url = $urlb->abs($url);
print qq~$full_url<br>~;
}
#----------------------------------
http://www.iuni.com/...tware/web/index.html
Links Plugins