Sujet : Re: Need Assistance -- Network Programming
De : lt (at) *nospam* gnu.rocks (Lester Thorpe)
Groupes : comp.os.linux.advocacyDate : 19. Jun 2024, 20:27:25
Autres entêtes
Organisation : UsenetExpress - www.usenetexpress.com
Message-ID : <17da7b2e072f80b8$336$3510362$802601b3@news.usenetexpress.com>
References : 1
On Wed, 19 Jun 2024 13:47:44 +0000, Lester Thorpe wrote:
I need to read through an HTML file, find all external HTTP(S) links,
and then determine if those external links are still viable,
>
As expected, none of the duncified denizens on this NG are up
to the task.
Perl can do it:
https://www.perlmonks.org/?node_id=801302However, I don' want to install dozens of Perl modules just to get
this simple job done (that's bullshit web programming for ya').
Therefore, I break it into pieces.
First, I extract the links using "hrefgrep -p -s > links.txt."
Hrefgrep is Perl code. (
http://home.linuxfocus.org/~guido/#webgrep)
Thank the gods for FOSS!
Then I process each link to see if it is viable.
(Unfinished)
Thanks for nothing, assholes.