Liste des Groupes | Revenir à col advocacy |
Ordinarily, I don't give a flying fuck about network programming
but necessity does dictate.
>
I need to read through an HTML file, find all external HTTP(S) links,
and then determine if those external links are still viable, i.e.
if the pages to which they link still exist.
>
Perl is the language of choice.
>
Finding the links is not a problem, but how do I determine viability?
Do I look for the "404" error or is there another way?
>
I don't want no fucking Python code.
>
Pseudocode or just the relevant commands would be preferable.
Les messages affichés proviennent d'usenet.