[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Using urlcheck - how many broken links should we accept?



Folks,

Looking at the output of urlcheck:

There are lots of websites that return error codes. There are still lots
where an http -> https substitution would solve a missing website.

There are lots of apparent errors where, in fact, it's just a directory level
move - https://www-master.debian.org/build-logs/urlcheck/MailingLists

How much breakage should we accept for old sites / sites that have been
renamed or simply no longer exist?

For, for example, DPL nominations on debian-vote which list resume information
/ university "stuff" - is it worth going to fix old links?

I'm very wary of global search and replace through all of the webwml. How 
much checking should we devote to seeing whether websites exist or still
resolve?

All opinions gratefully received.

All the very best, as ever,

Andy C.


Reply to: