[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#362421: marked as done (wiki.debian.org: Returns 200 OK for non-existing entires, should be 404)



Your message dated Sun, 1 Jun 2008 15:08:12 +0200
with message-id <20080601130812.GA23509@dedibox.ebzao.info>
and subject line Re: Bug#362421: wiki: Returns 200 OK for non-existing entires, should be 404
has caused the Debian Bug report #362421,
regarding wiki.debian.org: Returns 200 OK for non-existing entires, should be 404
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact owner@bugs.debian.org
immediately.)


-- 
362421: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=362421
Debian Bug Tracking System
Contact owner@bugs.debian.org with problems
--- Begin Message ---
Package: www.debian.org
Severity: minor

The wiki returns a "200 OK" response for every page, even ones that
do not exist yet. It would be better if it returned a 404 so these
pages do not get crawled or indexed.

If it's a problem to accomplish that, then the page should at least
use a <meta name="robots" value="noindex,nofollow"> so complying robots
will skip it.

One specific URL affected by this is wiki.debian.org/robots.txt: this
now returns a lot of content not at all in line with the robots.txt
specification. It should return 404 or robots.txt content.

By the way, thanks for the very useful service!


Thijs


--- End Message ---
--- Begin Message ---
Hello,

The problem is now solved, closing the bug.

On Thu, Apr 13, 2006 at 01:38:26PM +0200, Thijs Kinkhorst wrote:
> The wiki returns a "200 OK" response for every page, even ones that
> do not exist yet. It would be better if it returned a 404 so these
> pages do not get crawled or indexed.

$ wget -S -O /dev/null http://wiki.debian.org/non-existing-page
[..]
  HTTP/1.1 404 Not Found

> One specific URL affected by this is wiki.debian.org/robots.txt: this
> now returns a lot of content not at all in line with the robots.txt
> specification. It should return 404 or robots.txt content.

$ wget -S -O /dev/null http://wiki.debian.org/robots.txt
[..]
  HTTP/1.1 404 Not Found

-- 
Simon Paillard


--- End Message ---

Reply to: