[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: ikiwiki / CVE-2019-9187



Chris Lamb <lamby@debian.org> writes:

> Sorry to be a pain but can you remake this with --exclude="*/.pc/*"
> or similar...? :)

Hmmm. Would have hoped that directory would be excluded automatically
from the source package.

Oh wait, this is a debian native package. Means I will probably have to
patch the files directly, not rely on debian/patches. So was only
working before because I was testing with patches applied.

Curiously I am getting a test failure when testing without my patches.
-- 
Brian May <bam@debian.org>
diff -Nru ikiwiki-3.20141016.4/CHANGELOG ikiwiki-3.20141016.4+deb8u1/CHANGELOG
--- ikiwiki-3.20141016.4/CHANGELOG	2017-01-12 05:18:52.000000000 +1100
+++ ikiwiki-3.20141016.4+deb8u1/CHANGELOG	2019-03-07 17:35:55.000000000 +1100
@@ -1,3 +1,10 @@
+ikiwiki (3.20141016.4+deb8u1) jessie-security; urgency=high
+
+  * Non-maintainer upload by the LTS Team.
+  * CVE-2019-9187: Fix server-side request forgery via aggregate plugin.
+
+ -- Brian May <bam@debian.org>  Thu, 07 Mar 2019 17:35:55 +1100
+
 ikiwiki (3.20141016.4) jessie-security; urgency=high
 
   * Reference CVE-2016-4561 in 3.20141016.3 changelog
diff -Nru ikiwiki-3.20141016.4/CVE-2019-9187-1.patch ikiwiki-3.20141016.4+deb8u1/CVE-2019-9187-1.patch
--- ikiwiki-3.20141016.4/CVE-2019-9187-1.patch	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/CVE-2019-9187-1.patch	2019-03-07 17:25:37.000000000 +1100
@@ -0,0 +1,28 @@
+From e7b0d4a0fff8ed45a90c2efe8ef294bdf7c9bdac Mon Sep 17 00:00:00 2001
+From: Simon McVittie <smcv@debian.org>
+Date: Sun, 10 Feb 2019 16:29:19 +0000
+Subject: [PATCH] useragent: Raise an exception if the LWP module can't be
+ loaded
+
+Signed-off-by: Simon McVittie <smcv@debian.org>
+---
+ IkiWiki.pm | 3 +++
+ 1 file changed, 3 insertions(+)
+
+diff --git a/IkiWiki.pm b/IkiWiki.pm
+index 90cb96e58..dc047b08a 100644
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -2470,6 +2470,9 @@ sub add_autofile ($$$) {
+ }
+ 
+ sub useragent () {
++	eval q{use LWP};
++	error($@) if $@;
++
+ 	return LWP::UserAgent->new(
+ 		cookie_jar => $config{cookiejar},
+ 		env_proxy => 1,		# respect proxy env vars
+-- 
+2.11.0
+
diff -Nru ikiwiki-3.20141016.4/CVE-2019-9187-2.patch ikiwiki-3.20141016.4+deb8u1/CVE-2019-9187-2.patch
--- ikiwiki-3.20141016.4/CVE-2019-9187-2.patch	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/CVE-2019-9187-2.patch	2019-03-07 17:26:25.000000000 +1100
@@ -0,0 +1,238 @@
+From 67543ce1d62161fdef9dca198289d7dd7dceacc0 Mon Sep 17 00:00:00 2001
+From: Simon McVittie <smcv@debian.org>
+Date: Sun, 10 Feb 2019 16:30:07 +0000
+Subject: [PATCH] useragent: Don't allow non-HTTP protocols to be used
+
+This prevents the aggregate plugin from being used to read the contents
+of local files via file:/// URLs.
+
+Signed-off-by: Simon McVittie <smcv@debian.org>
+---
+ IkiWiki.pm                         |   1 +
+ t/aggregate-file.t                 | 173 +++++++++++++++++++++++++++++++++++++
+ t/noparanoia/LWPx/ParanoidAgent.pm |   2 +
+ t/secret.rss                       |  11 +++
+ 4 files changed, 187 insertions(+)
+ create mode 100755 t/aggregate-file.t
+ create mode 100644 t/noparanoia/LWPx/ParanoidAgent.pm
+ create mode 100644 t/secret.rss
+
+diff --git a/IkiWiki.pm b/IkiWiki.pm
+index dc047b08a..d5d1af56c 100644
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -2477,6 +2477,7 @@ sub useragent () {
+ 		cookie_jar => $config{cookiejar},
+ 		env_proxy => 1,		# respect proxy env vars
+ 		agent => $config{useragent},
++		protocols_allowed => [qw(http https)],
+ 	);
+ }
+ 
+diff --git a/t/aggregate-file.t b/t/aggregate-file.t
+new file mode 100755
+index 000000000..f00743dac
+--- /dev/null
++++ b/t/aggregate-file.t
+@@ -0,0 +1,173 @@
++#!/usr/bin/perl
++use utf8;
++use warnings;
++use strict;
++
++use Encode;
++use Test::More;
++
++BEGIN {
++	plan(skip_all => "CGI not available")
++		unless eval q{
++			use CGI qw();
++			1;
++		};
++
++	plan(skip_all => "IPC::Run not available")
++		unless eval q{
++			use IPC::Run qw(run);
++			1;
++		};
++
++	use_ok('IkiWiki');
++	use_ok('YAML::XS');
++}
++
++# We check for English error messages
++$ENV{LC_ALL} = 'C';
++
++use Cwd qw(getcwd);
++use Errno qw(ENOENT);
++
++my $installed = $ENV{INSTALLED_TESTS};
++
++my @command;
++if ($installed) {
++	@command = qw(ikiwiki --plugin inline);
++}
++else {
++	ok(! system("make -s ikiwiki.out"));
++	@command = ("perl", "-I".getcwd."/blib/lib", './ikiwiki.out',
++		'--underlaydir='.getcwd.'/underlays/basewiki',
++		'--set', 'underlaydirbase='.getcwd.'/underlays',
++		'--templatedir='.getcwd.'/templates');
++}
++
++sub write_old_file {
++	my $name = shift;
++	my $dir = shift;
++	my $content = shift;
++	writefile($name, $dir, $content);
++	ok(utime(333333333, 333333333, "$dir/$name"));
++}
++
++sub write_setup_file {
++	my %params = @_;
++	my %setup = (
++		wikiname => 'this is the name of my wiki',
++		srcdir => getcwd.'/t/tmp/in',
++		destdir => getcwd.'/t/tmp/out',
++		url => 'http://example.com',
++		cgiurl => 'http://example.com/cgi-bin/ikiwiki.cgi',
++		cgi_wrapper => getcwd.'/t/tmp/ikiwiki.cgi',
++		cgi_wrappermode => '0751',
++		add_plugins => [qw(aggregate)],
++		disable_plugins => [qw(emailauth openid passwordauth)],
++		aggregate_webtrigger => 1,
++	);
++	if ($params{without_paranoia}) {
++		$setup{libdirs} = [getcwd.'/t/noparanoia'];
++	}
++	unless ($installed) {
++		$setup{ENV} = { 'PERL5LIB' => getcwd.'/blib/lib' };
++	}
++	writefile("test.setup", "t/tmp",
++		"# IkiWiki::Setup::Yaml - YAML formatted setup file\n" .
++		Dump(\%setup));
++}
++
++sub thoroughly_rebuild {
++	ok(unlink("t/tmp/ikiwiki.cgi") || $!{ENOENT});
++	ok(! system(@command, qw(--setup t/tmp/test.setup --rebuild --wrappers)));
++}
++
++sub run_cgi {
++	my (%args) = @_;
++	my ($in, $out);
++	my $method = $args{method} || 'GET';
++	my $environ = $args{environ} || {};
++	my $params = $args{params} || { do => 'prefs' };
++
++	my %defaults = (
++		SCRIPT_NAME	=> '/cgi-bin/ikiwiki.cgi',
++		HTTP_HOST	=> 'example.com',
++	);
++
++	my $cgi = CGI->new($args{params});
++	my $query_string = $cgi->query_string();
++	diag $query_string;
++
++	if ($method eq 'POST') {
++		$defaults{REQUEST_METHOD} = 'POST';
++		$in = $query_string;
++		$defaults{CONTENT_LENGTH} = length $in;
++	} else {
++		$defaults{REQUEST_METHOD} = 'GET';
++		$defaults{QUERY_STRING} = $query_string;
++	}
++
++	my %envvars = (
++		%defaults,
++		%$environ,
++	);
++	run(["./t/tmp/ikiwiki.cgi"], \$in, \$out, init => sub {
++		map {
++			$ENV{$_} = $envvars{$_}
++		} keys(%envvars);
++	});
++
++	return decode_utf8($out);
++}
++
++sub test {
++	my $content;
++
++	ok(! system(qw(rm -rf t/tmp)));
++	ok(! system(qw(mkdir t/tmp)));
++
++	write_old_file('aggregator.mdwn', 't/tmp/in',
++		'[[!aggregate name="ssrf" url="file://'.getcwd.'/t/secret.rss"]]'
++		.'[[!inline pages="internal(aggregator/*)"]]');
++
++	write_setup_file();
++	thoroughly_rebuild();
++
++	$content = run_cgi(
++		method => 'GET',
++		params => {
++			do => 'aggregate_webtrigger',
++		},
++	);
++	unlike($content, qr{creating new page});
++	unlike($content, qr{Secrets});
++	ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf');
++	ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf/Secrets_go_here._aggregated');
++
++	thoroughly_rebuild();
++	$content = readfile('t/tmp/out/aggregator/index.html');
++	unlike($content, qr{Secrets});
++
++	diag('Trying test again with LWPx::ParanoidAgent disabled');
++
++	write_setup_file(without_paranoia => 1);
++	thoroughly_rebuild();
++
++	$content = run_cgi(
++		method => 'GET',
++		params => {
++			do => 'aggregate_webtrigger',
++		},
++	);
++	unlike($content, qr{creating new page});
++	unlike($content, qr{Secrets});
++	ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf');
++	ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf/Secrets_go_here._aggregated');
++
++	thoroughly_rebuild();
++	$content = readfile('t/tmp/out/aggregator/index.html');
++	unlike($content, qr{Secrets});
++}
++
++test();
++
++done_testing();
+diff --git a/t/noparanoia/LWPx/ParanoidAgent.pm b/t/noparanoia/LWPx/ParanoidAgent.pm
+new file mode 100644
+index 000000000..751e80ce6
+--- /dev/null
++++ b/t/noparanoia/LWPx/ParanoidAgent.pm
+@@ -0,0 +1,2 @@
++# make import fail
++0;
+diff --git a/t/secret.rss b/t/secret.rss
+new file mode 100644
+index 000000000..11202e9ed
+--- /dev/null
++++ b/t/secret.rss
+@@ -0,0 +1,11 @@
++<?xml version="1.0"?>
++<rss version="2.0">
++<channel>
++<title>Secrets go here</title>
++<description>Secrets go here</description>
++<item>
++  <title>Secrets go here</title>
++  <description>Secrets go here</description>
++</item>
++</channel>
++</rss>
+-- 
+2.11.0
+
diff -Nru ikiwiki-3.20141016.4/CVE-2019-9187-3.patch ikiwiki-3.20141016.4+deb8u1/CVE-2019-9187-3.patch
--- ikiwiki-3.20141016.4/CVE-2019-9187-3.patch	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/CVE-2019-9187-3.patch	2019-03-07 17:26:41.000000000 +1100
@@ -0,0 +1,590 @@
+From d283e4ca1aeb6ca8cc0951c8495f778071076013 Mon Sep 17 00:00:00 2001
+From: Simon McVittie <smcv@debian.org>
+Date: Sun, 10 Feb 2019 17:22:06 +0000
+Subject: [PATCH] useragent: Automatically choose whether to use
+ LWPx::ParanoidAgent
+
+The simple implementation of this, which I'd prefer to use, would be:
+if we can import LWPx::ParanoidAgent, use it; otherwise, use
+LWP::UserAgent.
+
+However, aggregate has historically worked with proxies, and
+LWPx::ParanoidAgent quite reasonably refuses to work with proxies
+(because it can't know whether those proxies are going to do the same
+filtering that LWPx::ParanoidAgent would).
+
+Signed-off-by: Simon McVittie <smcv@debian.org>
+---
+ IkiWiki.pm                  | 123 ++++++++++++++++-
+ IkiWiki/Plugin/aggregate.pm |   5 +-
+ IkiWiki/Plugin/blogspam.pm  |  16 +--
+ IkiWiki/Plugin/openid.pm    |  12 +-
+ IkiWiki/Plugin/pinger.pm    |  21 ++-
+ t/useragent.t               | 317 ++++++++++++++++++++++++++++++++++++++++++++
+ 6 files changed, 458 insertions(+), 36 deletions(-)
+ create mode 100755 t/useragent.t
+
+diff --git a/IkiWiki.pm b/IkiWiki.pm
+index d5d1af56c..efb48293a 100644
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -2469,16 +2469,131 @@ sub add_autofile ($$$) {
+ 	$autofiles{$file}{generator}=$generator;
+ }
+ 
+-sub useragent () {
++sub useragent (@) {
++	my %params = @_;
++	my $for_url = delete $params{for_url};
++	# Fail safe, in case a plugin calling this function is relying on
++	# a future parameter to make the UA more strict
++	foreach my $key (keys %params) {
++		error "Internal error: useragent(\"$key\" => ...) not understood";
++	}
++
+ 	eval q{use LWP};
+ 	error($@) if $@;
+ 
+-	return LWP::UserAgent->new(
+-		cookie_jar => $config{cookiejar},
+-		env_proxy => 1,		# respect proxy env vars
++	my %args = (
+ 		agent => $config{useragent},
++		cookie_jar => $config{cookiejar},
++		env_proxy => 0,
+ 		protocols_allowed => [qw(http https)],
+ 	);
++	my %proxies;
++
++	if (defined $for_url) {
++		# We know which URL we're going to fetch, so we can choose
++		# whether it's going to go through a proxy or not.
++		#
++		# We reimplement http_proxy, https_proxy and no_proxy here, so
++		# that we are not relying on LWP implementing them exactly the
++		# same way we do.
++
++		eval q{use URI};
++		error($@) if $@;
++
++		my $proxy;
++		my $uri = URI->new($for_url);
++
++		if ($uri->scheme eq 'http') {
++			$proxy = $ENV{http_proxy};
++			# HTTP_PROXY is deliberately not implemented
++			# because the HTTP_* namespace is also used by CGI
++		}
++		elsif ($uri->scheme eq 'https') {
++			$proxy = $ENV{https_proxy};
++			$proxy = $ENV{HTTPS_PROXY} unless defined $proxy;
++		}
++		else {
++			$proxy = undef;
++		}
++
++		foreach my $var (qw(no_proxy NO_PROXY)) {
++			my $no_proxy = $ENV{$var};
++			if (defined $no_proxy) {
++				foreach my $domain (split /\s*,\s*/, $no_proxy) {
++					if ($domain =~ s/^\*?\.//) {
++						# no_proxy="*.example.com" or
++						# ".example.com": match suffix
++						# against .example.com
++						if ($uri->host =~ m/(^|\.)\Q$domain\E$/i) {
++							$proxy = undef;
++						}
++					}
++					else {
++						# no_proxy="example.com":
++						# match exactly example.com
++						if (lc $uri->host eq lc $domain) {
++							$proxy = undef;
++						}
++					}
++				}
++			}
++		}
++
++		if (defined $proxy) {
++			$proxies{$uri->scheme} = $proxy;
++			# Paranoia: make sure we can't bypass the proxy
++			$args{protocols_allowed} = [$uri->scheme];
++		}
++	}
++	else {
++		# The plugin doesn't know yet which URL(s) it's going to
++		# fetch, so we have to make some conservative assumptions.
++		my $http_proxy = $ENV{http_proxy};
++		my $https_proxy = $ENV{https_proxy};
++		$https_proxy = $ENV{HTTPS_PROXY} unless defined $https_proxy;
++
++		# We don't respect no_proxy here: if we are not using the
++		# paranoid user-agent, then we need to give the proxy the
++		# opportunity to reject undesirable requests.
++
++		# If we have one, we need the other: otherwise, neither
++		# LWPx::ParanoidAgent nor the proxy would have the
++		# opportunity to filter requests for the other protocol.
++		if (defined $https_proxy && defined $http_proxy) {
++			%proxies = (http => $http_proxy, https => $https_proxy);
++		}
++		elsif (defined $https_proxy) {
++			%proxies = (http => $https_proxy, https => $https_proxy);
++		}
++		elsif (defined $http_proxy) {
++			%proxies = (http => $http_proxy, https => $http_proxy);
++		}
++
++	}
++
++	if (scalar keys %proxies) {
++		# The configured proxy is responsible for deciding which
++		# URLs are acceptable to fetch and which URLs are not.
++		my $ua = LWP::UserAgent->new(%args);
++		foreach my $scheme (@{$ua->protocols_allowed}) {
++			unless ($proxies{$scheme}) {
++				error "internal error: $scheme is allowed but has no proxy";
++			}
++		}
++		# We can't pass the proxies in %args because that only
++		# works since LWP 6.24.
++		foreach my $scheme (keys %proxies) {
++			$ua->proxy($scheme, $proxies{$scheme});
++		}
++		return $ua;
++	}
++
++	eval q{use LWPx::ParanoidAgent};
++	if ($@) {
++		print STDERR "warning: installing LWPx::ParanoidAgent is recommended\n";
++		return LWP::UserAgent->new(%args);
++	}
++	return LWPx::ParanoidAgent->new(%args);
+ }
+ 
+ sub sortspec_translate ($$) {
+diff --git a/IkiWiki/Plugin/aggregate.pm b/IkiWiki/Plugin/aggregate.pm
+index 05e22a290..8f0870e2e 100644
+--- a/IkiWiki/Plugin/aggregate.pm
++++ b/IkiWiki/Plugin/aggregate.pm
+@@ -513,7 +513,10 @@ sub aggregate (@) {
+ 			}
+ 			$feed->{feedurl}=pop @urls;
+ 		}
+-		my $ua=useragent();
++		# Using the for_url parameter makes sure we crash if used
++		# with an older IkiWiki.pm that didn't automatically try
++		# to use LWPx::ParanoidAgent.
++		my $ua=useragent(for_url => $feed->{feedurl});
+ 		my $res=URI::Fetch->fetch($feed->{feedurl}, UserAgent=>$ua);
+ 		if (! $res) {
+ 			$feed->{message}=URI::Fetch->errstr;
+diff --git a/IkiWiki/Plugin/blogspam.pm b/IkiWiki/Plugin/blogspam.pm
+index 3eb4cf8b3..3835f52ca 100644
+--- a/IkiWiki/Plugin/blogspam.pm
++++ b/IkiWiki/Plugin/blogspam.pm
+@@ -57,18 +57,10 @@ sub checkconfig () {
+ 	};
+ 	error $@ if $@;
+ 
+-	eval q{use LWPx::ParanoidAgent};
+-	if (!$@) {
+-		$client=LWPx::ParanoidAgent->new(agent => $config{useragent});
+-	}
+-	else {
+-		eval q{use LWP};
+-		if ($@) {
+-			error $@;
+-			return;
+-		}
+-		$client=useragent();
+-	}
++	# Using the for_url parameter makes sure we crash if used
++	# with an older IkiWiki.pm that didn't automatically try
++	# to use LWPx::ParanoidAgent.
++	$client=useragent(for_url => $config{blogspam_server});
+ }
+ 
+ sub checkcontent (@) {
+diff --git a/IkiWiki/Plugin/openid.pm b/IkiWiki/Plugin/openid.pm
+index 35ef52a58..eb21955e9 100644
+--- a/IkiWiki/Plugin/openid.pm
++++ b/IkiWiki/Plugin/openid.pm
+@@ -219,14 +219,10 @@ sub getobj ($$) {
+ 	eval q{use Net::OpenID::Consumer};
+ 	error($@) if $@;
+ 
+-	my $ua;
+-	eval q{use LWPx::ParanoidAgent};
+-	if (! $@) {
+-		$ua=LWPx::ParanoidAgent->new(agent => $config{useragent});
+-	}
+-	else {
+-		$ua=useragent();
+-	}
++	# We pass the for_url parameter, even though it's undef, because
++	# that will make sure we crash if used with an older IkiWiki.pm
++	# that didn't automatically try to use LWPx::ParanoidAgent.
++	my $ua=useragent(for_url => undef);
+ 
+ 	# Store the secret in the session.
+ 	my $secret=$session->param("openid_secret");
+diff --git a/IkiWiki/Plugin/pinger.pm b/IkiWiki/Plugin/pinger.pm
+index b2d54af8a..ec764caee 100644
+--- a/IkiWiki/Plugin/pinger.pm
++++ b/IkiWiki/Plugin/pinger.pm
+@@ -70,17 +70,16 @@ sub ping {
+ 		eval q{use Net::INET6Glue::INET_is_INET6}; # may not be available
+ 		
+ 		my $ua;
+-		eval q{use LWPx::ParanoidAgent};
+-		if (!$@) {
+-			$ua=LWPx::ParanoidAgent->new(agent => $config{useragent});
+-		}
+-		else {
+-			eval q{use LWP};
+-			if ($@) {
+-				debug(gettext("LWP not found, not pinging"));
+-				return;
+-			}
+-			$ua=useragent();
++		eval {
++			# We pass the for_url parameter, even though it's
++			# undef, because that will make sure we crash if used
++			# with an older IkiWiki.pm that didn't automatically
++			# try to use LWPx::ParanoidAgent.
++			$ua=useragent(for_url => undef);
++		};
++		if ($@) {
++			debug(gettext("LWP not found, not pinging").": $@");
++			return;
+ 		}
+ 		$ua->timeout($config{pinger_timeout} || 15);
+ 		
+diff --git a/t/useragent.t b/t/useragent.t
+new file mode 100755
+index 000000000..195a86521
+--- /dev/null
++++ b/t/useragent.t
+@@ -0,0 +1,317 @@
++#!/usr/bin/perl
++use warnings;
++use strict;
++use Test::More;
++
++my $have_paranoid_agent;
++BEGIN {
++	plan(skip_all => 'LWP not available')
++		unless eval q{
++			use LWP qw(); 1;
++		};
++	use_ok("IkiWiki");
++	$have_paranoid_agent = eval q{
++		use LWPx::ParanoidAgent qw(); 1;
++	};
++}
++
++eval { useragent(future_feature => 1); };
++ok($@, 'future features should cause useragent to fail');
++
++diag "==== No proxy ====";
++delete $ENV{http_proxy};
++delete $ENV{https_proxy};
++delete $ENV{no_proxy};
++delete $ENV{HTTPS_PROXY};
++delete $ENV{NO_PROXY};
++
++diag "---- Unspecified URL ----";
++my $ua = useragent(for_url => undef);
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef, 'No http proxy');
++is($ua->proxy('https'), undef, 'No https proxy');
++
++diag "---- Specified URL ----";
++$ua = useragent(for_url => 'http://example.com');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef, 'No http proxy');
++is($ua->proxy('https'), undef, 'No https proxy');
++
++diag "==== Proxy for everything ====";
++$ENV{http_proxy} = 'http://proxy:8080';
++$ENV{https_proxy} = 'http://sproxy:8080';
++delete $ENV{no_proxy};
++delete $ENV{HTTPS_PROXY};
++delete $ENV{NO_PROXY};
++
++diag "---- Unspecified URL ----";
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++$ua = useragent(for_url => 'http://example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++# We don't care what $ua->proxy('https') is, because it won't be used
++$ua = useragent(for_url => 'https://example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++# We don't care what $ua->proxy('http') is, because it won't be used
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++
++diag "==== Selective proxy ====";
++$ENV{http_proxy} = 'http://proxy:8080';
++$ENV{https_proxy} = 'http://sproxy:8080';
++$ENV{no_proxy} = '*.example.net,example.com,.example.org';
++delete $ENV{HTTPS_PROXY};
++delete $ENV{NO_PROXY};
++
++diag "---- Unspecified URL ----";
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++
++diag "---- Exact match for no_proxy ----";
++$ua = useragent(for_url => 'http://example.com');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- Subdomain of exact domain in no_proxy ----";
++$ua = useragent(for_url => 'http://sub.example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++
++diag "---- example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://sub.example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.net does not match *.example.net ----";
++$ua = useragent(for_url => 'https://badexample.net');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "---- example.org matches .example.org ----";
++$ua = useragent(for_url => 'https://example.org');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.org matches .example.org ----";
++$ua = useragent(for_url => 'https://sub.example.org');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.org does not match .example.org ----";
++$ua = useragent(for_url => 'https://badexample.org');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "==== Selective proxy (alternate variables) ====";
++$ENV{http_proxy} = 'http://proxy:8080';
++delete $ENV{https_proxy};
++$ENV{HTTPS_PROXY} = 'http://sproxy:8080';
++delete $ENV{no_proxy};
++$ENV{NO_PROXY} = '*.example.net,example.com,.example.org';
++
++diag "---- Unspecified URL ----";
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++
++diag "---- Exact match for no_proxy ----";
++$ua = useragent(for_url => 'http://example.com');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- Subdomain of exact domain in no_proxy ----";
++$ua = useragent(for_url => 'http://sub.example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++
++diag "---- example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://sub.example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.net does not match *.example.net ----";
++$ua = useragent(for_url => 'https://badexample.net');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "---- example.org matches .example.org ----";
++$ua = useragent(for_url => 'https://example.org');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.org matches .example.org ----";
++$ua = useragent(for_url => 'https://sub.example.org');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.org does not match .example.org ----";
++$ua = useragent(for_url => 'https://badexample.org');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "==== Selective proxy (many variables) ====";
++$ENV{http_proxy} = 'http://proxy:8080';
++$ENV{https_proxy} = 'http://sproxy:8080';
++# This one should be ignored in favour of https_proxy
++$ENV{HTTPS_PROXY} = 'http://not.preferred.proxy:3128';
++# These two should be merged
++$ENV{no_proxy} = '*.example.net,example.com';
++$ENV{NO_PROXY} = '.example.org';
++
++diag "---- Unspecified URL ----";
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++
++diag "---- Exact match for no_proxy ----";
++$ua = useragent(for_url => 'http://example.com');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- Subdomain of exact domain in no_proxy ----";
++$ua = useragent(for_url => 'http://sub.example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++
++diag "---- example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://sub.example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.net does not match *.example.net ----";
++$ua = useragent(for_url => 'https://badexample.net');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "==== One but not the other ====\n";
++$ENV{http_proxy} = 'http://proxy:8080';
++delete $ENV{https_proxy};
++delete $ENV{HTTPS_PROXY};
++delete $ENV{no_proxy};
++delete $ENV{NO_PROXY};
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://proxy:8080', 'should use proxy');
++
++delete $ENV{http_proxy};
++$ENV{https_proxy} = 'http://sproxy:8080';
++delete $ENV{HTTPS_PROXY};
++delete $ENV{no_proxy};
++delete $ENV{NO_PROXY};
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://sproxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++done_testing;
+-- 
+2.11.0
+
diff -Nru ikiwiki-3.20141016.4/CVE-2019-9187-4.patch ikiwiki-3.20141016.4+deb8u1/CVE-2019-9187-4.patch
--- ikiwiki-3.20141016.4/CVE-2019-9187-4.patch	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/CVE-2019-9187-4.patch	2019-03-07 17:26:55.000000000 +1100
@@ -0,0 +1,175 @@
+From 9a275b2f1846d7268c71a740975447e269383849 Mon Sep 17 00:00:00 2001
+From: Simon McVittie <smcv@debian.org>
+Date: Sun, 10 Feb 2019 16:56:41 +0000
+Subject: [PATCH] doc: Document security issues involving LWP::UserAgent
+
+Recommend the LWPx::ParanoidAgent module where appropriate.
+It is particularly important for openid, since unauthenticated users
+can control which URLs that plugin will contact. Conversely, it is
+non-critical for blogspam, since the URL to be contacted is under
+the wiki administrator's control.
+
+Signed-off-by: Simon McVittie <smcv@debian.org>
+---
+ doc/plugins/aggregate.mdwn  |  4 ++++
+ doc/plugins/blogspam.mdwn   |  2 ++
+ doc/plugins/openid.mdwn     |  7 +++++--
+ doc/plugins/pinger.mdwn     |  8 +++++---
+ doc/security.mdwn           | 49 +++++++++++++++++++++++++++++++++++++++++++++
+ doc/tips/using_a_proxy.mdwn | 22 ++++++++++++++++++++
+ 6 files changed, 87 insertions(+), 5 deletions(-)
+ create mode 100644 doc/tips/using_a_proxy.mdwn
+
+diff --git a/doc/plugins/aggregate.mdwn b/doc/plugins/aggregate.mdwn
+index 75123d923..b1db828d1 100644
+--- a/doc/plugins/aggregate.mdwn
++++ b/doc/plugins/aggregate.mdwn
+@@ -11,6 +11,10 @@ The [[meta]] and [[tag]] plugins are also recommended to be used with this
+ one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since
+ feeds can easily contain html problems, some of which these plugins can fix.
+ 
++Installing the [[!cpan LWPx::ParanoidAgent]] Perl module is strongly
++recommended. The [[!cpan LWP]] module can also be used, but is susceptible
++to server-side request forgery.
++
+ ## triggering aggregation
+ 
+ You will need to run ikiwiki periodically from a cron job, passing it the
+diff --git a/doc/plugins/blogspam.mdwn b/doc/plugins/blogspam.mdwn
+index 745fc48e2..0ebae7d84 100644
+--- a/doc/plugins/blogspam.mdwn
++++ b/doc/plugins/blogspam.mdwn
+@@ -11,6 +11,8 @@ To check for and moderate comments, log in to the wiki as an admin,
+ go to your Preferences page, and click the "Comment Moderation" button.
+ 
+ The plugin requires the [[!cpan JSON]] perl module.
++The [[!cpan LWPx::ParanoidAgent]] Perl module is recommended,
++although this plugin can also fall back to [[!cpan LWP]].
+ 
+ You can control how content is tested via the `blogspam_options` setting.
+ The list of options is [here](http://blogspam.net/api/2.0/testComment.html#options).
+diff --git a/doc/plugins/openid.mdwn b/doc/plugins/openid.mdwn
+index 4c8e0d381..a061cb43f 100644
+--- a/doc/plugins/openid.mdwn
++++ b/doc/plugins/openid.mdwn
+@@ -7,8 +7,11 @@ into the wiki.
+ The plugin needs the [[!cpan Net::OpenID::Consumer]] perl module.
+ Version 1.x is needed in order for OpenID v2 to work.
+ 
+-The [[!cpan LWPx::ParanoidAgent]] perl module is used if available, for
+-added security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed
++The [[!cpan LWPx::ParanoidAgent]] Perl module is strongly recommended.
++The [[!cpan LWP]] module can also be used, but is susceptible to
++server-side request forgery.
++
++The [[!cpan Crypt::SSLeay]] Perl module is needed
+ to support users entering "https" OpenID urls.
+ 
+ This plugin is enabled by default, but can be turned off if you want to
+diff --git a/doc/plugins/pinger.mdwn b/doc/plugins/pinger.mdwn
+index 00d83e1bb..f37979ac6 100644
+--- a/doc/plugins/pinger.mdwn
++++ b/doc/plugins/pinger.mdwn
+@@ -10,9 +10,11 @@ can be kept up-to-date.
+ To configure what URLs to ping, use the [[ikiwiki/directive/ping]]
+ [[ikiwiki/directive]].
+ 
+-The [[!cpan LWP]] perl module is used for pinging. Or the [[!cpan
+-LWPx::ParanoidAgent]] perl module is used if available, for added security.
+-Finally, the [[!cpan Crypt::SSLeay]] perl module is needed to support pinging
++The [[!cpan LWPx::ParanoidAgent]] Perl module is strongly recommended.
++The [[!cpan LWP]] module can also be used, but is susceptible
++to server-side request forgery.
++
++The [[!cpan Crypt::SSLeay]] perl module is needed to support pinging
+ "https" urls.
+ 
+ By default the pinger will try to ping a site for 15 seconds before timing
+diff --git a/doc/security.mdwn b/doc/security.mdwn
+index e7770dd27..378a2e4bc 100644
+--- a/doc/security.mdwn
++++ b/doc/security.mdwn
+@@ -611,3 +611,52 @@ This was fixed in ikiwiki 3.20170111, with fixes backported to Debian 8
+ in version 3.20141016.4.
+ 
+ ([[!debcve CVE-2017-0356]]/OVE-20170111-0001)
++
++## Server-side request forgery via aggregate plugin
++
++The ikiwiki maintainers discovered that the [[plugins/aggregate]] plugin
++did not use [[!cpan LWPx::ParanoidAgent]]. On sites where the
++aggregate plugin is enabled, authorized wiki editors could tell ikiwiki
++to fetch potentially undesired URIs even if LWPx::ParanoidAgent was
++installed:
++
++* local files via `file:` URIs
++* other URI schemes that might be misused by attackers, such as `gopher:`
++* hosts that resolve to loopback IP addresses (127.x.x.x)
++* hosts that resolve to RFC 1918 IP addresses (192.168.x.x etc.)
++
++This could be used by an attacker to publish information that should not have
++been accessible, cause denial of service by requesting "tarpit" URIs that are
++slow to respond, or cause undesired side-effects if local web servers implement
++["unsafe"](https://tools.ietf.org/html/rfc7231#section-4.2.1) GET requests.
++([[!debcve CVE-2019-9187]])
++
++Additionally, if the LWPx::ParanoidAgent module was not installed, the
++[[plugins/blogspam]], [[plugins/openid]] and [[plugins/pinger]] plugins
++would fall back to [[!cpan LWP]], which is susceptible to similar attacks.
++This is unlikely to be a practical problem for the blogspam plugin because
++the URL it requests is under the control of the wiki administrator, but
++the openid plugin can request URLs controlled by unauthenticated remote
++users, and the pinger plugin can request URLs controlled by authorized
++wiki editors.
++
++This is addressed in ikiwiki 3.20190228 as follows, with the same fixes
++backported to Debian 9 in version 3.20170111.1:
++
++* URI schemes other than `http:` and `https:` are not accepted, preventing
++  access to `file:`, `gopher:`, etc.
++
++* If a proxy is [[configured in the ikiwiki setup file|tips/using_a_proxy]],
++  it is used for all outgoing `http:` and `https:` requests. In this case
++  the proxy is responsible for blocking any requests that are undesired,
++  including loopback or RFC 1918 addresses.
++
++* If a proxy is not configured, and LWPx::ParanoidAgent is installed,
++  it will be used. This prevents loopback and RFC 1918 IP addresses, and
++  sets a timeout to avoid denial of service via "tarpit" URIs.
++
++* Otherwise, the ordinary LWP user-agent will be used. This allows requests
++  to loopback and RFC 1918 IP addresses, and has less robust timeout
++  behaviour. We are not treating this as a vulnerability: if this
++  behaviour is not acceptable for your site, please make sure to install
++  LWPx::ParanoidAgent or disable the affected plugins.
+diff --git a/doc/tips/using_a_proxy.mdwn b/doc/tips/using_a_proxy.mdwn
+new file mode 100644
+index 000000000..39df3c42a
+--- /dev/null
++++ b/doc/tips/using_a_proxy.mdwn
+@@ -0,0 +1,22 @@
++Some ikiwiki plugins make outgoing HTTP requests from the web server:
++
++* [[plugins/aggregate]] (to download Atom and RSS feeds)
++* [[plugins/blogspam]] (to check whether a comment or edit is spam)
++* [[plugins/openid]] (to authenticate users)
++* [[plugins/pinger]] (to ping other ikiwiki installations)
++
++If your ikiwiki installation cannot contact the Internet without going
++through a proxy, you can configure this in the [[setup file|setup]] by
++setting environment variables:
++
++    ENV:
++        http_proxy: "http://proxy.example.com:8080";
++        https_proxy: "http://proxy.example.com:8080";
++        # optional
++        no_proxy: ".example.com,www.example.org"
++
++Note that some plugins will use the configured proxy for all destinations,
++even if they are listed in `no_proxy`.
++
++To avoid server-side request forgery attacks, ensure that your proxy does
++not allow requests to addresses that are considered to be internal.
+-- 
+2.11.0
+
diff -Nru ikiwiki-3.20141016.4/debian/changelog ikiwiki-3.20141016.4+deb8u1/debian/changelog
--- ikiwiki-3.20141016.4/debian/changelog	2017-01-12 05:18:52.000000000 +1100
+++ ikiwiki-3.20141016.4+deb8u1/debian/changelog	2019-03-07 17:35:55.000000000 +1100
@@ -1,3 +1,10 @@
+ikiwiki (3.20141016.4+deb8u1) jessie-security; urgency=high
+
+  * Non-maintainer upload by the LTS Team.
+  * CVE-2019-9187: Fix server-side request forgery via aggregate plugin.
+
+ -- Brian May <bam@debian.org>  Thu, 07 Mar 2019 17:35:55 +1100
+
 ikiwiki (3.20141016.4) jessie-security; urgency=high
 
   * Reference CVE-2016-4561 in 3.20141016.3 changelog
diff -Nru ikiwiki-3.20141016.4/debian/control ikiwiki-3.20141016.4+deb8u1/debian/control
--- ikiwiki-3.20141016.4/debian/control	2017-01-12 05:18:52.000000000 +1100
+++ ikiwiki-3.20141016.4+deb8u1/debian/control	2019-03-07 17:35:55.000000000 +1100
@@ -17,7 +17,8 @@
   libnet-openid-consumer-perl,
   libxml-feed-perl,
   libxml-parser-perl,
-  libxml-twig-perl
+  libxml-twig-perl,
+  liblwpx-paranoidagent-perl,
 Maintainer: Simon McVittie <smcv@debian.org>
 Uploaders: Josh Triplett <josh@freedesktop.org>
 Standards-Version: 3.9.5
diff -Nru ikiwiki-3.20141016.4/debian/patches/CVE-2019-9187-1.patch ikiwiki-3.20141016.4+deb8u1/debian/patches/CVE-2019-9187-1.patch
--- ikiwiki-3.20141016.4/debian/patches/CVE-2019-9187-1.patch	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/debian/patches/CVE-2019-9187-1.patch	2019-03-07 17:32:31.000000000 +1100
@@ -0,0 +1,23 @@
+From e7b0d4a0fff8ed45a90c2efe8ef294bdf7c9bdac Mon Sep 17 00:00:00 2001
+From: Simon McVittie <smcv@debian.org>
+Date: Sun, 10 Feb 2019 16:29:19 +0000
+Subject: [PATCH] useragent: Raise an exception if the LWP module can't be
+ loaded
+
+Signed-off-by: Simon McVittie <smcv@debian.org>
+---
+ IkiWiki.pm | 3 +++
+ 1 file changed, 3 insertions(+)
+
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -2368,6 +2368,9 @@
+ }
+ 
+ sub useragent () {
++	eval q{use LWP};
++	error($@) if $@;
++
+ 	return LWP::UserAgent->new(
+ 		cookie_jar => $config{cookiejar},
+ 		env_proxy => 1,		# respect proxy env vars
diff -Nru ikiwiki-3.20141016.4/debian/patches/CVE-2019-9187-2.patch ikiwiki-3.20141016.4+deb8u1/debian/patches/CVE-2019-9187-2.patch
--- ikiwiki-3.20141016.4/debian/patches/CVE-2019-9187-2.patch	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/debian/patches/CVE-2019-9187-2.patch	2019-03-07 17:32:43.000000000 +1100
@@ -0,0 +1,224 @@
+From 67543ce1d62161fdef9dca198289d7dd7dceacc0 Mon Sep 17 00:00:00 2001
+From: Simon McVittie <smcv@debian.org>
+Date: Sun, 10 Feb 2019 16:30:07 +0000
+Subject: [PATCH] useragent: Don't allow non-HTTP protocols to be used
+
+This prevents the aggregate plugin from being used to read the contents
+of local files via file:/// URLs.
+
+Signed-off-by: Simon McVittie <smcv@debian.org>
+---
+ IkiWiki.pm                         |   1 +
+ t/aggregate-file.t                 | 173 +++++++++++++++++++++++++++++++++++++
+ t/noparanoia/LWPx/ParanoidAgent.pm |   2 +
+ t/secret.rss                       |  11 +++
+ 4 files changed, 187 insertions(+)
+ create mode 100755 t/aggregate-file.t
+ create mode 100644 t/noparanoia/LWPx/ParanoidAgent.pm
+ create mode 100644 t/secret.rss
+
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -2375,6 +2375,7 @@
+ 		cookie_jar => $config{cookiejar},
+ 		env_proxy => 1,		# respect proxy env vars
+ 		agent => $config{useragent},
++		protocols_allowed => [qw(http https)],
+ 	);
+ }
+ 
+--- /dev/null
++++ b/t/aggregate-file.t
+@@ -0,0 +1,173 @@
++#!/usr/bin/perl
++use utf8;
++use warnings;
++use strict;
++
++use Encode;
++use Test::More;
++
++BEGIN {
++	plan(skip_all => "CGI not available")
++		unless eval q{
++			use CGI qw();
++			1;
++		};
++
++	plan(skip_all => "IPC::Run not available")
++		unless eval q{
++			use IPC::Run qw(run);
++			1;
++		};
++
++	use_ok('IkiWiki');
++	use_ok('YAML::XS');
++}
++
++# We check for English error messages
++$ENV{LC_ALL} = 'C';
++
++use Cwd qw(getcwd);
++use Errno qw(ENOENT);
++
++my $installed = $ENV{INSTALLED_TESTS};
++
++my @command;
++if ($installed) {
++	@command = qw(ikiwiki --plugin inline);
++}
++else {
++	ok(! system("make -s ikiwiki.out"));
++	@command = ("perl", "-I".getcwd."/blib/lib", './ikiwiki.out',
++		'--underlaydir='.getcwd.'/underlays/basewiki',
++		'--set', 'underlaydirbase='.getcwd.'/underlays',
++		'--templatedir='.getcwd.'/templates');
++}
++
++sub write_old_file {
++	my $name = shift;
++	my $dir = shift;
++	my $content = shift;
++	writefile($name, $dir, $content);
++	ok(utime(333333333, 333333333, "$dir/$name"));
++}
++
++sub write_setup_file {
++	my %params = @_;
++	my %setup = (
++		wikiname => 'this is the name of my wiki',
++		srcdir => getcwd.'/t/tmp/in',
++		destdir => getcwd.'/t/tmp/out',
++		url => 'http://example.com',
++		cgiurl => 'http://example.com/cgi-bin/ikiwiki.cgi',
++		cgi_wrapper => getcwd.'/t/tmp/ikiwiki.cgi',
++		cgi_wrappermode => '0751',
++		add_plugins => [qw(aggregate)],
++		disable_plugins => [qw(emailauth openid passwordauth)],
++		aggregate_webtrigger => 1,
++	);
++	if ($params{without_paranoia}) {
++		$setup{libdirs} = [getcwd.'/t/noparanoia'];
++	}
++	unless ($installed) {
++		$setup{ENV} = { 'PERL5LIB' => getcwd.'/blib/lib' };
++	}
++	writefile("test.setup", "t/tmp",
++		"# IkiWiki::Setup::Yaml - YAML formatted setup file\n" .
++		Dump(\%setup));
++}
++
++sub thoroughly_rebuild {
++	ok(unlink("t/tmp/ikiwiki.cgi") || $!{ENOENT});
++	ok(! system(@command, qw(--setup t/tmp/test.setup --rebuild --wrappers)));
++}
++
++sub run_cgi {
++	my (%args) = @_;
++	my ($in, $out);
++	my $method = $args{method} || 'GET';
++	my $environ = $args{environ} || {};
++	my $params = $args{params} || { do => 'prefs' };
++
++	my %defaults = (
++		SCRIPT_NAME	=> '/cgi-bin/ikiwiki.cgi',
++		HTTP_HOST	=> 'example.com',
++	);
++
++	my $cgi = CGI->new($args{params});
++	my $query_string = $cgi->query_string();
++	diag $query_string;
++
++	if ($method eq 'POST') {
++		$defaults{REQUEST_METHOD} = 'POST';
++		$in = $query_string;
++		$defaults{CONTENT_LENGTH} = length $in;
++	} else {
++		$defaults{REQUEST_METHOD} = 'GET';
++		$defaults{QUERY_STRING} = $query_string;
++	}
++
++	my %envvars = (
++		%defaults,
++		%$environ,
++	);
++	run(["./t/tmp/ikiwiki.cgi"], \$in, \$out, init => sub {
++		map {
++			$ENV{$_} = $envvars{$_}
++		} keys(%envvars);
++	});
++
++	return decode_utf8($out);
++}
++
++sub test {
++	my $content;
++
++	ok(! system(qw(rm -rf t/tmp)));
++	ok(! system(qw(mkdir t/tmp)));
++
++	write_old_file('aggregator.mdwn', 't/tmp/in',
++		'[[!aggregate name="ssrf" url="file://'.getcwd.'/t/secret.rss"]]'
++		.'[[!inline pages="internal(aggregator/*)"]]');
++
++	write_setup_file();
++	thoroughly_rebuild();
++
++	$content = run_cgi(
++		method => 'GET',
++		params => {
++			do => 'aggregate_webtrigger',
++		},
++	);
++	unlike($content, qr{creating new page});
++	unlike($content, qr{Secrets});
++	ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf');
++	ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf/Secrets_go_here._aggregated');
++
++	thoroughly_rebuild();
++	$content = readfile('t/tmp/out/aggregator/index.html');
++	unlike($content, qr{Secrets});
++
++	diag('Trying test again with LWPx::ParanoidAgent disabled');
++
++	write_setup_file(without_paranoia => 1);
++	thoroughly_rebuild();
++
++	$content = run_cgi(
++		method => 'GET',
++		params => {
++			do => 'aggregate_webtrigger',
++		},
++	);
++	unlike($content, qr{creating new page});
++	unlike($content, qr{Secrets});
++	ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf');
++	ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf/Secrets_go_here._aggregated');
++
++	thoroughly_rebuild();
++	$content = readfile('t/tmp/out/aggregator/index.html');
++	unlike($content, qr{Secrets});
++}
++
++test();
++
++done_testing();
+--- /dev/null
++++ b/t/noparanoia/LWPx/ParanoidAgent.pm
+@@ -0,0 +1,2 @@
++# make import fail
++0;
+--- /dev/null
++++ b/t/secret.rss
+@@ -0,0 +1,11 @@
++<?xml version="1.0"?>
++<rss version="2.0">
++<channel>
++<title>Secrets go here</title>
++<description>Secrets go here</description>
++<item>
++  <title>Secrets go here</title>
++  <description>Secrets go here</description>
++</item>
++</channel>
++</rss>
diff -Nru ikiwiki-3.20141016.4/debian/patches/CVE-2019-9187-3.patch ikiwiki-3.20141016.4+deb8u1/debian/patches/CVE-2019-9187-3.patch
--- ikiwiki-3.20141016.4/debian/patches/CVE-2019-9187-3.patch	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/debian/patches/CVE-2019-9187-3.patch	2019-03-07 17:32:58.000000000 +1100
@@ -0,0 +1,574 @@
+From d283e4ca1aeb6ca8cc0951c8495f778071076013 Mon Sep 17 00:00:00 2001
+From: Simon McVittie <smcv@debian.org>
+Date: Sun, 10 Feb 2019 17:22:06 +0000
+Subject: [PATCH] useragent: Automatically choose whether to use
+ LWPx::ParanoidAgent
+
+The simple implementation of this, which I'd prefer to use, would be:
+if we can import LWPx::ParanoidAgent, use it; otherwise, use
+LWP::UserAgent.
+
+However, aggregate has historically worked with proxies, and
+LWPx::ParanoidAgent quite reasonably refuses to work with proxies
+(because it can't know whether those proxies are going to do the same
+filtering that LWPx::ParanoidAgent would).
+
+Signed-off-by: Simon McVittie <smcv@debian.org>
+---
+ IkiWiki.pm                  | 123 ++++++++++++++++-
+ IkiWiki/Plugin/aggregate.pm |   5 +-
+ IkiWiki/Plugin/blogspam.pm  |  16 +--
+ IkiWiki/Plugin/openid.pm    |  12 +-
+ IkiWiki/Plugin/pinger.pm    |  21 ++-
+ t/useragent.t               | 317 ++++++++++++++++++++++++++++++++++++++++++++
+ 6 files changed, 458 insertions(+), 36 deletions(-)
+ create mode 100755 t/useragent.t
+
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -2367,16 +2367,131 @@
+ 	$autofiles{$file}{generator}=$generator;
+ }
+ 
+-sub useragent () {
++sub useragent (@) {
++	my %params = @_;
++	my $for_url = delete $params{for_url};
++	# Fail safe, in case a plugin calling this function is relying on
++	# a future parameter to make the UA more strict
++	foreach my $key (keys %params) {
++		error "Internal error: useragent(\"$key\" => ...) not understood";
++	}
++
+ 	eval q{use LWP};
+ 	error($@) if $@;
+ 
+-	return LWP::UserAgent->new(
+-		cookie_jar => $config{cookiejar},
+-		env_proxy => 1,		# respect proxy env vars
++	my %args = (
+ 		agent => $config{useragent},
++		cookie_jar => $config{cookiejar},
++		env_proxy => 0,
+ 		protocols_allowed => [qw(http https)],
+ 	);
++	my %proxies;
++
++	if (defined $for_url) {
++		# We know which URL we're going to fetch, so we can choose
++		# whether it's going to go through a proxy or not.
++		#
++		# We reimplement http_proxy, https_proxy and no_proxy here, so
++		# that we are not relying on LWP implementing them exactly the
++		# same way we do.
++
++		eval q{use URI};
++		error($@) if $@;
++
++		my $proxy;
++		my $uri = URI->new($for_url);
++
++		if ($uri->scheme eq 'http') {
++			$proxy = $ENV{http_proxy};
++			# HTTP_PROXY is deliberately not implemented
++			# because the HTTP_* namespace is also used by CGI
++		}
++		elsif ($uri->scheme eq 'https') {
++			$proxy = $ENV{https_proxy};
++			$proxy = $ENV{HTTPS_PROXY} unless defined $proxy;
++		}
++		else {
++			$proxy = undef;
++		}
++
++		foreach my $var (qw(no_proxy NO_PROXY)) {
++			my $no_proxy = $ENV{$var};
++			if (defined $no_proxy) {
++				foreach my $domain (split /\s*,\s*/, $no_proxy) {
++					if ($domain =~ s/^\*?\.//) {
++						# no_proxy="*.example.com" or
++						# ".example.com": match suffix
++						# against .example.com
++						if ($uri->host =~ m/(^|\.)\Q$domain\E$/i) {
++							$proxy = undef;
++						}
++					}
++					else {
++						# no_proxy="example.com":
++						# match exactly example.com
++						if (lc $uri->host eq lc $domain) {
++							$proxy = undef;
++						}
++					}
++				}
++			}
++		}
++
++		if (defined $proxy) {
++			$proxies{$uri->scheme} = $proxy;
++			# Paranoia: make sure we can't bypass the proxy
++			$args{protocols_allowed} = [$uri->scheme];
++		}
++	}
++	else {
++		# The plugin doesn't know yet which URL(s) it's going to
++		# fetch, so we have to make some conservative assumptions.
++		my $http_proxy = $ENV{http_proxy};
++		my $https_proxy = $ENV{https_proxy};
++		$https_proxy = $ENV{HTTPS_PROXY} unless defined $https_proxy;
++
++		# We don't respect no_proxy here: if we are not using the
++		# paranoid user-agent, then we need to give the proxy the
++		# opportunity to reject undesirable requests.
++
++		# If we have one, we need the other: otherwise, neither
++		# LWPx::ParanoidAgent nor the proxy would have the
++		# opportunity to filter requests for the other protocol.
++		if (defined $https_proxy && defined $http_proxy) {
++			%proxies = (http => $http_proxy, https => $https_proxy);
++		}
++		elsif (defined $https_proxy) {
++			%proxies = (http => $https_proxy, https => $https_proxy);
++		}
++		elsif (defined $http_proxy) {
++			%proxies = (http => $http_proxy, https => $http_proxy);
++		}
++
++	}
++
++	if (scalar keys %proxies) {
++		# The configured proxy is responsible for deciding which
++		# URLs are acceptable to fetch and which URLs are not.
++		my $ua = LWP::UserAgent->new(%args);
++		foreach my $scheme (@{$ua->protocols_allowed}) {
++			unless ($proxies{$scheme}) {
++				error "internal error: $scheme is allowed but has no proxy";
++			}
++		}
++		# We can't pass the proxies in %args because that only
++		# works since LWP 6.24.
++		foreach my $scheme (keys %proxies) {
++			$ua->proxy($scheme, $proxies{$scheme});
++		}
++		return $ua;
++	}
++
++	eval q{use LWPx::ParanoidAgent};
++	if ($@) {
++		print STDERR "warning: installing LWPx::ParanoidAgent is recommended\n";
++		return LWP::UserAgent->new(%args);
++	}
++	return LWPx::ParanoidAgent->new(%args);
+ }
+ 
+ sub sortspec_translate ($$) {
+--- a/IkiWiki/Plugin/aggregate.pm
++++ b/IkiWiki/Plugin/aggregate.pm
+@@ -513,7 +513,10 @@
+ 			}
+ 			$feed->{feedurl}=pop @urls;
+ 		}
+-		my $ua=useragent();
++		# Using the for_url parameter makes sure we crash if used
++		# with an older IkiWiki.pm that didn't automatically try
++		# to use LWPx::ParanoidAgent.
++		my $ua=useragent(for_url => $feed->{feedurl});
+ 		my $res=URI::Fetch->fetch($feed->{feedurl}, UserAgent=>$ua);
+ 		if (! $res) {
+ 			$feed->{message}=URI::Fetch->errstr;
+--- a/IkiWiki/Plugin/blogspam.pm
++++ b/IkiWiki/Plugin/blogspam.pm
+@@ -57,18 +57,10 @@
+ 	};
+ 	error $@ if $@;
+ 
+-	eval q{use LWPx::ParanoidAgent};
+-	if (!$@) {
+-		$client=LWPx::ParanoidAgent->new(agent => $config{useragent});
+-	}
+-	else {
+-		eval q{use LWP};
+-		if ($@) {
+-			error $@;
+-			return;
+-		}
+-		$client=useragent();
+-	}
++	# Using the for_url parameter makes sure we crash if used
++	# with an older IkiWiki.pm that didn't automatically try
++	# to use LWPx::ParanoidAgent.
++	$client=useragent(for_url => $config{blogspam_server});
+ }
+ 
+ sub checkcontent (@) {
+--- a/IkiWiki/Plugin/openid.pm
++++ b/IkiWiki/Plugin/openid.pm
+@@ -237,14 +237,10 @@
+ 	eval q{use Net::OpenID::Consumer};
+ 	error($@) if $@;
+ 
+-	my $ua;
+-	eval q{use LWPx::ParanoidAgent};
+-	if (! $@) {
+-		$ua=LWPx::ParanoidAgent->new(agent => $config{useragent});
+-	}
+-	else {
+-		$ua=useragent();
+-	}
++	# We pass the for_url parameter, even though it's undef, because
++	# that will make sure we crash if used with an older IkiWiki.pm
++	# that didn't automatically try to use LWPx::ParanoidAgent.
++	my $ua=useragent(for_url => undef);
+ 
+ 	# Store the secret in the session.
+ 	my $secret=$session->param("openid_secret");
+--- a/IkiWiki/Plugin/pinger.pm
++++ b/IkiWiki/Plugin/pinger.pm
+@@ -70,17 +70,16 @@
+ 		eval q{use Net::INET6Glue::INET_is_INET6}; # may not be available
+ 		
+ 		my $ua;
+-		eval q{use LWPx::ParanoidAgent};
+-		if (!$@) {
+-			$ua=LWPx::ParanoidAgent->new(agent => $config{useragent});
+-		}
+-		else {
+-			eval q{use LWP};
+-			if ($@) {
+-				debug(gettext("LWP not found, not pinging"));
+-				return;
+-			}
+-			$ua=useragent();
++		eval {
++			# We pass the for_url parameter, even though it's
++			# undef, because that will make sure we crash if used
++			# with an older IkiWiki.pm that didn't automatically
++			# try to use LWPx::ParanoidAgent.
++			$ua=useragent(for_url => undef);
++		};
++		if ($@) {
++			debug(gettext("LWP not found, not pinging").": $@");
++			return;
+ 		}
+ 		$ua->timeout($config{pinger_timeout} || 15);
+ 		
+--- /dev/null
++++ b/t/useragent.t
+@@ -0,0 +1,317 @@
++#!/usr/bin/perl
++use warnings;
++use strict;
++use Test::More;
++
++my $have_paranoid_agent;
++BEGIN {
++	plan(skip_all => 'LWP not available')
++		unless eval q{
++			use LWP qw(); 1;
++		};
++	use_ok("IkiWiki");
++	$have_paranoid_agent = eval q{
++		use LWPx::ParanoidAgent qw(); 1;
++	};
++}
++
++eval { useragent(future_feature => 1); };
++ok($@, 'future features should cause useragent to fail');
++
++diag "==== No proxy ====";
++delete $ENV{http_proxy};
++delete $ENV{https_proxy};
++delete $ENV{no_proxy};
++delete $ENV{HTTPS_PROXY};
++delete $ENV{NO_PROXY};
++
++diag "---- Unspecified URL ----";
++my $ua = useragent(for_url => undef);
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef, 'No http proxy');
++is($ua->proxy('https'), undef, 'No https proxy');
++
++diag "---- Specified URL ----";
++$ua = useragent(for_url => 'http://example.com');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef, 'No http proxy');
++is($ua->proxy('https'), undef, 'No https proxy');
++
++diag "==== Proxy for everything ====";
++$ENV{http_proxy} = 'http://proxy:8080';
++$ENV{https_proxy} = 'http://sproxy:8080';
++delete $ENV{no_proxy};
++delete $ENV{HTTPS_PROXY};
++delete $ENV{NO_PROXY};
++
++diag "---- Unspecified URL ----";
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++$ua = useragent(for_url => 'http://example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++# We don't care what $ua->proxy('https') is, because it won't be used
++$ua = useragent(for_url => 'https://example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++# We don't care what $ua->proxy('http') is, because it won't be used
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++
++diag "==== Selective proxy ====";
++$ENV{http_proxy} = 'http://proxy:8080';
++$ENV{https_proxy} = 'http://sproxy:8080';
++$ENV{no_proxy} = '*.example.net,example.com,.example.org';
++delete $ENV{HTTPS_PROXY};
++delete $ENV{NO_PROXY};
++
++diag "---- Unspecified URL ----";
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++
++diag "---- Exact match for no_proxy ----";
++$ua = useragent(for_url => 'http://example.com');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- Subdomain of exact domain in no_proxy ----";
++$ua = useragent(for_url => 'http://sub.example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++
++diag "---- example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://sub.example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.net does not match *.example.net ----";
++$ua = useragent(for_url => 'https://badexample.net');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "---- example.org matches .example.org ----";
++$ua = useragent(for_url => 'https://example.org');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.org matches .example.org ----";
++$ua = useragent(for_url => 'https://sub.example.org');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.org does not match .example.org ----";
++$ua = useragent(for_url => 'https://badexample.org');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "==== Selective proxy (alternate variables) ====";
++$ENV{http_proxy} = 'http://proxy:8080';
++delete $ENV{https_proxy};
++$ENV{HTTPS_PROXY} = 'http://sproxy:8080';
++delete $ENV{no_proxy};
++$ENV{NO_PROXY} = '*.example.net,example.com,.example.org';
++
++diag "---- Unspecified URL ----";
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++
++diag "---- Exact match for no_proxy ----";
++$ua = useragent(for_url => 'http://example.com');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- Subdomain of exact domain in no_proxy ----";
++$ua = useragent(for_url => 'http://sub.example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++
++diag "---- example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://sub.example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.net does not match *.example.net ----";
++$ua = useragent(for_url => 'https://badexample.net');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "---- example.org matches .example.org ----";
++$ua = useragent(for_url => 'https://example.org');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.org matches .example.org ----";
++$ua = useragent(for_url => 'https://sub.example.org');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.org does not match .example.org ----";
++$ua = useragent(for_url => 'https://badexample.org');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "==== Selective proxy (many variables) ====";
++$ENV{http_proxy} = 'http://proxy:8080';
++$ENV{https_proxy} = 'http://sproxy:8080';
++# This one should be ignored in favour of https_proxy
++$ENV{HTTPS_PROXY} = 'http://not.preferred.proxy:3128';
++# These two should be merged
++$ENV{no_proxy} = '*.example.net,example.com';
++$ENV{NO_PROXY} = '.example.org';
++
++diag "---- Unspecified URL ----";
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
++
++diag "---- Exact match for no_proxy ----";
++$ua = useragent(for_url => 'http://example.com');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- Subdomain of exact domain in no_proxy ----";
++$ua = useragent(for_url => 'http://sub.example.com');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++
++diag "---- example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- sub.example.net matches *.example.net ----";
++$ua = useragent(for_url => 'https://sub.example.net');
++SKIP: {
++	skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
++	ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
++}
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), undef);
++is($ua->proxy('https'), undef);
++
++diag "---- badexample.net does not match *.example.net ----";
++$ua = useragent(for_url => 'https://badexample.net');
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++diag "==== One but not the other ====\n";
++$ENV{http_proxy} = 'http://proxy:8080';
++delete $ENV{https_proxy};
++delete $ENV{HTTPS_PROXY};
++delete $ENV{no_proxy};
++delete $ENV{NO_PROXY};
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://proxy:8080', 'should use proxy');
++
++delete $ENV{http_proxy};
++$ENV{https_proxy} = 'http://sproxy:8080';
++delete $ENV{HTTPS_PROXY};
++delete $ENV{no_proxy};
++delete $ENV{NO_PROXY};
++$ua = useragent(for_url => undef);
++ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent');
++is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
++is($ua->proxy('http'), 'http://sproxy:8080', 'should use proxy');
++is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
++
++done_testing;
diff -Nru ikiwiki-3.20141016.4/debian/patches/CVE-2019-9187-4.patch ikiwiki-3.20141016.4+deb8u1/debian/patches/CVE-2019-9187-4.patch
--- ikiwiki-3.20141016.4/debian/patches/CVE-2019-9187-4.patch	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/debian/patches/CVE-2019-9187-4.patch	2019-03-07 17:35:45.000000000 +1100
@@ -0,0 +1,159 @@
+From 9a275b2f1846d7268c71a740975447e269383849 Mon Sep 17 00:00:00 2001
+From: Simon McVittie <smcv@debian.org>
+Date: Sun, 10 Feb 2019 16:56:41 +0000
+Subject: [PATCH] doc: Document security issues involving LWP::UserAgent
+
+Recommend the LWPx::ParanoidAgent module where appropriate.
+It is particularly important for openid, since unauthenticated users
+can control which URLs that plugin will contact. Conversely, it is
+non-critical for blogspam, since the URL to be contacted is under
+the wiki administrator's control.
+
+Signed-off-by: Simon McVittie <smcv@debian.org>
+---
+ doc/plugins/aggregate.mdwn  |  4 ++++
+ doc/plugins/blogspam.mdwn   |  2 ++
+ doc/plugins/openid.mdwn     |  7 +++++--
+ doc/plugins/pinger.mdwn     |  8 +++++---
+ doc/security.mdwn           | 49 +++++++++++++++++++++++++++++++++++++++++++++
+ doc/tips/using_a_proxy.mdwn | 22 ++++++++++++++++++++
+ 6 files changed, 87 insertions(+), 5 deletions(-)
+ create mode 100644 doc/tips/using_a_proxy.mdwn
+
+--- a/doc/plugins/aggregate.mdwn
++++ b/doc/plugins/aggregate.mdwn
+@@ -11,6 +11,10 @@
+ one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since
+ feeds can easily contain html problems, some of which these plugins can fix.
+ 
++Installing the [[!cpan LWPx::ParanoidAgent]] Perl module is strongly
++recommended. The [[!cpan LWP]] module can also be used, but is susceptible
++to server-side request forgery.
++
+ ## triggering aggregation
+ 
+ You will need to run ikiwiki periodically from a cron job, passing it the
+--- a/doc/plugins/blogspam.mdwn
++++ b/doc/plugins/blogspam.mdwn
+@@ -11,6 +11,8 @@
+ go to your Preferences page, and click the "Comment Moderation" button.
+ 
+ The plugin requires the [[!cpan JSON]] perl module.
++The [[!cpan LWPx::ParanoidAgent]] Perl module is recommended,
++although this plugin can also fall back to [[!cpan LWP]].
+ 
+ You can control how content is tested via the `blogspam_options` setting.
+ The list of options is [here](http://blogspam.net/api/testComment.html#options).
+--- a/doc/plugins/openid.mdwn
++++ b/doc/plugins/openid.mdwn
+@@ -7,8 +7,11 @@
+ The plugin needs the [[!cpan Net::OpenID::Consumer]] perl module.
+ Version 1.x is needed in order for OpenID v2 to work.
+ 
+-The [[!cpan LWPx::ParanoidAgent]] perl module is used if available, for
+-added security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed
++The [[!cpan LWPx::ParanoidAgent]] Perl module is strongly recommended.
++The [[!cpan LWP]] module can also be used, but is susceptible to
++server-side request forgery.
++
++The [[!cpan Crypt::SSLeay]] Perl module is needed
+ to support users entering "https" OpenID urls.
+ 
+ This plugin is enabled by default, but can be turned off if you want to
+--- a/doc/plugins/pinger.mdwn
++++ b/doc/plugins/pinger.mdwn
+@@ -10,9 +10,11 @@
+ To configure what URLs to ping, use the [[ikiwiki/directive/ping]]
+ [[ikiwiki/directive]].
+ 
+-The [[!cpan LWP]] perl module is used for pinging. Or the [[!cpan
+-LWPx::ParanoidAgent]] perl module is used if available, for added security.
+-Finally, the [[!cpan Crypt::SSLeay]] perl module is needed to support pinging
++The [[!cpan LWPx::ParanoidAgent]] Perl module is strongly recommended.
++The [[!cpan LWP]] module can also be used, but is susceptible
++to server-side request forgery.
++
++The [[!cpan Crypt::SSLeay]] perl module is needed to support pinging
+ "https" urls.
+ 
+ By default the pinger will try to ping a site for 15 seconds before timing
+--- a/doc/security.mdwn
++++ b/doc/security.mdwn
+@@ -526,3 +526,52 @@
+ able to attach images. Upgrading ImageMagick to a version where
+ CVE-2016-3714 has been fixed is also recommended, but at the time of
+ writing no such version is available.
++
++## Server-side request forgery via aggregate plugin
++
++The ikiwiki maintainers discovered that the [[plugins/aggregate]] plugin
++did not use [[!cpan LWPx::ParanoidAgent]]. On sites where the
++aggregate plugin is enabled, authorized wiki editors could tell ikiwiki
++to fetch potentially undesired URIs even if LWPx::ParanoidAgent was
++installed:
++
++* local files via `file:` URIs
++* other URI schemes that might be misused by attackers, such as `gopher:`
++* hosts that resolve to loopback IP addresses (127.x.x.x)
++* hosts that resolve to RFC 1918 IP addresses (192.168.x.x etc.)
++
++This could be used by an attacker to publish information that should not have
++been accessible, cause denial of service by requesting "tarpit" URIs that are
++slow to respond, or cause undesired side-effects if local web servers implement
++["unsafe"](https://tools.ietf.org/html/rfc7231#section-4.2.1) GET requests.
++([[!debcve CVE-2019-9187]])
++
++Additionally, if the LWPx::ParanoidAgent module was not installed, the
++[[plugins/blogspam]], [[plugins/openid]] and [[plugins/pinger]] plugins
++would fall back to [[!cpan LWP]], which is susceptible to similar attacks.
++This is unlikely to be a practical problem for the blogspam plugin because
++the URL it requests is under the control of the wiki administrator, but
++the openid plugin can request URLs controlled by unauthenticated remote
++users, and the pinger plugin can request URLs controlled by authorized
++wiki editors.
++
++This is addressed in ikiwiki 3.20190228 as follows, with the same fixes
++backported to Debian 9 in version 3.20170111.1:
++
++* URI schemes other than `http:` and `https:` are not accepted, preventing
++  access to `file:`, `gopher:`, etc.
++
++* If a proxy is [[configured in the ikiwiki setup file|tips/using_a_proxy]],
++  it is used for all outgoing `http:` and `https:` requests. In this case
++  the proxy is responsible for blocking any requests that are undesired,
++  including loopback or RFC 1918 addresses.
++
++* If a proxy is not configured, and LWPx::ParanoidAgent is installed,
++  it will be used. This prevents loopback and RFC 1918 IP addresses, and
++  sets a timeout to avoid denial of service via "tarpit" URIs.
++
++* Otherwise, the ordinary LWP user-agent will be used. This allows requests
++  to loopback and RFC 1918 IP addresses, and has less robust timeout
++  behaviour. We are not treating this as a vulnerability: if this
++  behaviour is not acceptable for your site, please make sure to install
++  LWPx::ParanoidAgent or disable the affected plugins.
+--- /dev/null
++++ b/doc/tips/using_a_proxy.mdwn
+@@ -0,0 +1,22 @@
++Some ikiwiki plugins make outgoing HTTP requests from the web server:
++
++* [[plugins/aggregate]] (to download Atom and RSS feeds)
++* [[plugins/blogspam]] (to check whether a comment or edit is spam)
++* [[plugins/openid]] (to authenticate users)
++* [[plugins/pinger]] (to ping other ikiwiki installations)
++
++If your ikiwiki installation cannot contact the Internet without going
++through a proxy, you can configure this in the [[setup file|setup]] by
++setting environment variables:
++
++    ENV:
++        http_proxy: "http://proxy.example.com:8080";
++        https_proxy: "http://proxy.example.com:8080";
++        # optional
++        no_proxy: ".example.com,www.example.org"
++
++Note that some plugins will use the configured proxy for all destinations,
++even if they are listed in `no_proxy`.
++
++To avoid server-side request forgery attacks, ensure that your proxy does
++not allow requests to addresses that are considered to be internal.
diff -Nru ikiwiki-3.20141016.4/debian/patches/series ikiwiki-3.20141016.4+deb8u1/debian/patches/series
--- ikiwiki-3.20141016.4/debian/patches/series	1970-01-01 10:00:00.000000000 +1000
+++ ikiwiki-3.20141016.4+deb8u1/debian/patches/series	2019-03-07 17:35:55.000000000 +1100
@@ -0,0 +1,4 @@
+CVE-2019-9187-1.patch
+CVE-2019-9187-2.patch
+CVE-2019-9187-3.patch
+CVE-2019-9187-4.patch

Reply to: