Bug#1112331: trixie-pu: package swift/2.35.0-4
Package: release.debian.org
Severity: normal
Tags: trixie
X-Debbugs-Cc: swift@packages.debian.org
Control: affects -1 + src:swift
User: release.debian.org@packages.debian.org
Usertags: pu
Hi,
[ Reason ]
Since the boto lib was upgraded upstream, it mandates the support
of the "Transfer-Encoding: aws-chunked" protocol. In many case,
our users do not control what eversion of the boto library is used
as client to swift, for using the S3 protocol support.
Upstream Swift has added the support for this 10 years old protocol
in the point release 2.35.1 (as a backport from the master branch).
This was done upstream partly because I asked for it, explaining that
such a point release could be accepted in Debian, if the change aren't
too big, but a new upstream release such as 2.36.x would not. Nicely,
they helped and did the backport work, so that it's acceptable by the
stable release team. Big up to Tim Burk for his work.
So, I would like to upgrade Swift from upstream version 2.35.0 to this
new 2.35.1, so that Swift can have support for this S3 transfer
protocol that is used almost everywhere these days. That's the only
change there is in this point release, as per the upstream release notes.
[ Impact ]
No support for the S3 protocol "Transfer-Encoding: aws-chunked" for
the next 2 years of Debian, which is really harmful, IMO.
[ Tests ]
Upstream runs extensive functional test, and the Debian swift package
runs upstream unit tests at build time and in autopkgtest.
[ Risks ]
Not much risk as the only part that's touched is the S3 protocol part.
[ Checklist ]
[x] *all* changes are documented in the d/changelog
[x] I reviewed all changes and I approve them
[x] attach debdiff against the package in (old)stable
[x] the issue is verified as fixed in unstable
Please allow me to upload swift/2.35.1-1~deb13u1 to Trixie p-u.
Cheers,
Thomas Goirand (zigo)
diff -Nru swift-2.35.0/CHANGELOG swift-2.35.1/CHANGELOG
--- swift-2.35.0/CHANGELOG 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/CHANGELOG 2025-08-22 17:56:44.000000000 +0200
@@ -1,3 +1,19 @@
+swift (2.35.1, epoxy stable backports)
+
+ * S3 API
+
+ * Added support for aws-chunked transfers. Recent AWS clients recently
+ began defaulting to this mode. See also:
+ https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-streaming.html
+
+ * Added support for verifying additional checksums during upload. All
+ algorithms currently supported by AWS are supported: CRC64NVME,
+ CRC32, CRC32C, SHA1, and SHA256. See also:
+ https://docs.aws.amazon.com/AmazonS3/latest/userguide/checking-object-integrity.html
+ Note that some algorithms require the availability of additional
+ libraries: ISA-L or anycrc.
+
+
swift (2.35.0, OpenStack Epoxy)
* Removed the use of `eval` in the xprofile middleware. Note that this
diff -Nru swift-2.35.0/debian/changelog swift-2.35.1/debian/changelog
--- swift-2.35.0/debian/changelog 2025-07-10 10:38:51.000000000 +0200
+++ swift-2.35.1/debian/changelog 2025-08-28 16:07:31.000000000 +0200
@@ -1,3 +1,9 @@
+swift (2.35.1-1~deb13u1) trixie; urgency=medium
+
+ * New upstream point release.
+
+ -- Thomas Goirand <zigo@debian.org> Thu, 28 Aug 2025 16:07:31 +0200
+
swift (2.35.0-4) unstable; urgency=medium
* Add test_PUT_account_update (Closes: #1108751).
diff -Nru swift-2.35.0/doc/s3api/rnc/complete_multipart_upload.rnc swift-2.35.1/doc/s3api/rnc/complete_multipart_upload.rnc
--- swift-2.35.0/doc/s3api/rnc/complete_multipart_upload.rnc 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/doc/s3api/rnc/complete_multipart_upload.rnc 2025-08-22 17:56:44.000000000 +0200
@@ -2,6 +2,11 @@
element CompleteMultipartUpload {
element Part {
element PartNumber { xsd:int } &
- element ETag { xsd:string }
+ element ETag { xsd:string } &
+ element ChecksumCRC32 { xsd:string }? &
+ element ChecksumCRC32C { xsd:string }? &
+ element ChecksumCRC64NVME { xsd:string }? &
+ element ChecksumSHA1 { xsd:string }? &
+ element ChecksumSHA256 { xsd:string }?
}+
}
diff -Nru swift-2.35.0/.gitreview swift-2.35.1/.gitreview
--- swift-2.35.0/.gitreview 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/.gitreview 2025-08-22 17:56:44.000000000 +0200
@@ -2,3 +2,4 @@
host=review.opendev.org
port=29418
project=openstack/swift.git
+defaultbranch=stable/2025.1
diff -Nru swift-2.35.0/releasenotes/notes/release-2.35.1-58d9d52dfe46ee4d.yaml swift-2.35.1/releasenotes/notes/release-2.35.1-58d9d52dfe46ee4d.yaml
--- swift-2.35.0/releasenotes/notes/release-2.35.1-58d9d52dfe46ee4d.yaml 1970-01-01 01:00:00.000000000 +0100
+++ swift-2.35.1/releasenotes/notes/release-2.35.1-58d9d52dfe46ee4d.yaml 2025-08-22 17:56:44.000000000 +0200
@@ -0,0 +1,16 @@
+---
+features:
+ - |
+ S3 API
+
+ * Added support for aws-chunked transfers. Recent AWS clients recently
+ began defaulting to this mode. See `Amazon's documentation
+ <https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-streaming.html>`__.
+
+ * Added support for verifying additional checksums during upload. All
+ algorithms currently supported by AWS are supported: ``CRC64NVME``,
+ ``CRC32``, ``CRC32C``, ``SHA1``, and ``SHA256``. See `Amazon's documentation
+ <https://docs.aws.amazon.com/AmazonS3/latest/userguide/checking-object-integrity.html>`__.
+ Note that some algorithms require the availability of additional
+ libraries: `ISA-L <https://github.com/intel/isa-l>`__ or
+ `anycrc <https://pypi.org/project/anycrc>`__.
diff -Nru swift-2.35.0/swift/common/middleware/s3api/exception.py swift-2.35.1/swift/common/middleware/s3api/exception.py
--- swift-2.35.0/swift/common/middleware/s3api/exception.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/swift/common/middleware/s3api/exception.py 2025-08-22 17:56:44.000000000 +0200
@@ -30,3 +30,81 @@
def __init__(self, resource, cause):
self.resource = resource
self.cause = cause
+
+
+class S3InputError(BaseException):
+ """
+ There was an error with the client input detected on read().
+
+ Inherit from BaseException (rather than Exception) so it cuts from the
+ proxy-server app (which will presumably be the one reading the input)
+ through all the layers of the pipeline back to s3api. It should never
+ escape the s3api middleware.
+ """
+
+
+class S3InputIncomplete(S3InputError):
+ pass
+
+
+class S3InputSizeError(S3InputError):
+ def __init__(self, expected, provided):
+ self.expected = expected
+ self.provided = provided
+
+
+class S3InputChunkTooSmall(S3InputError):
+ def __init__(self, bad_chunk_size, chunk_number):
+ self.bad_chunk_size = bad_chunk_size
+ self.chunk_number = chunk_number
+
+
+class S3InputMalformedTrailer(S3InputError):
+ pass
+
+
+class S3InputChunkSignatureMismatch(S3InputError):
+ """
+ Client provided a chunk-signature, but it doesn't match the data.
+
+ This should result in a 403 going back to the client.
+ """
+
+
+class S3InputMissingSecret(S3InputError):
+ """
+ Client provided per-chunk signatures, but we have no secret with which to
+ verify them.
+
+ This happens if the auth middleware responsible for the user never called
+ the provided ``check_signature`` callback.
+ """
+
+
+class S3InputSHA256Mismatch(S3InputError):
+ """
+ Client provided a X-Amz-Content-SHA256, but it doesn't match the data.
+
+ This should result in a BadDigest going back to the client.
+ """
+ def __init__(self, expected, computed):
+ self.expected = expected
+ self.computed = computed
+
+
+class S3InputChecksumMismatch(S3InputError):
+ """
+ Client provided a X-Amz-Checksum-* header, but it doesn't match the data.
+
+ This should result in a InvalidRequest going back to the client.
+ """
+
+
+class S3InputChecksumTrailerInvalid(S3InputError):
+ """
+ Client provided a X-Amz-Checksum-* trailer, but it is not a valid format.
+
+ This should result in a InvalidRequest going back to the client.
+ """
+ def __init__(self, trailer_name):
+ self.trailer = trailer_name
diff -Nru swift-2.35.0/swift/common/middleware/s3api/s3api.py swift-2.35.1/swift/common/middleware/s3api/s3api.py
--- swift-2.35.0/swift/common/middleware/s3api/s3api.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/swift/common/middleware/s3api/s3api.py 2025-08-22 17:56:44.000000000 +0200
@@ -159,7 +159,7 @@
InternalError, MethodNotAllowed, S3ResponseBase, S3NotImplemented
from swift.common.utils import get_logger, config_true_value, \
config_positive_int_value, split_path, closing_if_possible, \
- list_from_csv, parse_header
+ list_from_csv, parse_header, checksum
from swift.common.middleware.s3api.utils import Config
from swift.common.middleware.s3api.acl_handlers import get_acl_handler
from swift.common.registry import register_swift_info, \
@@ -300,6 +300,7 @@
self.logger = get_logger(
wsgi_conf, log_route='s3api', statsd_tail_prefix='s3api')
self.check_pipeline(wsgi_conf)
+ checksum.log_selected_implementation(self.logger)
def is_s3_cors_preflight(self, env):
if env['REQUEST_METHOD'] != 'OPTIONS' or not env.get('HTTP_ORIGIN'):
diff -Nru swift-2.35.0/swift/common/middleware/s3api/s3request.py swift-2.35.1/swift/common/middleware/s3api/s3request.py
--- swift-2.35.0/swift/common/middleware/s3api/s3request.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/swift/common/middleware/s3api/s3request.py 2025-08-22 17:56:44.000000000 +0200
@@ -16,6 +16,7 @@
import base64
import binascii
from collections import defaultdict, OrderedDict
+import contextlib
from email.header import Header
from hashlib import sha1, sha256
import hmac
@@ -25,7 +26,8 @@
import string
from swift.common.utils import split_path, json, md5, streq_const_time, \
- get_policy_index, InputProxy
+ close_if_possible, InputProxy, get_policy_index, list_from_csv, \
+ strict_b64decode, base64_str, checksum
from swift.common.registry import get_swift_info
from swift.common import swob
from swift.common.http import HTTP_OK, HTTP_CREATED, HTTP_ACCEPTED, \
@@ -56,8 +58,14 @@
MalformedXML, InvalidRequest, RequestTimeout, InvalidBucketName, \
BadDigest, AuthorizationHeaderMalformed, SlowDown, \
AuthorizationQueryParametersError, ServiceUnavailable, BrokenMPU, \
- InvalidPartNumber, InvalidPartArgument, XAmzContentSHA256Mismatch
-from swift.common.middleware.s3api.exception import NotS3Request
+ XAmzContentSHA256Mismatch, IncompleteBody, InvalidChunkSizeError, \
+ InvalidPartNumber, InvalidPartArgument, MalformedTrailerError
+from swift.common.middleware.s3api.exception import NotS3Request, \
+ S3InputError, S3InputSizeError, S3InputIncomplete, \
+ S3InputChunkSignatureMismatch, S3InputChunkTooSmall, \
+ S3InputMalformedTrailer, S3InputMissingSecret, \
+ S3InputSHA256Mismatch, S3InputChecksumMismatch, \
+ S3InputChecksumTrailerInvalid
from swift.common.middleware.s3api.utils import utf8encode, \
S3Timestamp, mktime, MULTIUPLOAD_SUFFIX
from swift.common.middleware.s3api.subresource import decode_acl, encode_acl
@@ -82,9 +90,53 @@
MAX_32BIT_INT = 2147483647
SIGV2_TIMESTAMP_FORMAT = '%Y-%m-%dT%H:%M:%S'
SIGV4_X_AMZ_DATE_FORMAT = '%Y%m%dT%H%M%SZ'
+SIGV4_CHUNK_MIN_SIZE = 8192
SERVICE = 's3' # useful for mocking out in tests
+CHECKSUMS_BY_HEADER = {
+ 'x-amz-checksum-crc32': checksum.crc32,
+ 'x-amz-checksum-crc32c': checksum.crc32c,
+ 'x-amz-checksum-crc64nvme': checksum.crc64nvme,
+ 'x-amz-checksum-sha1': sha1,
+ 'x-amz-checksum-sha256': sha256,
+}
+
+
+def _get_checksum_hasher(header):
+ try:
+ return CHECKSUMS_BY_HEADER[header]()
+ except (KeyError, NotImplementedError):
+ raise S3NotImplemented('The %s algorithm is not supported.' % header)
+
+
+def _validate_checksum_value(checksum_hasher, b64digest):
+ return strict_b64decode(
+ b64digest,
+ exact_size=checksum_hasher.digest_size,
+ )
+
+
+def _validate_checksum_header_cardinality(num_checksum_headers,
+ headers_and_trailer=False):
+ if num_checksum_headers > 1:
+ # inconsistent messaging for AWS compatibility...
+ msg = 'Expecting a single x-amz-checksum- header'
+ if not headers_and_trailer:
+ msg += '. Multiple checksum Types are not allowed.'
+ raise InvalidRequest(msg)
+
+
+def _is_streaming(aws_sha256):
+ return aws_sha256 in (
+ 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER',
+ 'STREAMING-AWS4-ECDSA-P256-SHA256-PAYLOAD',
+ 'STREAMING-AWS4-ECDSA-P256-SHA256-PAYLOAD-TRAILER',
+ )
+
+
def _header_strip(value):
# S3 seems to strip *all* control characters
if value is None:
@@ -119,20 +171,6 @@
doc='Get and set the %s acl property' % resource)
-class S3InputSHA256Mismatch(BaseException):
- """
- Client provided a X-Amz-Content-SHA256, but it doesn't match the data.
-
- Inherit from BaseException (rather than Exception) so it cuts from the
- proxy-server app (which will presumably be the one reading the input)
- through all the layers of the pipeline back to us. It should never escape
- the s3api middleware.
- """
- def __init__(self, expected, computed):
- self.expected = expected
- self.computed = computed
-
-
class HashingInput(InputProxy):
"""
wsgi.input wrapper to verify the SHA256 of the input as it's read.
@@ -152,6 +190,8 @@
)
def chunk_update(self, chunk, eof, *args, **kwargs):
+ # Note that "chunk" is just whatever was read from the input; this
+ # says nothing about whether the underlying stream uses aws-chunked
self._hasher.update(chunk)
if self.bytes_received < self._expected_length:
@@ -171,27 +211,517 @@
return chunk
-class SigV4Mixin(object):
+class ChecksummingInput(InputProxy):
"""
- A request class mixin to provide S3 signature v4 functionality
+ wsgi.input wrapper to calculate the X-Amz-Checksum-* of the input as it's
+ read. The calculated value is checked against an expected value that is
+ sent in either the request headers or trailers. To allow for the latter,
+ the expected value is lazy fetched once the input has been read.
+
+ :param wsgi_input: file-like object to be wrapped.
+ :param content_length: the expected number of bytes to be read.
+ :param checksum_hasher: a hasher to calculate the checksum of read bytes.
+ :param checksum_key: the name of the header or trailer that will have
+ the expected checksum value to be checked.
+ :param checksum_source: a dict that will have the ``checksum_key``.
"""
+ def __init__(self, wsgi_input, content_length, checksum_hasher,
+ checksum_key, checksum_source):
+ super().__init__(wsgi_input)
+ self._expected_length = content_length
+ self._checksum_hasher = checksum_hasher
+ self._checksum_key = checksum_key
+ self._checksum_source = checksum_source
+
+ def chunk_update(self, chunk, eof, *args, **kwargs):
+ # Note that "chunk" is just whatever was read from the input; this
+ # says nothing about whether the underlying stream uses aws-chunked
+ self._checksum_hasher.update(chunk)
+ if self.bytes_received < self._expected_length:
+ error = eof
+ elif self.bytes_received == self._expected_length:
+ # Lazy fetch checksum value because it may have come in trailers
+ b64digest = self._checksum_source.get(self._checksum_key)
+ try:
+ expected_raw_checksum = _validate_checksum_value(
+ self._checksum_hasher, b64digest)
+ except ValueError:
+ # If the checksum value came in a header then it would have
+ # been validated before the body was read, so if the validation
+ # fails here then we can infer that the checksum value came in
+ # a trailer. The S3InputChecksumTrailerInvalid raised here will
+ # propagate all the way back up the middleware stack to s3api
+ # where it is caught and translated to an InvalidRequest.
+ raise S3InputChecksumTrailerInvalid(self._checksum_key)
+ error = self._checksum_hasher.digest() != expected_raw_checksum
+ else:
+ error = True
+
+ if error:
+ self.close()
+ # Since we don't return the last chunk, the PUT never completes
+ raise S3InputChecksumMismatch(self._checksum_hasher.name.upper())
+ return chunk
+
+
+class ChunkReader(InputProxy):
+ """
+ wsgi.input wrapper to read a single chunk from an aws-chunked input and
+ validate its signature.
+
+ :param wsgi_input: a wsgi input.
+ :param chunk_size: number of bytes to read.
+ :param validator: function to call to validate the chunk's content.
+ :param chunk_params: string of params from the chunk's header.
+ """
+ def __init__(self, wsgi_input, chunk_size, validator, chunk_params):
+ super().__init__(wsgi_input)
+ self.chunk_size = chunk_size
+ self._validator = validator
+ if self._validator is None:
+ self._signature = None
+ else:
+ self._signature = self._parse_chunk_signature(chunk_params)
+ self._sha256 = sha256()
+
+ def _parse_chunk_signature(self, chunk_params):
+ if not chunk_params:
+ raise S3InputIncomplete
+ start, _, chunk_sig = chunk_params.partition('=')
+ if start.strip() != 'chunk-signature':
+ # Call the validator to update the string to sign
+ self._validator('', '')
+ raise S3InputChunkSignatureMismatch
+ if ';' in chunk_sig:
+ raise S3InputIncomplete
+ chunk_sig = chunk_sig.strip()
+ if not chunk_sig:
+ raise S3InputIncomplete
+ return chunk_sig
+
+ @property
+ def to_read(self):
+ return self.chunk_size - self.bytes_received
+
+ def read(self, size=None, *args, **kwargs):
+ if size is None or size < 0 or size > self.to_read:
+ size = self.to_read
+ return super().read(size)
+
+ def readline(self, size=None, *args, **kwargs):
+ if size is None or size < 0 or size > self.to_read:
+ size = self.to_read
+ return super().readline(size)
+
+ def chunk_update(self, chunk, eof, *args, **kwargs):
+ # Note that "chunk" is just whatever was read from the input
+ self._sha256.update(chunk)
+ if self.bytes_received == self.chunk_size:
+ if self._validator and not self._validator(
+ self._sha256.hexdigest(), self._signature):
+ self.close()
+ raise S3InputChunkSignatureMismatch
+ return chunk
+
+
+class StreamingInput:
+ """
+ wsgi.input wrapper to read a chunked input, verifying each chunk as it's
+ read. Once all chunks have been read, any trailers are read.
+
+ :param input: a wsgi input.
+ :param decoded_content_length: the number of payload bytes expected to be
+ extracted from chunks.
+ :param expected_trailers: the set of trailer names expected.
+ :param sig_checker: an instance of SigCheckerV4 that will be called to
+ verify each chunk's signature.
+ """
+ def __init__(self, input, decoded_content_length,
+ expected_trailers, sig_checker):
+ self._input = input
+ self._decoded_content_length = decoded_content_length
+ self._expected_trailers = expected_trailers
+ self._sig_checker = sig_checker
+ # Length of the payload remaining; i.e., number of bytes a caller
+ # still expects to be able to read. Once exhausted, we should be
+ # exactly at the trailers (if present)
+ self._to_read = decoded_content_length
+ # Reader for the current chunk that's in progress
+ self._chunk_reader = None
+ # Track the chunk number, for error messages
+ self._chunk_number = 0
+ # Track the size of the most recently read chunk. AWS enforces an 8k
+ # min chunk size (except the final chunk)
+ self._last_chunk_size = None
+ # When True, we've read the payload, but not necessarily the trailers
+ self._completed_payload = False
+ # When True, we've read the trailers
+ self._completed_trailers = False
+ # Any trailers present after the payload (not available until after
+ # caller has read full payload; i.e., until after _to_read is 0)
+ self.trailers = {}
+
+ def _read_chunk_header(self):
+ """
+ Read a chunk header, reading at most one line from the raw input.
+
+ Parse out the next chunk size and any other params.
+
+ :returns: a tuple of (chunk_size, chunk_params). chunk_size is an int,
+ chunk_params is string.
+ """
+ self._chunk_number += 1
+ chunk_header = swob.bytes_to_wsgi(self._input.readline())
+ if chunk_header[-2:] != '\r\n':
+ raise S3InputIncomplete('invalid chunk header: %s' % chunk_header)
+ chunk_size, _, chunk_params = chunk_header[:-2].partition(';')
+
+ try:
+ chunk_size = int(chunk_size, 16)
+ if chunk_size < 0:
+ raise ValueError
+ except ValueError:
+ raise S3InputIncomplete('invalid chunk header: %s' % chunk_header)
+
+ if self._last_chunk_size is not None and \
+ self._last_chunk_size < SIGV4_CHUNK_MIN_SIZE and \
+ chunk_size != 0:
+ raise S3InputChunkTooSmall(self._last_chunk_size,
+ self._chunk_number)
+ self._last_chunk_size = chunk_size
+
+ if chunk_size > self._to_read:
+ raise S3InputSizeError(
+ self._decoded_content_length,
+ self._decoded_content_length - self._to_read + chunk_size)
+ return chunk_size, chunk_params
+
+ def _read_payload(self, size, readline=False):
+ bufs = []
+ bytes_read = 0
+ while not self._completed_payload and (
+ bytes_read < size
+ # Make sure we read the trailing zero-byte chunk at the end
+ or self._to_read == 0):
+ if self._chunk_reader is None:
+ # OK, we're at the start of a new chunk
+ chunk_size, chunk_params = self._read_chunk_header()
+ self._chunk_reader = ChunkReader(
+ self._input,
+ chunk_size,
+ self._sig_checker and
+ self._sig_checker.check_chunk_signature,
+ chunk_params)
+ if readline:
+ buf = self._chunk_reader.readline(size - bytes_read)
+ else:
+ buf = self._chunk_reader.read(size - bytes_read)
+ bufs.append(buf)
+ if self._chunk_reader.to_read == 0:
+ # If it's the final chunk, we're in (possibly empty) trailers
+ # Otherwise, there's a CRLF chunk-separator
+ if self._chunk_reader.chunk_size == 0:
+ self._completed_payload = True
+ elif self._input.read(2) != b'\r\n':
+ raise S3InputIncomplete
+ self._chunk_reader = None
+ bytes_read += len(buf)
+ self._to_read -= len(buf)
+ if readline and buf[-1:] == b'\n':
+ break
+ return b''.join(bufs)
+
+ def _read_trailers(self):
+ if self._expected_trailers:
+ for line in iter(self._input.readline, b''):
+ if not line.endswith(b'\r\n'):
+ raise S3InputIncomplete
+ if line == b'\r\n':
+ break
+ key, _, value = swob.bytes_to_wsgi(line).partition(':')
+ if key.lower() not in self._expected_trailers:
+ raise S3InputMalformedTrailer
+ self.trailers[key.strip()] = value.strip()
+ if 'x-amz-trailer-signature' in self._expected_trailers \
+ and 'x-amz-trailer-signature' not in self.trailers:
+ raise S3InputIncomplete
+ if set(self.trailers.keys()) != self._expected_trailers:
+ raise S3InputMalformedTrailer
+ if 'x-amz-trailer-signature' in self._expected_trailers \
+ and self._sig_checker is not None:
+ if not self._sig_checker.check_trailer_signature(
+ self.trailers):
+ raise S3InputChunkSignatureMismatch
+ if len(self.trailers) == 1:
+ raise S3InputIncomplete
+ # Now that we've read them, we expect no more
+ self._expected_trailers = set()
+ elif self._input.read(2) not in (b'', b'\r\n'):
+ raise S3InputIncomplete
+
+ self._completed_trailers = True
+
+ def _read(self, size, readline=False):
+ data = self._read_payload(size, readline)
+ if self._completed_payload:
+ if not self._completed_trailers:
+ # read trailers, if present
+ self._read_trailers()
+ # At this point, we should have read everything; if we haven't,
+ # that's an error
+ if self._to_read:
+ raise S3InputSizeError(
+ self._decoded_content_length,
+ self._decoded_content_length - self._to_read)
+ return data
+
+ def read(self, size=None):
+ if size is None or size < 0 or size > self._to_read:
+ size = self._to_read
+ try:
+ return self._read(size)
+ except S3InputError:
+ self.close()
+ raise
+
+ def readline(self, size=None):
+ if size is None or size < 0 or size > self._to_read:
+ size = self._to_read
+ try:
+ return self._read(size, True)
+ except S3InputError:
+ self.close()
+ raise
+
+ def close(self):
+ close_if_possible(self._input)
+
+
+class BaseSigChecker:
+ def __init__(self, req):
+ self.req = req
+ self.signature = req.signature
+ self.string_to_sign = self._string_to_sign()
+ self._secret = None
+
+ def _string_to_sign(self):
+ raise NotImplementedError
+
+ def _derive_secret(self, secret):
+ return utf8encode(secret)
+
+ def _check_signature(self):
+ raise NotImplementedError
+
def check_signature(self, secret):
- secret = utf8encode(secret)
- user_signature = self.signature
- derived_secret = b'AWS4' + secret
- for scope_piece in self.scope.values():
+ self._secret = self._derive_secret(secret)
+ return self._check_signature()
+
+
+class SigCheckerV2(BaseSigChecker):
+ def _string_to_sign(self):
+ """
+ Create 'StringToSign' value in Amazon terminology for v2.
+ """
+ buf = [swob.wsgi_to_bytes(wsgi_str) for wsgi_str in [
+ self.req.method,
+ _header_strip(self.req.headers.get('Content-MD5')) or '',
+ _header_strip(self.req.headers.get('Content-Type')) or '']]
+
+ if 'headers_raw' in self.req.environ: # eventlet >= 0.19.0
+ # See https://github.com/eventlet/eventlet/commit/67ec999
+ amz_headers = defaultdict(list)
+ for key, value in self.req.environ['headers_raw']:
+ key = key.lower()
+ if not key.startswith('x-amz-'):
+ continue
+ amz_headers[key.strip()].append(value.strip())
+ amz_headers = dict((key, ','.join(value))
+ for key, value in amz_headers.items())
+ else: # mostly-functional fallback
+ amz_headers = dict((key.lower(), value)
+ for key, value in self.req.headers.items()
+ if key.lower().startswith('x-amz-'))
+
+ if self.req._is_header_auth:
+ if 'x-amz-date' in amz_headers:
+ buf.append(b'')
+ elif 'Date' in self.req.headers:
+ buf.append(swob.wsgi_to_bytes(self.req.headers['Date']))
+ elif self.req._is_query_auth:
+ buf.append(swob.wsgi_to_bytes(self.req.params['Expires']))
+ else:
+ # Should have already raised NotS3Request in _parse_auth_info,
+ # but as a sanity check...
+ raise AccessDenied(reason='not_s3')
+
+ for key, value in sorted(amz_headers.items()):
+ buf.append(swob.wsgi_to_bytes("%s:%s" % (key, value)))
+
+ path = self.req._canonical_uri()
+ if self.req.query_string:
+ path += '?' + self.req.query_string
+ params = []
+ if '?' in path:
+ path, args = path.split('?', 1)
+ for key, value in sorted(self.req.params.items()):
+ if key in ALLOWED_SUB_RESOURCES:
+ params.append('%s=%s' % (key, value) if value else key)
+ if params:
+ buf.append(swob.wsgi_to_bytes('%s?%s' % (path, '&'.join(params))))
+ else:
+ buf.append(swob.wsgi_to_bytes(path))
+ return b'\n'.join(buf)
+
+ def _check_signature(self):
+ valid_signature = base64_str(
+ hmac.new(self._secret, self.string_to_sign, sha1).digest())
+ return streq_const_time(self.signature, valid_signature)
+
+
+class SigCheckerV4(BaseSigChecker):
+ def __init__(self, req):
+ super().__init__(req)
+ self._all_chunk_signatures_valid = True
+
+ def _string_to_sign(self):
+ return b'\n'.join([
+ b'AWS4-HMAC-SHA256',
+ self.req.timestamp.amz_date_format.encode('ascii'),
+ '/'.join(self.req.scope.values()).encode('utf8'),
+ sha256(self.req._canonical_request()).hexdigest().encode('ascii')])
+
+ def _derive_secret(self, secret):
+ derived_secret = b'AWS4' + super()._derive_secret(secret)
+ for scope_piece in self.req.scope.values():
derived_secret = hmac.new(
derived_secret, scope_piece.encode('utf8'), sha256).digest()
+ return derived_secret
+
+ def _check_signature(self):
+ if self._secret is None:
+ raise S3InputMissingSecret
valid_signature = hmac.new(
- derived_secret, self.string_to_sign, sha256).hexdigest()
- return streq_const_time(user_signature, valid_signature)
+ self._secret, self.string_to_sign, sha256).hexdigest()
+ return streq_const_time(self.signature, valid_signature)
+
+ def _chunk_string_to_sign(self, data_sha256):
+ """
+ Create 'ChunkStringToSign' value in Amazon terminology for v4.
+ """
+ return b'\n'.join([
+ b'AWS4-HMAC-SHA256-PAYLOAD',
+ self.req.timestamp.amz_date_format.encode('ascii'),
+ '/'.join(self.req.scope.values()).encode('utf8'),
+ self.signature.encode('utf8'),
+ sha256(b'').hexdigest().encode('utf8'),
+ data_sha256.encode('utf8')
+ ])
+
+ def check_chunk_signature(self, chunk_sha256, signature):
+ """
+ Check the validity of a chunk's signature.
+
+ This method verifies the signature of a given chunk using its SHA-256
+ hash. It updates the string to sign and the current signature, then
+ checks if the signature is valid. If any chunk signature is invalid,
+ it returns False.
+
+ :param chunk_sha256: (str) The SHA-256 hash of the chunk.
+ :param signature: (str) The signature to be verified.
+ :returns: True if all chunk signatures are valid, False otherwise.
+ """
+ if not self._all_chunk_signatures_valid:
+ return False
+ # NB: string_to_sign is calculated using the previous signature
+ self.string_to_sign = self._chunk_string_to_sign(chunk_sha256)
+ # So we have to update the signature to compare against *after*
+ # the string-to-sign
+ self.signature = signature
+ self._all_chunk_signatures_valid &= self._check_signature()
+ return self._all_chunk_signatures_valid
+
+ def _trailer_string_to_sign(self, trailers):
+ """
+ Create 'TrailerChunkStringToSign' value in Amazon terminology for v4.
+ """
+ canonical_trailers = swob.wsgi_to_bytes(''.join(
+ f'{key}:{value}\n'
+ for key, value in sorted(
+ trailers.items(),
+ key=lambda kvp: swob.wsgi_to_bytes(kvp[0]).lower(),
+ )
+ if key != 'x-amz-trailer-signature'
+ ))
+ if not canonical_trailers:
+ canonical_trailers = b'\n'
+ return b'\n'.join([
+ b'AWS4-HMAC-SHA256-TRAILER',
+ self.req.timestamp.amz_date_format.encode('ascii'),
+ '/'.join(self.req.scope.values()).encode('utf8'),
+ self.signature.encode('utf8'),
+ sha256(canonical_trailers).hexdigest().encode('utf8'),
+ ])
+
+ def check_trailer_signature(self, trailers):
+ """
+ Check the validity of a chunk's signature.
+
+ This method verifies the trailers received after the main payload.
+
+ :param trailers: (dict[str, str]) The trailers received.
+ :returns: True if x-amz-trailer-signature is valid, False otherwise.
+ """
+ if not self._all_chunk_signatures_valid:
+ # if there was a breakdown earlier, this can't be right
+ return False
+ # NB: string_to_sign is calculated using the previous signature
+ self.string_to_sign = self._trailer_string_to_sign(trailers)
+ # So we have to update the signature to compare against *after*
+ # the string-to-sign
+ self.signature = trailers['x-amz-trailer-signature']
+ self._all_chunk_signatures_valid &= self._check_signature()
+ return self._all_chunk_signatures_valid
+
+
+def _parse_credential(credential_string):
+ """
+ Parse an AWS credential string into its components.
+
+ This method splits the given credential string into its constituent parts:
+ access key ID, date, AWS region, AWS service, and terminal identifier.
+ The credential string must follow the format:
+ <access-key-id>/<date>/<AWS-region>/<AWS-service>/aws4_request.
+
+ :param credential_string: (str) The AWS credential string to be parsed.
+ :raises AccessDenied: If the credential string is invalid or does not
+ follow the required format.
+ :returns: A dict containing the parsed components of the credential string.
+ """
+ parts = credential_string.split("/")
+ # credential must be in following format:
+ # <access-key-id>/<date>/<AWS-region>/<AWS-service>/aws4_request
+ if not parts[0] or len(parts) != 5:
+ raise AccessDenied(reason='invalid_credential')
+ return dict(zip(['access', 'date', 'region', 'service', 'terminal'],
+ parts))
+
+
+class SigV4Mixin(object):
+ """
+ A request class mixin to provide S3 signature v4 functionality
+ """
@property
def _is_query_auth(self):
return 'X-Amz-Credential' in self.params
@property
+ def _is_x_amz_content_sha256_required(self):
+ return not self._is_query_auth
+
+ @property
def timestamp(self):
"""
Return timestamp string according to the auth type
@@ -259,37 +789,6 @@
if int(self.timestamp) + expires < S3Timestamp.now():
raise AccessDenied('Request has expired', reason='expired')
- def _validate_sha256(self):
- aws_sha256 = self.headers.get('x-amz-content-sha256')
- looks_like_sha256 = (
- aws_sha256 and len(aws_sha256) == 64 and
- all(c in '0123456789abcdef' for c in aws_sha256.lower()))
- if not aws_sha256:
- if 'X-Amz-Credential' in self.params:
- pass # pre-signed URL; not required
- else:
- msg = 'Missing required header for this request: ' \
- 'x-amz-content-sha256'
- raise InvalidRequest(msg)
- elif aws_sha256 == 'UNSIGNED-PAYLOAD':
- pass
- elif not looks_like_sha256 and 'X-Amz-Credential' not in self.params:
- raise InvalidArgument(
- 'x-amz-content-sha256',
- aws_sha256,
- 'x-amz-content-sha256 must be UNSIGNED-PAYLOAD, or '
- 'a valid sha256 value.')
- return aws_sha256
-
- def _parse_credential(self, credential_string):
- parts = credential_string.split("/")
- # credential must be in following format:
- # <access-key-id>/<date>/<AWS-region>/<AWS-service>/aws4_request
- if not parts[0] or len(parts) != 5:
- raise AccessDenied(reason='invalid_credential')
- return dict(zip(['access', 'date', 'region', 'service', 'terminal'],
- parts))
-
def _parse_query_authentication(self):
"""
Parse v4 query authentication
@@ -302,7 +801,7 @@
raise InvalidArgument('X-Amz-Algorithm',
self.params.get('X-Amz-Algorithm'))
try:
- cred_param = self._parse_credential(
+ cred_param = _parse_credential(
swob.wsgi_to_str(self.params['X-Amz-Credential']))
sig = swob.wsgi_to_str(self.params['X-Amz-Signature'])
if not sig:
@@ -356,7 +855,7 @@
"""
auth_str = swob.wsgi_to_str(self.headers['Authorization'])
- cred_param = self._parse_credential(auth_str.partition(
+ cred_param = _parse_credential(auth_str.partition(
"Credential=")[2].split(',')[0])
sig = auth_str.partition("Signature=")[2].split(',')[0]
if not sig:
@@ -506,16 +1005,6 @@
('terminal', 'aws4_request'),
])
- def _string_to_sign(self):
- """
- Create 'StringToSign' value in Amazon terminology for v4.
- """
- return b'\n'.join([
- b'AWS4-HMAC-SHA256',
- self.timestamp.amz_date_format.encode('ascii'),
- '/'.join(self.scope.values()).encode('utf8'),
- sha256(self._canonical_request()).hexdigest().encode('ascii')])
-
def signature_does_not_match_kwargs(self):
kwargs = super(SigV4Mixin, self).signature_does_not_match_kwargs()
cr = self._canonical_request()
@@ -565,13 +1054,46 @@
self.bucket_in_host = self._parse_host()
self.container_name, self.object_name = self._parse_uri()
self._validate_headers()
+ if isinstance(self, SigV4Mixin):
+ # this is a deliberate but only partial shift away from the
+ # 'inherit and override from mixin' pattern towards a 'compose
+ # adapters' pattern.
+ self.sig_checker = SigCheckerV4(self)
+ else:
+ self.sig_checker = SigCheckerV2(self)
+ aws_sha256 = self.headers.get('x-amz-content-sha256')
+ if self.method in ('PUT', 'POST'):
+ checksum_hasher, checksum_header, checksum_trailer = \
+ self._validate_checksum_headers()
+ if _is_streaming(aws_sha256):
+ if checksum_trailer:
+ streaming_input = self._install_streaming_input_wrapper(
+ aws_sha256, checksum_trailer=checksum_trailer)
+ checksum_key = checksum_trailer
+ checksum_source = streaming_input.trailers
+ else:
+ self._install_streaming_input_wrapper(aws_sha256)
+ checksum_key = checksum_header
+ checksum_source = self.headers
+ elif checksum_trailer:
+ raise MalformedTrailerError
+ else:
+ self._install_non_streaming_input_wrapper(aws_sha256)
+ checksum_key = checksum_header
+ checksum_source = self.headers
+
+ # S3 doesn't check the checksum against the request body for at
+ # least some POSTs (e.g. MPU complete) so restrict this to PUTs
+ if checksum_key and self.method == 'PUT':
+ self._install_checksumming_input_wrapper(
+ checksum_hasher, checksum_key, checksum_source)
+
# Lock in string-to-sign now, before we start messing with query params
- self.string_to_sign = self._string_to_sign()
self.environ['s3api.auth_details'] = {
'access_key': self.access_key,
'signature': self.signature,
- 'string_to_sign': self.string_to_sign,
- 'check_signature': self.check_signature,
+ 'string_to_sign': self.sig_checker.string_to_sign,
+ 'check_signature': self.sig_checker.check_signature,
}
self.account = None
self.user_id = None
@@ -632,14 +1154,6 @@
return part_number
- def check_signature(self, secret):
- secret = utf8encode(secret)
- user_signature = self.signature
- valid_signature = base64.b64encode(hmac.new(
- secret, self.string_to_sign, sha1
- ).digest()).strip().decode('ascii')
- return streq_const_time(user_signature, valid_signature)
-
@property
def timestamp(self):
"""
@@ -684,6 +1198,10 @@
def _is_query_auth(self):
return 'AWSAccessKeyId' in self.params
+ @property
+ def _is_x_amz_content_sha256_required(self):
+ return False
+
def _parse_host(self):
if not self.conf.storage_domains:
return None
@@ -821,7 +1339,236 @@
raise RequestTimeTooSkewed()
def _validate_sha256(self):
- return self.headers.get('x-amz-content-sha256')
+ aws_sha256 = self.headers.get('x-amz-content-sha256')
+ if not aws_sha256:
+ if self._is_x_amz_content_sha256_required:
+ msg = 'Missing required header for this request: ' \
+ 'x-amz-content-sha256'
+ raise InvalidRequest(msg)
+ else:
+ return
+
+ looks_like_sha256 = (
+ aws_sha256 and len(aws_sha256) == 64 and
+ all(c in '0123456789abcdef' for c in aws_sha256.lower()))
+ if aws_sha256 == 'UNSIGNED-PAYLOAD':
+ pass
+ elif _is_streaming(aws_sha256):
+ decoded_content_length = self.headers.get(
+ 'x-amz-decoded-content-length')
+ try:
+ decoded_content_length = int(decoded_content_length)
+ except (ValueError, TypeError):
+ raise MissingContentLength
+ if decoded_content_length < 0:
+ raise InvalidArgument('x-amz-decoded-content-length',
+ decoded_content_length)
+
+ if not isinstance(self, SigV4Mixin) or self._is_query_auth:
+ if decoded_content_length < (self.content_length or 0):
+ raise IncompleteBody(
+ number_bytes_expected=decoded_content_length,
+ number_bytes_provided=self.content_length,
+ )
+ body = self.body_file.read()
+ raise XAmzContentSHA256Mismatch(
+ client_computed_content_s_h_a256=aws_sha256,
+ s3_computed_content_s_h_a256=sha256(body).hexdigest(),
+ )
+ elif aws_sha256 in (
+ 'STREAMING-AWS4-ECDSA-P256-SHA256-PAYLOAD',
+ 'STREAMING-AWS4-ECDSA-P256-SHA256-PAYLOAD-TRAILER',
+ ):
+ raise S3NotImplemented(
+ "Don't know how to validate %s streams"
+ % aws_sha256)
+
+ elif not looks_like_sha256 and self._is_x_amz_content_sha256_required:
+ raise InvalidArgument(
+ 'x-amz-content-sha256',
+ aws_sha256,
+ 'x-amz-content-sha256 must be UNSIGNED-PAYLOAD, '
+ 'STREAMING-UNSIGNED-PAYLOAD-TRAILER, '
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD, '
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER or '
+ 'a valid sha256 value.')
+
+ return aws_sha256
+
+ def _cleanup_content_encoding(self):
+ if 'aws-chunked' in self.headers.get('Content-Encoding', ''):
+ new_enc = ', '.join(
+ enc for enc in list_from_csv(
+ self.headers.pop('Content-Encoding'))
+ # TODO: test what's stored w/ 'aws-chunked, aws-chunked'
+ if enc != 'aws-chunked')
+ if new_enc:
+ # used to be, AWS would store '', but not any more
+ self.headers['Content-Encoding'] = new_enc
+
+ def _install_streaming_input_wrapper(self, aws_sha256,
+ checksum_trailer=None):
+ """
+ Wrap the wsgi input with a reader that parses an aws-chunked body.
+
+ :param aws_sha256: the value of the 'x-amz-content-sha256' header.
+ :param checksum_trailer: the name of an 'x-amz-checksum-*' trailer
+ (if any) that is to be expected at the end of the body.
+ :return: an instance of StreamingInput.
+ """
+ self._cleanup_content_encoding()
+ self.content_length = int(self.headers.get(
+ 'x-amz-decoded-content-length'))
+ expected_trailers = set()
+ if aws_sha256 == 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER':
+ expected_trailers.add('x-amz-trailer-signature')
+ if checksum_trailer:
+ expected_trailers.add(checksum_trailer)
+ streaming_input = StreamingInput(
+ self.environ['wsgi.input'],
+ self.content_length,
+ expected_trailers,
+ None if aws_sha256 == 'STREAMING-UNSIGNED-PAYLOAD-TRAILER'
+ else self.sig_checker)
+ self.environ['wsgi.input'] = streaming_input
+ return streaming_input
+
+ def _install_non_streaming_input_wrapper(self, aws_sha256):
+ if (aws_sha256 not in (None, 'UNSIGNED-PAYLOAD') and
+ self.content_length is not None):
+ self.environ['wsgi.input'] = HashingInput(
+ self.environ['wsgi.input'],
+ self.content_length,
+ aws_sha256)
+ # If no content-length, either client's trying to do a HTTP chunked
+ # transfer, or a HTTP/1.0-style transfer (in which case swift will
+ # reject with length-required and we'll translate back to
+ # MissingContentLength)
+
+ def _validate_x_amz_checksum_headers(self):
+ """
+ Validate and return a header that specifies a checksum value. A valid
+ header must be named x-amz-checksum-<algorithm> where <algorithm> is
+ one of the supported checksum algorithms.
+
+ :raises: InvalidRequest if more than one checksum header is found or if
+ an invalid algorithm is specified.
+ :return: a dict containing at most a single checksum header name:value
+ pair.
+ """
+ checksum_headers = {
+ h.lower(): v
+ for h, v in self.headers.items()
+ if (h.lower().startswith('x-amz-checksum-')
+ and h.lower() not in ('x-amz-checksum-algorithm',
+ 'x-amz-checksum-type'))
+ }
+ if any(h not in CHECKSUMS_BY_HEADER
+ for h in checksum_headers):
+ raise InvalidRequest('The algorithm type you specified in '
+ 'x-amz-checksum- header is invalid.')
+ _validate_checksum_header_cardinality(len(checksum_headers))
+ return checksum_headers
+
+ def _validate_x_amz_trailer_header(self):
+ """
+ Validate and return the name of a checksum trailer that is declared by
+ an ``x-amz-trailer`` header. A valid trailer must be named
+ x-amz-checksum-<algorithm> where <algorithm> is one of the supported
+ checksum algorithms.
+
+ :raises: InvalidRequest if more than one checksum trailer is declared
+ by the ``x-amz-trailer`` header, or if an invalid algorithm is
+ specified.
+ :return: a list containing at most a single checksum header name.
+ """
+ header = self.headers.get('x-amz-trailer', '').strip()
+ checksum_headers = [
+ v.strip() for v in header.rstrip(',').split(',')
+ ] if header else []
+ if any(h not in CHECKSUMS_BY_HEADER
+ for h in checksum_headers):
+ raise InvalidRequest('The value specified in the x-amz-trailer '
+ 'header is not supported')
+ _validate_checksum_header_cardinality(len(checksum_headers))
+ return checksum_headers
+
+ def _validate_checksum_headers(self):
+ """
+ A checksum for the request is specified by a checksum header of the
+ form:
+
+ x-amz-checksum-<algorithm>: <checksum>
+
+ where <algorithm> is one of the supported checksum algorithms and
+ <checksum> is the value to be checked. A checksum header may be sent in
+ either the headers or the trailers. An ``x-amz-trailer`` header is used
+ to declare that a checksum header is to be expected in the trailers.
+
+ At most one checksum header is allowed in the headers or trailers. If
+ this condition is met, this method returns the name of the checksum
+ header or trailer and a hasher for the checksum algorithm that it
+ declares.
+
+ :raises InvalidRequest: if any of the following conditions occur: more
+ than one checksum header is declared; the checksum header specifies
+ an invalid algorithm; the algorithm does not match the value of any
+ ``x-amz-sdk-checksum-algorithm`` header that is also present; the
+ checksum value is invalid.
+ :raises S3NotImplemented: if the declared algorithm is valid but not
+ supported.
+ :return: a tuple of
+ (hasher, checksum header name, checksum trailer name) where at
+ least one of (checksum header name, checksum trailer name) will be
+ None.
+ """
+ checksum_headers = self._validate_x_amz_checksum_headers()
+ checksum_trailer_headers = self._validate_x_amz_trailer_header()
+ _validate_checksum_header_cardinality(
+ len(checksum_headers) + len(checksum_trailer_headers),
+ headers_and_trailer=True
+ )
+
+ if checksum_headers:
+ checksum_trailer = None
+ checksum_header, b64digest = list(checksum_headers.items())[0]
+ checksum_hasher = _get_checksum_hasher(checksum_header)
+ try:
+ # early check on the value...
+ _validate_checksum_value(checksum_hasher, b64digest)
+ except ValueError:
+ raise InvalidRequest(
+ 'Value for %s header is invalid.' % checksum_header)
+ elif checksum_trailer_headers:
+ checksum_header = None
+ checksum_trailer = checksum_trailer_headers[0]
+ checksum_hasher = _get_checksum_hasher(checksum_trailer)
+ # checksum should appear at end of request in trailers
+ else:
+ checksum_hasher = checksum_header = checksum_trailer = None
+
+ checksum_algo = self.headers.get('x-amz-sdk-checksum-algorithm')
+ if checksum_algo:
+ if not checksum_hasher:
+ raise InvalidRequest(
+ 'x-amz-sdk-checksum-algorithm specified, but no '
+ 'corresponding x-amz-checksum-* or x-amz-trailer '
+ 'headers were found.')
+ if checksum_algo.lower() != checksum_hasher.name:
+ raise InvalidRequest('Value for x-amz-sdk-checksum-algorithm '
+ 'header is invalid.')
+
+ return checksum_hasher, checksum_header, checksum_trailer
+
+ def _install_checksumming_input_wrapper(
+ self, checksum_hasher, checksum_key, checksum_source):
+ self.environ['wsgi.input'] = ChecksummingInput(
+ self.environ['wsgi.input'],
+ self.content_length,
+ checksum_hasher,
+ checksum_key,
+ checksum_source
+ )
def _validate_headers(self):
if 'CONTENT_LENGTH' in self.environ:
@@ -878,21 +1625,7 @@
if 'x-amz-website-redirect-location' in self.headers:
raise S3NotImplemented('Website redirection is not supported.')
- aws_sha256 = self._validate_sha256()
- if (aws_sha256
- and aws_sha256 != 'UNSIGNED-PAYLOAD'
- and self.content_length is not None):
- # Even if client-provided SHA doesn't look like a SHA, wrap the
- # input anyway so we'll send the SHA of what the client sent in
- # the eventual error
- self.environ['wsgi.input'] = HashingInput(
- self.environ['wsgi.input'],
- self.content_length,
- aws_sha256)
- # If no content-length, either client's trying to do a HTTP chunked
- # transfer, or a HTTP/1.0-style transfer (in which case swift will
- # reject with length-required and we'll translate back to
- # MissingContentLength)
+ self._validate_sha256()
value = _header_strip(self.headers.get('Content-MD5'))
if value is not None:
@@ -909,15 +1642,6 @@
if len(self.headers['ETag']) != 32:
raise InvalidDigest(content_md5=value)
- # https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-streaming.html
- # describes some of what would be required to support this
- if any(['aws-chunked' in self.headers.get('content-encoding', ''),
- 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD' == self.headers.get(
- 'x-amz-content-sha256', ''),
- 'x-amz-decoded-content-length' in self.headers]):
- raise S3NotImplemented('Transfering payloads in multiple chunks '
- 'using aws-chunked is not supported.')
-
if 'x-amz-tagging' in self.headers:
raise S3NotImplemented('Object tagging is not supported.')
@@ -948,13 +1672,8 @@
if te or ml:
# Limit the read similar to how SLO handles manifests
- try:
+ with self.translate_read_errors():
body = self.body_file.read(max_length)
- except S3InputSHA256Mismatch as err:
- raise XAmzContentSHA256Mismatch(
- client_computed_content_s_h_a256=err.expected,
- s3_computed_content_s_h_a256=err.computed,
- )
else:
# No (or zero) Content-Length provided, and not chunked transfer;
# no body. Assume zero-length, and enforce a required body below.
@@ -967,8 +1686,7 @@
raise InvalidRequest('Missing required header for this request: '
'Content-MD5')
- digest = base64.b64encode(md5(
- body, usedforsecurity=False).digest()).strip().decode('ascii')
+ digest = base64_str(md5(body, usedforsecurity=False).digest())
if self.environ['HTTP_CONTENT_MD5'] != digest:
raise BadDigest(content_md5=self.environ['HTTP_CONTENT_MD5'])
@@ -1045,70 +1763,14 @@
raw_path_info = '/' + self.bucket_in_host + raw_path_info
return raw_path_info
- def _string_to_sign(self):
- """
- Create 'StringToSign' value in Amazon terminology for v2.
- """
- amz_headers = {}
-
- buf = [swob.wsgi_to_bytes(wsgi_str) for wsgi_str in [
- self.method,
- _header_strip(self.headers.get('Content-MD5')) or '',
- _header_strip(self.headers.get('Content-Type')) or '']]
-
- if 'headers_raw' in self.environ: # eventlet >= 0.19.0
- # See https://github.com/eventlet/eventlet/commit/67ec999
- amz_headers = defaultdict(list)
- for key, value in self.environ['headers_raw']:
- key = key.lower()
- if not key.startswith('x-amz-'):
- continue
- amz_headers[key.strip()].append(value.strip())
- amz_headers = dict((key, ','.join(value))
- for key, value in amz_headers.items())
- else: # mostly-functional fallback
- amz_headers = dict((key.lower(), value)
- for key, value in self.headers.items()
- if key.lower().startswith('x-amz-'))
-
- if self._is_header_auth:
- if 'x-amz-date' in amz_headers:
- buf.append(b'')
- elif 'Date' in self.headers:
- buf.append(swob.wsgi_to_bytes(self.headers['Date']))
- elif self._is_query_auth:
- buf.append(swob.wsgi_to_bytes(self.params['Expires']))
- else:
- # Should have already raised NotS3Request in _parse_auth_info,
- # but as a sanity check...
- raise AccessDenied(reason='not_s3')
-
- for key, value in sorted(amz_headers.items()):
- buf.append(swob.wsgi_to_bytes("%s:%s" % (key, value)))
-
- path = self._canonical_uri()
- if self.query_string:
- path += '?' + self.query_string
- params = []
- if '?' in path:
- path, args = path.split('?', 1)
- for key, value in sorted(self.params.items()):
- if key in ALLOWED_SUB_RESOURCES:
- params.append('%s=%s' % (key, value) if value else key)
- if params:
- buf.append(swob.wsgi_to_bytes('%s?%s' % (path, '&'.join(params))))
- else:
- buf.append(swob.wsgi_to_bytes(path))
- return b'\n'.join(buf)
-
def signature_does_not_match_kwargs(self):
return {
'a_w_s_access_key_id': self.access_key,
- 'string_to_sign': self.string_to_sign,
+ 'string_to_sign': self.sig_checker.string_to_sign,
'signature_provided': self.signature,
'string_to_sign_bytes': ' '.join(
format(ord(c), '02x')
- for c in self.string_to_sign.decode('latin1')),
+ for c in self.sig_checker.string_to_sign.decode('latin1')),
}
@property
@@ -1439,6 +2101,52 @@
return code_map[method]
+ @contextlib.contextmanager
+ def translate_read_errors(self):
+ try:
+ yield
+ except S3InputIncomplete:
+ raise IncompleteBody('The request body terminated unexpectedly')
+ except S3InputSHA256Mismatch as err:
+ # hopefully by now any modifications to the path (e.g. tenant to
+ # account translation) will have been made by auth middleware
+ raise XAmzContentSHA256Mismatch(
+ client_computed_content_s_h_a256=err.expected,
+ s3_computed_content_s_h_a256=err.computed,
+ )
+ except S3InputChecksumMismatch as e:
+ raise BadDigest(
+ 'The %s you specified did not '
+ 'match the calculated checksum.' % e.args[0])
+ except S3InputChecksumTrailerInvalid as e:
+ raise InvalidRequest(
+ 'Value for %s trailing header is invalid.' % e.trailer)
+ except S3InputChunkSignatureMismatch:
+ raise SignatureDoesNotMatch(
+ **self.signature_does_not_match_kwargs())
+ except S3InputSizeError as e:
+ raise IncompleteBody(
+ number_bytes_expected=e.expected,
+ number_bytes_provided=e.provided,
+ )
+ except S3InputChunkTooSmall as e:
+ raise InvalidChunkSizeError(
+ chunk=e.chunk_number,
+ bad_chunk_size=e.bad_chunk_size,
+ )
+ except S3InputMalformedTrailer:
+ raise MalformedTrailerError
+ except S3InputMissingSecret:
+ # XXX: We should really log something here. The poor user can't do
+ # anything about this; we need to notify the operator to notify the
+ # auth middleware developer
+ raise S3NotImplemented('Transferring payloads in multiple chunks '
+ 'using aws-chunked is not supported.')
+ except S3InputError:
+ # All cases should be covered above, but belt & braces
+ # NB: general exception handler in s3api.py will log traceback
+ raise InternalError
+
def _get_response(self, app, method, container, obj,
headers=None, body=None, query=None):
"""
@@ -1457,21 +2165,13 @@
body=body, query=query)
try:
- sw_resp = sw_req.get_response(app)
- except S3InputSHA256Mismatch as err:
- # hopefully by now any modifications to the path (e.g. tenant to
- # account translation) will have been made by auth middleware
- self.environ['s3api.backend_path'] = sw_req.environ['PATH_INFO']
- raise XAmzContentSHA256Mismatch(
- client_computed_content_s_h_a256=err.expected,
- s3_computed_content_s_h_a256=err.computed,
- )
- else:
+ with self.translate_read_errors():
+ sw_resp = sw_req.get_response(app)
+ finally:
# reuse account
- _, self.account, _ = split_path(sw_resp.environ['PATH_INFO'],
+ _, self.account, _ = split_path(sw_req.environ['PATH_INFO'],
2, 3, True)
- # Update s3.backend_path from the response environ
- self.environ['s3api.backend_path'] = sw_resp.environ['PATH_INFO']
+ self.environ['s3api.backend_path'] = sw_req.environ['PATH_INFO']
# keep a record of the backend policy index so that the s3api can add
# it to the headers of whatever response it returns, which may not
diff -Nru swift-2.35.0/swift/common/middleware/s3api/s3response.py swift-2.35.1/swift/common/middleware/s3api/s3response.py
--- swift-2.35.0/swift/common/middleware/s3api/s3response.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/swift/common/middleware/s3api/s3response.py 2025-08-22 17:56:44.000000000 +0200
@@ -395,7 +395,7 @@
class IncompleteBody(ErrorResponse):
_status = '400 Bad Request'
_msg = 'You did not provide the number of bytes specified by the ' \
- 'Content-Length HTTP header.'
+ 'Content-Length HTTP header'
class IncorrectNumberOfFilesInPostRequest(ErrorResponse):
@@ -444,6 +444,11 @@
_msg = 'The request is not valid with the current state of the bucket.'
+class InvalidChunkSizeError(ErrorResponse):
+ _status = '403 Forbidden'
+ _msg = 'Only the last chunk is allowed to have a size less than 8192 bytes'
+
+
class InvalidDigest(ErrorResponse):
_status = '400 Bad Request'
_msg = 'The Content-MD5 you specified was invalid.'
@@ -565,6 +570,12 @@
'multipart/form-data.'
+class MalformedTrailerError(ErrorResponse):
+ _status = '400 Bad Request'
+ _msg = 'The request contained trailing data that was not well-formed ' \
+ 'or did not conform to our published schema.'
+
+
class MalformedXML(ErrorResponse):
_status = '400 Bad Request'
_msg = 'The XML you provided was not well-formed or did not validate ' \
diff -Nru swift-2.35.0/swift/common/middleware/s3api/schema/complete_multipart_upload.rng swift-2.35.1/swift/common/middleware/s3api/schema/complete_multipart_upload.rng
--- swift-2.35.0/swift/common/middleware/s3api/schema/complete_multipart_upload.rng 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/swift/common/middleware/s3api/schema/complete_multipart_upload.rng 2025-08-22 17:56:44.000000000 +0200
@@ -11,6 +11,31 @@
<element name="ETag">
<data type="string"/>
</element>
+ <optional>
+ <element name="ChecksumCRC32">
+ <data type="string"/>
+ </element>
+ </optional>
+ <optional>
+ <element name="ChecksumCRC32C">
+ <data type="string"/>
+ </element>
+ </optional>
+ <optional>
+ <element name="ChecksumCRC64NVME">
+ <data type="string"/>
+ </element>
+ </optional>
+ <optional>
+ <element name="ChecksumSHA1">
+ <data type="string"/>
+ </element>
+ </optional>
+ <optional>
+ <element name="ChecksumSHA256">
+ <data type="string"/>
+ </element>
+ </optional>
</interleave>
</element>
</oneOrMore>
diff -Nru swift-2.35.0/swift/common/utils/checksum.py swift-2.35.1/swift/common/utils/checksum.py
--- swift-2.35.0/swift/common/utils/checksum.py 1970-01-01 01:00:00.000000000 +0100
+++ swift-2.35.1/swift/common/utils/checksum.py 2025-08-22 17:56:44.000000000 +0200
@@ -0,0 +1,266 @@
+# Copyright (c) 2024 NVIDIA
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+# implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+try:
+ import anycrc
+except ImportError:
+ anycrc = None
+import binascii
+import ctypes
+import ctypes.util
+import errno
+import socket
+import struct
+import zlib
+
+
+# See if anycrc is available...
+if anycrc:
+ crc32c_anycrc = anycrc.Model('CRC32C').calc
+ crc64nvme_anycrc = anycrc.Model('CRC64-NVME').calc
+else:
+ crc32c_anycrc = None
+ crc64nvme_anycrc = None
+
+
+def find_isal():
+ # If isal is available system-wide, great!
+ isal_lib = ctypes.util.find_library('isal')
+ if isal_lib is None:
+ # py38+: Hopefully pyeclib was installed from a manylinux wheel
+ # with isal baked in?
+ try:
+ import pyeclib # noqa
+ from importlib.metadata import \
+ files as pkg_files, PackageNotFoundError # py38+
+ except ImportError:
+ pass
+ else:
+ # Assume busted installs won't have it
+ try:
+ pyeclib_files = pkg_files('pyeclib')
+ if pyeclib_files is None:
+ # Have a dist-info, but no RECORD file??
+ pyeclib_files = []
+ except PackageNotFoundError:
+ # Could import pyeclib, but no dist-info directory??
+ pyeclib_files = []
+ isal_libs = [f for f in pyeclib_files
+ if f.name.startswith("libisal")]
+ if len(isal_libs) == 1:
+ isal_lib = isal_libs[0].locate()
+ return ctypes.CDLL(isal_lib) if isal_lib else None
+
+
+isal = find_isal()
+
+if hasattr(isal, 'crc32_iscsi'): # isa-l >= 2.16
+ isal.crc32_iscsi.argtypes = [ctypes.c_char_p, ctypes.c_int, ctypes.c_uint]
+ isal.crc32_iscsi.restype = ctypes.c_uint
+
+ def crc32c_isal(data, value=0):
+ result = isal.crc32_iscsi(
+ data,
+ len(data),
+ value ^ 0xffff_ffff,
+ )
+ # for some reason, despite us specifying that restype is uint,
+ # it can come back signed??
+ return (result & 0xffff_ffff) ^ 0xffff_ffff
+else:
+ crc32c_isal = None
+
+if hasattr(isal, 'crc64_rocksoft_refl'): # isa-l >= 2.31.0
+ isal.crc64_rocksoft_refl.argtypes = [
+ ctypes.c_uint64, ctypes.c_char_p, ctypes.c_uint64]
+ isal.crc64_rocksoft_refl.restype = ctypes.c_uint64
+
+ def crc64nvme_isal(data, value=0):
+ return isal.crc64_rocksoft_refl(
+ value,
+ data,
+ len(data),
+ )
+else:
+ crc64nvme_isal = None
+
+
+# The kernel may also provide crc32c
+AF_ALG = getattr(socket, 'AF_ALG', 38)
+try:
+ _sock = socket.socket(AF_ALG, socket.SOCK_SEQPACKET)
+ _sock.bind(("hash", "crc32c"))
+except OSError as e:
+ if e.errno == errno.ENOENT:
+ # could create socket, but crc32c is unknown
+ _sock.close()
+ elif e.errno != errno.EAFNOSUPPORT:
+ raise
+ crc32c_kern = None
+else:
+ def crc32c_kern(data, value=0):
+ crc32c_sock = socket.socket(AF_ALG, socket.SOCK_SEQPACKET)
+ try:
+ crc32c_sock.bind(("hash", "crc32c"))
+ crc32c_sock.setsockopt(
+ socket.SOL_ALG,
+ socket.ALG_SET_KEY,
+ struct.pack("I", value ^ 0xffff_ffff))
+ sock, _ = crc32c_sock.accept()
+ try:
+ sock.sendall(data)
+ return struct.unpack("I", sock.recv(4))[0]
+ finally:
+ sock.close()
+ finally:
+ crc32c_sock.close()
+
+
+def _select_crc32c_impl():
+ # Use the best implementation available.
+ # On various hardware we've seen
+ #
+ # CPU | ISA-L | Kernel
+ # ---------------+-----------+----------
+ # Intel N100 | ~9GB/s | ~3.5GB/s
+ # ARM Cortex-A55 | ~2.5GB/s | ~0.4GB/s
+ # Intel 11850H | ~7GB/s | ~2.6GB/s
+ # AMD 3900XT | ~20GB/s | ~5GB/s
+ #
+ # i.e., ISA-L is consistently 3-5x faster than kernel sockets
+ selected = crc32c_isal or crc32c_kern or crc32c_anycrc or None
+ if not selected:
+ raise NotImplementedError(
+ 'no crc32c implementation, install isal or anycrc')
+ return selected
+
+
+def _select_crc64nvme_impl():
+ selected = crc64nvme_isal or crc64nvme_anycrc or None
+ if not selected:
+ raise NotImplementedError(
+ 'no crc64nvme implementation, install isal or anycrc')
+ return selected
+
+
+class CRCHasher(object):
+ """
+ Helper that works like a hashlib hasher, but with a CRC.
+ """
+ def __init__(self, name, crc_func, data=None, initial_value=0, width=32):
+ """
+ Initialize the CRCHasher.
+
+ :param name: Name of the hasher
+ :param crc_func: Function to compute the CRC.
+ :param data: Data to update the hasher.
+ :param initial_value: Initial CRC value.
+ :param width: Width (in bits) of CRC values.
+ """
+ self.name = name
+ self.crc_func = crc_func
+ self.crc = initial_value
+ if width not in (32, 64):
+ raise ValueError("CRCHasher only supports 32- or 64-bit CRCs")
+ self.width = width
+ if data is not None:
+ self.update(data)
+
+ @property
+ def digest_size(self):
+ return self.width / 8
+
+ @property
+ def digest_fmt(self):
+ return "!I" if self.width == 32 else "!Q"
+
+ def update(self, data):
+ """
+ Update the CRC with new data.
+
+ :param data: Data to update the CRC with.
+ """
+ self.crc = self.crc_func(data, self.crc)
+
+ def digest(self):
+ """
+ Return the current CRC value as a big-endian integer of length
+ ``width / 8`` bytes.
+
+ :returns: Packed CRC value. (bytes)
+ """
+ return struct.pack(self.digest_fmt, self.crc)
+
+ def hexdigest(self):
+ """
+ Return the hexadecimal representation of the current CRC value.
+
+ :returns: Hexadecimal CRC value. (str)
+ """
+ hex = binascii.hexlify(self.digest()).decode("ascii")
+ return hex
+
+ def copy(self):
+ """
+ Copy the current state of this CRCHasher to a new one.
+
+ :returns:
+ """
+ return CRCHasher(self.name,
+ self.crc_func,
+ initial_value=self.crc,
+ width=self.width)
+
+
+def crc32(data=None, initial_value=0):
+ return CRCHasher('crc32',
+ zlib.crc32,
+ data=data,
+ initial_value=initial_value)
+
+
+def crc32c(data=None, initial_value=0):
+ return CRCHasher('crc32c',
+ _select_crc32c_impl(),
+ data=data,
+ initial_value=initial_value)
+
+
+def crc64nvme(data=None, initial_value=0):
+ return CRCHasher('crc64nvme',
+ _select_crc64nvme_impl(),
+ data=data,
+ initial_value=initial_value,
+ width=64)
+
+
+def log_selected_implementation(logger):
+ try:
+ impl = _select_crc32c_impl()
+ except NotImplementedError:
+ logger.warning(
+ 'No implementation found for CRC32C; '
+ 'install ISA-L or anycrc for support.')
+ else:
+ logger.info('Using %s implementation for CRC32C.' % impl.__name__)
+
+ try:
+ impl = _select_crc64nvme_impl()
+ except NotImplementedError:
+ logger.warning(
+ 'No implementation found for CRC64NVME; '
+ 'install ISA-L or anycrc for support.')
+ else:
+ logger.info('Using %s implementation for CRC64NVME.' % impl.__name__)
diff -Nru swift-2.35.0/swift/common/utils/__init__.py swift-2.35.1/swift/common/utils/__init__.py
--- swift-2.35.0/swift/common/utils/__init__.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/swift/common/utils/__init__.py 2025-08-22 17:56:44.000000000 +0200
@@ -4627,7 +4627,7 @@
return None
-def strict_b64decode(value, allow_line_breaks=False):
+def strict_b64decode(value, allow_line_breaks=False, exact_size=None):
'''
Validate and decode Base64-encoded data.
@@ -4636,6 +4636,8 @@
:param value: some base64-encoded data
:param allow_line_breaks: if True, ignore carriage returns and newlines
+ :param exact_size: if provided, the exact size of the decoded bytes
+ expected; also enforces round-trip checks
:returns: the decoded data
:raises ValueError: if ``value`` is not a string, contains invalid
characters, or has insufficient padding
@@ -4656,7 +4658,17 @@
strip_chars += '\r\n'
if any(c not in valid_chars for c in value.strip(strip_chars)):
raise ValueError
- return base64.b64decode(value)
+ ret_val = base64.b64decode(value)
+ if exact_size is not None:
+ if len(ret_val) != exact_size:
+ raise ValueError
+ if base64_str(ret_val) != value:
+ raise ValueError
+ return ret_val
+
+
+def base64_str(value):
+ return base64.b64encode(value).decode('ascii')
def cap_length(value, max_length):
diff -Nru swift-2.35.0/test/functional/s3api/test_object.py swift-2.35.1/test/functional/s3api/test_object.py
--- swift-2.35.0/test/functional/s3api/test_object.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/test/functional/s3api/test_object.py 2025-08-22 17:56:44.000000000 +0200
@@ -13,7 +13,6 @@
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-
import unittest
import calendar
@@ -29,7 +28,7 @@
from swift.common.utils import md5, quote
from test.functional.s3api import S3ApiBase, SigV4Mixin, \
- skip_boto2_sort_header_bug
+ skip_boto2_sort_header_bug, S3ApiBaseBoto3, get_boto3_conn
from test.functional.s3api.s3_test_client import Connection
from test.functional.s3api.utils import get_error_code, calculate_md5, \
get_error_msg
@@ -45,6 +44,43 @@
tf.teardown_package()
+class TestS3ApiObjectBoto3(S3ApiBaseBoto3):
+ def setUp(self):
+ super().setUp()
+ self.conn = get_boto3_conn(tf.config['s3_access_key'],
+ tf.config['s3_secret_key'])
+ self.bucket = 'test-bucket'
+ resp = self.conn.create_bucket(Bucket=self.bucket)
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+
+ def test_put(self):
+ body = b'abcd' * 8192
+ resp = self.conn.put_object(Bucket=self.bucket, Key='obj', Body=body)
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+ resp = self.conn.get_object(Bucket=self.bucket, Key='obj')
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+ self.assertEqual(body, resp['Body'].read())
+
+ def test_put_chunked(self):
+ body = b'abcd' * 8192
+ resp = self.conn.put_object(Bucket=self.bucket, Key='obj', Body=body,
+ ContentEncoding='aws-chunked')
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+ resp = self.conn.get_object(Bucket=self.bucket, Key='obj')
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+ self.assertEqual(body, resp['Body'].read())
+
+ def test_put_chunked_sha256(self):
+ body = b'abcd' * 8192
+ resp = self.conn.put_object(Bucket=self.bucket, Key='obj', Body=body,
+ ContentEncoding='aws-chunked',
+ ChecksumAlgorithm='SHA256')
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+ resp = self.conn.get_object(Bucket=self.bucket, Key='obj')
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+ self.assertEqual(body, resp['Body'].read())
+
+
class TestS3ApiObject(S3ApiBase):
def setUp(self):
super(TestS3ApiObject, self).setUp()
diff -Nru swift-2.35.0/test/s3api/test_input_errors.py swift-2.35.1/test/s3api/test_input_errors.py
--- swift-2.35.0/test/s3api/test_input_errors.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/test/s3api/test_input_errors.py 2025-08-22 17:56:44.000000000 +0200
@@ -16,11 +16,14 @@
import binascii
import base64
import datetime
+import gzip
import hashlib
import hmac
import os
import requests
import requests.models
+import struct
+import zlib
from urllib.parse import urlsplit, urlunsplit, quote
from swift.common import bufferedhttp
@@ -50,6 +53,12 @@
).decode('ascii')
+def _crc32(payload=b''):
+ return base64.b64encode(
+ struct.pack('!I', zlib.crc32(payload))
+ ).decode('ascii')
+
+
EMPTY_SHA256 = _sha256()
EPOCH = datetime.datetime.fromtimestamp(0, UTC)
@@ -272,7 +281,7 @@
for k, v in sorted(request['query'].items())),
]
canonical_request_lines.extend(
- '%s:%s' % (h, request['headers'][h])
+ '%s:%s' % (h, request['headers'][h].strip())
for h in request['signed_headers'])
canonical_request_lines.extend([
'',
@@ -304,6 +313,58 @@
'signature': signature,
}
+ def sign_chunk(self, request, previous_signature, current_chunk_sha):
+ scope = [
+ request['now'].strftime('%Y%m%d'),
+ self.region,
+ 's3',
+ 'aws4_request',
+ ]
+ string_to_sign_lines = [
+ 'AWS4-HMAC-SHA256-PAYLOAD',
+ self.date_to_sign(request),
+ '/'.join(scope),
+ previous_signature,
+ _sha256(), # ??
+ current_chunk_sha,
+ ]
+ key = 'AWS4' + self.secret_key
+ for piece in scope:
+ key = _hmac(key, piece, hashlib.sha256)
+ return binascii.hexlify(_hmac(
+ key,
+ '\n'.join(string_to_sign_lines),
+ hashlib.sha256
+ )).decode('ascii')
+
+ def sign_trailer(self, request, previous_signature, trailer):
+ # rough canonicalization
+ trailer = trailer.replace(b'\r', b'').replace(b' ', b'')
+ # AWS always wants at least the newline
+ if not trailer:
+ trailer = b'\n'
+ scope = [
+ request['now'].strftime('%Y%m%d'),
+ self.region,
+ 's3',
+ 'aws4_request',
+ ]
+ string_to_sign_lines = [
+ 'AWS4-HMAC-SHA256-TRAILER',
+ self.date_to_sign(request),
+ '/'.join(scope),
+ previous_signature,
+ _sha256(trailer),
+ ]
+ key = 'AWS4' + self.secret_key
+ for piece in scope:
+ key = _hmac(key, piece, hashlib.sha256)
+ return binascii.hexlify(_hmac(
+ key,
+ '\n'.join(string_to_sign_lines),
+ hashlib.sha256
+ )).decode('ascii')
+
class S3SessionV4Headers(S3SessionV4):
def build_request(
@@ -749,6 +810,297 @@
self.assertSHA256Mismatch(
resp, EMPTY_SHA256.upper(), _sha256(TEST_BODY))
+ def test_good_md5_good_sha_good_crc(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'good-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'content-md5': _md5(TEST_BODY),
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ })
+ self.assertOK(resp)
+
+ def test_good_md5_good_sha_good_crc_declared(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'good-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'content-md5': _md5(TEST_BODY),
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ # can flag that you're going to send it
+ 'x-amz-sdk-checksum-algorithm': 'CRC32',
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ })
+ self.assertOK(resp)
+
+ def test_good_md5_good_sha_no_crc_but_declared(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'missing-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'content-md5': _md5(TEST_BODY),
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ # but if you flag it, you gotta send it
+ 'x-amz-sdk-checksum-algorithm': 'CRC32',
+ })
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>x-amz-sdk-checksum-algorithm specified, but '
+ b'no corresponding x-amz-checksum-* or x-amz-trailer '
+ b'headers were found.</Message>', resp.content)
+
+ def test_good_md5_good_sha_good_crc_algo_mismatch(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'good-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'content-md5': _md5(TEST_BODY),
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-sdk-checksum-algorithm': 'CRC32C',
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ })
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ # Note that if there's a mismatch between what you flag and what you
+ # send, the message isn't super clear
+ self.assertIn(b'<Message>Value for x-amz-sdk-checksum-algorithm '
+ b'header is invalid.</Message>', resp.content)
+
+ def test_good_md5_good_sha_invalid_crc_header(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'invalid-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'content-md5': _md5(TEST_BODY),
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': 'bad'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Value for x-amz-checksum-crc32 header is '
+ b'invalid.</Message>', resp.content)
+
+ def test_good_md5_good_sha_bad_crc_header(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'content-md5': _md5(TEST_BODY),
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(b'not the body')})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>BadDigest</Code>', resp.content)
+ self.assertIn(b'<Message>The CRC32 you specified did not match the '
+ b'calculated checksum.</Message>', resp.content)
+
+ def test_good_md5_bad_sha_bad_crc_header(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'content-md5': _md5(TEST_BODY),
+ 'x-amz-content-sha256': _sha256(b'not the body'),
+ 'x-amz-checksum-crc32': _crc32(b'not the body')})
+ # SHA256 trumps checksum
+ self.assertSHA256Mismatch(
+ resp, _sha256(b'not the body'), _sha256(TEST_BODY))
+
+ def test_no_md5_good_sha_good_crc_header(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY)})
+ self.assertOK(resp)
+
+ def test_no_md5_good_sha_unsupported_crc_header(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-bad': _crc32(TEST_BODY)})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>The algorithm type you specified in '
+ b'x-amz-checksum- header is invalid.</Message>',
+ resp.content)
+
+ def test_no_md5_good_sha_multiple_crc_in_headers(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32c': _crc32(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY)})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header. '
+ b'Multiple checksum Types are not allowed.</Message>',
+ resp.content)
+
+ def test_no_md5_good_sha_multiple_crc_in_headers_algo_mismatch(self):
+ # repeats trump the algo mismatch
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'sha256',
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32c': _crc32(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY)})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header. '
+ b'Multiple checksum Types are not allowed.</Message>',
+ resp.content)
+
+ def test_no_md5_good_sha_crc_in_trailer_but_not_streaming(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'crc32',
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>MalformedTrailerError</Code>', resp.content)
+ self.assertIn(b'<Message>The request contained trailing data that was '
+ b'not well-formed or did not conform to our published '
+ b'schema.</Message>', resp.content)
+
+ def test_no_md5_good_sha_duplicated_crc_in_trailer_algo_mismatch(self):
+ # repeats trump the algo mismatch
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'sha256',
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-trailer': 'x-amz-checksum-crc32, x-amz-checksum-crc32'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header. '
+ b'Multiple checksum Types are not allowed.</Message>',
+ resp.content)
+
+ def test_no_md5_good_sha_multiple_crc_in_trailer_algo_mismatch(self):
+ # repeats trump the algo mismatch
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'sha256',
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-trailer': 'x-amz-checksum-crc32, x-amz-checksum-crc32c'}
+ )
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header. '
+ b'Multiple checksum Types are not allowed.</Message>',
+ resp.content)
+
+ def test_no_md5_good_sha_different_crc_in_trailer_and_header(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'crc32',
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-trailer': 'x-amz-checksum-crc32c'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header'
+ b'</Message>', resp.content)
+
+ def test_no_md5_good_sha_same_crc_in_trailer_and_header(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'crc32',
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header'
+ b'</Message>', resp.content)
+
+ def test_no_md5_good_sha_multiple_crc_in_trailer_and_header(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'crc32',
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-trailer': 'x-amz-checksum-crc32, x-amz-checksum-crc32c'}
+ )
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header. '
+ b'Multiple checksum Types are not allowed.</Message>',
+ resp.content)
+
+ def test_no_md5_good_sha_multiple_crc_in_header_and_trailer(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'crc32',
+ 'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-checksum-sha256': _sha256(TEST_BODY),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'}
+ )
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header. '
+ b'Multiple checksum Types are not allowed.</Message>',
+ resp.content)
+
def test_no_md5_bad_sha_empty_body(self):
resp = self.conn.make_request(
self.bucket_name,
@@ -766,6 +1118,23 @@
headers={'x-amz-content-sha256': _sha256(TEST_BODY)})
self.assertOK(resp)
+ def test_no_md5_good_sha_chunk_encoding_declared_ok(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=TEST_BODY,
+ headers={'x-amz-content-sha256': _sha256(TEST_BODY),
+ 'content-encoding': 'aws-chunked'}) # but not really
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'})
+ self.assertOK(resp, TEST_BODY)
+ self.assertEqual(resp.headers.get('Content-Encoding'), 'aws-chunked')
+
def test_no_md5_good_sha_ucase(self):
resp = self.conn.make_request(
self.bucket_name,
@@ -824,6 +1193,52 @@
headers={'x-amz-content-sha256': 'unsigned-payload'})
self.assertSHA256Mismatch(resp, 'unsigned-payload', _sha256(TEST_BODY))
+ def test_no_md5_streaming_unsigned_no_encoding_no_length(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER'})
+ respbody = resp.content
+ if not isinstance(respbody, str):
+ respbody = respbody.decode('utf8')
+ self.assertEqual(
+ (resp.status_code, resp.reason),
+ (411, 'Length Required'),
+ respbody)
+ self.assertIn('<Code>MissingContentLength</Code>', respbody)
+ # NB: we *do* provide Content-Length (or rather, urllib does)
+ # they really mean X-Amz-Decoded-Content-Length
+ self.assertIn("<Message>You must provide the Content-Length HTTP "
+ "header.</Message>",
+ respbody)
+
+ def test_no_md5_streaming_unsigned_bad_decoded_content_length(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': 'not an int'})
+ respbody = resp.content
+ if not isinstance(respbody, str):
+ respbody = respbody.decode('utf8')
+ self.assertEqual(
+ (resp.status_code, resp.reason),
+ (411, 'Length Required'),
+ respbody)
+ self.assertIn('<Code>MissingContentLength</Code>', respbody)
+ # NB: we *do* provide Content-Length (or rather, urllib does)
+ # they really mean X-Amz-Decoded-Content-Length
+ self.assertIn("<Message>You must provide the Content-Length HTTP "
+ "header.</Message>",
+ respbody)
+
def test_invalid_md5_no_sha(self):
resp = self.conn.make_request(
self.bucket_name,
@@ -1140,6 +1555,45 @@
self.assertIn('<ArgumentValue>%s</ArgumentValue>' % sha_in_headers,
respbody)
+ def assertSignatureMismatch(self, resp, sts_first_line='AWS4-HMAC-SHA256'):
+ respbody = resp.content
+ if not isinstance(respbody, str):
+ respbody = respbody.decode('utf8')
+ self.assertEqual(
+ (resp.status_code, resp.reason),
+ (403, 'Forbidden'),
+ respbody)
+ self.assertIn('<Code>SignatureDoesNotMatch</Code>', respbody)
+ self.assertIn('<Message>The request signature we calculated does not '
+ 'match the signature you provided. Check your key and '
+ 'signing method.</Message>', respbody)
+ self.assertIn('<AWSAccessKeyId>', respbody)
+ self.assertIn(f'<StringToSign>{sts_first_line}\n', respbody)
+ self.assertIn('<SignatureProvided>', respbody)
+ self.assertIn('<StringToSignBytes>', respbody)
+ self.assertIn('<CanonicalRequest>', respbody)
+ self.assertIn('<CanonicalRequestBytes>', respbody)
+
+ def assertMalformedTrailer(self, resp):
+ respbody = resp.content
+ if not isinstance(respbody, str):
+ respbody = respbody.decode('utf8')
+ self.assertEqual(
+ (resp.status_code, resp.reason),
+ (400, 'Bad Request'),
+ respbody)
+ self.assertIn('<Code>MalformedTrailerError</Code>', respbody)
+ self.assertIn('<Message>The request contained trailing data that was '
+ 'not well-formed or did not conform to our published '
+ 'schema.</Message>', respbody)
+
+ def assertUnsupportedTrailerHeader(self, resp):
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>The value specified in the x-amz-trailer '
+ b'header is not supported</Message>',
+ resp.content)
+
def test_get_service_no_sha(self):
resp = self.conn.make_request()
self.assertMissingSHA256(resp)
@@ -1242,6 +1696,1540 @@
headers={'x-amz-content-sha256': 'unsigned-payload'})
self.assertInvalidSHA256(resp, 'unsigned-payload')
+ def test_no_md5_no_sha_good_crc(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'bad-checksum',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY)})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Missing required header for this request: '
+ b'x-amz-content-sha256</Message>', resp.content)
+
+ def test_strm_unsgnd_pyld_trl_not_encoded(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_encoding_declared_not_encoded(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_no_trailer_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='GET',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'})
+ self.assertOK(resp, TEST_BODY)
+ self.assertNotIn('Content-Encoding', resp.headers)
+
+ def test_strm_unsgnd_pyld_trl_te_chunked_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ # Use iter(list-of-bytes) to force requests to send
+ # Transfer-Encoding: chunked
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=iter([chunked_body]),
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_te_chunked_no_decoded_content_length(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ # Use iter(list-of-bytes) to force requests to send
+ # Transfer-Encoding: chunked
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=iter([chunked_body]),
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked'})
+ self.assertEqual(resp.status_code, 411, resp.content)
+ self.assertIn(b'<Code>MissingContentLength</Code>', resp.content)
+ self.assertIn(b'<Message>You must provide the Content-Length HTTP '
+ b'header.</Message>', resp.content)
+
+ def test_strm_unsgnd_pyld_trl_crc_header_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='GET',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'})
+ self.assertOK(resp, TEST_BODY)
+ self.assertNotIn('Content-Encoding', resp.headers)
+
+ def test_strm_unsgnd_pyld_trl_crc_header_x_amz_checksum_type_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ # unexpected with a PUT but tolerated...
+ 'x-amz-checksum-type': 'COMPOSITE',
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_crc_header_x_amz_checksum_algorithm_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ # unexpected with a PUT but tolerated...
+ 'x-amz-checksum-algorithm': 'crc32',
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_crc_header_algo_mismatch(self):
+ chunked_body = b'nonsense ignored'
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'sha256',
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Value for x-amz-sdk-checksum-algorithm '
+ b'header is invalid.</Message>', resp.content)
+
+ def test_strm_unsgnd_pyld_trl_multiple_crc_header(self):
+ chunked_body = b'nonsense ignored'
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-checksum-crc32c': _crc32(TEST_BODY),
+ 'x-amz-checksum-crc32': _crc32(TEST_BODY),
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header. '
+ b'Multiple checksum Types are not allowed.</Message>',
+ resp.content)
+
+ def test_strm_unsgnd_pyld_trl_crc_header_mismatch(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-sdk-checksum-algorithm': 'crc32',
+ 'x-amz-checksum-crc32': _crc32(b'not the test body'),
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>BadDigest</Code>', resp.content)
+ self.assertIn(b'<Message>The CRC32 you specified did not match the '
+ b'calculated checksum.</Message>', resp.content)
+
+ def test_strm_unsgnd_pyld_trl_declared_algo_declared_no_trailer_sent(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-sdk-checksum-algorithm': 'crc32',
+ 'x-amz-trailer': 'x-amz-checksum-crc32',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertMalformedTrailer(resp)
+
+ def test_strm_unsgnd_pyld_trl_declared_no_trailer_sent(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-trailer': 'x-amz-checksum-crc32',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertMalformedTrailer(resp)
+
+ def test_strm_sgnd_pyld_trl_no_trailer(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256':
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s%s' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk,
+ b'\r\n' if chunk else b''))
+ prev_sig = chunk_sig
+ trailers = b''
+ body_parts.append(trailers)
+ trailer_sig = self.conn.sign_trailer(req, prev_sig, trailers)
+ body_parts.append(
+ b'x-amz-trailer-signature:%s\r\n' % trailer_sig.encode('ascii'))
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_no_trailer_tr_chunked_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=iter([chunked_body]),
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_trailer_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_comma_in_trailer_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32,'})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_commas_in_trailer_1(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': ', x-amz-checksum-crc32, ,'})
+ self.assertUnsupportedTrailerHeader(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_commas_in_trailer_2(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': ', x-amz-checksum-crc32'})
+ self.assertUnsupportedTrailerHeader(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_commas_in_trailer_3(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': ',x-amz-checksum-crc32'})
+ self.assertUnsupportedTrailerHeader(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_commas_in_trailer_4(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32,,'})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_commas_in_trailer_5(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32, ,'})
+ self.assertUnsupportedTrailerHeader(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_commas_in_trailer_6(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32, '})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_trailer_checksum_mismatch(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(b"not the body")}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>BadDigest</Code>', resp.content)
+ self.assertIn(b'<Message>The CRC32 you specified did not match the '
+ b'calculated checksum.</Message>', resp.content)
+
+ def test_strm_unsgnd_pyld_trl_with_trailer_checksum_invalid(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {"not=base-64"}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Value for x-amz-checksum-crc32 trailing '
+ b'header is invalid.</Message>', resp.content)
+
+ def test_strm_unsgnd_pyld_trl_content_sha256_in_trailer(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-content-sha256: {_sha256(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-content-sha256'})
+ self.assertUnsupportedTrailerHeader(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_trailer_no_cr(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_trailer_no_lf(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_trailer_no_crlf(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_trailer_extra_line_before(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ '\r\n',
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertMalformedTrailer(resp)
+
+ def test_strm_unsgnd_pyld_trl_extra_line_after_trailer_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ '\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_with_trailer_extra_line_junk_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ '\r\n',
+ '\xff\xde\xad\xbe\xef\xff',
+ ]).encode('latin1')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertOK(resp) # really??
+
+ def test_strm_unsgnd_pyld_trl_extra_lines_after_trailer_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ '\r\n',
+ '\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_mismatch_trailer(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32c'})
+ self.assertMalformedTrailer(resp)
+
+ def test_strm_unsgnd_pyld_trl_unsupported_trailer_sent(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-bad: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32c'})
+ self.assertMalformedTrailer(resp)
+
+ def test_strm_unsgnd_pyld_trl_non_checksum_trailer(self):
+ def do_test(trailer, value):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'{trailer}: {value}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256':
+ 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': trailer})
+ self.assertUnsupportedTrailerHeader(resp)
+
+ do_test('foo', 'bar')
+ do_test('content-md5', _md5(TEST_BODY))
+ do_test('x-amz-content-sha256', _sha256(TEST_BODY))
+
+ def test_strm_unsgnd_pyld_trl_unsupported_trailer_declared(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-bad'})
+ self.assertUnsupportedTrailerHeader(resp)
+
+ def test_strm_unsgnd_pyld_trl_multiple_checksum_trailers(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ f'x-amz-checksum-sha256: {_sha256(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer':
+ 'x-amz-checksum-crc32, x-amz-checksum-sha256'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>InvalidRequest</Code>', resp.content)
+ self.assertIn(b'<Message>Expecting a single x-amz-checksum- header. '
+ b'Multiple checksum Types are not allowed.</Message>',
+ resp.content)
+
+ def test_strm_unsgnd_pyld_trl_multiple_trailers_unsupported(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ 'x-amz-foo: bar\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer':
+ 'x-amz-checksum-crc32, x-amz-foo'})
+ self.assertUnsupportedTrailerHeader(resp)
+
+ def test_strm_unsgnd_pyld_trl_extra_trailer(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ 'bonus: trailer\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertMalformedTrailer(resp)
+
+ def test_strm_unsgnd_pyld_trl_bad_then_good_trailer_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY[:-1])}\r\n',
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertOK(resp)
+
+ def test_strm_unsgnd_pyld_trl_good_then_bad_trailer(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY[:-1])}\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertEqual(resp.status_code, 400, resp.content)
+ self.assertIn(b'<Code>BadDigest</Code>', resp.content)
+ self.assertIn(b'<Message>The CRC32 you specified did not match the '
+ b'calculated checksum.</Message>', resp.content)
+
+ def test_strm_unsgnd_pyld_trl_extra_line_then_trailer_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])[:-2]
+ chunked_body += ''.join([
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n',
+ '\r\n',
+ 'bonus: trailer\r\n',
+ ]).encode('ascii')
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY)),
+ 'x-amz-trailer': 'x-amz-checksum-crc32'})
+ self.assertOK(resp) # ???
+
+ def test_strm_unsgnd_pyld_trl_no_cr(self):
+ chunked_body = b''.join(
+ b'%x\n%s\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_no_lf(self):
+ chunked_body = b''.join(
+ b'%x\r%s\r' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_no_trailing_lf(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ chunked_body = chunked_body[:-1]
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_no_trailing_crlf_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ chunked_body = chunked_body[:-2]
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ # dafuk?
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='GET',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'})
+ self.assertOK(resp, TEST_BODY)
+ self.assertNotIn('Content-Encoding', resp.headers)
+
+ def test_strm_unsgnd_pyld_trl_cl_matches_decoded_cl(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(chunked_body))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_sgnd_pyld_cl_matches_decoded_cl(self):
+ # Used to calculate our bad decoded-content-length
+ dummy_body = b''.join(
+ b'%x;chunk-signature=%064x\r\n%s\r\n' % (len(chunk), 0, chunk)
+ for chunk in [TEST_BODY, b''])
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256':
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(dummy_body))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_no_zero_chunk(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_unsgnd_pyld_trl_zero_chunk_mid_stream(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY[:4], b'', TEST_BODY[4:], b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp, 4, len(TEST_BODY))
+
+ def test_strm_unsgnd_pyld_trl_too_many_bytes(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY * 2, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp, 2 * len(TEST_BODY), len(TEST_BODY))
+
+ def test_strm_unsgnd_pyld_trl_no_encoding_ok(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='GET',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'})
+ self.assertOK(resp, TEST_BODY)
+ self.assertNotIn('Content-Encoding', resp.headers)
+
+ def test_strm_unsgnd_pyld_trl_custom_encoding_ok(self):
+ # As best we can tell, AWS doesn't care at all about how
+ # > If one or more encodings have been applied to a representation,
+ # > the sender that applied the encodings MUST generate a
+ # > Content-Encoding header field that lists the content codings in
+ # > the order in which they were applied.
+ # See https://www.rfc-editor.org/rfc/rfc9110.html#section-8.4
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'foo, aws-chunked, bar',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='GET',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'})
+ self.assertOK(resp, TEST_BODY)
+ self.assertIn('Content-Encoding', resp.headers)
+ self.assertEqual(resp.headers['Content-Encoding'], 'foo, bar')
+
+ def test_strm_unsgnd_pyld_trl_gzipped_undeclared_ok(self):
+ alt_body = gzip.compress(TEST_BODY)
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [alt_body, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'gzip',
+ 'x-amz-decoded-content-length': str(len(alt_body))})
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='GET',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'},
+ stream=True) # needed so requests won't try to be "helpful"
+ read_body = resp.raw.read()
+ self.assertEqual(read_body, alt_body)
+ self.assertEqual(resp.headers['Content-Length'], str(len(alt_body)))
+ self.assertOK(resp) # already read body
+ self.assertIn('Content-Encoding', resp.headers)
+ self.assertEqual(resp.headers['Content-Encoding'], 'gzip')
+
+ def test_strm_unsgnd_pyld_trl_gzipped_declared_swapped_ok(self):
+ alt_body = gzip.compress(TEST_BODY)
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [alt_body, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked, gzip',
+ 'x-amz-decoded-content-length': str(len(alt_body))})
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='GET',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'},
+ stream=True)
+ read_body = resp.raw.read()
+ self.assertEqual(read_body, alt_body)
+ self.assertEqual(resp.headers['Content-Length'], str(len(alt_body)))
+ self.assertOK(resp) # already read body
+ self.assertIn('Content-Encoding', resp.headers)
+ self.assertEqual(resp.headers['Content-Encoding'], 'gzip')
+
+ def test_strm_unsgnd_pyld_trl_gzipped_declared_ok(self):
+ alt_body = gzip.compress(TEST_BODY)
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [alt_body, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'gzip, aws-chunked',
+ 'x-amz-decoded-content-length': str(len(alt_body))})
+ self.assertOK(resp)
+
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='GET',
+ headers={'x-amz-content-sha256': 'UNSIGNED-PAYLOAD'},
+ stream=True)
+ read_body = resp.raw.read()
+ self.assertEqual(read_body, alt_body)
+ self.assertEqual(resp.headers['Content-Length'], str(len(alt_body)))
+ self.assertOK(resp) # already read body
+ self.assertIn('Content-Encoding', resp.headers)
+ self.assertEqual(resp.headers['Content-Encoding'], 'gzip')
+
+ def test_strm_sgnd_pyld_no_signatures(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_sgnd_pyld_blank_signatures(self):
+ chunked_body = b''.join(
+ b'%x;chunk-signature=\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp)
+
+ def test_strm_sgnd_pyld_invalid_signatures(self):
+ chunked_body = b''.join(
+ b'%x;chunk-signature=invalid\r\n%s\r\n' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertSignatureMismatch(resp, 'AWS4-HMAC-SHA256-PAYLOAD')
+
+ def test_strm_sgnd_pyld_bad_signatures(self):
+ chunked_body = b''.join(
+ b'%x;chunk-signature=%064x\r\n%s\r\n' % (len(chunk), 0, chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertSignatureMismatch(resp, 'AWS4-HMAC-SHA256-PAYLOAD')
+
+ def test_strm_sgnd_pyld_good_signatures_ok(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertOK(resp)
+
+ def test_strm_sgnd_pyld_ragged_chunk_lengths_ok(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str((15 + 8 + 16) * 1024)})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [
+ b'x' * 15 * 1024,
+ b'y' * 8 * 1024,
+ b'z' * 16 * 1024,
+ b'',
+ ]:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertOK(resp)
+
+ def test_strm_sgnd_pyld_no_zero_chunk(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY]:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertIncompleteBody(resp)
+
+ def test_strm_sgnd_pyld_negative_chunk_length(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'-%x;chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ # AWS reliably 500s at time of writing
+ self.assertNotEqual(resp.status_code, 200)
+
+ def test_strm_sgnd_pyld_too_small_chunks(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(9 * 1024)})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [b'x' * 1024, b'y' * 4 * 1024, b'z' * 3 * 1024, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertEqual(
+ (resp.status_code, resp.reason),
+ (403, 'Forbidden')) # ???
+ respbody = resp.content.decode('utf8')
+ self.assertIn('<Code>InvalidChunkSizeError</Code>', respbody)
+ self.assertIn("<Message>Only the last chunk is allowed to have a "
+ "size less than 8192 bytes</Message>",
+ respbody)
+ # Yeah, it points at the wrong chunk number
+ self.assertIn("<Chunk>2</Chunk>", respbody)
+ # But at least it complains about the right size!
+ self.assertIn("<BadChunkSize>%d</BadChunkSize>" % 1024,
+ respbody)
+
+ def test_strm_sgnd_pyld_spaced_out_chunk_param_ok(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x ; chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertOK(resp)
+
+ def test_strm_sgnd_pyld_spaced_out_chunk_param_value_ok(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature = %s \r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertOK(resp)
+
+ def test_strm_sgnd_pyld_bad_final_signature(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(
+ req, prev_sig, _sha256(chunk or b'x'))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertSignatureMismatch(resp, 'AWS4-HMAC-SHA256-PAYLOAD')
+
+ def test_strm_sgnd_pyld_extra_param_before(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(
+ b'%x;extra=param;chunk-signature=%s\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertSignatureMismatch(resp, 'AWS4-HMAC-SHA256-PAYLOAD')
+
+ def test_strm_sgnd_pyld_extra_param_after(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(
+ b'%x;chunk-signature=%s;extra=param\r\n%s\r\n' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk))
+ prev_sig = chunk_sig
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertIncompleteBody(resp)
+
+ def test_strm_sgnd_pyld_missing_final_chunk(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(TEST_BODY))
+ body = b'%x;chunk-signature=%s\r\n%s\r\n' % (
+ len(TEST_BODY), chunk_sig.encode('ascii'), TEST_BODY)
+ resp = self.conn.send_request(req, body)
+ self.assertIncompleteBody(resp)
+
+ def test_strm_sgnd_pyld_trl_ok(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256':
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-trailer': 'x-amz-checksum-crc32',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s%s' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk,
+ b'\r\n' if chunk else b''))
+ prev_sig = chunk_sig
+ trailers = (
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n'
+ ).encode('ascii')
+ body_parts.append(trailers)
+ trailer_sig = self.conn.sign_trailer(req, prev_sig, trailers)
+ body_parts.append(
+ b'x-amz-trailer-signature:%s\r\n' % trailer_sig.encode('ascii'))
+ body_parts.append(b'\r\n')
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertOK(resp)
+
+ def test_strm_sgnd_pyld_trl_missing_trl_sig(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256':
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-trailer': 'x-amz-checksum-crc32',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s%s' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk,
+ b'\r\n' if chunk else b''))
+ prev_sig = chunk_sig
+ trailers = (
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n'
+ ).encode('ascii')
+ body_parts.append(trailers)
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertIncompleteBody(resp)
+
+ def test_strm_sgnd_pyld_trl_bad_trl_sig(self):
+ req = self.conn.build_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ headers={
+ 'x-amz-content-sha256':
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-trailer': 'x-amz-checksum-crc32',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ prev_sig = self.conn.sign_v4(req)['signature']
+ self.conn.sign_request(req)
+ body_parts = []
+ for chunk in [TEST_BODY, b'']:
+ chunk_sig = self.conn.sign_chunk(req, prev_sig, _sha256(chunk))
+ body_parts.append(b'%x;chunk-signature=%s\r\n%s%s' % (
+ len(chunk), chunk_sig.encode('ascii'), chunk,
+ b'\r\n' if chunk else b''))
+ prev_sig = chunk_sig
+ trailers = (
+ f'x-amz-checksum-crc32: {_crc32(TEST_BODY)}\r\n'
+ ).encode('ascii')
+ body_parts.append(trailers)
+ trailer_sig = self.conn.sign_trailer(req, prev_sig, trailers[:-1])
+ body_parts.append(
+ b'x-amz-trailer-signature:%s\r\n' % trailer_sig.encode('ascii'))
+ resp = self.conn.send_request(req, b''.join(body_parts))
+ self.assertSignatureMismatch(resp, 'AWS4-HMAC-SHA256-TRAILER')
+
def test_invalid_md5_no_sha(self):
resp = self.conn.make_request(
self.bucket_name,
@@ -1372,13 +3360,108 @@
(400, 'Bad Request'))
-class TestV4AuthQuery(InputErrorsMixin, BaseS3TestCaseWithBucket):
+class NotV4AuthHeadersMixin:
+ def test_strm_unsgnd_pyld_trl_not_encoded(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertSHA256Mismatch(resp, 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ _sha256(TEST_BODY))
+
+ def test_strm_unsgnd_pyld_trl_encoding_declared_not_encoded(self):
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=TEST_BODY,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertSHA256Mismatch(resp, 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ _sha256(TEST_BODY))
+
+ def test_strm_unsgnd_pyld_trl_no_trailer(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp, len(chunked_body), len(TEST_BODY))
+
+ def test_strm_unsgnd_pyld_trl_cl_matches_decoded_cl(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(chunked_body))})
+ self.assertSHA256Mismatch(resp, 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ _sha256(chunked_body))
+
+ def test_strm_sgnd_pyld_trl_no_trailer(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(TEST_BODY))})
+ self.assertIncompleteBody(resp, len(chunked_body), len(TEST_BODY))
+
+ def test_strm_sgnd_pyld_cl_matches_decoded_cl(self):
+ chunked_body = b''.join(
+ b'%x\r\n%s' % (len(chunk), chunk)
+ for chunk in [TEST_BODY, b''])
+ resp = self.conn.make_request(
+ self.bucket_name,
+ 'test-obj',
+ method='PUT',
+ body=chunked_body,
+ headers={
+ 'x-amz-content-sha256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'content-encoding': 'aws-chunked',
+ 'x-amz-decoded-content-length': str(len(chunked_body))})
+ self.assertSHA256Mismatch(resp, 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ _sha256(chunked_body))
+
+
+class TestV4AuthQuery(InputErrorsMixin,
+ NotV4AuthHeadersMixin,
+ BaseS3TestCaseWithBucket):
session_cls = S3SessionV4Query
-class TestV2AuthHeaders(InputErrorsMixin, BaseS3TestCaseWithBucket):
+class TestV2AuthHeaders(InputErrorsMixin,
+ NotV4AuthHeadersMixin,
+ BaseS3TestCaseWithBucket):
session_cls = S3SessionV2Headers
-class TestV2AuthQuery(InputErrorsMixin, BaseS3TestCaseWithBucket):
+class TestV2AuthQuery(InputErrorsMixin,
+ NotV4AuthHeadersMixin,
+ BaseS3TestCaseWithBucket):
session_cls = S3SessionV2Query
diff -Nru swift-2.35.0/test/s3api/test_mpu.py swift-2.35.1/test/s3api/test_mpu.py
--- swift-2.35.0/test/s3api/test_mpu.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/test/s3api/test_mpu.py 2025-08-22 17:56:44.000000000 +0200
@@ -793,7 +793,7 @@
with self.assertRaises(ClientError) as cm:
self.get_part(key_name, 3)
- self.assertEqual(416, status_from_error(cm.exception))
+ self.assertEqual(416, status_from_error(cm.exception), cm.exception)
self.assertEqual('InvalidPartNumber', code_from_error(cm.exception))
def test_create_upload_complete_misordered_parts(self):
diff -Nru swift-2.35.0/test/s3api/test_object_checksums.py swift-2.35.1/test/s3api/test_object_checksums.py
--- swift-2.35.0/test/s3api/test_object_checksums.py 1970-01-01 01:00:00.000000000 +0100
+++ swift-2.35.1/test/s3api/test_object_checksums.py 2025-08-22 17:56:44.000000000 +0200
@@ -0,0 +1,600 @@
+# Copyright (c) 2010-2023 OpenStack Foundation
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+# implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import binascii
+import botocore
+import hashlib
+from unittest import SkipTest
+
+from swift.common.utils import base64_str
+from swift.common.utils.checksum import crc32c
+from test.s3api import BaseS3TestCaseWithBucket
+
+TEST_BODY = b'123456789'
+
+
+def boto_at_least(*version):
+ return tuple(int(x) for x in botocore.__version__.split('.')) >= version
+
+
+class ObjectChecksumMixin(object):
+
+ @classmethod
+ def setUpClass(cls):
+ super().setUpClass()
+ cls.client = cls.get_s3_client(1)
+ cls.use_tls = cls.client._endpoint.host.startswith('https:')
+ cls.CHECKSUM_HDR = 'x-amz-checksum-' + cls.ALGORITHM.lower()
+
+ def assert_error(self, resp, err_code, err_msg, obj_name, **extra):
+ self.assertEqual(400, resp['ResponseMetadata']['HTTPStatusCode'])
+ self.assertEqual(err_code, resp['Error']['Code'])
+ self.assertEqual(err_msg, resp['Error']['Message'])
+ self.assertEqual({k: resp['Error'].get(k) for k in extra}, extra)
+
+ # Sanity check: object was not created
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.head_object(Bucket=self.bucket_name, Key=obj_name)
+ resp = caught.exception.response
+ self.assertEqual(404, resp['ResponseMetadata']['HTTPStatusCode'])
+
+ def test_let_sdk_compute(self):
+ obj_name = self.create_name(self.ALGORITHM + '-sdk')
+ resp = self.client.put_object(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ Body=TEST_BODY,
+ ChecksumAlgorithm=self.ALGORITHM,
+ )
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+
+ def test_good_checksum(self):
+ obj_name = self.create_name(self.ALGORITHM + '-with-algo-header')
+ resp = self.client.put_object(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ Body=TEST_BODY,
+ ChecksumAlgorithm=self.ALGORITHM,
+ **{'Checksum' + self.ALGORITHM: self.EXPECTED}
+ )
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+
+ def test_good_checksum_no_algorithm_header(self):
+ obj_name = self.create_name(self.ALGORITHM + '-no-algo-header')
+ resp = self.client.put_object(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ Body=TEST_BODY,
+ **{'Checksum' + self.ALGORITHM: self.EXPECTED}
+ )
+ self.assertEqual(200, resp['ResponseMetadata']['HTTPStatusCode'])
+
+ def test_invalid_checksum(self):
+ obj_name = self.create_name(self.ALGORITHM + '-invalid')
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ Body=TEST_BODY,
+ ChecksumAlgorithm=self.ALGORITHM,
+ **{'Checksum' + self.ALGORITHM: self.INVALID}
+ )
+ self.assert_error(
+ caught.exception.response,
+ 'InvalidRequest',
+ 'Value for %s header is invalid.' % self.CHECKSUM_HDR,
+ obj_name,
+ )
+
+ def test_bad_checksum(self):
+ obj_name = self.create_name(self.ALGORITHM + '-bad')
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ Body=TEST_BODY,
+ ChecksumAlgorithm=self.ALGORITHM,
+ **{'Checksum' + self.ALGORITHM: self.BAD}
+ )
+ self.assert_error(
+ caught.exception.response,
+ 'BadDigest',
+ 'The %s you specified did not match the calculated checksum.'
+ % self.ALGORITHM,
+ obj_name,
+ )
+
+ def test_mpu_upload_part_invalid_checksum(self):
+ obj_name = self.create_name(
+ self.ALGORITHM + '-mpu-upload-part-invalid-checksum')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ ChecksumAlgorithm=self.ALGORITHM)
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=TEST_BODY,
+ **{'Checksum' + self.ALGORITHM: self.INVALID},
+ )
+ self.assert_error(
+ caught.exception.response,
+ 'InvalidRequest',
+ 'Value for %s header is invalid.' % self.CHECKSUM_HDR,
+ obj_name,
+ )
+
+ def test_mpu_upload_part_bad_checksum(self):
+ obj_name = self.create_name(
+ self.ALGORITHM + '-mpu-upload-part-bad-checksum')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ ChecksumAlgorithm=self.ALGORITHM)
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=TEST_BODY,
+ **{'Checksum' + self.ALGORITHM: self.BAD},
+ )
+ self.assert_error(
+ caught.exception.response,
+ 'BadDigest',
+ 'The %s you specified did not match the calculated '
+ 'checksum.' % self.ALGORITHM,
+ obj_name,
+ )
+
+ def test_mpu_upload_part_good_checksum(self):
+ obj_name = self.create_name(self.ALGORITHM + '-mpu-upload-part-good')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ ChecksumAlgorithm=self.ALGORITHM)
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ part_resp = self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=TEST_BODY,
+ **{'Checksum' + self.ALGORITHM: self.EXPECTED},
+ )
+ self.assertEqual(200, part_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+
+ def test_mpu_complete_good_checksum(self):
+ checksum_kwargs = {
+ 'ChecksumAlgorithm': self.ALGORITHM,
+ }
+ if boto_at_least(1, 36):
+ if self.ALGORITHM == 'CRC64NVME':
+ # crc64nvme only allows full-object
+ checksum_kwargs['ChecksumType'] = 'FULL_OBJECT'
+ else:
+ checksum_kwargs['ChecksumType'] = 'COMPOSITE'
+
+ obj_name = self.create_name(self.ALGORITHM + '-mpu-complete-good')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ **checksum_kwargs)
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ part_resp = self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=TEST_BODY,
+ **{'Checksum' + self.ALGORITHM: self.EXPECTED},
+ )
+ self.assertEqual(200, part_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ complete_mpu_resp = self.client.complete_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ MultipartUpload={
+ 'Parts': [
+ {
+ 'ETag': part_resp['ETag'],
+ 'PartNumber': 1,
+ 'Checksum' + self.ALGORITHM: self.EXPECTED,
+ },
+ ],
+ },
+ UploadId=upload_id,
+ )
+ self.assertEqual(200, complete_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+
+
+class TestObjectChecksumCRC32(ObjectChecksumMixin, BaseS3TestCaseWithBucket):
+ ALGORITHM = 'CRC32'
+ EXPECTED = 'y/Q5Jg=='
+ INVALID = 'y/Q5Jh=='
+ BAD = 'z/Q5Jg=='
+
+
+class TestObjectChecksumCRC32C(ObjectChecksumMixin, BaseS3TestCaseWithBucket):
+ ALGORITHM = 'CRC32C'
+ EXPECTED = '4waSgw=='
+ INVALID = '4waSgx=='
+ BAD = '5waSgw=='
+
+ @classmethod
+ def setUpClass(cls):
+ if not botocore.httpchecksum.HAS_CRT:
+ raise SkipTest('botocore cannot crc32c (run `pip install awscrt`)')
+ super().setUpClass()
+
+
+class TestObjectChecksumCRC64NVME(ObjectChecksumMixin,
+ BaseS3TestCaseWithBucket):
+ ALGORITHM = 'CRC64NVME'
+ EXPECTED = 'rosUhgp5mIg='
+ INVALID = 'rosUhgp5mIh='
+ BAD = 'sosUhgp5mIg='
+
+ @classmethod
+ def setUpClass(cls):
+ if [int(x) for x in botocore.__version__.split('.')] < [1, 36]:
+ raise SkipTest('botocore cannot crc64nvme (run '
+ '`pip install -U boto3 botocore`)')
+ if not botocore.httpchecksum.HAS_CRT:
+ raise SkipTest('botocore cannot crc64nvme (run '
+ '`pip install awscrt`)')
+ super().setUpClass()
+
+
+class TestObjectChecksumSHA1(ObjectChecksumMixin, BaseS3TestCaseWithBucket):
+ ALGORITHM = 'SHA1'
+ EXPECTED = '98O8HYCOBHMq32eZZczDTKeuNEE='
+ INVALID = '98O8HYCOBHMq32eZZczDTKeuNEF='
+ BAD = '+8O8HYCOBHMq32eZZczDTKeuNEE='
+
+
+class TestObjectChecksumSHA256(ObjectChecksumMixin, BaseS3TestCaseWithBucket):
+ ALGORITHM = 'SHA256'
+ EXPECTED = 'FeKw08M4keuw8e9gnsQZQgwg4yDOlMZfvIwzEkSOsiU='
+ INVALID = 'FeKw08M4keuw8e9gnsQZQgwg4yDOlMZfvIwzEkSOsiV='
+ BAD = 'GeKw08M4keuw8e9gnsQZQgwg4yDOlMZfvIwzEkSOsiU='
+
+
+class TestObjectChecksums(BaseS3TestCaseWithBucket):
+
+ @classmethod
+ def setUpClass(cls):
+ super().setUpClass()
+ cls.client = cls.get_s3_client(1)
+ cls.use_tls = cls.client._endpoint.host.startswith('https:')
+
+ def test_multi_checksum(self):
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(
+ Bucket=self.bucket_name,
+ Key=self.create_name('multi-checksum'),
+ Body=TEST_BODY,
+ # Note: Both valid! Ought to be able to validate & store both
+ ChecksumCRC32='y/Q5Jg==',
+ ChecksumSHA1='98O8HYCOBHMq32eZZczDTKeuNEE=',
+ )
+ resp = caught.exception.response
+ code = resp['ResponseMetadata']['HTTPStatusCode']
+ self.assertEqual(400, code)
+ self.assertEqual('InvalidRequest', resp['Error']['Code'])
+ self.assertEqual(
+ resp['Error']['Message'],
+ 'Expecting a single x-amz-checksum- header. '
+ 'Multiple checksum Types are not allowed.')
+
+ def test_different_checksum_requested(self):
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(
+ Bucket=self.bucket_name,
+ Key=self.create_name('different-checksum'),
+ Body=TEST_BODY,
+ ChecksumCRC32='y/Q5Jg==',
+ ChecksumAlgorithm='SHA1',
+ )
+ resp = caught.exception.response
+ code = resp['ResponseMetadata']['HTTPStatusCode']
+ self.assertEqual(400, code)
+ self.assertEqual('InvalidRequest', resp['Error']['Code'])
+ if boto_at_least(1, 36):
+ self.assertEqual(
+ resp['Error']['Message'],
+ 'Value for x-amz-sdk-checksum-algorithm header is invalid.')
+ else:
+ self.assertEqual(
+ resp['Error']['Message'],
+ 'Expecting a single x-amz-checksum- header')
+
+ def assert_invalid(self, resp):
+ code = resp['ResponseMetadata']['HTTPStatusCode']
+ self.assertEqual(400, code)
+ self.assertEqual('InvalidRequest', resp['Error']['Code'])
+ self.assertEqual(
+ resp['Error']['Message'],
+ 'Value for x-amz-checksum-crc32 header is invalid.')
+
+ def test_invalid_base64_invalid_length(self):
+ put_kwargs = {
+ 'Bucket': self.bucket_name,
+ 'Key': self.create_name('invalid-bad-length'),
+ 'Body': TEST_BODY,
+ 'ChecksumCRC32': 'short===', # invalid length for base64
+ }
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(**put_kwargs)
+ self.assert_invalid(caught.exception.response)
+
+ def test_invalid_base64_too_short(self):
+ put_kwargs = {
+ 'Bucket': self.bucket_name,
+ 'Key': self.create_name('invalid-short'),
+ 'Body': TEST_BODY,
+ 'ChecksumCRC32': 'shrt', # only 3 bytes
+ }
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(**put_kwargs)
+ self.assert_invalid(caught.exception.response)
+
+ def test_invalid_base64_too_long(self):
+ put_kwargs = {
+ 'Bucket': self.bucket_name,
+ 'Key': self.create_name('invalid-long'),
+ 'Body': TEST_BODY,
+ 'ChecksumCRC32': 'toolong=', # 5 bytes
+ }
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(**put_kwargs)
+ self.assert_invalid(caught.exception.response)
+
+ def test_invalid_base64_all_invalid_chars(self):
+ put_kwargs = {
+ 'Bucket': self.bucket_name,
+ 'Key': self.create_name('purely-invalid'),
+ 'Body': TEST_BODY,
+ 'ChecksumCRC32': '^^^^^^==', # all invalid char
+ }
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(**put_kwargs)
+ self.assert_invalid(caught.exception.response)
+
+ def test_invalid_base64_includes_invalid_chars(self):
+ put_kwargs = {
+ 'Bucket': self.bucket_name,
+ 'Key': self.create_name('contains-invalid'),
+ 'Body': TEST_BODY,
+ 'ChecksumCRC32': 'y^/^Q5^J^g==', # spaced out with invalid chars
+ }
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.put_object(**put_kwargs)
+ self.assert_invalid(caught.exception.response)
+
+ def test_mpu_no_checksum_upload_part_invalid_checksum(self):
+ obj_name = self.create_name('no-checksum-mpu')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name)
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=TEST_BODY,
+ ChecksumCRC32=TestObjectChecksumCRC32.INVALID,
+ )
+ self.assert_invalid(caught.exception.response)
+
+ def test_mpu_has_no_checksum(self):
+ # Clients don't need to be thinking about checksums at all
+ obj_name = self.create_name('no-checksum-mpu')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name)
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ part_resp = self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=TEST_BODY,
+ )
+ complete_mpu_resp = self.client.complete_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ MultipartUpload={
+ 'Parts': [
+ {
+ 'ETag': part_resp['ETag'],
+ 'PartNumber': 1,
+ },
+ ],
+ },
+ UploadId=upload_id,
+ )
+ self.assertEqual(200, complete_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+
+ head_resp = self.client.head_object(
+ Bucket=self.bucket_name, Key=obj_name)
+ self.assertFalse([k for k in head_resp
+ if k.startswith('Checksum')])
+
+ def test_mpu_upload_part_multi_checksum(self):
+ obj_name = self.create_name('multi-checksum-mpu')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ ChecksumAlgorithm='CRC32C')
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=TEST_BODY,
+ # Both valid!
+ ChecksumCRC32=TestObjectChecksumCRC32.EXPECTED,
+ ChecksumCRC32C=TestObjectChecksumCRC32C.EXPECTED,
+ )
+ resp = caught.exception.response
+ self.assertEqual(400, resp['ResponseMetadata']['HTTPStatusCode'])
+ self.assertEqual(resp['Error'], {
+ 'Code': 'InvalidRequest',
+ 'Message': ('Expecting a single x-amz-checksum- header. '
+ 'Multiple checksum Types are not allowed.'),
+ })
+ # You'd think we ought to be able to validate & store both...
+
+ def test_multipart_mpu(self):
+ obj_name = self.create_name('multipart-mpu')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ ChecksumAlgorithm='CRC32C')
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ part_body = b'\x00' * 5 * 1024 * 1024
+ part_crc32c = base64_str(crc32c(part_body).digest())
+
+ upload_part_resp = self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=part_body,
+ ChecksumCRC32C=part_crc32c,
+ )
+ self.assertEqual(200, upload_part_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ # then do another
+ upload_part_resp = self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=2,
+ Body=part_body,
+ ChecksumCRC32C=part_crc32c,
+ )
+ self.assertEqual(200, upload_part_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+
+ complete_mpu_resp = self.client.complete_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ MultipartUpload={
+ 'Parts': [
+ {
+ 'PartNumber': 1,
+ 'ETag': upload_part_resp['ETag'],
+ 'ChecksumCRC32C': part_crc32c,
+ },
+ {
+ 'PartNumber': 2,
+ 'ETag': upload_part_resp['ETag'],
+ 'ChecksumCRC32C': part_crc32c,
+ },
+ ],
+ },
+ UploadId=upload_id,
+ )
+ self.assertEqual(200, complete_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ mpu_etag = '"' + hashlib.md5(binascii.unhexlify(
+ upload_part_resp['ETag'].strip('"')) * 2).hexdigest() + '-2"'
+ self.assertEqual(mpu_etag,
+ complete_mpu_resp['ETag'])
+
+ def test_multipart_mpu_no_etags(self):
+ obj_name = self.create_name('multipart-mpu')
+ create_mpu_resp = self.client.create_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ ChecksumAlgorithm='CRC32C')
+ self.assertEqual(200, create_mpu_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ upload_id = create_mpu_resp['UploadId']
+ part_body = b'\x00' * 5 * 1024 * 1024
+ part_crc32c = base64_str(crc32c(part_body).digest())
+
+ upload_part_resp = self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=1,
+ Body=part_body,
+ ChecksumCRC32C=part_crc32c,
+ )
+ self.assertEqual(200, upload_part_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+ # then do another
+ upload_part_resp = self.client.upload_part(
+ Bucket=self.bucket_name,
+ Key=obj_name,
+ UploadId=upload_id,
+ PartNumber=2,
+ Body=part_body,
+ ChecksumCRC32C=part_crc32c,
+ )
+ self.assertEqual(200, upload_part_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
+
+ with self.assertRaises(botocore.exceptions.ClientError) as caught:
+ self.client.complete_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ MultipartUpload={
+ 'Parts': [
+ {
+ 'PartNumber': 1,
+ 'ChecksumCRC32C': part_crc32c,
+ },
+ {
+ 'PartNumber': 2,
+ 'ChecksumCRC32C': part_crc32c,
+ },
+ ],
+ },
+ UploadId=upload_id,
+ )
+ resp = caught.exception.response
+ self.assertEqual(400, resp['ResponseMetadata']['HTTPStatusCode'])
+ self.assertEqual(resp['Error']['Code'], 'MalformedXML')
+ self.assertEqual(
+ resp['Error']['Message'],
+ 'The XML you provided was not well-formed or did not validate '
+ 'against our published schema'
+ )
+ abort_resp = self.client.abort_multipart_upload(
+ Bucket=self.bucket_name, Key=obj_name,
+ UploadId=upload_id,
+ )
+ self.assertEqual(204, abort_resp[
+ 'ResponseMetadata']['HTTPStatusCode'])
diff -Nru swift-2.35.0/test/unit/common/middleware/s3api/test_multi_upload.py swift-2.35.1/test/unit/common/middleware/s3api/test_multi_upload.py
--- swift-2.35.0/test/unit/common/middleware/s3api/test_multi_upload.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/test/unit/common/middleware/s3api/test_multi_upload.py 2025-08-22 17:56:44.000000000 +0200
@@ -1136,6 +1136,16 @@
'Content-MD5': base64.b64encode(b'blahblahblahblah').strip()},
fake_memcache)
+ def test_object_multipart_upload_initiate_with_checksum_algorithm(self):
+ fake_memcache = FakeMemcache()
+ fake_memcache.store[get_cache_key(
+ 'AUTH_test', 'bucket+segments')] = {'status': 204}
+ fake_memcache.store[get_cache_key(
+ 'AUTH_test', 'bucket')] = {'status': 204}
+ self._test_object_multipart_upload_initiate(
+ {'X-Amz-Checksum-Algorithm': 'CRC32',
+ 'X-Amz-Checksum-Type': 'COMPOSITE'}, fake_memcache)
+
def test_object_mpu_initiate_with_segment_bucket_mixed_policy(self):
fake_memcache = FakeMemcache()
fake_memcache.store[get_cache_key(
diff -Nru swift-2.35.0/test/unit/common/middleware/s3api/test_s3api.py swift-2.35.1/test/unit/common/middleware/s3api/test_s3api.py
--- swift-2.35.0/test/unit/common/middleware/s3api/test_s3api.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/test/unit/common/middleware/s3api/test_s3api.py 2025-08-22 17:56:44.000000000 +0200
@@ -15,6 +15,7 @@
# limitations under the License.
import base64
+import io
import unittest
from unittest.mock import patch, MagicMock
import calendar
@@ -242,6 +243,21 @@
self.assertEqual([(b's3api.test-metric:1|c', ('1.2.3.4', 8125))],
client.sendto_calls)
+ def test_init_logs_checksum_implementation(self):
+ with mock.patch('swift.common.middleware.s3api.s3api.get_logger',
+ return_value=self.logger), \
+ mock.patch('swift.common.utils.checksum.crc32c_isal') \
+ as mock_crc32c, \
+ mock.patch('swift.common.utils.checksum.crc64nvme_isal') \
+ as mock_crc64nvme:
+ mock_crc32c.__name__ = 'crc32c_isal'
+ mock_crc64nvme.__name__ = 'crc64nvme_isal'
+ S3ApiMiddleware(None, {})
+ self.assertEqual(
+ {'info': ['Using crc32c_isal implementation for CRC32C.',
+ 'Using crc64nvme_isal implementation for CRC64NVME.']},
+ self.logger.all_log_lines())
+
def test_non_s3_request_passthrough(self):
req = Request.blank('/something')
status, headers, body = self.call_s3api(req)
@@ -319,6 +335,7 @@
'PATH_INFO': path,
'QUERY_STRING': query_string,
'HTTP_AUTHORIZATION': 'AWS X:Y:Z',
+ 'wsgi.input': io.BytesIO(),
}
for header, value in headers.items():
header = 'HTTP_' + header.replace('-', '_').upper()
@@ -706,7 +723,7 @@
date_header = self.get_date_header()
req.headers['Date'] = date_header
with mock.patch('swift.common.middleware.s3api.s3request.'
- 'S3Request.check_signature') as mock_cs:
+ 'SigCheckerV2.check_signature') as mock_cs:
status, headers, body = self.call_s3api(req)
self.assertIn('swift.backend_path', req.environ)
self.assertEqual(
@@ -737,7 +754,7 @@
date_header = self.get_date_header()
req.headers['Date'] = date_header
with mock.patch('swift.common.middleware.s3api.s3request.'
- 'S3Request.check_signature') as mock_cs:
+ 'SigCheckerV2.check_signature') as mock_cs:
status, headers, body = self.call_s3api(req)
self.assertIn('swift.backend_path', req.environ)
self.assertEqual(
@@ -919,23 +936,6 @@
def test_website_redirect_location(self):
self._test_unsupported_header('x-amz-website-redirect-location')
- def test_aws_chunked(self):
- self._test_unsupported_header('content-encoding', 'aws-chunked')
- # https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-streaming.html
- # has a multi-encoding example:
- #
- # > Amazon S3 supports multiple content encodings. For example:
- # >
- # > Content-Encoding : aws-chunked,gzip
- # > That is, you can specify your custom content-encoding when using
- # > Signature Version 4 streaming API.
- self._test_unsupported_header('Content-Encoding', 'aws-chunked,gzip')
- # Some clients skip the content-encoding,
- # such as minio-go and aws-sdk-java
- self._test_unsupported_header('x-amz-content-sha256',
- 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD')
- self._test_unsupported_header('x-amz-decoded-content-length')
-
def test_object_tagging(self):
self._test_unsupported_header('x-amz-tagging')
@@ -1279,6 +1279,7 @@
'Credential=X:Y/20110909/us-east-1/s3/aws4_request, '
'SignedHeaders=content-md5;content-type;date, '
'Signature=x',
+ 'wsgi.input': io.BytesIO(),
}
fake_time = calendar.timegm((2011, 9, 9, 23, 36, 0))
env.update(environ)
@@ -1300,9 +1301,9 @@
patch.object(swift.common.middleware.s3api.s3request,
'SERVICE', 'host'):
req = _get_req(path, environ)
- hash_in_sts = req._string_to_sign().split(b'\n')[3]
+ hash_in_sts = req.sig_checker._string_to_sign().split(b'\n')[3]
self.assertEqual(hash_val, hash_in_sts.decode('ascii'))
- self.assertTrue(req.check_signature(
+ self.assertTrue(req.sig_checker.check_signature(
'wJalrXUtnFEMI/K7MDENG+bPxRfiCYEXAMPLEKEY'))
# all next data got from aws4_testsuite from Amazon
diff -Nru swift-2.35.0/test/unit/common/middleware/s3api/test_s3request.py swift-2.35.1/test/unit/common/middleware/s3api/test_s3request.py
--- swift-2.35.0/test/unit/common/middleware/s3api/test_s3request.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/test/unit/common/middleware/s3api/test_s3request.py 2025-08-22 17:56:44.000000000 +0200
@@ -12,32 +12,37 @@
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-
+import base64
+import io
from datetime import timedelta
import hashlib
from unittest.mock import patch, MagicMock
import unittest
+import unittest.mock as mock
from io import BytesIO
from swift.common import swob
-from swift.common.middleware.s3api import s3response, controllers
+from swift.common.middleware.s3api import s3request, s3response, controllers
+from swift.common.middleware.s3api.exception import S3InputChecksumMismatch
from swift.common.swob import Request, HTTPNoContent
from swift.common.middleware.s3api.utils import mktime, Config
from swift.common.middleware.s3api.acl_handlers import get_acl_handler
from swift.common.middleware.s3api.subresource import ACL, User, Owner, \
Grant, encode_acl
-from test.unit.common.middleware.s3api.test_s3api import S3ApiTestCase
from swift.common.middleware.s3api.s3request import S3Request, \
S3AclRequest, SigV4Request, SIGV4_X_AMZ_DATE_FORMAT, HashingInput, \
- S3InputSHA256Mismatch
+ ChunkReader, StreamingInput, S3InputSHA256Mismatch, \
+ S3InputChunkSignatureMismatch, _get_checksum_hasher
from swift.common.middleware.s3api.s3response import InvalidArgument, \
NoSuchBucket, InternalError, ServiceUnavailable, \
AccessDenied, SignatureDoesNotMatch, RequestTimeTooSkewed, \
InvalidPartArgument, InvalidPartNumber, InvalidRequest, \
- XAmzContentSHA256Mismatch
-
+ XAmzContentSHA256Mismatch, S3NotImplemented
+from swift.common.utils import checksum
from test.debug_logger import debug_logger
+from test.unit import requires_crc32c, requires_crc64nvme
+from test.unit.common.middleware.s3api.test_s3api import S3ApiTestCase
Fake_ACL_MAP = {
# HEAD Bucket
@@ -97,6 +102,7 @@
def setUp(self):
super(TestRequest, self).setUp()
self.s3api.conf.s3_acl = True
+ s3request.SIGV4_CHUNK_MIN_SIZE = 2
@patch('swift.common.middleware.s3api.acl_handlers.ACL_MAP', Fake_ACL_MAP)
@patch('swift.common.middleware.s3api.s3request.S3AclRequest.authenticate',
@@ -791,8 +797,8 @@
b'Tue, 27 Mar 2007 19:36:42 +0000',
b'/johnsmith/photos/puppy.jpg',
])
- self.assertEqual(expected_sts, sigv2_req._string_to_sign())
- self.assertTrue(sigv2_req.check_signature(secret))
+ self.assertEqual(expected_sts, sigv2_req.sig_checker.string_to_sign)
+ self.assertTrue(sigv2_req.sig_checker.check_signature(secret))
req = Request.blank('/photos/puppy.jpg', method='PUT', headers={
'Content-Type': 'image/jpeg',
@@ -811,8 +817,8 @@
b'Tue, 27 Mar 2007 21:15:45 +0000',
b'/johnsmith/photos/puppy.jpg',
])
- self.assertEqual(expected_sts, sigv2_req._string_to_sign())
- self.assertTrue(sigv2_req.check_signature(secret))
+ self.assertEqual(expected_sts, sigv2_req.sig_checker.string_to_sign)
+ self.assertTrue(sigv2_req.sig_checker.check_signature(secret))
req = Request.blank(
'/?prefix=photos&max-keys=50&marker=puppy',
@@ -832,12 +838,12 @@
b'Tue, 27 Mar 2007 19:42:41 +0000',
b'/johnsmith/',
])
- self.assertEqual(expected_sts, sigv2_req._string_to_sign())
- self.assertTrue(sigv2_req.check_signature(secret))
+ self.assertEqual(expected_sts, sigv2_req.sig_checker.string_to_sign)
+ self.assertTrue(sigv2_req.sig_checker.check_signature(secret))
with patch('swift.common.middleware.s3api.s3request.streq_const_time',
return_value=True) as mock_eq:
- self.assertTrue(sigv2_req.check_signature(secret))
+ self.assertTrue(sigv2_req.sig_checker.check_signature(secret))
mock_eq.assert_called_once()
def test_check_signature_sigv2(self):
@@ -861,7 +867,7 @@
'storage_domains': ['s3.amazonaws.com']}))
# This is a failure case with utf-8 non-ascii multi-bytes charactor
# but we expect to return just False instead of exceptions
- self.assertFalse(sigv2_req.check_signature(
+ self.assertFalse(sigv2_req.sig_checker.check_signature(
u'\u30c9\u30e9\u30b4\u30f3'))
# Test v4 check_signature with multi bytes invalid secret
@@ -877,12 +883,12 @@
})
sigv4_req = SigV4Request(
req.environ, Config({'storage_domains': ['s3.amazonaws.com']}))
- self.assertFalse(sigv4_req.check_signature(
+ self.assertFalse(sigv4_req.sig_checker.check_signature(
u'\u30c9\u30e9\u30b4\u30f3'))
with patch('swift.common.middleware.s3api.s3request.streq_const_time',
return_value=False) as mock_eq:
- self.assertFalse(sigv4_req.check_signature(
+ self.assertFalse(sigv4_req.sig_checker.check_signature(
u'\u30c9\u30e9\u30b4\u30f3'))
mock_eq.assert_called_once()
@@ -908,7 +914,7 @@
sigv4_req = SigV4Request(req.environ)
self.assertTrue(
sigv4_req._canonical_request().endswith(b'UNSIGNED-PAYLOAD'))
- self.assertTrue(sigv4_req.check_signature('secret'))
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
@patch.object(S3Request, '_validate_dates', lambda *a: None)
def test_check_signature_sigv4_url_encode(self):
@@ -935,7 +941,7 @@
canonical_req = sigv4_req._canonical_request()
self.assertIn(b'PUT\n/test/~/file%2C1_1%3A1-1\n', canonical_req)
self.assertTrue(canonical_req.endswith(b'UNSIGNED-PAYLOAD'))
- self.assertTrue(sigv4_req.check_signature('secret'))
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
@patch.object(S3Request, '_validate_dates', lambda *a: None)
def test_check_sigv4_req_zero_content_length_sha256(self):
@@ -979,7 +985,7 @@
sigv4_req = SigV4Request(req.environ)
self.assertTrue(
sigv4_req._canonical_request().endswith(sha256_of_nothing))
- self.assertTrue(sigv4_req.check_signature('secret'))
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
# uppercase sha256 -- signature changes, but content's valid
headers = {
@@ -998,7 +1004,7 @@
sigv4_req = SigV4Request(req.environ)
self.assertTrue(
sigv4_req._canonical_request().endswith(sha256_of_nothing.upper()))
- self.assertTrue(sigv4_req.check_signature('secret'))
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
@patch.object(S3Request, '_validate_dates', lambda *a: None)
def test_v4_req_xmz_content_sha256_mismatch(self):
@@ -1039,7 +1045,7 @@
caught.exception.body)
@patch.object(S3Request, '_validate_dates', lambda *a: None)
- def test_v4_req_xmz_content_sha256_missing(self):
+ def test_v4_req_amz_content_sha256_missing(self):
# Virtual hosted-style
self.s3api.conf.storage_domains = ['s3.test.com']
environ = {
@@ -1183,6 +1189,886 @@
self.assertIn(b'Cannot specify both Range header and partNumber query '
b'parameter', cm.exception.body)
+ @mock.patch('swift.common.middleware.s3api.subresource.ACL.check_owner')
+ def test_sigv2_content_sha256_ok(self, mock_check_owner):
+ good_sha_256 = hashlib.sha256(b'body').hexdigest()
+ req = Request.blank('/bucket/object',
+ method='PUT',
+ body=b'body',
+ headers={'content-encoding': 'aws-chunked',
+ 'x-amz-content-sha256': good_sha_256,
+ 'Content-Length': '4',
+ 'Authorization': 'AWS test:tester:hmac',
+ 'Date': self.get_date_header()})
+
+ status, headers, body = self.call_s3api(req)
+ self.assertEqual(status, '200 OK')
+
+ @mock.patch('swift.common.middleware.s3api.subresource.ACL.check_owner')
+ def test_sigv2_content_sha256_bad_value(self, mock_check_owner):
+ good_sha_256 = hashlib.sha256(b'body').hexdigest()
+ bad_sha_256 = hashlib.sha256(b'not body').hexdigest()
+ req = Request.blank('/bucket/object',
+ method='PUT',
+ body=b'body',
+ headers={'content-encoding': 'aws-chunked',
+ 'x-amz-content-sha256':
+ bad_sha_256,
+ 'Content-Length': '4',
+ 'Authorization': 'AWS test:tester:hmac',
+ 'Date': self.get_date_header()})
+
+ status, headers, body = self.call_s3api(req)
+ self.assertEqual(status, '400 Bad Request')
+ self.assertIn(f'<ClientComputedContentSHA256>{bad_sha_256}'
+ '</ClientComputedContentSHA256>',
+ body.decode('utf8'))
+ self.assertIn(f'<S3ComputedContentSHA256>{good_sha_256}'
+ '</S3ComputedContentSHA256>',
+ body.decode('utf8'))
+
+ @mock.patch('swift.common.middleware.s3api.subresource.ACL.check_owner')
+ def test_sigv2_content_encoding_aws_chunked_is_ignored(
+ self, mock_check_owner):
+ req = Request.blank('/bucket/object',
+ method='PUT',
+ headers={'content-encoding': 'aws-chunked',
+ 'Authorization': 'AWS test:tester:hmac',
+ 'Date': self.get_date_header()})
+
+ status, _, body = self.call_s3api(req)
+ self.assertEqual(status, '200 OK')
+
+ def test_sigv2_content_sha256_streaming_is_bad_request(self):
+ def do_test(sha256):
+ req = Request.blank(
+ '/bucket/object',
+ method='PUT',
+ headers={'content-encoding': 'aws-chunked',
+ 'x-amz-content-sha256': sha256,
+ 'Content-Length': '0',
+ 'x-amz-decoded-content-length': '0',
+ 'Authorization': 'AWS test:tester:hmac',
+ 'Date': self.get_date_header()})
+ status, _, body = self.call_s3api(req)
+ # sig v2 wants that to actually be the SHA!
+ self.assertEqual(status, '400 Bad Request', body)
+ self.assertEqual(self._get_error_code(body),
+ 'XAmzContentSHA256Mismatch')
+ self.assertIn(f'<ClientComputedContentSHA256>{sha256}'
+ '</ClientComputedContentSHA256>',
+ body.decode('utf8'))
+
+ do_test('STREAMING-UNSIGNED-PAYLOAD-TRAILER')
+ do_test('STREAMING-AWS4-HMAC-SHA256-PAYLOAD')
+ do_test('STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER')
+ do_test('STREAMING-AWS4-ECDSA-P256-SHA256-PAYLOAD')
+ do_test('STREAMING-AWS4-ECDSA-P256-SHA256-PAYLOAD-TRAILER')
+
+ def test_sigv2_content_sha256_streaming_no_decoded_content_length(self):
+ # MissingContentLength trumps XAmzContentSHA256Mismatch
+ def do_test(sha256):
+ req = Request.blank(
+ '/bucket/object',
+ method='PUT',
+ headers={'content-encoding': 'aws-chunked',
+ 'x-amz-content-sha256': sha256,
+ 'Content-Length': '0',
+ 'Authorization': 'AWS test:tester:hmac',
+ 'Date': self.get_date_header()})
+ status, _, body = self.call_s3api(req)
+ self.assertEqual(status, '411 Length Required', body)
+ self.assertEqual(self._get_error_code(body),
+ 'MissingContentLength')
+
+ do_test('STREAMING-UNSIGNED-PAYLOAD-TRAILER')
+ do_test('STREAMING-AWS4-HMAC-SHA256-PAYLOAD')
+ do_test('STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER')
+ do_test('STREAMING-AWS4-ECDSA-P256-SHA256-PAYLOAD')
+ do_test('STREAMING-AWS4-ECDSA-P256-SHA256-PAYLOAD-TRAILER')
+
+ def _make_sig_v4_unsigned_payload_req(self, body=None, extra_headers=None):
+ environ = {
+ 'HTTP_HOST': 's3.test.com',
+ 'REQUEST_METHOD': 'PUT',
+ 'RAW_PATH_INFO': '/test/file'}
+ headers = {
+ 'Authorization':
+ 'AWS4-HMAC-SHA256 '
+ 'Credential=test/20220330/us-east-1/s3/aws4_request,'
+ 'SignedHeaders=content-length;host;x-amz-content-sha256;'
+ 'x-amz-date,'
+ 'Signature=d14bba0da2bba545c8275cb75c99b326cbdfdad015465dbaeca'
+ 'e18c7647c73da',
+ 'Content-Length': '27',
+ 'Host': 's3.test.com',
+ 'X-Amz-Content-SHA256': 'UNSIGNED-PAYLOAD',
+ 'X-Amz-Date': '20220330T095351Z',
+ }
+ if extra_headers:
+ headers.update(extra_headers)
+ return Request.blank(environ['RAW_PATH_INFO'], environ=environ,
+ headers=headers, body=body)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def _test_sig_v4_unsigned_payload(self, body=None, extra_headers=None):
+ req = self._make_sig_v4_unsigned_payload_req(
+ body=body, extra_headers=extra_headers)
+ sigv4_req = SigV4Request(req.environ)
+ # Verify header signature
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
+ return sigv4_req
+
+ def test_sig_v4_unsgnd_pyld_no_crc_ok(self):
+ body = b'abcdefghijklmnopqrstuvwxyz\n'
+ sigv4_req = self._test_sig_v4_unsigned_payload(body=body)
+ resp_body = sigv4_req.environ['wsgi.input'].read()
+ self.assertEqual(body, resp_body)
+
+ def test_sig_v4_unsgnd_pyld_crc32_ok(self):
+ body = b'abcdefghijklmnopqrstuvwxyz\n'
+ crc = base64.b64encode(checksum.crc32(body).digest())
+ sigv4_req = self._test_sig_v4_unsigned_payload(
+ body=body,
+ extra_headers={'X-Amz-Checksum-Crc32': crc}
+ )
+ resp_body = sigv4_req.environ['wsgi.input'].read()
+ self.assertEqual(body, resp_body)
+
+ def test_sig_v4_unsgnd_pyld_crc32_mismatch(self):
+ body = b'abcdefghijklmnopqrstuvwxyz\n'
+ crc = base64.b64encode(checksum.crc32(b'not the body').digest())
+ sigv4_req = self._test_sig_v4_unsigned_payload(
+ body=body,
+ extra_headers={'X-Amz-Checksum-Crc32': crc}
+ )
+ with self.assertRaises(S3InputChecksumMismatch):
+ sigv4_req.environ['wsgi.input'].read()
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_unsgnd_pyld_crc32_invalid(self):
+ req = self._make_sig_v4_unsigned_payload_req(
+ extra_headers={'X-Amz-Checksum-Crc32': 'not a crc'}
+ )
+ with self.assertRaises(s3request.InvalidRequest):
+ SigV4Request(req.environ)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_unsgnd_pyld_declares_crc32_trailer(self):
+ req = self._make_sig_v4_unsigned_payload_req(
+ extra_headers={'X-Amz-Trailer': 'x-amz-checksum-crc32'})
+ with self.assertRaises(s3request.MalformedTrailerError):
+ SigV4Request(req.environ)
+
+ def _make_valid_v4_streaming_hmac_sha256_payload_request(self):
+ environ = {
+ 'HTTP_HOST': 's3.test.com',
+ 'REQUEST_METHOD': 'PUT',
+ 'RAW_PATH_INFO': '/test/file'}
+ headers = {
+ 'Authorization':
+ 'AWS4-HMAC-SHA256 '
+ 'Credential=test/20220330/us-east-1/s3/aws4_request,'
+ 'SignedHeaders=content-encoding;content-length;host;x-amz-con'
+ 'tent-sha256;x-amz-date;x-amz-decoded-content-length,'
+ 'Signature=aa1b67fc5bc4503d05a636e6e740dcb757d3aa2352f32e7493f'
+ '261f71acbe1d5',
+ 'Content-Encoding': 'aws-chunked',
+ 'Content-Length': '369',
+ 'Host': 's3.test.com',
+ 'X-Amz-Content-SHA256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'X-Amz-Date': '20220330T095351Z',
+ 'X-Amz-Decoded-Content-Length': '25'}
+ body = 'a;chunk-signature=4a397f01db2cd700402dc38931b462e789ae49911d' \
+ 'c229d93c9f9c46fd3e0b21\r\nabcdefghij\r\n' \
+ 'a;chunk-signature=49177768ee3e9b77c6353ab9f3b9747d188adc11d4' \
+ '5b38be94a130616e6d64dc\r\nklmnopqrst\r\n' \
+ '5;chunk-signature=c884ebbca35b923cf864854e2a906aa8f5895a7140' \
+ '6c73cc6d4ee057527a8c23\r\nuvwz\n\r\n' \
+ '0;chunk-signature=50f7c470d6bf6c59126eecc2cb020d532a69c92322' \
+ 'ddfbbd21811de45491022c\r\n\r\n'
+
+ req = Request.blank(environ['RAW_PATH_INFO'], environ=environ,
+ headers=headers, body=body.encode('utf8'))
+ return SigV4Request(req.environ)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_check_signature_v4_hmac_sha256_payload_chunk_valid(self):
+ s3req = self._make_valid_v4_streaming_hmac_sha256_payload_request()
+ # Verify header signature
+ self.assertTrue(s3req.sig_checker.check_signature('secret'))
+
+ self.assertEqual(b'abcdefghij', s3req.environ['wsgi.input'].read(10))
+ self.assertEqual(b'klmnopqrst', s3req.environ['wsgi.input'].read(10))
+ self.assertEqual(b'uvwz\n', s3req.environ['wsgi.input'].read(10))
+ self.assertEqual(b'', s3req.environ['wsgi.input'].read(10))
+ self.assertTrue(s3req.sig_checker._all_chunk_signatures_valid)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_check_signature_v4_hmac_sha256_payload_no_secret(self):
+ # verify S3InputError if auth middleware does NOT call check_signature
+ # before the stream is read
+ s3req = self._make_valid_v4_streaming_hmac_sha256_payload_request()
+ with self.assertRaises(s3request.S3InputMissingSecret) as cm:
+ s3req.environ['wsgi.input'].read(10)
+
+ # ...which in context gets translated to a 501 response
+ s3req = self._make_valid_v4_streaming_hmac_sha256_payload_request()
+ with self.assertRaises(s3response.S3NotImplemented) as cm, \
+ s3req.translate_read_errors():
+ s3req.environ['wsgi.input'].read(10)
+ self.assertIn(
+ 'Transferring payloads in multiple chunks using aws-chunked is '
+ 'not supported.', str(cm.exception.body))
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_check_signature_v4_hmac_sha256_payload_chunk_invalid(self):
+ environ = {
+ 'HTTP_HOST': 's3.test.com',
+ 'REQUEST_METHOD': 'PUT',
+ 'RAW_PATH_INFO': '/test/file'}
+ headers = {
+ 'Authorization':
+ 'AWS4-HMAC-SHA256 '
+ 'Credential=test/20220330/us-east-1/s3/aws4_request,'
+ 'SignedHeaders=content-encoding;content-length;host;x-amz-con'
+ 'tent-sha256;x-amz-date;x-amz-decoded-content-length,'
+ 'Signature=aa1b67fc5bc4503d05a636e6e740dcb757d3aa2352f32e7493f'
+ '261f71acbe1d5',
+ 'Content-Encoding': 'aws-chunked',
+ 'Content-Length': '369',
+ 'Host': 's3.test.com',
+ 'X-Amz-Content-SHA256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'X-Amz-Date': '20220330T095351Z',
+ 'X-Amz-Decoded-Content-Length': '25'}
+ # second chunk signature is incorrect, should be
+ # 49177768ee3e9b77c6353ab9f3b9747d188adc11d45b38be94a130616e6d64dc
+ body = 'a;chunk-signature=4a397f01db2cd700402dc38931b462e789ae49911d' \
+ 'c229d93c9f9c46fd3e0b21\r\nabcdefghij\r\n' \
+ 'a;chunk-signature=49177768ee3e9b77c6353ab0f3b9747d188adc11d4' \
+ '5b38be94a130616e6d64dc\r\nklmnopqrst\r\n' \
+ '5;chunk-signature=c884ebbca35b923cf864854e2a906aa8f5895a7140' \
+ '6c73cc6d4ee057527a8c23\r\nuvwz\n\r\n' \
+ '0;chunk-signature=50f7c470d6bf6c59126eecc2cb020d532a69c92322' \
+ 'ddfbbd21811de45491022c\r\n\r\n'
+
+ req = Request.blank(environ['RAW_PATH_INFO'], environ=environ,
+ headers=headers, body=body.encode('utf8'))
+ sigv4_req = SigV4Request(req.environ)
+ # Verify header signature
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
+
+ self.assertEqual(b'abcdefghij', req.environ['wsgi.input'].read(10))
+ with self.assertRaises(s3request.S3InputChunkSignatureMismatch):
+ req.environ['wsgi.input'].read(10)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_check_signature_v4_hmac_sha256_payload_chunk_wrong_size(self):
+ environ = {
+ 'HTTP_HOST': 's3.test.com',
+ 'REQUEST_METHOD': 'PUT',
+ 'RAW_PATH_INFO': '/test/file'}
+ headers = {
+ 'Authorization':
+ 'AWS4-HMAC-SHA256 '
+ 'Credential=test/20220330/us-east-1/s3/aws4_request,'
+ 'SignedHeaders=content-encoding;content-length;host;x-amz-con'
+ 'tent-sha256;x-amz-date;x-amz-decoded-content-length,'
+ 'Signature=aa1b67fc5bc4503d05a636e6e740dcb757d3aa2352f32e7493f'
+ '261f71acbe1d5',
+ 'Content-Encoding': 'aws-chunked',
+ 'Content-Length': '369',
+ 'Host': 's3.test.com',
+ 'X-Amz-Content-SHA256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'X-Amz-Date': '20220330T095351Z',
+ 'X-Amz-Decoded-Content-Length': '25'}
+ # 2nd chunk contains an incorrect chunk size (9 should be a)...
+ body = 'a;chunk-signature=4a397f01db2cd700402dc38931b462e789ae49911d' \
+ 'c229d93c9f9c46fd3e0b21\r\nabcdefghij\r\n' \
+ '9;chunk-signature=49177768ee3e9b77c6353ab9f3b9747d188adc11d4' \
+ '5b38be94a130616e6d64dc\r\nklmnopqrst\r\n' \
+ '5;chunk-signature=c884ebbca35b923cf864854e2a906aa8f5895a7140' \
+ '6c73cc6d4ee057527a8c23\r\nuvwz\n\r\n' \
+ '0;chunk-signature=50f7c470d6bf6c59126eecc2cb020d532a69c92322' \
+ 'ddfbbd21811de45491022c\r\n\r\n'
+
+ req = Request.blank(environ['RAW_PATH_INFO'], environ=environ,
+ headers=headers, body=body.encode('utf8'))
+ sigv4_req = SigV4Request(req.environ)
+ # Verify header signature
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
+
+ self.assertEqual(b'abcdefghij', req.environ['wsgi.input'].read(10))
+ with self.assertRaises(s3request.S3InputChunkSignatureMismatch):
+ req.environ['wsgi.input'].read(10)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_check_signature_v4_hmac_sha256_payload_chunk_no_last_chunk(self):
+ environ = {
+ 'HTTP_HOST': 's3.test.com',
+ 'REQUEST_METHOD': 'PUT',
+ 'RAW_PATH_INFO': '/test/file'}
+ headers = {
+ 'Authorization':
+ 'AWS4-HMAC-SHA256 '
+ 'Credential=test/20220330/us-east-1/s3/aws4_request,'
+ 'SignedHeaders=content-encoding;content-length;host;x-amz-con'
+ 'tent-sha256;x-amz-date;x-amz-decoded-content-length,'
+ 'Signature=99759fb2823febb695950e6b75a7a1396b164742da9d204f71f'
+ 'db3a3a52216aa',
+ 'Content-Encoding': 'aws-chunked',
+ 'Content-Length': '283',
+ 'Host': 's3.test.com',
+ 'X-Amz-Content-SHA256': 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD',
+ 'X-Amz-Date': '20220330T095351Z',
+ 'X-Amz-Decoded-Content-Length': '25'}
+ body = 'a;chunk-signature=9c35d0203ce923cb7837b5e4a2984f2c107b05ac45' \
+ '80bafce7541c4b142b9712\r\nabcdefghij\r\n' \
+ 'a;chunk-signature=f514382beed5f287a5181b8293399fe006fd9398ee' \
+ '4b8aed910238092a4d5ec7\r\nklmnopqrst\r\n' \
+ '5;chunk-signature=ed6a54f035b920e7daa378ab2d255518c082573c98' \
+ '60127c80d43697375324f4\r\nuvwz\n\r\n'
+
+ req = Request.blank(environ['RAW_PATH_INFO'], environ=environ,
+ headers=headers, body=body.encode('utf8'))
+ sigv4_req = SigV4Request(req.environ)
+ # Verify header signature
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
+ self.assertEqual(b'abcdefghij', req.environ['wsgi.input'].read(10))
+ self.assertEqual(b'klmnopqrst', req.environ['wsgi.input'].read(10))
+ with self.assertRaises(s3request.S3InputIncomplete):
+ req.environ['wsgi.input'].read(5)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def _test_sig_v4_streaming_aws_hmac_sha256_payload_trailer(
+ self, body):
+ environ = {
+ 'HTTP_HOST': 's3.test.com',
+ 'REQUEST_METHOD': 'PUT',
+ 'RAW_PATH_INFO': '/test/file'}
+ headers = {
+ 'Authorization':
+ 'AWS4-HMAC-SHA256 '
+ 'Credential=test/20220330/us-east-1/s3/aws4_request,'
+ 'SignedHeaders=content-encoding;content-length;host;x-amz-con'
+ 'tent-sha256;x-amz-date;x-amz-decoded-content-length,'
+ 'Signature=bee7ad4f1a4f16c22f3b24155ab749b2aca0773065ccf08bc41'
+ 'a1e8e84748311',
+ 'Content-Encoding': 'aws-chunked',
+ 'Content-Length': '369',
+ 'Host': 's3.test.com',
+ 'X-Amz-Content-SHA256':
+ 'STREAMING-AWS4-HMAC-SHA256-PAYLOAD-TRAILER',
+ 'X-Amz-Date': '20220330T095351Z',
+ 'X-Amz-Decoded-Content-Length': '27',
+ 'X-Amz-Trailer': 'x-amz-checksum-sha256',
+ }
+ req = Request.blank(environ['RAW_PATH_INFO'], environ=environ,
+ headers=headers, body=body.encode('utf8'))
+ sigv4_req = SigV4Request(req.environ)
+ # Verify header signature
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
+ return sigv4_req
+
+ def test_check_sig_v4_streaming_aws_hmac_sha256_payload_trailer_ok(self):
+ body = 'a;chunk-signature=c9dd07703599d3d0bd51c96193110756d4f7091d5a' \
+ '4408314a53a802e635b1ad\r\nabcdefghij\r\n' \
+ 'a;chunk-signature=662dc18fb1a3ddad6abc2ce9ebb0748bedacd219eb' \
+ '223a5e80721c2637d30240\r\nklmnopqrst\r\n' \
+ '7;chunk-signature=b63f141c2012de9ac60b961795ef31ad3202b125aa' \
+ '873b4142cf9d815360abc0\r\nuvwxyz\n\r\n' \
+ '0;chunk-signature=b1ff1f86dccfbe9bcc80011e2b87b72e43e0c7f543' \
+ 'bb93612c06f9808ccb772e\r\n' \
+ 'x-amz-checksum-sha256:EBCn52FhCYCsWRNZyHH3JN4VDyNEDrtZWaxMBy' \
+ 'TJHZE=\r\n' \
+ 'x-amz-trailer-signature:1212d72cb487bf08ed25d1329dc93f65fde0' \
+ 'dcb21739a48f3182c86cfe79737b\r\n'
+ req = self._test_sig_v4_streaming_aws_hmac_sha256_payload_trailer(body)
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ req.environ['wsgi.input'].read())
+
+ def test_check_sig_v4_streaming_aws_hmac_sha256_missing_trailer_sig(self):
+ body = 'a;chunk-signature=c9dd07703599d3d0bd51c96193110756d4f7091d5a' \
+ '4408314a53a802e635b1ad\r\nabcdefghij\r\n' \
+ 'a;chunk-signature=662dc18fb1a3ddad6abc2ce9ebb0748bedacd219eb' \
+ '223a5e80721c2637d30240\r\nklmnopqrst\r\n' \
+ '7;chunk-signature=b63f141c2012de9ac60b961795ef31ad3202b125aa' \
+ '873b4142cf9d815360abc0\r\nuvwxyz\n\r\n' \
+ '0;chunk-signature=b1ff1f86dccfbe9bcc80011e2b87b72e43e0c7f543' \
+ 'bb93612c06f9808ccb772e\r\n' \
+ 'x-amz-checksum-sha256:foo\r\n'
+ req = self._test_sig_v4_streaming_aws_hmac_sha256_payload_trailer(body)
+ with self.assertRaises(s3request.S3InputIncomplete):
+ req.environ['wsgi.input'].read()
+
+ def test_check_sig_v4_streaming_aws_hmac_sha256_payload_trailer_bad(self):
+ body = 'a;chunk-signature=c9dd07703599d3d0bd51c96193110756d4f7091d5a' \
+ '4408314a53a802e635b1ad\r\nabcdefghij\r\n' \
+ 'a;chunk-signature=000000000000000000000000000000000000000000' \
+ '0000000000000000000000\r\nklmnopqrst\r\n' \
+ '7;chunk-signature=b63f141c2012de9ac60b961795ef31ad3202b125aa' \
+ '873b4142cf9d815360abc0\r\nuvwxyz\n\r\n' \
+ '0;chunk-signature=b1ff1f86dccfbe9bcc80011e2b87b72e43e0c7f543' \
+ 'bb93612c06f9808ccb772e\r\n' \
+ 'x-amz-checksum-sha256:foo\r\n'
+ req = self._test_sig_v4_streaming_aws_hmac_sha256_payload_trailer(body)
+ self.assertEqual(b'abcdefghij', req.environ['wsgi.input'].read(10))
+ with self.assertRaises(s3request.S3InputChunkSignatureMismatch):
+ req.environ['wsgi.input'].read(10)
+
+ def _make_sig_v4_streaming_unsigned_payload_trailer_req(
+ self, body=None, wsgi_input=None, extra_headers=None):
+ environ = {
+ 'HTTP_HOST': 's3.test.com',
+ 'REQUEST_METHOD': 'PUT',
+ 'RAW_PATH_INFO': '/test/file'}
+ if body:
+ body = body.encode('utf8')
+ elif wsgi_input:
+ environ['wsgi.input'] = wsgi_input
+ headers = {
+ 'Authorization':
+ 'AWS4-HMAC-SHA256 '
+ 'Credential=test/20220330/us-east-1/s3/aws4_request,'
+ 'SignedHeaders=content-encoding;content-length;host;x-amz-con'
+ 'tent-sha256;x-amz-date;x-amz-decoded-content-length,'
+ 'Signature=43727fcfa7765e97cd3cbfc112fed5fedc31e2b7930588ddbca'
+ '3feaa1205a7f2',
+ 'Content-Encoding': 'aws-chunked',
+ 'Content-Length': '369',
+ 'Host': 's3.test.com',
+ 'X-Amz-Content-SHA256': 'STREAMING-UNSIGNED-PAYLOAD-TRAILER',
+ 'X-Amz-Date': '20220330T095351Z',
+ 'X-Amz-Decoded-Content-Length': '27',
+ }
+ if extra_headers:
+ headers.update(extra_headers)
+ return Request.blank(environ['RAW_PATH_INFO'], environ=environ,
+ headers=headers, body=body)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def _test_sig_v4_streaming_unsigned_payload_trailer(
+ self, body=None, x_amz_trailer='x-amz-checksum-sha256'):
+ if x_amz_trailer is None:
+ headers = {}
+ else:
+ headers = {'X-Amz-Trailer': x_amz_trailer}
+
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body, extra_headers=headers)
+ sigv4_req = SigV4Request(req.environ)
+ # Verify header signature
+ self.assertTrue(sigv4_req.sig_checker.check_signature('secret'))
+ return sigv4_req
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_ok(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-sha256:EBCn52FhCYCsWRNZyHH3JN4VDyNEDrtZWaxMB' \
+ 'yTJHZE=\r\n'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(body)
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ s3req.environ['wsgi.input'].read())
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_none_ok(self):
+ # verify it's ok to not send any trailer
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body, x_amz_trailer=None)
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ s3req.environ['wsgi.input'].read())
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_undeclared(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-sha256:undeclared\r\n'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body, x_amz_trailer=None)
+ self.assertEqual(b'abcdefghijklmnopqrst',
+ s3req.environ['wsgi.input'].read(20))
+ with self.assertRaises(s3request.S3InputIncomplete):
+ s3req.environ['wsgi.input'].read()
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_multiple(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-sha256:undeclared\r\n'
+ with self.assertRaises(s3request.InvalidRequest):
+ self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body,
+ x_amz_trailer='x-amz-checksum-sha256,x-amz-checksum-crc32')
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_with_commas_invalid(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-sha256:undeclared\r\n'
+ with self.assertRaises(s3request.InvalidRequest):
+ self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body,
+ x_amz_trailer=', x-amz-checksum-crc32, ,')
+ with self.assertRaises(s3request.InvalidRequest):
+ self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body,
+ x_amz_trailer=', x-amz-checksum-crc32')
+ with self.assertRaises(s3request.InvalidRequest):
+ self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body,
+ x_amz_trailer=',x-amz-checksum-crc32')
+ with self.assertRaises(s3request.InvalidRequest):
+ self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body,
+ x_amz_trailer='x-amz-checksum-crc32, ,')
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_with_commas_ok(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-sha256:EBCn52FhCYCsWRNZyHH3JN4VDyNEDrtZWaxMB' \
+ 'yTJHZE=\r\n'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body, x_amz_trailer='x-amz-checksum-sha256, ')
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ s3req.environ['wsgi.input'].read())
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body, x_amz_trailer='x-amz-checksum-sha256,,')
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ s3req.environ['wsgi.input'].read())
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_unrecognised(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ with self.assertRaises(s3request.InvalidRequest):
+ self._test_sig_v4_streaming_unsigned_payload_trailer(
+ body,
+ x_amz_trailer='x-amz-content-sha256')
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_mismatch(self):
+ # the unexpected footer is detected before the incomplete line
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-not-sha256:foo\r\n' \
+ 'x-'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(body)
+ self.assertEqual(b'abcdefghijklmnopqrst',
+ s3req.environ['wsgi.input'].read(20))
+ # trailers are read with penultimate chunk??
+ with self.assertRaises(s3request.S3InputMalformedTrailer):
+ s3req.environ['wsgi.input'].read()
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_missing(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ '\r\n'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(body)
+ self.assertEqual(b'abcdefghijklmnopqrst',
+ s3req.environ['wsgi.input'].read(20))
+ # trailers are read with penultimate chunk??
+ with self.assertRaises(s3request.S3InputMalformedTrailer):
+ s3req.environ['wsgi.input'].read()
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_extra(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-crc32:foo\r\n' \
+ 'x-amz-checksum-sha32:foo\r\n'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(body)
+ self.assertEqual(b'abcdefghijklmnopqrst',
+ s3req.environ['wsgi.input'].read(20))
+ # trailers are read with penultimate chunk??
+ with self.assertRaises(s3request.S3InputMalformedTrailer):
+ s3req.environ['wsgi.input'].read()
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_duplicate(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-sha256:foo\r\n' \
+ 'x-amz-checksum-sha256:EBCn52FhCYCsWRNZyHH3JN4VDyNEDrtZWaxMB' \
+ 'yTJHZE=\r\n'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(body)
+ self.assertEqual(b'abcdefghijklmnopqrst',
+ s3req.environ['wsgi.input'].read(20))
+ # Reading the rest succeeds! AWS would complain about the checksum,
+ # but we aren't looking at it (yet)
+ s3req.environ['wsgi.input'].read()
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_short(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-sha256'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(body)
+ self.assertEqual(b'abcdefghijklmnopqrst',
+ s3req.environ['wsgi.input'].read(20))
+ # trailers are read with penultimate chunk??
+ with self.assertRaises(s3request.S3InputIncomplete):
+ s3req.environ['wsgi.input'].read()
+
+ def test_sig_v4_strm_unsgnd_pyld_trl_invalid(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n' \
+ 'x-amz-checksum-sha256: not=base-64\r\n'
+ s3req = self._test_sig_v4_streaming_unsigned_payload_trailer(body)
+ self.assertEqual(b'abcdefghijklmnopqrst',
+ s3req.environ['wsgi.input'].read(20))
+ with self.assertRaises(s3request.S3InputChecksumTrailerInvalid):
+ s3req.environ['wsgi.input'].read()
+
+ # ...which in context gets translated to a 400 response
+ with self.assertRaises(s3response.InvalidRequest) as cm, \
+ s3req.translate_read_errors():
+ s3req.environ['wsgi.input'].read()
+ self.assertIn(
+ 'Value for x-amz-checksum-sha256 trailing header is invalid.',
+ str(cm.exception.body))
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_sha256_ok(self):
+ # TODO: do we already have coverage for this?
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ headers = {
+ 'x-amz-checksum-sha256':
+ 'EBCn52FhCYCsWRNZyHH3JN4VDyNEDrtZWaxMByTJHZE=',
+ }
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers=headers
+ )
+ sigv4_req = SigV4Request(req.environ)
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ sigv4_req.environ['wsgi.input'].read())
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_sha256_mismatch(self):
+ # TODO: do we already have coverage for this?
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ headers = {
+ 'x-amz-sdk-checksum-algorithm': 'sha256',
+ 'x-amz-checksum-sha256':
+ 'BADBADBADBADWRNZyHH3JN4VDyNEDrtZWaxMByTJHZE=',
+ }
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers=headers
+ )
+ sigv4_req = SigV4Request(req.environ)
+ with self.assertRaises(s3request.BadDigest) as cm, \
+ sigv4_req.translate_read_errors():
+ sigv4_req.environ['wsgi.input'].read()
+ self.assertIn('The SHA256 you specified did not match the calculated '
+ 'checksum.', str(cm.exception.body))
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_crc32_ok(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(
+ checksum.crc32(b'abcdefghijklmnopqrstuvwxyz\n').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-crc32': crc}
+ )
+ sigv4_req = SigV4Request(req.environ)
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ sigv4_req.environ['wsgi.input'].read())
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_crc32_mismatch(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(checksum.crc32(b'not-the-body').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-crc32': crc}
+ )
+ sigv4_req = SigV4Request(req.environ)
+ with self.assertRaises(S3InputChecksumMismatch):
+ sigv4_req.environ['wsgi.input'].read()
+
+ @requires_crc32c
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_crc32c_ok(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(
+ checksum.crc32c(b'abcdefghijklmnopqrstuvwxyz\n').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-crc32c': crc}
+ )
+ sigv4_req = SigV4Request(req.environ)
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ sigv4_req.environ['wsgi.input'].read())
+
+ @requires_crc32c
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_crc32c_mismatch(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(checksum.crc32c(b'not-the-body').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-crc32c': crc}
+ )
+ sigv4_req = SigV4Request(req.environ)
+ with self.assertRaises(S3InputChecksumMismatch):
+ sigv4_req.environ['wsgi.input'].read()
+
+ @requires_crc64nvme
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_crc64nvme_ok(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(
+ checksum.crc64nvme(b'abcdefghijklmnopqrstuvwxyz\n').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-crc64nvme': crc}
+ )
+ sigv4_req = SigV4Request(req.environ)
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ sigv4_req.environ['wsgi.input'].read())
+
+ @requires_crc64nvme
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_crc64nvme_invalid(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(checksum.crc64nvme(b'not-the-body').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-crc64nvme': crc}
+ )
+ sigv4_req = SigV4Request(req.environ)
+ with self.assertRaises(S3InputChecksumMismatch):
+ sigv4_req.environ['wsgi.input'].read()
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_sha1_ok(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(
+ hashlib.sha1(b'abcdefghijklmnopqrstuvwxyz\n').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-sha1': crc}
+ )
+ sigv4_req = SigV4Request(req.environ)
+ self.assertEqual(b'abcdefghijklmnopqrstuvwxyz\n',
+ sigv4_req.environ['wsgi.input'].read())
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_sha1_mismatch(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(hashlib.sha1(b'not-the-body').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-sha1': crc}
+ )
+ sigv4_req = SigV4Request(req.environ)
+ with self.assertRaises(S3InputChecksumMismatch):
+ sigv4_req.environ['wsgi.input'].read()
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_unsupported(self):
+ body = 'a\r\nabcdefghij\r\n' \
+ 'a\r\nklmnopqrst\r\n' \
+ '7\r\nuvwxyz\n\r\n' \
+ '0\r\n'
+ crc = base64.b64encode(
+ checksum.crc32c(b'abcdefghijklmnopqrstuvwxyz\n').digest())
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ body=body,
+ extra_headers={'x-amz-checksum-crc32c': crc}
+ )
+ with patch('swift.common.middleware.s3api.s3request.checksum.'
+ '_select_crc32c_impl', side_effect=NotImplementedError):
+ with self.assertRaises(S3NotImplemented):
+ SigV4Request(req.environ)
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_hdr_and_trailer(self):
+ wsgi_input = io.BytesIO(b'123')
+ self.assertEqual(0, wsgi_input.tell())
+ headers = {
+ 'x-amz-checksum-sha256':
+ 'EBCn52FhCYCsWRNZyHH3JN4VDyNEDrtZWaxMByTJHZE=',
+ 'x-amz-trailer': 'x-amz-checksum-sha256'
+ }
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ wsgi_input=wsgi_input,
+ extra_headers=headers
+ )
+ with self.assertRaises(InvalidRequest) as cm:
+ SigV4Request(req.environ)
+ self.assertIn('Expecting a single x-amz-checksum- header',
+ str(cm.exception.body))
+
+ @patch.object(S3Request, '_validate_dates', lambda *a: None)
+ def test_sig_v4_strm_unsgnd_pyld_trl_checksum_algo_mismatch(self):
+ wsgi_input = io.BytesIO(b'123')
+ self.assertEqual(0, wsgi_input.tell())
+ headers = {
+ 'x-amz-sdk-checksum-algorithm': 'crc32',
+ 'x-amz-checksum-sha256':
+ 'EBCn52FhCYCsWRNZyHH3JN4VDyNEDrtZWaxMByTJHZE=',
+ }
+ req = self._make_sig_v4_streaming_unsigned_payload_trailer_req(
+ wsgi_input=wsgi_input,
+ extra_headers=headers
+ )
+ with self.assertRaises(InvalidRequest) as cm:
+ SigV4Request(req.environ)
+ self.assertIn('Value for x-amz-sdk-checksum-algorithm header is '
+ 'invalid.', str(cm.exception.body))
+
class TestSigV4Request(S3ApiTestCase):
def setUp(self):
@@ -1551,5 +2437,432 @@
self.assertEqual(b'6789', wrapped.readline())
+class TestChunkReader(unittest.TestCase):
+ def test_read_sig_checker_ok(self):
+ raw = '123456789\r\n0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+
+ mock_validator = MagicMock(return_value=True)
+ bytes_input = BytesIO(raw)
+ reader = ChunkReader(
+ bytes_input, 9, mock_validator, 'chunk-signature=signature')
+ self.assertEqual(9, reader.to_read)
+ self.assertEqual(b'123456789', reader.read())
+ self.assertEqual(0, reader.to_read)
+ self.assertEqual(
+ [mock.call(hashlib.sha256(b'123456789').hexdigest(), 'signature')],
+ mock_validator.call_args_list)
+ self.assertFalse(bytes_input.closed)
+
+ mock_validator = MagicMock(return_value=True)
+ reader = ChunkReader(
+ BytesIO(raw), 9, mock_validator, 'chunk-signature=signature')
+ self.assertEqual(9, reader.to_read)
+ self.assertEqual(b'12345678', reader.read(8))
+ self.assertEqual(1, reader.to_read)
+ self.assertEqual(b'9', reader.read(8))
+ self.assertEqual(0, reader.to_read)
+ self.assertEqual(
+ [mock.call(hashlib.sha256(b'123456789').hexdigest(), 'signature')],
+ mock_validator.call_args_list)
+
+ mock_validator = MagicMock(return_value=True)
+ reader = ChunkReader(
+ BytesIO(raw), 9, mock_validator, 'chunk-signature=signature')
+ self.assertEqual(9, reader.to_read)
+ self.assertEqual(b'123456789', reader.read(10))
+ self.assertEqual(0, reader.to_read)
+ self.assertEqual(
+ [mock.call(hashlib.sha256(b'123456789').hexdigest(), 'signature')],
+ mock_validator.call_args_list)
+
+ mock_validator = MagicMock(return_value=True)
+ reader = ChunkReader(
+ BytesIO(raw), 9, mock_validator, 'chunk-signature=signature')
+ self.assertEqual(9, reader.to_read)
+ self.assertEqual(b'123456789', reader.read(-1))
+ self.assertEqual(0, reader.to_read)
+ self.assertEqual(
+ [mock.call(hashlib.sha256(b'123456789').hexdigest(), 'signature')],
+ mock_validator.call_args_list)
+
+ def test_read_sig_checker_bad(self):
+ raw = '123456789\r\n0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ mock_validator = MagicMock(return_value=False)
+ bytes_input = BytesIO(raw)
+ reader = ChunkReader(
+ bytes_input, 9, mock_validator, 'chunk-signature=signature')
+ reader.read(8)
+ self.assertEqual(1, reader.to_read)
+ with self.assertRaises(S3InputChunkSignatureMismatch):
+ reader.read(1)
+ self.assertEqual(0, reader.to_read)
+ self.assertEqual(
+ [mock.call(hashlib.sha256(b'123456789').hexdigest(), 'signature')],
+ mock_validator.call_args_list)
+ self.assertTrue(bytes_input.closed)
+
+ def test_read_no_sig_checker(self):
+ raw = '123456789\r\n0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ bytes_input = BytesIO(raw)
+ reader = ChunkReader(bytes_input, 9, None, None)
+ self.assertEqual(9, reader.to_read)
+ self.assertEqual(b'123456789', reader.read())
+ self.assertEqual(0, reader.to_read)
+ self.assertFalse(bytes_input.closed)
+
+ def test_readline_sig_checker_ok_newline_is_midway_through_chunk(self):
+ raw = '123456\n7\r\n0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ mock_validator = MagicMock(return_value=True)
+ bytes_input = BytesIO(raw)
+ reader = ChunkReader(
+ bytes_input, 8, mock_validator, 'chunk-signature=signature')
+ self.assertEqual(8, reader.to_read)
+ self.assertEqual(b'123456\n', reader.readline())
+ self.assertEqual(1, reader.to_read)
+ self.assertEqual(b'7', reader.readline())
+ self.assertEqual(0, reader.to_read)
+ self.assertEqual(
+ [mock.call(hashlib.sha256(b'123456\n7').hexdigest(), 'signature')],
+ mock_validator.call_args_list)
+ self.assertFalse(bytes_input.closed)
+
+ def test_readline_sig_checker_ok_newline_is_end_of_chunk(self):
+ raw = '1234567\n\r\n0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ mock_validator = MagicMock(return_value=True)
+ bytes_input = BytesIO(raw)
+ reader = ChunkReader(
+ bytes_input, 8, mock_validator, 'chunk-signature=signature')
+ self.assertEqual(8, reader.to_read)
+ self.assertEqual(b'1234567\n', reader.readline())
+ self.assertEqual(0, reader.to_read)
+ self.assertEqual(
+ [mock.call(hashlib.sha256(b'1234567\n').hexdigest(), 'signature')],
+ mock_validator.call_args_list)
+ self.assertFalse(bytes_input.closed)
+
+ def test_readline_sig_checker_ok_partial_line_read(self):
+ raw = '1234567\n\r\n0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ mock_validator = MagicMock(return_value=True)
+ bytes_input = BytesIO(raw)
+ reader = ChunkReader(
+ bytes_input, 8, mock_validator, 'chunk-signature=signature')
+ self.assertEqual(8, reader.to_read)
+ self.assertEqual(b'12345', reader.readline(5))
+ self.assertEqual(3, reader.to_read)
+ self.assertEqual(b'67', reader.readline(2))
+ self.assertEqual(1, reader.to_read)
+ self.assertEqual(b'\n', reader.readline())
+ self.assertEqual(0, reader.to_read)
+ self.assertEqual(
+ [mock.call(hashlib.sha256(b'1234567\n').hexdigest(), 'signature')],
+ mock_validator.call_args_list)
+ self.assertFalse(bytes_input.closed)
+
+
+class TestStreamingInput(S3ApiTestCase):
+ def setUp(self):
+ super(TestStreamingInput, self).setUp()
+ # Override chunk min size
+ s3request.SIGV4_CHUNK_MIN_SIZE = 2
+ self.fake_sig_checker = MagicMock()
+ self.fake_sig_checker.check_chunk_signature = \
+ lambda chunk, signature: signature == 'ok'
+
+ def test_read(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ self.assertEqual(b'123456789', wrapped.read())
+ self.assertFalse(wrapped._input.closed)
+ wrapped.close()
+ self.assertTrue(wrapped._input.closed)
+
+ def test_read_with_size(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ self.assertEqual(b'1234', wrapped.read(4))
+ self.assertEqual(b'56', wrapped.read(2))
+ # trying to read past the end gets us whatever's left
+ self.assertEqual(b'789', wrapped.read(4))
+ # can continue trying to read -- but it'll be empty
+ self.assertEqual(b'', wrapped.read(2))
+
+ self.assertFalse(wrapped._input.closed)
+ wrapped.close()
+ self.assertTrue(wrapped._input.closed)
+
+ def test_read_multiple_chunks(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '7;chunk-signature=ok\r\nabc\ndef\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 16, set(),
+ self.fake_sig_checker)
+ self.assertEqual(b'123456789abc\ndef', wrapped.read())
+ self.assertEqual(b'', wrapped.read(2))
+
+ def test_read_multiple_chunks_with_size(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '7;chunk-signature=ok\r\nabc\ndef\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 16, set(),
+ self.fake_sig_checker)
+ self.assertEqual(b'123456789a', wrapped.read(10))
+ self.assertEqual(b'bc\n', wrapped.read(3))
+ self.assertEqual(b'def', wrapped.read(4))
+ self.assertEqual(b'', wrapped.read(2))
+
+ def test_readline_newline_in_middle_and_at_end(self):
+ raw = 'a;chunk-signature=ok\r\n123456\n789\r\n' \
+ '4;chunk-signature=ok\r\nabc\n\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 14, set(),
+ self.fake_sig_checker)
+ self.assertEqual(b'123456\n', wrapped.readline())
+ self.assertEqual(b'789abc\n', wrapped.readline())
+ self.assertEqual(b'', wrapped.readline())
+
+ def test_readline_newline_in_middle_not_at_end(self):
+ raw = 'a;chunk-signature=ok\r\n123456\n789\r\n' \
+ '3;chunk-signature=ok\r\nabc\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 13, set(),
+ self.fake_sig_checker)
+ self.assertEqual(b'123456\n', wrapped.readline())
+ self.assertEqual(b'789abc', wrapped.readline())
+ self.assertEqual(b'', wrapped.readline())
+
+ def test_readline_no_newline(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '3;chunk-signature=ok\r\nabc\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 12, set(),
+ self.fake_sig_checker)
+ self.assertEqual(b'123456789abc', wrapped.readline())
+ self.assertEqual(b'', wrapped.readline())
+
+ def test_readline_line_spans_chunks(self):
+ raw = '9;chunk-signature=ok\r\nblah\nblah\r\n' \
+ '9;chunk-signature=ok\r\n123456789\r\n' \
+ '7;chunk-signature=ok\r\nabc\ndef\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 25, set(),
+ self.fake_sig_checker)
+ self.assertEqual(b'blah\n', wrapped.readline())
+ self.assertEqual(b'blah123456789abc\n', wrapped.readline())
+ self.assertEqual(b'def', wrapped.readline())
+
+ def test_readline_with_size_line_spans_chunks(self):
+ raw = '9;chunk-signature=ok\r\nblah\nblah\r\n' \
+ '9;chunk-signature=ok\r\n123456789\r\n' \
+ '7;chunk-signature=ok\r\nabc\ndef\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 25, set(),
+ self.fake_sig_checker)
+ self.assertEqual(b'blah\n', wrapped.readline(8))
+ self.assertEqual(b'blah123456789a', wrapped.readline(14))
+ self.assertEqual(b'bc\n', wrapped.readline(99))
+ self.assertEqual(b'def', wrapped.readline(99))
+
+ def test_chunk_separator_missing(self):
+ raw = '9;chunk-signature=ok\r\n123456789' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ with self.assertRaises(s3request.S3InputIncomplete):
+ wrapped.read()
+ self.assertTrue(wrapped._input.closed)
+
+ def test_final_newline_missing(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '0;chunk-signature=ok\r\n\r'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ with self.assertRaises(s3request.S3InputIncomplete):
+ wrapped.read()
+ self.assertTrue(wrapped._input.closed)
+
+ def test_trailing_garbage_ok(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '0;chunk-signature=ok\r\n\r\ngarbage'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ self.assertEqual(b'123456789', wrapped.read())
+
+ def test_good_with_trailers(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '0;chunk-signature=ok\r\n' \
+ 'x-amz-checksum-crc32: AAAAAA==\r\n'.encode('utf8')
+ wrapped = StreamingInput(
+ BytesIO(raw), 9, {'x-amz-checksum-crc32'}, self.fake_sig_checker)
+ self.assertEqual(b'1234', wrapped.read(4))
+ self.assertEqual(b'56', wrapped.read(2))
+ # not at end, trailers haven't been read
+ self.assertEqual({}, wrapped.trailers)
+ # if we get exactly to the end, we go ahead and read the trailers
+ self.assertEqual(b'789', wrapped.read(3))
+ self.assertEqual({'x-amz-checksum-crc32': 'AAAAAA=='},
+ wrapped.trailers)
+ # can continue trying to read -- but it'll be empty
+ self.assertEqual(b'', wrapped.read(2))
+ self.assertEqual({'x-amz-checksum-crc32': 'AAAAAA=='},
+ wrapped.trailers)
+
+ self.assertFalse(wrapped._input.closed)
+ wrapped.close()
+ self.assertTrue(wrapped._input.closed)
+
+ def test_unexpected_trailers(self):
+ def do_test(raw):
+ wrapped = StreamingInput(
+ BytesIO(raw), 9, {'x-amz-checksum-crc32'},
+ self.fake_sig_checker)
+ with self.assertRaises(s3request.S3InputMalformedTrailer):
+ wrapped.read()
+ self.assertTrue(wrapped._input.closed)
+
+ do_test('9;chunk-signature=ok\r\n123456789\r\n'
+ '0;chunk-signature=ok\r\n'
+ 'x-amz-checksum-sha256: value\r\n'.encode('utf8'))
+ do_test('9;chunk-signature=ok\r\n123456789\r\n'
+ '0;chunk-signature=ok\r\n'
+ 'x-amz-checksum-crc32=value\r\n'.encode('utf8'))
+ do_test('9;chunk-signature=ok\r\n123456789\r\n'
+ '0;chunk-signature=ok\r\n'
+ 'x-amz-checksum-crc32\r\n'.encode('utf8'))
+
+ def test_wrong_signature_first_chunk(self):
+ raw = '9;chunk-signature=ko\r\n123456789\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ # Can read while in the chunk...
+ self.assertEqual(b'1234', wrapped.read(4))
+ self.assertEqual(b'5678', wrapped.read(4))
+ # But once we hit the end, bomb out
+ with self.assertRaises(s3request.S3InputChunkSignatureMismatch):
+ wrapped.read(4)
+ self.assertTrue(wrapped._input.closed)
+
+ def test_wrong_signature_middle_chunk(self):
+ raw = '2;chunk-signature=ok\r\n12\r\n' \
+ '2;chunk-signature=ok\r\n34\r\n' \
+ '2;chunk-signature=ko\r\n56\r\n' \
+ '2;chunk-signature=ok\r\n78\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ self.assertEqual(b'1234', wrapped.read(4))
+ with self.assertRaises(s3request.S3InputChunkSignatureMismatch):
+ wrapped.read(4)
+ self.assertTrue(wrapped._input.closed)
+
+ def test_wrong_signature_last_chunk(self):
+ raw = '2;chunk-signature=ok\r\n12\r\n' \
+ '2;chunk-signature=ok\r\n34\r\n' \
+ '2;chunk-signature=ok\r\n56\r\n' \
+ '2;chunk-signature=ok\r\n78\r\n' \
+ '0;chunk-signature=ko\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ self.assertEqual(b'12345678', wrapped.read(8))
+ with self.assertRaises(s3request.S3InputChunkSignatureMismatch):
+ wrapped.read(4)
+ self.assertTrue(wrapped._input.closed)
+
+ def test_not_enough_content(self):
+ raw = '9;chunk-signature=ok\r\n123456789\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(
+ BytesIO(raw), 33, set(), self.fake_sig_checker)
+ with self.assertRaises(s3request.S3InputSizeError) as cm:
+ wrapped.read()
+ self.assertEqual(33, cm.exception.expected)
+ self.assertEqual(9, cm.exception.provided)
+ self.assertTrue(wrapped._input.closed)
+
+ def test_wrong_chunk_size(self):
+ # first chunk should be size 9 not a
+ raw = 'a;chunk-signature=ok\r\n123456789\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ with self.assertRaises(s3request.S3InputSizeError) as cm:
+ wrapped.read(4)
+ self.assertEqual(9, cm.exception.expected)
+ self.assertEqual(10, cm.exception.provided)
+ self.assertTrue(wrapped._input.closed)
+
+ def test_small_first_chunk_size(self):
+ raw = '1;chunk-signature=ok\r\n1\r\n' \
+ '8;chunk-signature=ok\r\n23456789\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ with self.assertRaises(s3request.S3InputChunkTooSmall) as cm:
+ wrapped.read(4)
+ # note: the chunk number is the one *after* the short chunk
+ self.assertEqual(2, cm.exception.chunk_number)
+ self.assertEqual(1, cm.exception.bad_chunk_size)
+ self.assertTrue(wrapped._input.closed)
+
+ def test_small_final_chunk_size_ok(self):
+ raw = '8;chunk-signature=ok\r\n12345678\r\n' \
+ '1;chunk-signature=ok\r\n9\r\n' \
+ '0;chunk-signature=ok\r\n\r\n'.encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), self.fake_sig_checker)
+ self.assertEqual(b'123456789', wrapped.read())
+
+ def test_invalid_chunk_size(self):
+ # the actual chunk data doesn't need to match the length in the
+ # chunk header for the test
+ raw = ('-1;chunk-signature=ok\r\n123456789\r\n'
+ '0;chunk-signature=ok\r\n\r\n').encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), None)
+ with self.assertRaises(s3request.S3InputIncomplete) as cm:
+ wrapped.read(4)
+ self.assertIn('invalid chunk header', str(cm.exception))
+ self.assertTrue(wrapped._input.closed)
+
+ def test_invalid_chunk_params(self):
+ def do_test(params, exp_exception):
+ raw = ('9;%s\r\n123456789\r\n'
+ '0;chunk-signature=ok\r\n\r\n' % params).encode('utf8')
+ wrapped = StreamingInput(BytesIO(raw), 9, set(), MagicMock())
+ with self.assertRaises(exp_exception):
+ wrapped.read(4)
+ self.assertTrue(wrapped._input.closed)
+
+ do_test('chunk-signature=', s3request.S3InputIncomplete)
+ do_test('chunk-signature=ok;not-ok', s3request.S3InputIncomplete)
+ do_test('chunk-signature=ok;chunk-signature=ok',
+ s3request.S3InputIncomplete)
+ do_test('chunk-signature', s3request.S3InputIncomplete)
+ # note: underscore not hyphen...
+ do_test('chunk_signature=ok', s3request.S3InputChunkSignatureMismatch)
+ do_test('skunk-cignature=ok', s3request.S3InputChunkSignatureMismatch)
+
+
+class TestModuleFunctions(unittest.TestCase):
+ def test_get_checksum_hasher(self):
+ def do_test(crc):
+ hasher = _get_checksum_hasher('x-amz-checksum-%s' % crc)
+ self.assertEqual(crc, hasher.name)
+
+ do_test('crc32')
+ do_test('crc32c')
+ do_test('sha1')
+ do_test('sha256')
+ try:
+ checksum._select_crc64nvme_impl()
+ except NotImplementedError:
+ pass
+ else:
+ do_test('crc64nvme')
+
+ def test_get_checksum_hasher_invalid(self):
+ def do_test(crc):
+ with self.assertRaises(s3response.S3NotImplemented):
+ _get_checksum_hasher('x-amz-checksum-%s' % crc)
+
+ with mock.patch.object(checksum, '_select_crc64nvme_impl',
+ side_effect=NotImplementedError):
+ do_test('crc64nvme')
+ do_test('nonsense')
+ do_test('')
+
+
if __name__ == '__main__':
unittest.main()
diff -Nru swift-2.35.0/test/unit/common/test_utils.py swift-2.35.1/test/unit/common/test_utils.py
--- swift-2.35.0/test/unit/common/test_utils.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/test/unit/common/test_utils.py 2025-08-22 17:56:44.000000000 +0200
@@ -2450,6 +2450,28 @@
self.fail('Invalid results from pure function:\n%s' %
'\n'.join(failures))
+ def test_strict_b64decode_allow_line_breaks(self):
+ with self.assertRaises(ValueError):
+ utils.strict_b64decode(b'AA\nA=')
+ self.assertEqual(
+ b'\x00\x00',
+ utils.strict_b64decode(b'AA\nA=', allow_line_breaks=True))
+
+ def test_strict_b64decode_exact_size(self):
+ self.assertEqual(b'\x00\x00',
+ utils.strict_b64decode(b'AAA='))
+ self.assertEqual(b'\x00\x00',
+ utils.strict_b64decode(b'AAA=', exact_size=2))
+ with self.assertRaises(ValueError):
+ utils.strict_b64decode(b'AAA=', exact_size=1)
+ with self.assertRaises(ValueError):
+ utils.strict_b64decode(b'AAA=', exact_size=3)
+
+ def test_base64_str(self):
+ self.assertEqual('Zm9v', utils.base64_str(b'foo'))
+ self.assertEqual('Zm9vZA==', utils.base64_str(b'food'))
+ self.assertEqual('IGZvbw==', utils.base64_str(b' foo'))
+
def test_cap_length(self):
self.assertEqual(utils.cap_length(None, 3), None)
self.assertEqual(utils.cap_length('', 3), '')
diff -Nru swift-2.35.0/test/unit/common/utils/test_checksum.py swift-2.35.1/test/unit/common/utils/test_checksum.py
--- swift-2.35.0/test/unit/common/utils/test_checksum.py 1970-01-01 01:00:00.000000000 +0100
+++ swift-2.35.1/test/unit/common/utils/test_checksum.py 2025-08-22 17:56:44.000000000 +0200
@@ -0,0 +1,417 @@
+# Copyright (c) 2024 NVIDIA
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+# implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+import sys
+import unittest
+from unittest import mock
+
+import zlib
+
+from swift.common.utils import checksum
+from test.debug_logger import debug_logger
+from test.unit import requires_crc32c, requires_crc64nvme
+
+
+class TestModuleFunctions(unittest.TestCase):
+ def test_find_isal_sys_package_preferred(self):
+ with mock.patch('ctypes.util.find_library', return_value='my-isal.so'):
+ with mock.patch('ctypes.CDLL', return_value='fake') as mock_cdll:
+ self.assertEqual('fake', checksum.find_isal())
+ self.assertEqual([mock.call('my-isal.so')], mock_cdll.call_args_list)
+
+ @unittest.skipIf(
+ sys.version_info.major == 3 and sys.version_info.minor < 8,
+ "importlib.metadata not available until py3.8")
+ def test_find_isal_pyeclib_install_found(self):
+ mock_pkg = mock.MagicMock()
+ mock_pkg.locate = mock.MagicMock(return_value='fake-pkg')
+ with mock.patch('ctypes.util.find_library', return_value=None):
+ with mock.patch('ctypes.CDLL', return_value='fake') as mock_cdll:
+ with mock.patch('importlib.metadata.files',
+ return_value=[mock_pkg]):
+ self.assertEqual('fake', checksum.find_isal())
+ self.assertEqual([mock.call('fake-pkg')], mock_cdll.call_args_list)
+
+ @unittest.skipIf(
+ sys.version_info.major == 3 and sys.version_info.minor < 8,
+ "importlib.metadata not available until py3.8")
+ def test_find_isal_pyeclib_install_not_found(self):
+ mock_pkg = mock.MagicMock()
+ mock_pkg.locate = mock.MagicMock(return_value='fake-pkg')
+ with mock.patch('ctypes.util.find_library', return_value=None):
+ with mock.patch('importlib.metadata.files', return_value=[]):
+ self.assertIsNone(checksum.find_isal())
+
+ @unittest.skipIf(
+ sys.version_info.major == 3 and sys.version_info.minor < 8,
+ "importlib.metadata not available until py3.8")
+ def test_find_isal_pyeclib_dist_missing_files(self):
+ with mock.patch('ctypes.util.find_library', return_value=None):
+ with mock.patch('importlib.metadata.files', return_value=None):
+ self.assertIsNone(checksum.find_isal())
+
+ @unittest.skipIf(
+ sys.version_info.major == 3 and sys.version_info.minor < 8,
+ "importlib.metadata not available until py3.8")
+ def test_find_isal_pyeclib_dist_info_missing(self):
+ from importlib.metadata import PackageNotFoundError
+ with mock.patch('ctypes.util.find_library', return_value=None):
+ with mock.patch('importlib.metadata.files',
+ side_effect=PackageNotFoundError):
+ self.assertIsNone(checksum.find_isal())
+
+
+# If you're curious about the 0xe3069283, see "check" at
+# https://reveng.sourceforge.io/crc-catalogue/17plus.htm#crc.cat.crc-32-iscsi
+class TestCRC32C(unittest.TestCase):
+ def check_crc_func(self, impl):
+ self.assertEqual(impl(b"123456789"), 0xe3069283)
+ # Check that we can save/continue
+ partial = impl(b"12345")
+ self.assertEqual(impl(b"6789", partial), 0xe3069283)
+
+ @unittest.skipIf(checksum.crc32c_anycrc is None, 'No anycrc CRC32C')
+ def test_anycrc(self):
+ self.check_crc_func(checksum.crc32c_anycrc)
+ # Check preferences -- beats out reference, but not kernel or ISA-L
+ if checksum.crc32c_isal is None and checksum.crc32c_kern is None:
+ self.assertIs(checksum._select_crc32c_impl(),
+ checksum.crc32c_anycrc)
+
+ @unittest.skipIf(checksum.crc32c_kern is None, 'No kernel CRC32C')
+ def test_kern(self):
+ self.check_crc_func(checksum.crc32c_kern)
+ # Check preferences -- beats out reference and anycrc, but not ISA-L
+ if checksum.crc32c_isal is None:
+ self.assertIs(checksum._select_crc32c_impl(), checksum.crc32c_kern)
+
+ @unittest.skipIf(checksum.crc32c_kern is None, 'No kernel CRC32C')
+ def test_kern_socket_close_happy_path(self):
+ mock_crc32c_socket = mock.MagicMock()
+ mock_socket = mock.MagicMock()
+ mock_socket.recv.return_value = b'1234'
+ mock_crc32c_socket.accept.return_value = (mock_socket, None)
+ with mock.patch('swift.common.utils.checksum.socket.socket',
+ return_value=mock_crc32c_socket):
+ checksum.crc32c_kern(b'x')
+ self.assertEqual([mock.call()],
+ mock_socket.close.call_args_list)
+ self.assertEqual([mock.call()],
+ mock_crc32c_socket.close.call_args_list)
+
+ @unittest.skipIf(checksum.crc32c_kern is None, 'No kernel CRC32C')
+ def test_kern_socket_close_after_bind_error(self):
+ mock_crc32c_socket = mock.MagicMock()
+ mock_crc32c_socket.bind.side_effect = OSError('boom')
+ with mock.patch('swift.common.utils.checksum.socket.socket',
+ return_value=mock_crc32c_socket):
+ with self.assertRaises(OSError) as cm:
+ checksum.crc32c_kern(b'x')
+ self.assertEqual('boom', str(cm.exception))
+ self.assertEqual([mock.call()],
+ mock_crc32c_socket.close.call_args_list)
+
+ @unittest.skipIf(checksum.crc32c_kern is None, 'No kernel CRC32C')
+ def test_kern_socket_close_after_setsockopt_error(self):
+ mock_crc32c_socket = mock.MagicMock()
+ mock_crc32c_socket.setsockopt.side_effect = OSError('boom')
+ with mock.patch('swift.common.utils.checksum.socket.socket',
+ return_value=mock_crc32c_socket):
+ with self.assertRaises(OSError) as cm:
+ checksum.crc32c_kern(b'x')
+ self.assertEqual('boom', str(cm.exception))
+ self.assertEqual([mock.call()],
+ mock_crc32c_socket.close.call_args_list)
+
+ @unittest.skipIf(checksum.crc32c_kern is None, 'No kernel CRC32C')
+ def test_kern_socket_close_after_accept_error(self):
+ mock_crc32c_socket = mock.MagicMock()
+ mock_crc32c_socket.accept.side_effect = OSError('boom')
+ with mock.patch('swift.common.utils.checksum.socket.socket',
+ return_value=mock_crc32c_socket):
+ with self.assertRaises(OSError) as cm:
+ checksum.crc32c_kern(b'x')
+ self.assertEqual('boom', str(cm.exception))
+ self.assertEqual([mock.call()],
+ mock_crc32c_socket.close.call_args_list)
+
+ @unittest.skipIf(checksum.crc32c_kern is None, 'No kernel CRC32C')
+ def test_kern_socket_after_sendall_error(self):
+ mock_crc32c_socket = mock.MagicMock()
+ mock_socket = mock.MagicMock()
+ mock_socket.sendall.side_effect = OSError('boom')
+ mock_crc32c_socket.accept.return_value = (mock_socket, None)
+ with mock.patch('swift.common.utils.checksum.socket.socket',
+ return_value=mock_crc32c_socket):
+ with self.assertRaises(OSError) as cm:
+ checksum.crc32c_kern(b'x')
+ self.assertEqual('boom', str(cm.exception))
+ self.assertEqual([mock.call()],
+ mock_socket.close.call_args_list)
+ self.assertEqual([mock.call()],
+ mock_crc32c_socket.close.call_args_list)
+
+ @unittest.skipIf(checksum.crc32c_kern is None, 'No kernel CRC32C')
+ def test_kern_socket_after_recv_error(self):
+ mock_crc32c_socket = mock.MagicMock()
+ mock_socket = mock.MagicMock()
+ mock_socket.recv.side_effect = OSError('boom')
+ mock_crc32c_socket.accept.return_value = (mock_socket, None)
+ with mock.patch('swift.common.utils.checksum.socket.socket',
+ return_value=mock_crc32c_socket):
+ with self.assertRaises(OSError) as cm:
+ checksum.crc32c_kern(b'x')
+ self.assertEqual('boom', str(cm.exception))
+ self.assertEqual([mock.call()],
+ mock_socket.close.call_args_list)
+ self.assertEqual([mock.call()],
+ mock_crc32c_socket.close.call_args_list)
+
+ @unittest.skipIf(checksum.crc32c_isal is None, 'No ISA-L CRC32C')
+ def test_isal(self):
+ self.check_crc_func(checksum.crc32c_isal)
+ # Check preferences -- ISA-L always wins
+ self.assertIs(checksum._select_crc32c_impl(), checksum.crc32c_isal)
+
+
+class TestCRC64NVME(unittest.TestCase):
+ def check_crc_func(self, impl):
+ self.assertEqual(impl(b"123456789"), 0xae8b14860a799888)
+ # Check that we can save/continue
+ partial = impl(b"12345")
+ self.assertEqual(impl(b"6789", partial), 0xae8b14860a799888)
+
+ @unittest.skipIf(checksum.crc64nvme_anycrc is None, 'No anycrc CRC64NVME')
+ def test_anycrc(self):
+ self.check_crc_func(checksum.crc64nvme_anycrc)
+ if checksum.crc64nvme_isal is None:
+ self.assertIs(checksum._select_crc64nvme_impl(),
+ checksum.crc64nvme_anycrc)
+
+ @unittest.skipIf(checksum.crc64nvme_isal is None, 'No ISA-L CRC64NVME')
+ def test_isal(self):
+ self.check_crc_func(checksum.crc64nvme_isal)
+ # Check preferences -- ISA-L always wins
+ self.assertIs(checksum._select_crc64nvme_impl(),
+ checksum.crc64nvme_isal)
+
+
+class TestCRCHasher(unittest.TestCase):
+ def setUp(self):
+ self.logger = debug_logger()
+
+ def test_base_crc_hasher(self):
+ func = mock.MagicMock(return_value=0xbad1)
+ hasher = checksum.CRCHasher('fake', func)
+ self.assertEqual('fake', hasher.name)
+ self.assertEqual(32, hasher.width)
+ self.assertEqual(0, hasher.crc)
+ self.assertEqual(b'\x00\x00\x00\x00', hasher.digest())
+ self.assertEqual('00000000', hasher.hexdigest())
+
+ hasher.update(b'123456789')
+ self.assertEqual(0xbad1, hasher.crc)
+ self.assertEqual(b'\x00\x00\xba\xd1', hasher.digest())
+ self.assertEqual('0000bad1', hasher.hexdigest())
+
+ def test_crc32_hasher(self):
+ # See CRC-32/ISO-HDLC at
+ # https://reveng.sourceforge.io/crc-catalogue/17plus.htm
+ hasher = checksum.crc32()
+ self.assertEqual('crc32', hasher.name)
+ self.assertEqual(4, hasher.digest_size)
+ self.assertEqual(zlib.crc32, hasher.crc_func)
+ self.assertEqual(32, hasher.width)
+ self.assertEqual(0, hasher.crc)
+ self.assertEqual(b'\x00\x00\x00\x00', hasher.digest())
+ self.assertEqual('00000000', hasher.hexdigest())
+
+ hasher.update(b'123456789')
+ self.assertEqual(0xcbf43926, hasher.crc)
+ self.assertEqual(b'\xcb\xf4\x39\x26', hasher.digest())
+ self.assertEqual('cbf43926', hasher.hexdigest())
+
+ def test_crc32_hasher_contructed_with_data(self):
+ hasher = checksum.crc32(b'123456789')
+ self.assertEqual(zlib.crc32, hasher.crc_func)
+ self.assertEqual(0xcbf43926, hasher.crc)
+ self.assertEqual(b'\xcb\xf4\x39\x26', hasher.digest())
+ self.assertEqual('cbf43926', hasher.hexdigest())
+
+ def test_crc32_hasher_initial_value(self):
+ hasher = checksum.crc32(initial_value=0xcbf43926)
+ self.assertEqual(zlib.crc32, hasher.crc_func)
+ self.assertEqual(0xcbf43926, hasher.crc)
+ self.assertEqual(b'\xcb\xf4\x39\x26', hasher.digest())
+ self.assertEqual('cbf43926', hasher.hexdigest())
+
+ def test_crc32_hasher_copy(self):
+ hasher = checksum.crc32(b'123456789')
+ self.assertEqual(4, hasher.digest_size)
+ self.assertEqual('cbf43926', hasher.hexdigest())
+ hasher_copy = hasher.copy()
+ self.assertEqual('crc32', hasher.name)
+ self.assertEqual(zlib.crc32, hasher_copy.crc_func)
+ self.assertEqual('cbf43926', hasher_copy.hexdigest())
+ hasher_copy.update(b'foo')
+ self.assertEqual('cbf43926', hasher.hexdigest())
+ self.assertEqual('04e7e407', hasher_copy.hexdigest())
+ hasher.update(b'bar')
+ self.assertEqual('fe6b0d8c', hasher.hexdigest())
+ self.assertEqual('04e7e407', hasher_copy.hexdigest())
+
+ @requires_crc32c
+ def test_crc32c_hasher(self):
+ # See CRC-32/ISCSI at
+ # https://reveng.sourceforge.io/crc-catalogue/17plus.htm
+ hasher = checksum.crc32c()
+ self.assertEqual('crc32c', hasher.name)
+ self.assertEqual(32, hasher.width)
+ self.assertEqual(0, hasher.crc)
+ self.assertEqual(b'\x00\x00\x00\x00', hasher.digest())
+ self.assertEqual('00000000', hasher.hexdigest())
+
+ hasher.update(b'123456789')
+ self.assertEqual(0xe3069283, hasher.crc)
+ self.assertEqual(b'\xe3\x06\x92\x83', hasher.digest())
+ self.assertEqual('e3069283', hasher.hexdigest())
+
+ @requires_crc32c
+ def test_crc32c_hasher_constructed_with_data(self):
+ hasher = checksum.crc32c(b'123456789')
+ self.assertEqual(0xe3069283, hasher.crc)
+ self.assertEqual(b'\xe3\x06\x92\x83', hasher.digest())
+ self.assertEqual('e3069283', hasher.hexdigest())
+
+ @requires_crc32c
+ def test_crc32c_hasher_initial_value(self):
+ hasher = checksum.crc32c(initial_value=0xe3069283)
+ self.assertEqual(0xe3069283, hasher.crc)
+ self.assertEqual(b'\xe3\x06\x92\x83', hasher.digest())
+ self.assertEqual('e3069283', hasher.hexdigest())
+
+ @requires_crc32c
+ def test_crc32c_hasher_copy(self):
+ hasher = checksum.crc32c(b'123456789')
+ self.assertEqual('e3069283', hasher.hexdigest())
+ hasher_copy = hasher.copy()
+ self.assertEqual('crc32c', hasher_copy.name)
+ self.assertIs(hasher.crc_func, hasher_copy.crc_func)
+ self.assertEqual('e3069283', hasher_copy.hexdigest())
+ hasher_copy.update(b'foo')
+ self.assertEqual('e3069283', hasher.hexdigest())
+ self.assertEqual('6b2fc5b0', hasher_copy.hexdigest())
+ hasher.update(b'bar')
+ self.assertEqual('ae5c789c', hasher.hexdigest())
+ self.assertEqual('6b2fc5b0', hasher_copy.hexdigest())
+
+ def test_crc32c_hasher_selects_kern_impl(self):
+ scuc = 'swift.common.utils.checksum'
+ with mock.patch(scuc + '.crc32c_isal', None), \
+ mock.patch(scuc + '.crc32c_kern') as mock_kern, \
+ mock.patch(scuc + '.crc32c_anycrc', None):
+ mock_kern.__name__ = 'crc32c_kern'
+ self.assertIs(mock_kern, checksum.crc32c().crc_func)
+ checksum.log_selected_implementation(self.logger)
+ self.assertIn('Using crc32c_kern implementation for CRC32C.',
+ self.logger.get_lines_for_level('info'))
+
+ def test_crc32c_hasher_selects_anycrc_impl(self):
+ scuc = 'swift.common.utils.checksum'
+ with mock.patch(scuc + '.crc32c_isal', None), \
+ mock.patch(scuc + '.crc32c_kern', None), \
+ mock.patch(scuc + '.crc32c_anycrc') as mock_anycrc:
+ mock_anycrc.__name__ = 'crc32c_anycrc'
+ self.assertIs(mock_anycrc, checksum.crc32c().crc_func)
+ checksum.log_selected_implementation(self.logger)
+ self.assertIn('Using crc32c_anycrc implementation for CRC32C.',
+ self.logger.get_lines_for_level('info'))
+
+ def test_crc32c_hasher_selects_isal_impl(self):
+ scuc = 'swift.common.utils.checksum'
+ with mock.patch(scuc + '.crc32c_isal') as mock_isal, \
+ mock.patch(scuc + '.crc32c_kern'), \
+ mock.patch(scuc + '.crc32c_anycrc'):
+ mock_isal.__name__ = 'crc32c_isal'
+ self.assertIs(mock_isal, checksum.crc32c().crc_func)
+ checksum.log_selected_implementation(self.logger)
+ self.assertIn('Using crc32c_isal implementation for CRC32C.',
+ self.logger.get_lines_for_level('info'))
+
+ @requires_crc64nvme
+ def test_crc64nvme_hasher(self):
+ # See CRC-64/NVME at
+ # https://reveng.sourceforge.io/crc-catalogue/17plus.htm
+ hasher = checksum.crc64nvme()
+ self.assertEqual('crc64nvme', hasher.name)
+ self.assertEqual(8, hasher.digest_size)
+ self.assertEqual(64, hasher.width)
+ self.assertEqual(0, hasher.crc)
+ self.assertEqual(b'\x00\x00\x00\x00\x00\x00\x00\x00', hasher.digest())
+ self.assertEqual('0000000000000000', hasher.hexdigest())
+
+ hasher.update(b'123456789')
+ self.assertEqual(0xae8b14860a799888, hasher.crc)
+ self.assertEqual(b'\xae\x8b\x14\x86\x0a\x79\x98\x88', hasher.digest())
+ self.assertEqual('ae8b14860a799888', hasher.hexdigest())
+
+ @requires_crc64nvme
+ def test_crc64nvme_hasher_constructed_with_data(self):
+ hasher = checksum.crc64nvme(b'123456789')
+ self.assertEqual(b'\xae\x8b\x14\x86\x0a\x79\x98\x88', hasher.digest())
+ self.assertEqual('ae8b14860a799888', hasher.hexdigest())
+
+ @requires_crc64nvme
+ def test_crc64nvme_hasher_initial_value(self):
+ hasher = checksum.crc64nvme(initial_value=0xae8b14860a799888)
+ self.assertEqual(b'\xae\x8b\x14\x86\x0a\x79\x98\x88', hasher.digest())
+ self.assertEqual('ae8b14860a799888', hasher.hexdigest())
+
+ @requires_crc64nvme
+ def test_crc64nvme_hasher_copy(self):
+ hasher = checksum.crc64nvme(b'123456789')
+ self.assertEqual('ae8b14860a799888', hasher.hexdigest())
+ hasher_copy = hasher.copy()
+ self.assertEqual('crc64nvme', hasher_copy.name)
+ self.assertIs(hasher.crc_func, hasher_copy.crc_func)
+ self.assertEqual('ae8b14860a799888', hasher_copy.hexdigest())
+ hasher_copy.update(b'foo')
+ self.assertEqual('ae8b14860a799888', hasher.hexdigest())
+ self.assertEqual('673ece0d56523f46', hasher_copy.hexdigest())
+ hasher.update(b'bar')
+ self.assertEqual('0991d5edf1b0062e', hasher.hexdigest())
+ self.assertEqual('673ece0d56523f46', hasher_copy.hexdigest())
+
+ def test_crc64nvme_hasher_selects_anycrc_impl(self):
+ scuc = 'swift.common.utils.checksum'
+ with mock.patch(scuc + '.crc64nvme_isal', None), \
+ mock.patch(scuc + '.crc64nvme_anycrc') as mock_anycrc:
+ mock_anycrc.__name__ = 'crc64nvme_anycrc'
+ self.assertIs(mock_anycrc,
+ checksum.crc64nvme().crc_func)
+ checksum.log_selected_implementation(self.logger)
+ self.assertIn(
+ 'Using crc64nvme_anycrc implementation for CRC64NVME.',
+ self.logger.get_lines_for_level('info'))
+
+ def test_crc64nvme_hasher_selects_isal_impl(self):
+ scuc = 'swift.common.utils.checksum'
+ with mock.patch(scuc + '.crc64nvme_isal') as mock_isal, \
+ mock.patch(scuc + '.crc64nvme_anycrc'):
+ mock_isal.__name__ = 'crc64nvme_isal'
+ self.assertIs(mock_isal, checksum.crc64nvme().crc_func)
+ checksum.log_selected_implementation(self.logger)
+ self.assertIn(
+ 'Using crc64nvme_isal implementation for CRC64NVME.',
+ self.logger.get_lines_for_level('info'))
diff -Nru swift-2.35.0/test/unit/__init__.py swift-2.35.1/test/unit/__init__.py
--- swift-2.35.0/test/unit/__init__.py 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/test/unit/__init__.py 2025-08-22 17:56:44.000000000 +0200
@@ -46,7 +46,7 @@
from swift.common.memcached import MemcacheConnectionError
from swift.common.storage_policy import (StoragePolicy, ECStoragePolicy,
VALID_EC_TYPES)
-from swift.common.utils import Timestamp, md5, close_if_possible
+from swift.common.utils import Timestamp, md5, close_if_possible, checksum
from test import get_config
from test.debug_logger import FakeLogger
from swift.common.header_key_dict import HeaderKeyDict
@@ -1084,6 +1084,28 @@
return func(*args, **kwargs)
return wrapper
+
+def requires_crc32c(func):
+ @functools.wraps(func)
+ def wrapper(*args, **kwargs):
+ try:
+ checksum.crc32c()
+ except NotImplementedError as e:
+ raise SkipTest(str(e))
+ return func(*args, **kwargs)
+ return wrapper
+
+
+def requires_crc64nvme(func):
+ @functools.wraps(func)
+ def wrapper(*args, **kwargs):
+ try:
+ checksum.crc64nvme()
+ except NotImplementedError as e:
+ raise SkipTest(str(e))
+ return func(*args, **kwargs)
+ return wrapper
+
class StubResponse(object):
diff -Nru swift-2.35.0/tools/playbooks/multinode_setup/run.yaml swift-2.35.1/tools/playbooks/multinode_setup/run.yaml
--- swift-2.35.0/tools/playbooks/multinode_setup/run.yaml 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/tools/playbooks/multinode_setup/run.yaml 2025-08-22 17:56:44.000000000 +0200
@@ -44,7 +44,6 @@
include_role:
name: tox
vars:
- tox_envlist: func-py3
tox_environment:
TOX_CONSTRAINTS_FILE: https://releases.openstack.org/constraints/upper/yoga
SWIFT_TEST_CONFIG_FILE: /home/{{ ansible_user }}/test.conf
diff -Nru swift-2.35.0/tox.ini swift-2.35.1/tox.ini
--- swift-2.35.0/tox.ini 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/tox.ini 2025-08-22 17:56:44.000000000 +0200
@@ -13,7 +13,7 @@
install_command = pip install {opts} {packages}
setenv = VIRTUAL_ENV={envdir}
deps =
- -c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master}
+ -c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/2025.1}
-r{toxinidir}/requirements.txt
-r{toxinidir}/test-requirements.txt
commands =
@@ -85,7 +85,7 @@
[testenv:docs]
deps =
- -c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master}
+ -c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/2025.1}
-r{toxinidir}/doc/requirements.txt
commands = sphinx-build -W -b html doc/source doc/build/html
diff -Nru swift-2.35.0/.zuul.yaml swift-2.35.1/.zuul.yaml
--- swift-2.35.0/.zuul.yaml 2025-03-10 20:02:43.000000000 +0100
+++ swift-2.35.1/.zuul.yaml 2025-08-22 17:56:44.000000000 +0200
@@ -16,6 +16,7 @@
name: swift-tox-py36
parent: swift-tox-base
nodeset: ubuntu-bionic
+ ansible-version: 9
description: |
Run unit-tests for swift under cPython version 3.6.
@@ -31,7 +32,6 @@
- job:
name: swift-tox-py37
parent: swift-tox-base
- nodeset: ubuntu-bionic
description: |
Run unit-tests for swift under cPython version 3.7.
@@ -41,6 +41,7 @@
vars:
tox_envlist: py37
bindep_profile: test py37
+ python_use_pyenv: True
python_version: 3.7
post-run: tools/playbooks/common/cover-post.yaml
@@ -398,6 +399,7 @@
Build a 4 node swift cluster and run functional tests
timeout: 5400
vars:
+ tox_envlist: func
bindep_profile: test py39
pre-run:
- tools/playbooks/multinode_setup/pre.yaml
@@ -413,48 +415,56 @@
parent: swift-multinode-rolling-upgrade
vars:
previous_swift_version: wallaby-eom
+ tox_envlist: func-py3
- job:
name: swift-multinode-rolling-upgrade-xena
parent: swift-multinode-rolling-upgrade
vars:
previous_swift_version: xena-eom
+ tox_envlist: func-py3
- job:
name: swift-multinode-rolling-upgrade-yoga
parent: swift-multinode-rolling-upgrade
vars:
previous_swift_version: yoga-eom
+ tox_envlist: func-py3
- job:
name: swift-multinode-rolling-upgrade-zed
parent: swift-multinode-rolling-upgrade
vars:
previous_swift_version: zed-eom
+ tox_envlist: func-py3
- job:
name: swift-multinode-rolling-upgrade-antelope
parent: swift-multinode-rolling-upgrade
vars:
previous_swift_version: 2023.1-eom
+ tox_envlist: func-py3
- job:
name: swift-multinode-rolling-upgrade-bobcat
parent: swift-multinode-rolling-upgrade
vars:
- previous_swift_version: origin/stable/2023.2
+ previous_swift_version: 2023.2-eol
+ tox_envlist: func-py3
- job:
name: swift-multinode-rolling-upgrade-caracal
parent: swift-multinode-rolling-upgrade
vars:
previous_swift_version: origin/stable/2024.1
+ tox_envlist: func-py3
- job:
name: swift-multinode-rolling-upgrade-dalmatian
parent: swift-multinode-rolling-upgrade
vars:
previous_swift_version: origin/stable/2024.2
+ tox_envlist: func-py3
- job:
name: swift-multinode-rolling-upgrade-master
@@ -467,6 +477,7 @@
parent: openstack-tox-lower-constraints
# This seems defensible for a l-c job
nodeset: ubuntu-bionic
+ ansible-version: 9
vars:
bindep_profile: test py36
python_version: 3.6
Reply to: