[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#1009200: pytorch: (autopkgtest) needs update for python3.10: 'float' object cannot be interpreted as an integer



Source: pytorch
Version: 1.8.1-5
Severity: serious
Tags: sid bookworm
User: debian-ci@lists.debian.org
Usertags: needs-update
User: debian-python@lists.debian.org
Usertags: python3.10
Control: affects -1 src:python3-defaults

Dear maintainer(s),

We are in the transition of making python3.10 the default Python versions [0]. With a recent upload of python3-defaults the autopkgtest of pytorch fails in testing when that autopkgtest is run with the binary packages of python3-defaults from unstable. It passes when run with only packages from testing. In tabular form:

                       pass            fail
python3-defaults       from testing    3.10.4-1
pytorch                from testing    1.8.1-5
all others             from testing    from testing

I copied some of the output at the bottom of this report.

Currently this regression is blocking the migration of python3-defaults to testing [1]. https://docs.python.org/3/whatsnew/3.10.html lists what's new in Python3.10, it may help to identify what needs to be updated.

More information about this bug and the reason for filing it can be found on
https://wiki.debian.org/ContinuousIntegration/RegressionEmailInformation

Paul

[0] https://bugs.debian.org/1006836
[1] https://qa.debian.org/excuses.php?package=python3-defaults

https://ci.debian.net/data/autopkgtest/testing/amd64/p/pytorch/20675875/log.gz


=================================== FAILURES =================================== ____________ TestDistributions.test_invalid_parameter_broadcasting _____________

self = <test_distributions.TestDistributions testMethod=test_invalid_parameter_broadcasting>

    def test_invalid_parameter_broadcasting(self):
        # invalid broadcasting cases; should throw error
        # example type (distribution class, distribution params)
        invalid_examples = [
            (Normal, {
                'loc': torch.tensor([[0, 0]]),
                'scale': torch.tensor([1, 1, 1, 1])
            }),
            (Normal, {
                'loc': torch.tensor([[[0, 0, 0], [0, 0, 0]]]),
                'scale': torch.tensor([1, 1])
            }),
            (FisherSnedecor, {
                'df1': torch.tensor([1, 1]),
                'df2': torch.tensor([1, 1, 1]),
            }),
            (Gumbel, {
                'loc': torch.tensor([[0, 0]]),
                'scale': torch.tensor([1, 1, 1, 1])
            }),
            (Gumbel, {
                'loc': torch.tensor([[[0, 0, 0], [0, 0, 0]]]),
                'scale': torch.tensor([1, 1])
            }),
            (Gamma, {
                'concentration': torch.tensor([0, 0]),
                'rate': torch.tensor([1, 1, 1])
            }),
            (Kumaraswamy, {
                'concentration1': torch.tensor([[1, 1]]),
                'concentration0': torch.tensor([1, 1, 1, 1])
            }),
            (Kumaraswamy, {
                'concentration1': torch.tensor([[[1, 1, 1], [1, 1, 1]]]),
                'concentration0': torch.tensor([1, 1])
            }),
            (Laplace, {
                'loc': torch.tensor([0, 0]),
                'scale': torch.tensor([1, 1, 1])
            }),
            (Pareto, {
                'scale': torch.tensor([1, 1]),
                'alpha': torch.tensor([1, 1, 1])
            }),
            (StudentT, {
                'df': torch.tensor([1, 1]),
                'scale': torch.tensor([1, 1, 1])
            }),
            (StudentT, {
                'df': torch.tensor([1, 1]),
                'loc': torch.tensor([1, 1, 1])
            })
        ]
            for dist, kwargs in invalid_examples:
          self.assertRaises(RuntimeError, dist, **kwargs)

distributions/test_distributions.py:2871: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/dist-packages/torch/distributions/studentT.py:45: in __init__
    self.df, self.loc, self.scale = broadcast_all(df, loc, scale)
/usr/lib/python3/dist-packages/torch/distributions/utils.py:37: in broadcast_all new_values = [v if isinstance(v, torch.Tensor) or has_torch_function((v,)) else torch.tensor(v, **options) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
  new_values = [v if isinstance(v, torch.Tensor) or has_torch_function((v,)) else torch.tensor(v, **options)
                  for v in values]
E   TypeError: 'float' object cannot be interpreted as an integer

/usr/lib/python3/dist-packages/torch/distributions/utils.py:37: TypeError
=============================== warnings summary ===============================
../../../../../../usr/lib/python3/dist-packages/torch/testing/_internal/common_cuda.py:9

/usr/lib/python3/dist-packages/torch/testing/_internal/common_cuda.py:9: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils.version import LooseVersion

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_entropy
test/distributions/test_distributions.py::TestJit::test_enumerate_support
test/distributions/test_distributions.py::TestJit::test_log_prob
test/distributions/test_distributions.py::TestJit::test_mean
test/distributions/test_distributions.py::TestJit::test_rsample
test/distributions/test_distributions.py::TestJit::test_sample
test/distributions/test_distributions.py::TestJit::test_variance

/usr/lib/python3/dist-packages/torch/distributions/distribution.py:52: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if not constraint.check(getattr(self, param)).all():

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_entropy
test/distributions/test_distributions.py::TestJit::test_enumerate_support
test/distributions/test_distributions.py::TestJit::test_log_prob
test/distributions/test_distributions.py::TestJit::test_mean
test/distributions/test_distributions.py::TestJit::test_sample
test/distributions/test_distributions.py::TestJit::test_variance
/usr/lib/python3/dist-packages/torch/distributions/geometric.py:38: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if not self.probs.gt(0).all():

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_log_prob

/usr/lib/python3/dist-packages/torch/distributions/distribution.py:265: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if i != 1 and j != 1 and i != j:

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_log_prob

/usr/lib/python3/dist-packages/torch/distributions/distribution.py:276: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if not support.check(value).all():

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_entropy
test/distributions/test_distributions.py::TestJit::test_enumerate_support
test/distributions/test_distributions.py::TestJit::test_log_prob
test/distributions/test_distributions.py::TestJit::test_mean
test/distributions/test_distributions.py::TestJit::test_rsample
test/distributions/test_distributions.py::TestJit::test_sample
test/distributions/test_distributions.py::TestJit::test_variance
/usr/lib/python3/dist-packages/torch/distributions/utils.py:37: TracerWarning: torch.tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect. new_values = [v if isinstance(v, torch.Tensor) or has_torch_function((v,)) else torch.tensor(v, **options)

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_entropy
test/distributions/test_distributions.py::TestJit::test_enumerate_support
test/distributions/test_distributions.py::TestJit::test_log_prob
test/distributions/test_distributions.py::TestJit::test_mean
test/distributions/test_distributions.py::TestJit::test_rsample
test/distributions/test_distributions.py::TestJit::test_sample
test/distributions/test_distributions.py::TestJit::test_variance
/usr/lib/python3/dist-packages/torch/distributions/uniform.py:50: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if self._validate_args and not torch.lt(self.low, self.high).all():

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_entropy
test/distributions/test_distributions.py::TestJit::test_enumerate_support
test/distributions/test_distributions.py::TestJit::test_log_prob
test/distributions/test_distributions.py::TestJit::test_mean
test/distributions/test_distributions.py::TestJit::test_rsample
test/distributions/test_distributions.py::TestJit::test_sample
test/distributions/test_distributions.py::TestJit::test_variance

/usr/lib/python3/dist-packages/torch/distributions/transformed_distribution.py:65: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if base_shape != expanded_base_shape:

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_log_prob

/usr/lib/python3/dist-packages/torch/distributions/distribution.py:258: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if value.size()[event_dim_start:] != self._event_shape:

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_enumerate_support
test/distributions/test_distributions.py::TestJit::test_mean
test/distributions/test_distributions.py::TestJit::test_rsample
test/distributions/test_distributions.py::TestJit::test_sample
test/distributions/test_distributions.py::TestJit::test_variance

/usr/lib/python3/dist-packages/torch/distributions/lowrank_multivariate_normal.py:89: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if cov_factor.shape[-2:-1] != event_shape:

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_enumerate_support
test/distributions/test_distributions.py::TestJit::test_mean
test/distributions/test_distributions.py::TestJit::test_rsample
test/distributions/test_distributions.py::TestJit::test_sample
test/distributions/test_distributions.py::TestJit::test_variance

/usr/lib/python3/dist-packages/torch/distributions/lowrank_multivariate_normal.py:92: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if cov_diag.shape[-1:] != event_shape:

test/distributions/test_distributions.py::TestJit::test_cdf
test/distributions/test_distributions.py::TestJit::test_enumerate_support
test/distributions/test_distributions.py::TestJit::test_mean
test/distributions/test_distributions.py::TestJit::test_rsample
test/distributions/test_distributions.py::TestJit::test_sample
test/distributions/test_distributions.py::TestJit::test_variance
/usr/lib/python3/dist-packages/torch/tensor.py:587: RuntimeWarning: Iterating over a tensor might cause the trace to be incorrect. Passing a tensor of different shape won't change the number of iterations executed (and might lead to errors or silently give incorrect results). warnings.warn('Iterating over a tensor might cause the trace to be incorrect. '

test/distributions/test_distributions.py::TestJit::test_entropy
test/distributions/test_distributions.py::TestJit::test_log_prob
/usr/lib/python3/dist-packages/torch/nn/functional.py:2826: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    if not (target.size() == input.size()):

test/distributions/test_distributions.py::TestJit::test_log_prob
/usr/lib/python3/dist-packages/torch/distributions/gamma.py:66: TracerWarning: torch.as_tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect. value = torch.as_tensor(value, dtype=self.rate.dtype, device=self.rate.device)

test/distributions/test_distributions.py::TestJit::test_log_prob
/usr/lib/python3/dist-packages/torch/distributions/half_cauchy.py:55: TracerWarning: torch.as_tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect.
    value = torch.as_tensor(value, dtype=self.base_dist.scale.dtype,

test/distributions/test_distributions.py::TestJit::test_log_prob

/usr/lib/python3/dist-packages/torch/distributions/relaxed_categorical.py:80: TracerWarning: Converting a tensor to a Python float might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    log_scale = (torch.full_like(self.temperature, float(K)).lgamma() -

-- Docs: https://docs.pytest.org/en/stable/warnings.html
=========================== short test summary info ============================ FAILED distributions/test_distributions.py::TestDistributions::test_invalid_parameter_broadcasting =========== 1 failed, 147 passed, 59 skipped, 69 warnings in 21.10s ============
Traceback (most recent call last):
File "/tmp/autopkgtest-lxc.uzyuxcob/downtmp/build.HAJ/src/test/run_test.py", line 926, in <module>
    main()
File "/tmp/autopkgtest-lxc.uzyuxcob/downtmp/build.HAJ/src/test/run_test.py", line 905, in main
    raise RuntimeError(err_message)
RuntimeError: distributions/test_distributions.py failed!
autopkgtest [19:25:23]: test 10_of_49__pytest__test_distributions

Attachment: OpenPGP_signature
Description: OpenPGP digital signature


Reply to: