[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: pytorch 1.8.0 officially supports ROCm (beta status)



Hi Mo,

On 05.03.21 07:11, M. Zhou wrote:
> For your information, pytorch 1.8.0 officially supports ROCm,
> although it's still in the beta status.
> 
> https://pytorch.org/
> (see install pytorch -> stable -> linux -> pip -> python -> rocm)
> 
> This looks like an interesting move for the rocm ecosystem.

While true, it appears that official support for the Polaris 10 chips
(like the RX 580) has been dropped. I'm saddened by this, as I purchased
an RX 580 precisely because it was explicitly listed as officially
supported.

There's unofficial support for Polaris 11 and 12, like the RX 550 and RX
540.

One big difference between CUDA and ROCm, I realized a while ago, is
that CUDA aims to work with all cards, that is graphics and compute,
whereas ROCm aims to work primarily with compute cards (hence, why the
previous flagship 5700 XT was never supported by ROCm).

I think this makes it really difficult to support from a Debian point of
view. Either someone has a officially supported card (one of the
Instinct compute cards), or we'd need to get a porter box with one.

Best,
Christian


Reply to: