[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Bits from /me: A humble draft policy on "deep learning v.s. freedom"



Hi,

Some additional data points:

* In order to train the most widely used convolutional neural network,
  I use 4 * GTX 1080Ti cards on an 8-card machine. The network occupies
  around 40 GiB of video memory during training.

* GTX 1080 is the lowest standard for research or production. More
  common choices for rich groups are the Nvidia Titan X cards or
  Tesla cards.

* The state-of-the-art natural language representation, BERT, takes
  2 weeks to train on TPU at a cost about $500.
  https://github.com/google-research/bert
  CPU cannot do that in finite time.

For the reproducibility problem: In the definition of "Free Model",
I mentioned that the model *should be reproducible* with a fixed
random seed. This is also a good practice for ML/DL engineers
and researchers.

On 2019-05-21 12:10, julien.puydt@laposte.net wrote:
> Hi
> 
> Le 21 mai 2019 13:45, Mo Zhou <lumin@debian.org> a écrit :
> 
>> It's always good if we can do these things purely with our archive.
>>
>> However sometimes it's just not easy to enforce: datasets used by DL
>>
>> are generally large, (several hundred MB ~ several TB or even
>> larger).
> 
> And even with the data, the training might need an awfully powerful
> box *and* weeks of computation *and* some of the algorithms aren't
> deterministic, so reproducibility is a problem, not only for Debian
> but for the scientific community at large.
> 
> jpuydt


Reply to: