[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Rethink about who we (Debian) are in rapid dev cycle of deep learning

On 14/01/22 07:01, tomas@tuxteam.de wrote:
On Thu, Jan 13, 2022 at 10:07:05PM +0100, Davide Prina wrote:
On 13/01/22 01:00, M. Zhou wrote:

Cool and useful stuff keeps emerging -- e.g., Facial Authentication for Linux

note that EU (European Union) Privacy is managed by the GDPR Regulation and
the ePrivacy Directive.
The Directive will be replaced by the ePrivacy Regulation that will have
more strict rules (probably this will be approved this year).

As far as I understand the GDPR won't restrict the tech itself, but only
its use. Which makes sense. Basically, no consent => no use, except in
very restricted scenarios (e.g. public security).

That said, to have a workable face recognition, you'll need a training
set (at least with current "solutions"), so you'll have to collect
consent from all those face "providers".

I think that is not so simple. The reply can be very long and articulated, I will try to be very concise and let you know some points that I think can be very "interesting".

If you manage biometric data of EU citizen you must consider also:

* citizen can revoke the consent: so probably you must retire you model and generate new one without the data revoked. But if you have saved your model in a CVS/DVCS or similar... or you have distributed the model... how can you do that?

* with the new ePrivacy legislation, in some cases, the consent have a time of validity (I don't know if applicable also for this uses type) and you need to have a renewed consent... or delete the data (there are some exceptions, but I don't think they are applicable in this cases; and in any case these exceptions can have longer time validity)

* if you store and use biometric data you have to inform the Privacy State Board and also have the OK for the use you are declaring. the consent has validity only if you have done previously this step.

* in theory, for the few thing I know about AI, a model is something similar to an aggregation/anonymization... but for facial recognition a researcher have been able to extract original face from a model used to generate faces of not existing people. Other researchers have demonstrate that using anonymized data, aggregated with public data, they can identify some real people of the anonymized data. In these cases the biometric data can be stored only in EU territory and the servers where are stored must not be accessible by servers external the EU territory (as my previous reply in reality there are other territory external EU if they are part of the...)

All the above said, I'm not a lawyer. Nor do I play one on TV :)

I'm not a law/privacy expert, so I can mistake something.


Reply to: