Re: Rethink about who we (Debian) are in rapid dev cycle of deep learning
On 13/01/22 01:00, M. Zhou wrote:
Cool and useful stuff keeps emerging -- e.g., Facial Authentication for Linux
note that EU (European Union) Privacy is managed by the GDPR Regulation
and the ePrivacy Directive.
The Directive will be replaced by the ePrivacy Regulation that will have
more strict rules (probably this will be approved this year).
Note: a directive must be implemented in national law of each EU states
and each state can select how to "implement" it. A Regulation becomes
effective law for each EU states simultaneously, same rules for all. (In
reality this is not true for only EU states, but also for all states
that are in the European Single Market that don't have contract some
special exception for the field ruled by the Regulation. For example
Norway, who is not an EU state, is subject to GDPR Regulation...
societies have been fined by Privacy Norway Board for violating GDPR).
I have read that the new ePrivacy Regulation will introduce new strict
rules, for example no one can use AI for doing a facial recognition
(only Police can do it and only on regulated cases), but also cannot be
used in more generic fashion, for example for identify people type that
are making a demonstration (for example identify if they are woman/man
or most woman/man, the religion that they have, the color of they skin,
the origin country/region, ...).
Note: in reality facial recognition in public spaces is illegal also today.
So facial recognition will be illegal for doing workers authentication
or for identify clients in your shop or...
Note also that actually some data use are illegal in EU, for example a
society has used public photos to training AI and that society has been
fined for that action, because that society don't have a user consent
for this data treatment.
If I don't mistake also other extra-UE states are introducing
law/privacy law that limit AI usage.
All of this to say that AI in Debian cannot only introduce license
problems, but also legal problems.
I think that if Debian give to users general AI product that can be used
to train models, than, I think, it is a user responsibility (it is the
user that select what data to use to training and the use of the
training data). But if Debian give users a package that use a trained
model for doing something than, I think, that there must be at least a
disclaimer... so if there will be a package frdm (Facial Recognition
Display Manager) that let user authentication with only facial
recognition, probably who install/configure it will have to be
informed/accept that the use of this package in some states can violate
the law if not used only for personal use (or something similar).
I'm not a legal expert and neither a privacy expert.
But I will be interested to know what other people think about that and
if they are legal/privacy experts.