In September 2022, ANITI and Ekitia launched a citizen consultation on the knowledge, acceptability and ethical challenges of AI and data.

This consultation was divided into four separate surveys, each specific to a target audience, and addressed AI knowledge and citizens' apprehension in relation to key issues such as the use of personal data, human-machine collaboration, non-discrimination or energy sobriety.

Cette enquête est désormais clôturée. Après l’étude des résultats par une équipe composée : des membres du groupe de travail de l’enquête, des étudiants, des membres de l’Académie de Toulouse, des scientifiques et des partenaires industriels, 6 constats majeurs et 4 recommandations ressortent et font l’objet d’une synthèse que vous pourrez lire dans le document à télécharger plus bas.

Observations:

  • AI is understood in very different ways by different people
  • AI knowledge is uneven
  • there is a high level of mistrust in the use of personal data to improve services, including public services
  • purpose is key to AI acceptance
  • AI is accepted as indispensable, but not neutral
  • citizens are in favor of AI development, under certain conditions

Based on these findings, the working group that contributed to the study of the results came to 4 conclusions :

1Increase knowledge :

  • continue and expand awareness-raising and training initiatives for all ages
  • share a consensual definition of AI, its properties and the criteria for quantifying it

2Making the use of data and AI more transparent

  • strengthening vigilance

3Demystifying the possible applications and uses of AI

  • prioritize an explanation in sectors identified as priorities for AI deployment

4Ensuring ethical AI applications

  • anticipate future AI regulations
  • apply a regulatory approach right from the design stage of any AI project
  • more citizen involvement

Ne manquez rien !

Inscrivez-vous pour recevoir l'actualité d'ANITI chaque mois.

Nous n’envoyons pas de messages indésirables !

Categories: News

en_GBEnglish