i3, une unité mixte de recherche CNRS (UMR 9217)
fr

Institut Interdisciplinaire de l'Innovation

Loading Events
  • This event has passed.
Chaire Valeurs et Politiques des Informations Personnelles
Posted on 18 October 2021

The Connected Cars & Cyber Security Chair is the twelfth teaching and research chair at Télécom Paris to be financed by companies, and the second to deal with cybersecurity. In partnership with Nokia, Renault, Thales, Valeo and Wavestone, it is conducting its work along five main lines

Risk analysis and operational safety ;
Real-time data and data flow protection, cryptography and agility;
Authentication, identity and behavioral fingerprinting;
Resilience by design;
Protection of personal data involved in the connected vehicle (legal and societal aspects).

In this fifth area, the C3S Chair is working closely with the VP-IP Chair under the coordination of Claire Levallois-Barth (ITM). In this context, Jonathan Keller, a research engineer at the C3S Chair and associate researcher at the VP-IP Chair, and Claire Levallois-Barth have conducted an in-depth study on Data Protection Impact Assessment (DPIA) in connected cars. A first article entitled "Critical approach to the EDPS guidelines on connected vehicles submitted for consultation" was published in April 2020 in the newsletter n°17, on the occasion of the release of the first version of the European Data Protection Committee (EDPS) guidelines on "processing of personal data in the context of connected vehicles and mobility applications".

This work has since been the subject of a full Research Report (see summary below), and a companion document accessible to a wider audience. They will be made public at the joint conference of the two chairs VP-IP and C3S on the morning of November 18 (see details below).

AIPD through the prism of a practical case

The digitization of the automobile, and even its autonomization, is a source of hope, but also of concern. Among the latter, the fear of a digital panopticon developed without the knowledge of drivers and passengers, or even manufacturers, through the collection and use of their personal data is emerging.

The prevention of such a fear is anticipated by the General Data Protection Regulation (GDPR), Article 35 of which requires the data controller, in certain circumstances, to consult with stakeholders (subcontractors and technology providers in particular) prior to the implementation of any processing of personal data likely to entail a risk to the rights and freedoms of the persons concerned.

As a matter of principle, the DPIA is mandatory for certain types of data processing provided for by the GDPR. In this regard, the European text explicitly provides for a list of five distinct criteria: evaluation or scoring of personal aspects, automatic decision-making, large-scale systematic monitoring of a publicly accessible area, large-scale processing of sensitive data or highly personal data, large-scale data. In its Interpretative Guidelines 1/2020 on Article 35 of the GDPR, the EDPS adds four new alternative criteria: cross-referencing or combination of data sets, data relating to vulnerable persons, innovative use or application of new organizational solutions, and processing that prevents data subjects from benefiting from a service or contract.

As a prerequisite to the implementation of processing, the requirement to conduct a DPIA should be seen as fulfilling two distinct functions:

To define and assess the risks to personal data processed by the controller prior to the deployment of a processing operation to identify and mitigate the inherent risks, and
To have a risk assessment management tool to prioritize the appropriate measures to be deployed and set up.

These two functions should enable the data controller to demonstrate compliance of its data processing, highlighting the existence of active prevention to mitigate the harmful effects of the processing.

Several methodologies are available to assist the data controller in conducting a DPIA. Four of them (CNIL, PRIAM, BSI and NIST) were evaluated in the context of the research report, our main criterion being based on the following question: "which methodologies are the most relevant for analyzing the risks incurred in the context of a personal data processing operation involving a plurality of controllers? Each methodology has advantages and disadvantages, and takes into account a different set of risks. Although it is a manifestation of the principle of responsibility, in practice, the performance of a DPIA suffers from numerous shortcomings, thus making the legal obligation to perform it more cumbersome. The research report thus demonstrates the legal and practical limits of the provisions of Article 35 of the GDPR and its interpretation by the European and French guidelines. Situated between an obligation of reinforced means in its realization and an obligation of result in the demonstration of its documentary proof, the AIPD proves to be a difficult instrument to implement, to say the least.

In order to study the methodologies related to PIDA, a practical case ("Biomem") was defined in collaboration with the partners of the C3S Chair, taking into account different hypotheses of personal data processing by different actors. This practical case - the implementation of on-demand video services in a connected vehicle - offers the possibility of applying four impact analysis methodologies to deduce their effectiveness in such a context and define adequate risk mitigation measures. Three assumptions are considered, depending on whether the Biomem service is provided by a standalone entity, by the vehicle manufacturer, or by the video-on-demand provider as an ancillary service. The law and provisions applicable to registration data, biometric data, and geolocation data are examined. Interviews with data controllers, third parties to the Chair, were conducted (feedback).

In its third part, the research report carries out an exercise in comparative law. Indeed, uncertainties, designated under the terminology of "risks", are mainly assessed at the European level by the two European legal systems, the Council of Europe and the European Union. This assessment is in practice carried out by two courts set up by these systems, the European Court of Human Rights and the Court of Justice of the European Union. Furthermore, the issue of assessing damages under the GDPR requires studying judicial practice in non-EU states. In particular, decisions rendered in the United States of America and the United Kingdom impose monetary penalties and invite an examination of the concept of harm. The various decisions taken by foreign authorities make it possible to explore the possible transposition of these Anglo-Saxon trends into French law.

The research report and the companion document allow us to better situate the Data Protection Impact Assessment process in the regulatory context, and to consider the most relevant DPIA methodologies according to the needs encountered.