i3, une unité mixte de recherche CNRS (UMR 9217)
fr

Institut Interdisciplinaire de l'Innovation

Loading Events
  • This event has passed.
Values and Personal Information Policy Chair (VP-IP)
18 November 2021 • 10h-12h30 • en visioconférence

The Chairs for Personal Information Values and Policies (VP-IP) and  Connected Cars and Cyber Security (C3S) are organizing a joint conference on November 18, 2021 dedicated to the treatment of personal data in the context of connected vehicles. A research report on Data Protection Impact Analysis (DPIA): the case of connected cars will be presented, and its findings discussed.  

The Connected Cars & Cyber Security Chair, the twelfth Télécom Paris teaching and research chair financed by companies and the second chair dealing with cybersecurity, in partnership with Nokia, Renault, Thales, Valeo and Wavestone, is conducting its work along five main lines:

1- Risk analysis and operational safety;
2' Data protection and real-time data flow, cryptography and agility;
3- Authentication, identity and behavioral fingerprinting;
4- Resilience by design;
5- Protection of personal data involved in the connected vehicle (legal and societal aspects).

In this fifth area, the C3S Chair works closely with the VP-IP Chair under the coordination of Claire Levallois-Barth (ITM). In this context, Jonathan Keller, a research engineer at the C3S Chair and associate researcher at the VP-IP Chair, and Claire Levallois-Barth have conducted an in-depth study on Data Protection Impact Assessment (DPIA) in connected cars. A first article entitled "Critical approach to the EDPS guidelines on connected vehicles submitted for consultation" was published in April 2020 in the newsletter n°17, on the occasion of the release of the first version of the European Data Protection Committee (EDPS) guidelines on "processing of personal data in the context of connected vehicles and mobility applications".

This work has since been the subject of a full Research Report (see summary below), and a companion document accessible to a wider audience. They will be made public at the joint conference of the VP-IP and C3S Chairs on the morning of November 18 (see details below).

AIPD through the prism of a practical case

The digitization of the automobile, or even its autonomization, is a source of hope, but also of concern. Among the latter, the fear of a digital panopticon developed without the knowledge of drivers and passengers, or even manufacturers, through the collection and use of their personal data is emerging.

The prevention of such a fear is anticipated by the General Data Protection Regulation (GDPR), Article 35 of which requires the data controller, in certain circumstances, to consult with stakeholders (subcontractors and technology providers in particular) prior to the implementation of any processing of personal data likely to entail a risk to the rights and freedoms of the persons concerned.

As a matter of principle, the DPIA is mandatory for certain types of data processing under the GDPR. In this regard, the European text explicitly provides for a list of five distinct criteria: evaluation or scoring of personal aspects, automatic decision-making, large-scale systematic monitoring of a publicly accessible area, large-scale processing of sensitive data or highly personal data, large-scale data. In its Interpretative Guidelines 1/2020 on Article 35 of the GDPR, the EDPS adds four new alternative criteria: cross-referencing or combination of data sets, data relating to vulnerable persons, innovative use or application of new organizational solutions, and processing that prevents data subjects from benefiting from a service or contract.

As a prerequisite to the implementation of the processing, the obligation to conduct a DPIA should be seen as fulfilling two distinct functions:

  • Define and assess the risks to personal data processed by the data controller prior to the deployment of a processing operation to identify and reduce the inherent risks, and
  • Have a risk assessment management tool to prioritize the appropriate measures to be deployed and set up.

These two functions must allow the data controller to demonstrate the compliance of its data processing, emphasizing the existence of active prevention to mitigate the harmful effects of the processing.

Several methodologies are available to assist the data controller in conducting a DPIA. Four of them (CNIL, PRIAM, BSI and NIST) were evaluated in the context of the research report, our main criterion being based on the following question: "which methodologies are the most relevant for analyzing the risks incurred in the context of a personal data processing operation involving a plurality of controllers?" Each methodology has advantages and disadvantages, and takes into account a different set of risks. Although it is a manifestation of the principle of responsibility, in practice, the performance of a DPIA suffers from numerous shortcomings, thus making the legal obligation to perform it more cumbersome. The research report thus demonstrates the legal and practical limits of the provisions of Article 35 of the GDPR and its interpretation by the European and French guidelines. Situated between an obligation of reinforced means in its realization and an obligation of result in the demonstration of its documentary proof, the AIPD proves to be a difficult instrument to implement, to say the least.

To study the methodologies related to PIA, a practical case ("Biomem") including different hypotheses of personal data processing by different actors has been defined in collaboration with the C3S Chair partners. This practical case - the implementation of on-demand video services in a connected vehicle - offers the possibility of applying four impact analysis methodologies to deduce their effectiveness in such a context and define adequate risk mitigation measures. Three assumptions are considered, depending on whether the Biomem service is provided by a standalone entity, by the vehicle manufacturer, or by the video-on-demand provider as an ancillary service. The law and provisions applicable to registration data, biometric data, and geolocation data are examined. Interviews with data controllers, third parties to the Chair, were conducted (feedback).

In its third part, the research report carries out an exercise in comparative law. Indeed, uncertainties, designated under the terminology of "risks", are mainly assessed at the European level by the two European legal systems, the Council of Europe and the European Union. This assessment is in practice carried out by two courts set up by these systems, the European Court of Human Rights and the Court of Justice of the European Union. Furthermore, the issue of assessing damages under the GDPR requires studying judicial practice in non-EU states. In particular, decisions rendered in the United States of America and the United Kingdom impose monetary penalties and invite an examination of the concept of harm. The various decisions taken by foreign authorities make it possible to explore the possible transposition of these Anglo-Saxon trends into French law.

The research report and the companion document allow us to better situate the Data Protection Impact Assessment process in the regulatory context, and to consider the most relevant DPIA methodologies according to the needs encountered.

Contents of the research report "Data Protection Impact Assessment: the case of connected cars"

  • Contents of the research report "Data Protection Impact Assessment: the case of connected cars*
  • Introduction
  • Part 1: Definition of the Biomem case study
    ◊Chapter 1: Presentation of the three hypotheses of the Biomem case study
    ◊Chapter 2: Law applicable to the processing carried out by Biomem
  • Part 2: Risk analysis methodologies for personal data protection
    ◊Chapter 1: Position of the European Data Protection Committee on personal data protection impact assessments (PIDA)
    ◊Chapter 2: Feedback on how to conduct a DPIA
    ◊Chapter 3: Analysis of the four selected methodologies under the prism of the European Data Protection Committee's DPIA Guidelines
  • Part 3: Judicialization of risks related to personal data
    ◊Chapter 1: Risk assessment by the European Court of Human Rights and the Court of Justice of the European Union
    ◊Chapter 2: The assessment of the personal data breach by the judicial judge
    ◊Conclusion

Details

Date:
18 November 2021
Time:
10 h 00 min - 12 h 30 min
Event Category:

Venue

en visioconférence
France + Google Map