Editorial of March 2024 – Official Weblog of UNIO – Clouds Base

By the Alessandra Silveira 

On inferred private information and the difficulties of EU legislation in coping with this matter

The suitable to not be topic to automated choices was thought of for the primary time earlier than the Courtroom of Justice of the European Union (CJEU) within the latest SCHUFA judgment. Article 22 GDPR (on particular person choices based mostly solely on automated processing, together with profiling) at all times raised many doubts to authorized students:[1] i) what a choice taken “solely” on the premise of automated processing can be?; ii) would this Article present for a proper or, fairly, a basic prohibition whose utility doesn’t require the get together involved to actively invoke a proper?; iii) to what extent this automated choice produces authorized results or considerably impacts the information topic in an analogous method?; iv) will the provisions of Article 22 GDPR solely apply the place there is no such thing as a related human intervention within the decision-making course of?; v) if a human being examines and weighs different elements when making the ultimate choice, will it not be made “solely” based mostly on the automated processing? [and, in this situation, will the prohibition in Article 22(1) GDPR not apply]?

To those doubts a German court docket has added just a few extra. SCHUFA is a non-public firm beneath German legislation which offers its contractual companions with info on the creditworthiness of third events, specifically, shoppers. To that finish, it establishes a prognosis on the chance of a future behaviour of an individual (‘rating’), such because the compensation of a mortgage, based mostly on sure traits of that particular person, on the premise of mathematical and statistical procedures. The institution of scores (‘scoring’) relies on the belief that, by assigning an individual to a gaggle of different individuals with comparable traits who’ve behaved in a sure means, related behaviour may be predicted.[2]

SCHUFA offered a monetary entity with a rating for the applicant OQ, which served as the premise for refusing to grant the credit score for which the latter had utilized. That citizen subsequently requested SCHUFA to erase the entry regarding her and to grant her entry to the corresponding information. Nonetheless, SCHUFA merely knowledgeable her of the related rating and, basically phrases, of the rules underlying the strategy of calculating the rating, with out informing her of the precise information included in that calculation, or of the relevance attributed to them in that context, arguing that the strategy of calculation was a commerce secret.

Nonetheless, in response to the referring court docket, it’s in the end the credit score rating established by credit score info businesses that truly decides whether or not and the way a monetary entity/financial institution enters right into a contract with the information topic. The referring court docket proceeds on the belief that the institution of a rating by a credit score info company doesn’t merely serve to organize that financial institution’s choice, however constitutes an impartial “choice” throughout the that means of Article 22(1) of the GDPR.[3]

As we have now highlighted on this weblog,[4] this case legislation is especially related as a result of profiling is usually used to make predictions about people. It entails amassing details about an individual and assessing their traits or patterns of behaviour with a view to place them in a selected class or group and to attract on that inference or prediction – whether or not of their skill to carry out a process, their curiosity or presumed behaviour, and so on. To this extent, such automated inferences demand safety as inferred private information, since additionally they make it attainable to establish somebody by affiliation of ideas, traits, or contents. The crux of the matter is that persons are more and more shedding management over such automated inferences and the way they’re perceived and evaluated by others.

In SCHUFA case the CJEU was referred to as upon to make clear the scope of the regulatory powers that sure provisions of the GDPR bestow on the nationwide legislature, specifically the exception to the prohibition in Article 22(2)(b) GDPR – in response to which such prohibition doesn’t apply if the choice is permitted by European Union or Member State legislation to which the controller is topic. That is related as a result of, if Article 22(1) GDPR have been to be interpreted as that means that the institution of a rating by a credit score info company is an impartial choice throughout the that means of Article 22(1) of the GDPR, that exercise can be topic to the prohibition of automated particular person choices. Consequently, it could require a authorized foundation beneath Member State legislation throughout the that means of Article 22(2)(b) of the GDPR.

So, what’s new about this ruling? Firstly, the CJEU dominated that Article 22(1) of the GDPR offers for a prohibition tout court docket whose violation doesn’t should be invoked individually by the information topic. In different phrases, this provision guidelines out the potential for the information topic being the article of a choice taken solely on the premise of automated processing, together with profiling, and clarifies that energetic behaviour by the information topic shouldn’t be essential to make this prohibition efficient. [5] In any case, the prohibition won’t be relevant when the circumstances established beneath Article 22(2) and Recital 71 of the GDPR are relevant. That’s to say, the adoption of a choice based mostly solely on automated processing is authorised solely within the circumstances referred to in that Article 22(2), specifically when: i) it’s obligatory for coming into into, or efficiency of, a contract between the information topic and a knowledge controller [paragraph a)]; ii) it’s authorised by Union or Member State legislation to which the controller is topic [paragraph b)]; or iii) it’s based mostly on the information topic’s express consent [paragraph c)]. [6]

In second place, the CJEU clarified the extent to which Member State legislation is permitted to ascertain exceptions to the prohibition beneath Article 22(2)(b) of the GDPR. Based on the CJEU, it follows from the very wording of this provision that nationwide legislation authorizing the adoption of an automatic particular person choice should present for acceptable measures to safeguard the rights and freedoms and the authentic pursuits of the information topic. In mild of Recital 71 of the GDPR, such measures ought to embody acceptable mathematical or statistical procedures for the profiling, implementing technical and organisational measures acceptable to make sure, specifically, that elements which lead to inaccuracies in private information are corrected and the danger of errors is minimised, securing private information in a fashion that takes account of the potential dangers concerned for the pursuits and rights of the information topic and that stops, inter alia, discriminatory results on pure individuals. The SCHUFA case additionally made clear that the information topic has the fitting to i) get hold of human intervention; ii) to precise his or her perspective; and iii) to problem the choice. The CJEU has thus dispelled any doubts as as to whether the nationwide legislator is certain by the rights offered for in Article 22(3) of the GDPR, regardless of the considerably equivocal wording of this provision, which textually solely refers to Article 22(2)(a) and (c), seemingly to exclude Member States from that obligation. [7] The CJEU additionally added that Member States could not undertake, beneath Article 22(2)(b) of the GDPR, guidelines that authorise profiling in violation of the rules and authorized bases imposed by Articles 5 and 6 of the GDPR, as interpreted by CJEU case legislation. [8]

Lastly, the CJEU acknowledged the broad scope of the idea of “choice” throughout the that means of the GDPR, ruling {that a} profile could also be in itself an completely automated choice throughout the that means of Article 22 of the GDPR. The CJEU defined that there can be a threat of circumventing Article 22 of the GDPR and, consequently, a lacuna in authorized safety if a restrictive interpretation of that provision was adopted, in response to which the institution of the chance worth should solely be thought of as a preparatory act and solely the act adopted by the third get together can, the place acceptable, be categorised as a “choice” throughout the that means of Article 22(1). [9] Certainly, in that state of affairs, the institution of a chance worth equivalent to that at situation in the primary proceedings would escape the precise necessities offered for in Article 22(2) to (4) of the GDPR, although that process relies on automated processing and that it produces results considerably affecting the information topic, to the extent that the motion of the third get together to whom that chance worth is transmitted attracts strongly on it. This may additionally consequence within the information topic not having the ability to assert, from the credit score info company which establishes the chance worth regarding her or him, his or her proper of entry to the precise info referred to in Article 15(1)(h) of the GDPR, within the absence of automated decision-making by that firm. Even assuming that the act adopted by the third get together falls throughout the scope of Article 22(1) in as far as it fulfils the circumstances for utility of that provision, that third get together wouldn’t have the ability to present that particular info as a result of it usually doesn’t have it. [10]

Briefly, the truth that the willpower of a chance worth is roofed by Article 22(1) of the GDPR ends in its prohibition, except one of many exceptions set out in Article 22(2) of the GDPR applies – together with authorization by the legislation of the Member State, a chance which the CJEU has interpreted restrictively – and the precise necessities set out in Article 22(3) and (4) of the GDPR are complied with.

Nonetheless, the CJEU’s choice in SCHUFA nonetheless leaves many questions with out a clear response. Contemplating the precise request for a preliminary ruling, the CJEU answered that Article 22(1) of the GDPR have to be interpreted as that means that the automated institution, by a credit score info company, of a chance worth based mostly on private information referring to an individual and regarding his or her skill to fulfill cost commitments sooner or later, it constitutes “automated particular person decision-making” throughout the that means of that provision, the place a 3rd get together, to which that chance worth is transmitted, attracts strongly on that chance worth to ascertain, implement or terminate a contractual relationship with that particular person(our italics).[11]

Even if the CJEU’s reply outcomes from the precise wording of the query referred for a preliminary ruling – as written by the nationwide decide who’s the “grasp” of the referral – the query stays as to the extent of the CJEU’s reply. Did the CJEU maybe admit that profiling is, in itself, an completely automated choice – and, in precept, prohibited – however solely when the chance worth is decisive for the choice on the contractual relationship? Wouldn’t this verify the thought, rejected by the CJEU in Recital 61 of the SCHUFA case, that the willpower of the chance worth can be a easy preparatory act? And if the chance worth shouldn’t be decisive for the choice on the contractual relationship, then does the prohibition in Article 22 of the GDPR now not apply?

As we have now beforehand argued on this weblog, the issue ought to be seen as profiling itself, no matter whether or not or not it’s decisive for the choice of a 3rd get together. When profiling produces authorized results or equally considerably impacts a knowledge topic it ought to be seen as an automatic choice in accordance to Article 22 of the GDPR. The aim of Article 22 of the GDPR is to guard people in opposition to particular dangers to their rights and freedoms arising from the automated processing of private information, together with profiling – because the CJEU explains in paragraph 57 of the judgment in query. And that processing entails, as is evident from Recital 71 of the GDPR, the evaluation of private elements referring to the pure particular person affected by that processing, specifically the evaluation and prediction of elements referring to that particular person’s work efficiency, financial state of affairs, well being, private preferences or pursuits, reliability or behaviour, location or actions – because the CJEU rightly explains in paragraph 58 of the judgment in query.

You will need to keep in mind that profiling at all times consists of inferences and predictions in regards to the particular person, regardless the applying of automated particular person choices based mostly on profiling by a 3rd get together. To create a profile it’s essential to undergo three distinct phases: i) information assortment; ii) automated evaluation to establish correlations; and iii) making use of the correlations to a person to establish current or future behavioral traits. If there have been maybe automated particular person choices based mostly on profiling, these would even be topic to the GDPR – whether or not completely automated or not. That’s, profiling shouldn’t be restricted to the mere categorization of the person, nevertheless it additionally consists of inferences and predictions in regards to the particular person. Nonetheless, the effectiveness of the applying of the GDPR to inferred information faces a number of obstacles. This has to do with proven fact that the GDPR was designed for information offered straight by the information topic – and never for information inferred by digital applied sciences as AI techniques. That is the problem behind this judgment.


[1] See Alessandra Silveira, Profiling and cybersecurity: a perspective from elementary rights’ safety within the EU, Francisco Andrade/Joana Abreu/Pedro Freitas (eds.), “Authorized developments on cybersecurity and associated fields”, Springer Worldwide Publishing, Cham/Suíça, 2024.

[2] See Judgment SCHUFA, paragraph 14.

[3] See Request for a preliminary ruling of 1 October 2021, Case C-634/21, paragraph 23.

[4] See Alessandra Silveira, Lastly, the ECJ is decoding Article 22 GDPR (on particular person choices based mostly solely on automated processing, together with profiling), https://officialblogofunio.com/2023/04/10/finally-the-ecj-is-interpreting-article-22-gdpr-on-individual-decisions-based-solely-on-automated-processing-including-profiling/

[5] See Judgment SCHUFA, paragraph 52.

[6] See Judgment SCHUFA, paragraph 53.

[7] See Judgment SCHUFA, paragraph 65 and 66.

[8] See Judgment SCHUFA, paragraph 68. See additionally the ECJ choice within the Joined Circumstances C‑26/22 and C‑64/22.

[9] See Judgment SCHUFA, paragraph 61.

[10] See Judgment SCHUFA, paragraph 63.

[11] See Judgment SCHUFA, paragraph 73.

Image credit: Photograph by Pixabay on Pexels.com.

Leave a Comment

x