In the recent case C‑203/22 (Dun & Bradstreet Austria), the Court of Justice of the European Union (CJEU) has clarified the transparency requirements for automated decision-making under the GDPR.
The Court has established that when companies use automated decision-making systems, they must provide meaningful explanations about how these systems work. This ruling provides some clarity on the scope of transparency requirements for automated decision making.
Under Article 15(1)(h) GDPR, the Court determined that controllers must describe the procedure and principles actually applied in order to use the data subject's personal data to obtain a specific result.
The Court emphasized that the general complexity of automated systems cannot be used as an excuse to avoid providing proper explanations to individuals.
Importantly, the ruling does not require companies to reveal their actual algorithms or source code. Rather, it focuses on providing understandable explanations of the decision-making logic and its consequences for individuals.
The court rightfully stated, that the transparency obligations aims at enabling the data subject effectively to exercise the rights conferred on them by Article 22(3) GDPR, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.
Don't miss a thing, subscribe today!
Stay up to date by subscribing to the latest Data and Privacy insights from the experts at Fieldfisher.
Subscribe nowBalancing Trade Secrets and Transparency
The CJEU also clarified that trade secrets (e.g. algorithm logic) cannot automatically override data subject rights. When a controller claims that providing information would compromise trade secrets:
• They must disclose the allegedly protected information to the competent supervisory authority or court
• The authority/authority must then balance these competing interests
• National laws that automatically prioritize trade secrets over data access rights are incompatible with the GDPR
Practical Implications for Businesses using automated decision-making systems, this ruling means:
1. You must be prepared to explain how your automated systems work in understandable terms
2. You need to implement processes that allow for human intervention, enable individuals to express their views, and provide ways to contest decisions
3. You should in any case conduct Data Protection Impact Assessments for high-risk automated processing activities (Art. 35 GDPR)
4. You cannot hide behind complexity or trade secrets to avoid transparency obligations
This ruling reinforces that transparency isn’t just a formal checkbox exercise but a important requirement for automated decision making.