top of page

The AI & Privacy Explorer #25/2024 (7-23 June)

  • Jul 1, 2024
  • 11 min read

Welcome to the AI digital and privacy recap of privacy news for week 25 of 2024 (7-23 June)! 

👈 Swipe left for a quick overview, then find 🔍 more details on each topic below.






 

 

📧 LfDI Rhineland-Pfalz Launches Information Campaign on Direct Marketing

The Rhineland-Palatinate data protection authority (LfDI) initiated an information campaign on postal and email advertising to raise awareness of data protection laws. The campaign outlines the balance required under the General Data Protection Regulation (GDPR), particularly under Art. 6(1)(f) GDPR, which allows data processing for legitimate interests, including marketing, provided it meets specific criteria.

👉 Read more here.

 

🤝 California Settles with Tilting Point Media over kids game "SpongeBob: Krusty Cook-Off" Data Violations

On 18 June 2024, California Attorney General Rob Bonta and Los Angeles City Attorney Hydee Feldstein Soto finalized a settlement with Tilting Point Media LLC concerning violations of data privacy laws, specifically the California Consumer Privacy Act (CCPA) and the Children's Online Privacy Protection Act (COPPA). The case centered on the mobile game "SpongeBob: Krusty Cook-Off," where the company collected and shared children's data without appropriate parental consent.

Key Violations and Settlement Details

The investigation revealed that Tilting Point Media misconfigured third-party software development kits (SDKs), leading to unauthorized data collection from children. Additionally, the game’s age-screening mechanism failed to effectively determine users' ages.

The settlement requires Tilting Point to pay $500,000 in civil penalties, divided equally between the California Attorney General and the Los Angeles City Attorney. Additionally, the company must adhere to comprehensive injunctive provisions, which include:

  • Compliance with COPPA and CCPA regarding children's data across all games.

  • Not selling or sharing personal information of children under 13 without verified parental consent.

  • Implementing neutral age-screening mechanisms to ensure accurate age input by users.

  • Proper configuration and governance of third-party software development kits (SDKs) to prevent unauthorized data collection.

  • Ensuring clear parental consent for data collection from children under 13.

  • Limiting data sharing and advertising to comply with legal standards.


👉 Read the press release here.

 

💼 R.R. Donnelley Settles SEC Charges Over Third-Party Cybersecurity Failures

On 18 June 2024, the US Securities and Exchange Commission announced a $2.125 million settlement with R.R. Donnelley & Sons Co., a leading provider of business communications and marketing services, due to cybersecurity-related violations. The settlement addressed the company's oversight failures regarding its third-party managed security services provider (MSSP) during a significant cybersecurity incident in late 2021.

Key Points

  • Background: R.R. Donnelley, headquartered in Chicago, provides global business communication services. It stores and transmits sensitive data for a variety of clients, including SEC-registered firms and financial institutions.

  • Third-Party Management Failure: The SEC found that Donnelley failed to adequately manage and supervise its third-party managed security services provider, which was responsible for monitoring cybersecurity alerts. This lack of oversight led to significant delays in response to critical alerts, ultimately resulting in a ransomware attack that compromised 70 GB of client data.

  • Legal Findings: The company was charged with violating Section 13(b)(2)(B) of the Securities Exchange Act and Exchange Act Rule 13a-15(a), both of which mandate robust internal controls and proper disclosure of cybersecurity risks.

  • Settlement and Cooperation: R.R. Donnelley cooperated with the SEC investigation by promptly adopting new cybersecurity measures, revising internal procedures, and increasing cybersecurity staffing. This cooperation influenced the settlement terms, which reflected their proactive remedial actions.

👉 Read the press release here.

 

 

🌐 OECD Releases Report on Digital Safety for Children

The OECD released a comprehensive report on digital safety for children, focusing on embedding safety by design in digital products and services. The report builds on the OECD's 2021 guidelines and recommendations, offering a framework for governments and digital service providers. It emphasizes designing digital environments that cater to children's needs and vulnerabilities.

The report emphasizes eight key components to enhance children's safety in digital environments:

  • Age Assurance: Implementing tech-neutral mechanisms to ensure age-appropriate content and experiences.

  • Child-Centered Design: Considering developmental stages and socio-economic differences among children.

  • Preventing and Detecting Harm: Utilizing default settings, filters, and shared signals for risk mitigation.

  • Privacy Protection: Emphasizing data protection by design, transparency in data collection, and careful monitoring of sensitive data.

  • Child-Friendly Information: Providing clear safety policies, terms, and standards in accessible language for young users.

  • Complaints and Redress: Facilitating mechanisms for flagging unsafe or illegal content by children.

  • Child Participation: Involving children in decision-making processes, respecting their right to be heard.

  • Culture of Safety: Promoting safety awareness, corporate responsibility, and conducting child rights impact assessments.

The report also outlines the global movement towards digital safety for children, urging jurisdictions to adopt cohesive policies to prevent regulatory fragmentation. It includes case studies illustrating best practices and risks in digital platforms. such as:

  • LEGO Life App: A child-focused platform where safety measures include parental consent for account creation, child-centered design, and strict moderation to prevent personal information sharing. Privacy is ensured through anonymous user profiles.

  • Roblox: This platform balances creative and social opportunities with safety. It utilizes a "Trust by Design" process, engaging cross-functional teams in safety assessments. Age-appropriate controls, asset pre-moderation, and user guidance help mitigate risks.

  • Omegle: A platform originally aimed at adults, but accessible to children, exemplifies potential harms due to lack of effective moderation. It presents high risks of inappropriate content and interactions, illustrating the critical need for robust age assurance and harm prevention measures.

  • The report emphasizes that there is no one-size-fits-all solution; safety measures must be tailored to the specific risk profiles of different services.

👉 Read the report here.


 

 

🇪🇺 EDPB Finalizes Guidelines on Law Enforcement Data Transfers

On 19 June 2024, the European Data Protection Board (EDPB) released the finalized version of Guidelines 01/2023, which addresses Article 37 of the Law Enforcement Directive (LED). These guidelines are designed to assist EU Member States in establishing appropriate safeguards for personal data transfers by competent authorities to third countries or international organizations, ensuring that such transfers maintain a high level of data protection.

Legal Mechanisms and Safeguards

The guidelines detail the requirements for choosing legal mechanisms, emphasizing the importance of legally binding instruments. These instruments should ensure an essentially equivalent level of data protection, offering more legal certainty and transparency compared to self-assessment methods.

Assessment of Transfer Circumstances

Authorities are advised to assess the risks involved in data transfers, considering the impact on the fundamental rights of data subjects. This involves evaluating specific transfer circumstances and ensuring that appropriate safeguards are in place.

Accountability and Compliance

Competent authorities must maintain detailed records of processing activities, regularly review and update their assessments of data transfers, and cooperate with supervisory authorities. The guidelines also stress the need for Member States to align international agreements with LED standards to avoid undermining data protection when transferring personal data outside the EU.

👉 Read more here.


 

 

💸 The Norwegian Data Protection Authority cannot impose daily fines in cross-border cases

On 18 June 2024, the Data Protection Board ruled that the Norwegian Data Protection Authority (DPA) cannot impose daily fines on Meta for not complying with a ban on behavioral marketing on Facebook and Instagram.

Background

  • Ban on Marketing: The Norwegian DPA had banned behavioral marketing due to GDPR violations. Meta failed to comply, leading to the imposition of daily fines in a subsequent decision.

  • Daily Fines: Under Norwegian law, daily fines are permissible to enforce compliance. The DPA applied this to Meta, imposing NOK one million per day.

  • Appeal Board's Decision: The Data Protection Board ruled that these fines could not be applied to international companies like Meta, stating that Norwegian law's provision for daily fines is limited to domestic entities. The reason cited for this is that the ability to impose daily fines stems from national law, and thus cannot apply to foreign entities even in an emergency decision pursuant to Article 66 GDPR.

Implications

Enforcement Disparity: The ruling creates a situation where Norwegian companies can be fined daily, but international firms might escape such penalties, leading to potential inequities in enforcement.

Call for Legal Clarity: The DPA expressed concern over this interpretation and urged lawmakers to clarify the law to ensure consistent application across all companies, regardless of their origin. I agree that the reasoning is strange and the consequences quite ridiculous.

👉 Read the decision here and the DPA statement here (in Norwegian).


 

⚖️ CJEU Ruling on Compensation for Data Breach [C-182/22 and C-189/22 Scalable Capital]

 The Court of Justice of the European Union (CJEU) delivered a judgment on June 20, 2024, in merged cases C-182/22 and C-189/22 Scalable Capital, concerning the liability of the controller for damages to individuals. This ruling focused on the interpretation of Article 82(1) of the GDPR concerning compensation for non-material damage resulting from data breaches.

Background

The applicants, who had opened accounts with Scalable Capital, experienced theft of their personal data, including names and ID copies, by unknown third parties. They sought compensation from the Local Court of Munich, leading to questions referred to the CJEU about the nature of compensation under the GDPR.

CJEU Findings

  1. Compensatory Nature: The CJEU clarified that Article 82(1) serves an exclusively compensatory function, not a punitive one. Compensation aims to fully redress the damage suffered due to GDPR violations.

  2. Severity and Intent: The court held that the severity or intentional nature of the GDPR breach does not need to be considered when determining compensation. Non-material damage from data breaches should not be viewed as inherently less significant than physical harm.

  3. Minimal Compensation: The CJEU stated that when non-material damage is minor, national courts could award minimal compensation, provided it fully compensates the affected individuals.

  4. Identity Theft: The ruling emphasized that 'identity theft' involves actual misuse of personal data by a third party. However, compensation is not limited to cases where such misuse leads to fraud or further identity theft.

This judgment reinforces that individuals must receive full compensation for non-material damages due to data breaches under the GDPR, emphasizing the regulation's protective measures for personal data rights.

👉 Read the decision here.


 

🏛️ CJEU Ruling on Compensation for Non-Material Damages based on fear

On 20 June 2024 the Court of Justice of the European Union (CJEU) issued a ruling in Case C-590/22, concerning the right to compensation for non-material damages under GDPR Article 82(1). This followed a request from the Local Court of Wesel, Germany, about damages claims for non-material damage stemming from the disclosure of personal data.

Background

The case arose when tax returns were mistakenly sent to an incorrect address, resulting in unauthorized access to sensitive data. Claimants sought compensation, asserting they suffered non-material damage due to this disclosure. The Local Court referred several questions to the CJEU, focusing on whether GDPR infringement alone warrants compensation.

CJEU Findings

The court clarified that infringement alone does not justify compensation; actual damage must be shown, though no specific severity threshold is required. It also established that fear of data disclosure could merit compensation if the negative impact is substantiated. The CJEU emphasized that while fear must be proven, the existence of non-material damage does not need to be extensive.

Criteria for Damages

The Court ruled out the application of the administrative fine criteria from Article 83 when determining compensation amounts, since these are different types of liability. Additionally, it stated that compensation is purely compensatory, not punitive (same as in the other decision from 20 June in Cases C-182/22 and 189/22 Scalable Capital), and thus, should not serve a dissuasive function. The court further indicated that simultaneous breaches of national laws unrelated to GDPR specifications should not affect compensation assessments under Article 82(1) GDPR.

👉 Read the decision here.


 

 🚦Spain’s AEPD Explores Identity as Service vs. Fundamental Right

On 20 June 2024, the AEPD examined the implications of treating identity as a service rather than a fundamental right, emphasizing the potential threats to personal control over data and privacy.

Identity as a Fundamental Right

  • Recognized in international law, essential for social inclusion and equality.

  • Access to services: Education, healthcare, and financial services rely on legal identity.

  • Spanish Law: Protects identity through the Organic Law on the Protection of Citizen Security, ensuring identity belongs to citizens.

Risks of Treating Identity as a Service

  • Commodification can undermine rights and lead to social exclusion.

  • Examples:

    • World Bank’s ID4D: Marginalized citizens through biometric databases.

    • Kenya: Exclusion from legal recognition affected access to essential services.

Global Examples of Service-Driven Identity

  • India’s Aadhaar: Governance issues led to exclusion of millions from services.

  • ID.me (US) and Verify (UK): Digital systems resulted in service access issues for many.

Conclusion

  • The AEPD advocates for identity frameworks grounded in rightsprivacy, and fairness.

  • European Initiatives: Emphasize standardization and interoperability, supporting citizens’ autonomy while preventing surveillance and exclusion.

👉 Read more here.


 

📊 Belgium DPA Publishes 2023 Annual Report

The Belgian Data Protection Authority published its 2023 annual report, showcasing a year of significant renewal and collaboration. The report highlights several key areas:

Focus on Cookies
  • Developed practical compliance tools like the "cookie checklist" – get an automated translation into English here:

  • Aligned guidance with the European Data Protection Board (EDPB).

Support for Data Protection Officers (DPOs)
  • Active participation in DPODay.

  • Engaged in the coordinated European action on DPOs.

Awareness and Enforcement
  • Continued focus on youth data protection through educational initiatives.

  • Addressed ethical and societal aspects of AI in parliamentary discussions.

  • Contributed to significant cases against TikTok and Meta, focusing on international data transfers and privacy rights.

Statistics and Trends
  • Received 694 complaints, a 15% increase from 2022.

  • Handled 214 mediation requests, up 21%.

  • Data breaches primarily caused by human error (43%), with cyber threats like hacking at 32%.

  • Legislative requests increased by 90%.

  • Initiated 86 inspections and issued 171 decisions.

👉 Read more here.


 

 

🛡️ SAFE for Kids Act Signed into Law in New York

The SAFE for Kids Act, signed into law in New York on 20 June 2024, amends the general business law to protect minors from addictive social media feeds. The act addresses concerns about mental health impacts due to prolonged social media use among children and teenagers.

Key Provisions:

  • Definition of Addictive Feeds: The act defines an "addictive feed" as one that employs algorithms to recommend or prioritize content based on user behavior, aiming to enhance engagement.

  • Regulation: Social media platforms are prohibited from offering addictive feeds to users under 18 without verifiable parental consent. The act mandates using commercially reasonable methods to ascertain user age, ensuring compliance without relying solely on sensitive information like biometrics.

  • Penalties and Enforcement: The New York Attorney General is empowered to enforce the act, with potential civil penalties of up to $5,000 per violation. The act aims to hold social media companies accountable for non-compliance, safeguarding minors’ well-being. There is no private right of action.

  • Implementation Timeline: The act will take effect 108 days after the Attorney General’s office finalizes relevant regulations, allowing time for platforms to adapt their systems to meet new legal standards.

  • Rationale: The legislation responds to research linking addictive social media feeds with increased rates of depression, anxiety, and other mental health issues among youth. It underscores the inadequacy of self-regulation by platforms and the need for legal measures to mitigate these risks.

👉 Read the bill here.


 

 

🎮 Nordic DPAs Publish Guidance to Strengthen Children's Data Protection in Online Gaming

The Nordic Data Protection Authorities, led by Denmark's Datatilsynet, released new guidelines to enhance the protection of children's data in online gaming. This initiative was developed in response to the increasing prevalence of gaming among children and the corresponding privacy risks. The document emphasizes four key GDPR principles: fairness, transparency, data minimization, and accountability, along with guidance for game developers on how to comply with them.

Key Elements of the Guidelines

  • Fairness: Emphasizes the importance of equitable data processing, ensuring that the rights and vulnerabilities of children are respected. Developers are urged to avoid manipulative practices and to consider the specific context and impact of their data usage on young players.

  • Transparency: Stresses the need for clear and accessible communication about data collection and processing practices. The guidance advocates for age-appropriate explanations, ensuring that children understand how their data is being used and the implications of such usage.

  • Data Minimization: Advises developers to limit data collection to what is strictly necessary for game functionality. The principle discourages excessive data gathering, promoting privacy-focused default settings to protect children from unnecessary exposure.

  • Accountability: Calls for robust measures to demonstrate compliance with GDPR. This includes conducting Data Protection Impact Assessments (DPIAs) and maintaining thorough documentation of data processing activities. Developers must ensure that their practices align with regulatory requirements and effectively safeguard children's data.

The document does not address conditions for consent by children (Art. 8 GDPR).

👉 The guidance was published in English, here.


That’s it for this edition. Thanks for reading, and subscribe to get the full text in your inbox!


♻️ Share this if you found it useful.

💥 Follow me on Linkedin for updates and discussions on privacy education.

🎓 Take my course to advance your career in privacy – learn to navigate global privacy programs and build a scalable, effective privacy program across jurisdictions.

📍 Subscribe to my newsletter for weekly updates and insights in your mailbox.

Kommentare


Privacy & digital news FOMO got you puzzled?

Subscribe to my newsletter

Get all of my privacy, digital and AI insights delivered to you weekly, so you don’t need to remember to check my blog. You can unsubscribe at any time.


My newsletter can also include occasional marketing, such as information on my product launches and discounts.


Emails are sent through a processor located outside of the EU. Read more in the Privacy Notice.

It  takes  less  time  to  do  a  thing  right  than  to  explain  why  you  did  it  wrong.


Henry Wadsworth Longfellow

bottom of page