The FTC’s Epic ‘Epic’ Settlement: Parallels with EU Law

Epic Games Settles with FTC for $520 Million Over COPPA Violations and Deceptive 'Dark Patterns'

 

On 20 December 2022, the US Federal Trade Commission (FTC) announced two settlements totalling $520 million with Epic Games, the company behind the game Fortnite. 

The FTC accused Epic of violating the Children’s Online Privacy Protection Act (COPPA) and using “dark patterns” (manipulative design) to deceive millions of players into making “unintentional purchases”.

The case provides valuable insights into some of the most prominent current themes in data protection—both in the US and Europe—including children’s privacy, privacy-by-design, and dark patterns (albeit in a consumer law context).

The settlement also arguably shows how European data protection principles are influencing privacy law in the US.

The Background

There are two separate FTC complaints behind Epic’s settlement, alleging that Epic:

  • Violated COPPA, a US federal law that regulates the online collection of personal information from children under 13.
  • Violated Section 5 of the FTC Act, which bans “unfair or deceptive acts” in commercial transactions.

 

More specifically, the FTC alleged that Epic:

  1. Collected personal information from Fortnite players aged under 13 without providing proper notice to their parents or obtaining their parents’ consent.
  2. Set unreasonable barriers when parents requested that their children’s personal information be deleted (and sometimes failed to honour requests).
  3. Enabled real-time voice and text chat for children and teens by default.
  4. Used “dark patterns” and deceptive interfaces to trick players, including children and teenagers, into making purchases.
  5. Charged consumers for in-game items without clearly disclosing the terms of purchase.
  6. Failed to properly disclose that some in-game items would renew automatically, leading to unexpected charges.
  7. Failed to properly disclose that players could make purchases using their mobile device’s in-app payment system, resulting in unexpected charges.
  8. Used deceptive practices to induce players to make in-game purchases, such as using confusing language, hiding certain information, and making it difficult for users to cancel or opt out of purchases.

 

Children’s Privacy

At $275 million, Epic’s COPPA settlement is the largest in history—far exceeding the previous record-holder, YouTube, which settled for $170 million in 2019.

For the first time in a COPPA settlement, the FTC also required Epic to adopt “strong privacy default settings” for children and teens (despite COPPA not applying to teenagers).

COPPA applies to companies running websites and apps if they direct their services to children under 13, or have “actual knowledge” that some of their users are children.

Referring to several sources, including Epic’s marketing materials, internal discussions, and (reportedly reluctant) use of Microsoft’s age verification data, the FTC establishes that Epic targeted Fortnite at children under 13—which brings the company within COPPA’s scope.

COPPA provides five main requirements for covered companies:

  1. Posting a clear and comprehensive privacy policy.
  2. Providing clear and comprehensive notice of their data collection practices to parents.
  3. Obtaining verifiable parental consent before collecting, using, or disclosing personal information from children.
  4. Providing a reasonable means for parents to request access to the personal information collected from their children online.
  5. Deleting personal information collected from children online, upon parental request.

 

In its COPPA complaint, the FTC alleges that Epic violated of all but the first of these requirements.

Parental Notice and Consent

As mentioned above, COPPA requires companies to provide parents with “clear and comprehensive” notice of their data collection practices before collecting personal information from children under 13.

The law also requires companies to obtain “verifiable parental consent” before collecting personal information from children under 13.

In its complaint against Epic, the FTC alleges that Epic did provide notice to, and request consent from, some parents, but not others.

The FTC notes that Epic implemented a new age verification process in 2019 after being alerted that many young children played Fortnite.

From this point on, if a player with US IP address self-declared as being 12 or under, Epic would require that player to provide a parent’s email address. The company would then provide notice to and request consent from the parent via email.

However, the FTC criticises Epic for having allegedly failed to meet COPPA’s parental notice and consent requirements for children with certain types of accounts, or children using non-US IP addresses.

Parental Consent Under the GDPR

The EU and UK GDPR also include parental consent rules. 

In Europe, as in the US, there is widespread agreement that protecting children’s privacy is important. However, there is less consensus about how to comply with children’s data protection rules, particularly around how to verify someone’s age in a privacy-respecting way.

Article 8 of the GDPR requires controllers to obtain verifiable consent before processing children’s data if the controller is:

  • Providing “information society services” (which broadly means online services, and would include Fortnite)
  • Offering those services directly to a child
  • Relying on “consent” as its legal basis for processing

 

Member states can determine the age of a “child”, as long as this falls between 13 and 16. Laws about children entering into contracts, which also vary significantly across Europe, can further complicate the processing of children’s data.

Several high-profile enforcement decisions have concerned violations of the GDPR’s rules on children’s data, including the recent €405 million fine against Instagram from the Irish Data Protection Commission (DPC).

Default Settings

A central allegation in the FTC’s COPPA complaint was that certain settings were “on by default”, including voice chat.

The complaint cites an email from Epic’s UX designer urging the company to “avoid voice chat or have it opt-in at the very least” out of concerns over in-game “toxicity” affecting children.

The FTC notes that Fortnite did include a toggle allowing “those who happened to find it” to switch off voice chat. 

However, the chat features remained active by default for all players, reportedly leading to children being “bullied, threatened, and harassed, including sexually” while playing Fortnite.

Data Protection By Design and By Default in EU Law

COPPA does not include any explicit “privacy-by-design” or “data minimisation” requirement.

In the EU, such rules are provided by the GDPR’s doctrines of “data protection by design” and “data protection by default”.

Among other requirements, Article 25(2) of the GDPR obliges controllers to ensure that “by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.”

Turning off voice chat by default—not just for children, but for all players—is an example of a way to meet with this requirement.

Review and Deletion Requests

COPPA requires that covered businesses provide a means for parents to review or delete their children’s personal information. 

The FTC alleges that Epic made some parents “jump through extraordinary hoops” when making such requests. These hoops included requiring parents to provide an excessive amount of verification data, such as: 

  • All IP addresses from which their child had played Fortnite
  • The child’s account creation date
  • A copy of the parent’s passport, ID card or mortgage statement
  • Information about historic purchases made on the account

 

The FTC implies that the process of parental verification was deliberately designed to deter parents from reviewing or deleting their children’s personal information.

Identity Verification Under EU Law

The issue of parents’ rights over personal data about their children is more nuanced under the GDPR than under COPPA.

However, as under COPPA, the GDPR requires controllers to verify a person’s identity (or, where applicable, the identity of a child’s parent) before complying with a request to access, delete or modify personal data.

Efforts to identify a person who is exercising their data protection rights must be proportionate. The principle of data minimisation applies—controllers should not collect more personal data than is necessary in order to verify a person’s identity.

But this can be a tricky area, and not all controllers appropriately balance data minimisation and verification. 

For example, in a decision against jewellery firm Pandora, the Danish data protection authority (DPA) found that the company’s practice of systematically requiring a passport or driver’s license before facilitating data subject rights violated the data minimisation principle.

Dark Patterns

The FTC’s second complaint against Epic accused the company of using “dark patterns” to deceive players into making unwanted purchases.

Some examples of the “design tricks” used by Epic include:

  • Intentionally reducing the prominence of a “cancel purchase” button in Fortnite’s in-game store.
  • Requiring users to “find and navigate a difficult and lengthy path” to request a refund in the Fortnite app.
  • Forcing users to “navigate several unnecessary steps” to obtain a refund (described by Epic’s own UX designer as adding “friction for friction’s sake”).

 

Dark Patterns in US Privacy Law

While the “dark patterns” allegations relate to consumer protection law, rather than privacy law, the concept of dark patterns is increasingly relevant to privacy law—both in the US and other jurisdictions.

The California Privacy Rights Act (CPRA), effective from 1 Jan 2023, mentions dark patterns in its consent definition, specifying that “agreement obtained through use of dark patterns does not constitute consent”. 

The CPRA defines “dark patterns” as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decisionmaking, or choice…”

Likewise, the Colorado Privacy Act (CPA), effective from 1 July 2023, prohibits the use of dark patterns using the same definition as under the CPRA.

Dark Patterns in EU Data Protection Law

Dark patterns are not explicitly mentioned in either the GDPR or the ePrivacy Directive, which regulates the use of cookies.

However, the GDPR’s consent definition—which requires that consent is “freely given”, “specific”, “informed”, “unambiguous” and given via a “clear, affirmative action”—means that deceptive or manipulative design choices are likely to be illegal.

The European Data Protection Board (EDPB) also published guidelines in March 2022 called “Dark patterns in social media platform interfaces: How to recognise and avoid them”.

The EDPB guidelines described 15 types of dark patterns which fall into seven categories:

  • Overloading
    • Continuous prompting
    • Privacy maze
    • Too many options
  • Skipping
    • Deceptive snugness
    • “Look over there”
  • Stirring
    • Emotional steering
    • Hidden in plain sight
  • Hindering 
    • Dead end
    • Longer than necessary 
    • Misleading information
  • Fickle
    • Lacking hierarchy
    • Decontextualising
  • Left in the Dark 
    • Language discontinuity
    • Conflicting information 
    • Ambiguous wording or information

 

Privacy campaign group noyb launched an all-out assault on dark patterns in 2021. 

Noyb first submitted notices to 516 companies allegedly using manipulative design in their cookie banners. The group followed up with 422 complaints to DPAs against companies that reportedly failed to bring their websites into GDPR compliance.

Many DPA decisions have also hinged on the concept of dark patterns, including the French DPA’s fines against Google (€150 million) and Facebook (€60 million) for implementing their cookie banners in a way that made refusing cookies harder than accepting them.

Privacy in the US and the EU

After many decades of lax regulation, the US seems to be getting serious about privacy.

New US state laws are implementing European concepts, language and definitions. The US federal legislature is closer than ever to passing comprehensive privacy legislation with the American Data Privacy and Protection Act (ADPPA).

The FTC’s settlement with Epic, coupled with the California Attorney-General’s privacy settlement with Sephora earlier in the same year, suggests that US regulators are also increasingly willing to hold companies accountable for privacy violations.

It’s also noteworthy that, under the terms of its settlement with the FTC, Epic agreed to go beyond what is required under COPPA (which is, in essence, a simple “notice and consent” law) by applying “privacy-by-design” principles to children and teenagers.

The far-reaching nature of the Epic settlement is arguably further evidence that European data protection doctrines are influencing policy and practice in the US.