top of page
Search
  • Anna Claire Trahan

U.S. Sues TikTok for 3 Main COPPA Violations


On August 2, 2024 the U.S. sued TikTok and affiliates alleging violations of Children’s Online Privacy Protection Act (COPPA) and Children’s Online Privacy Protection Rule (COPPA Rule). In 2022, 66.67% (2/3rds) of all U.S. teens reported using TikTok and 61% of the teens aged 13 or 14. The complaint states that the precise number of violations is difficult to determine, however COPPA permits a maximum penalty of $51,744.00 for each violation. If the U.S. Government can successfully prove its allegations, the statutory damages permitted under COPPA could easily produce billions in damages against the social media platform.
On August 2, 2024 the U.S. sued TikTok and affiliates alleging violations of Children’s Online Privacy Protection Act (COPPA) and Children’s Online Privacy Protection Rule (COPPA Rule). 

On August 2, 2024 the U.S. sued TikTok and affiliates alleging violations of Children’s Online Privacy Protection Act (COPPA) and Children’s Online Privacy Protection Rule (COPPA Rule). In 2022, 66.67% (2/3rds) of all U.S. teens reported using TikTok and 61% of the teens aged 13 or 14. The complaint states that the precise number of violations is difficult to determine, however COPPA permits a maximum penalty of $51,744.00 for each violation. If the U.S. Government can successfully prove its allegations, the statutory damages permitted under COPPA could easily produce billions in damages against the social media platform.

 

The COPPA Rule imposes the following requirements on TikTok: post a privacy policy, give notice of information practices, identify the information collected, and provide a complete notice of such practices directly to the parents of children, obtain verifiable parental consent, and delete information at the parent’s request.

 

Three main failures were identified in the U.S.’s complaint:

1.     Failure to verify the age of users;

2.     Sharing of information obtained from a child user; and

3.     Failure to delete data after requests.

 

Failure to Age Verify

The platform requires users who create accounts to report their birthdates when signing up. TikTok does not prevent children from using the app. If an account is made by a person under the age of 13, the platform provides a prompt to enter a username (not containing any personal information) and create a password. The account is then made in “Kids Mode” which only allows the user to view videos. Kids Mode does not allow for creating or uploading videos, posting information publicly, or messaging others. However, the complaint alleges that Defendants did not notify parents or obtain parental consent for those who created an account as a child.

 

Until 2020, if a child was prompted with account creation in Kids Mode, TikTok did not prevent the child from evading the age gate by re-starting the process to enter an older age, even though by that point, Defendants knew the user was a child through analytics, the I.P. address, and the email address submitted.

 

Until May 2022, Defendants allowed consumers to create an account using login credentials from 3rd party online services such as Instagram and Google. Instagram did not require age disclosure until December 2019 and Google allows children under 13 to create accounts with parental consent. Defendants would identify the account as “age unknown” and did not close the loophole to “age gate” for several years.

 

Strategic Sharing of Children’s Information

After allowing children to bypass or evade the “age gate,” the U.S. alleges that TikTok knowingly collected and shared personal information from those same child-operated accounts with Facebook and AppsFlyer to encourage existing “Kids Mode” users to use TikTok more frequently. And, of course, with more use, TikTok collected and then stored and shared more data on same children.

 

Failure to Delete Data After Requests

The U.S. alleges TikTok failed to create a simple or effective process for parents to submit deletion requests. The word “delete” does not appear in many of defendant’s online parental guidance materials, making this process very difficult and confusing. The complaint describes a recent parent looking to delete data scrolling and through multiple webpages, links, and menu options without finding a “delete” submission request option. Instead, the parents were prompted to explain, in a textbox, that they are a parent who wanted the child’s information deleted.

 

In response to a parental request, Defendant’s staff would review the account for “objective indicators” that the account holder was under 13 based on the user’s bio and would only identify an account as underage is there was an EXPLICIT admission, such as “I am 9 years old.” If an explicit admission was not found, the defendant’s policy was to ask the parent to confirm their parental status of the child by completing a form under the penalty of perjury. If the form was not submitted, the account was not deleted.

 

However, the Defendants purportedly did nothing to prevent the same child from re-creating the account with the same information, such as email, phone number, and device. Additionally, even if a parent did succeed in submitting a request, Defendants often did not honor the request or failed to respond in a timely manner.

 

This filing of suit serves as a warning to all platforms to thoroughly monitor child-use of the platform and to ensure that the policies are compliant with both various State Children’s Privacy Laws and COPPA, as this lawsuit will set the tone for future enforcement.

16 views0 comments

Comments


Post: Blog2_Post
bottom of page