top of page
Search
Julian Mahfouz

TikTok Under Fire After "Blackout Challenge" Death


A case out of the Third Circuit Court of Appeals could redefine the relationship between social media platforms and the content that they host.
TikTok could face liability

A case out of the Third Circuit Court of Appeals could redefine the relationship between social media platforms and the content that they host. In June, TikTok was sued by the mother of a recently deceased ten-year-old girl named Nylah Anderson. Tragically, the young girl passed away after attempting to participate in the “blackout challenge.”


The TikTok post encouraging Nylah to try the “blackout challenge” appeared on her personally curated “for you page” (FYP).  The challenge “encourages viewers to record themselves engaging in acts of self-asphyxiation.”[1] In an attempt to mimic this challenge, Anderson tragically hanged herself and passed away. Anderson’s mother filed suit against TikTok for (1) hosting the video on its platform, (2) continuing to distribute the video, and (3) recommending the video to Anderson.


Normally, platforms like YouTube, Facebook, Instagram, and more recently TikTok are shielded from such liability under the Communications Decency Act, commonly referred to as Section 230 immunity. For years, Section 230 shielded platforms by stating that, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”[2] 


However, in the face of mounting pressure from this tragedy and social media skepticism, U.S. Third Circuit took aim to break this shield down.  In a brief, three-page opinion, the Court details the history of Section 230 protection and attempted to carve out an exception to TikTok’s immunity from suit. The exception boils down to one important distinction: presenting versus providing content.


Under Section 230, those that provide, or host content are protected. The idea stems from the fact that the platforms are not participating in first-party speech themselves; they are simply providing a forum for others to do so. The Third Circuit stated its belief that such a distinction no longer exists given TikTok’s active participation in selecting the content that its individual users receive:


 TikTok's FYP algorithm decides on the third-party speech that will be included in or excluded from a compilation—and then organizes and presents the included items on users' FYPs. Accordingly, TikTok's algorithm, which recommended the Blackout Challenge to Nylah on her FYP, was TikTok's own expressive activity, and thus its first-party speech.[3]

This ruling openly disagrees with all prior circuit court decisions that allowed social media platforms to escape liability using the shield of hosting rather than presenting. Because of this circuit split, this case will likely make its way to the Supreme Court, and if the Court affirms this decision, it will impact the social media landscape in a way unseen since Section 230 was passed.


The issue with this ruling is simple: in the modern internet age, EVERY platform would be determined to be participating in first party speech. This is no longer the internet of static web pages, or simple video hosting platforms like MetaCafe or Vimeo, platforms that predated YouTube, and allowed users to simply upload their videos and then share them. The post 2010 internet is the age of the algorithm; every habit is tracked, every preference is matched, and every single platform is trying to curate and present the most engaging content they can to keep any user on the platform for as long as possible.


Arguably, this change improved the internet user experience. If a user prefers to watch videos about golf, they no longer need to scour YouTube, optimizing their search terms and skipping over irrelevant content; the algorithm will now present the user what they wish to see. However, under the Courts new decision, this curation and presentation of golf content would constitute first-party speech by the video hosting platform.


To avoid liability, platforms would need to, in real time, curate content as it is uploaded. Considering that about a million TikTok videos are uploaded every hour[4], this is close to impossible, even with automatic content filters that allow some videos to fall through the cracks. Otherwise, they would need to disable the algorithms, and move back to the age of the 2000s, with basic videos hosting capabilities and nothing more.


Caught between this rock and a hard place, social media platforms will surely be awaiting an appeal of this decision, and the direction the courts move on Section 230 immunity will massively impact the landscape of the internet and the future of content presentation.


[1] Anderson v. TikTok, Inc., No. 22-3061, 2024 WL 3948248, at *1 (3d Cir. Aug. 27, 2024).

[2] 47 U.S.C. § 230

[3] Anderson v. TikTok, Inc., No. 22-3061, 2024 WL 3948248, at *3 (3d Cir. Aug. 27, 2024).

63 views1 comment

Recent Posts

See All

1 Comment


jonesajalo
5 days ago

TikTok can be a fun and engaging platform for kids, but its safety largely depends on how it is used. While the app offers creative content and opportunities for social interaction, it also exposes users to potential risks such as inappropriate content, online predators, and cyberbullying. To enhance safety, parents should enable privacy settings, monitor their children's activity, and encourage open discussions about online behavior. TikTok has features like restricted mode and account privacy settings that can help create a safer environment. Ultimately, parental guidance and active involvement are crucial in ensuring a positive experience for kids on the platform. https://softwarecosmos.com/is-tiktok-safe-for-kids-and-teenagers/

Like
Post: Blog2_Post
bottom of page