BYU Law Review
Abstract
When dangerous social media challenges go viral and cause harm to adolescents and young children, should the platform be held liable for pushing that content? As it currently stands, Section 230 of the Communications Decency Act of 1934 prevents this from happening. However, Anderson v. TikTok—a recent suit brought on behalf of a ten-year-old girl who died after asphyxiating herself while participating in the viral “Blackout Challenge”—seeks to change that.
The Third Circuit court in Anderson held that social media algorithms should be considered first-party speech, or the platform’s own expressive activity, rather than third-party speech, displaying its users’ expressive activity. Though holding algorithms as speech giving rise to liability is a step in the right direction, Anderson may strip away crucial protections Section 230 provides for small businesses and free speech.
This Note proposes that to prevent Anderson’s overcorrection of Section 230 immunity while still allowing accountability for large social media companies, the Justice Against Malicious Algorithms Act of 2021—a bill specifically involving social media providers causing physical or mental harm to individuals—should be amended and reintroduced. Specifically, these changes should (1) narrow its scope to harm affecting adolescents; (2) refine its definition of the term “personalized algorithm”; and (3) raise its threshold for small business exemptions. By making these changes and then reintroducing the Act in a future legislative session, tech giants can be held accountable for the harmful content their algorithms push to adolescents, and children can be better protected online.
Rights
© 2025 Brigham Young University Law Review
Recommended Citation
Allison Mitton,
Did Anderson v. TikTok Get It Right? Holding Social Media Providers Accountable for Harm to Adolescents,
51 BYU L. Rev.
245
(2025).
Available at: https://digitalcommons.law.byu.edu/lawreview/vol51/iss1/7
