No Liability for TikTok Challenge Videos

JOOTB_FinalA California court recently ruled that TikTok is not liable for property damage occurring at a bathroom in a Florida School District.  The California court ruled that the alleged damage wasn't a foreseeable result of a TikTok challenge video.  It also concluded that Section 230 of the Communications Decency Act would bar the claim in any event.

The complaint alleged that TikTok algorithms promote challenges that specifically target school districts.  The challenge videos feature videos of stolen urinals and smashed floor tiles. 

The Florida School District argued two theories of liability.  First, it contended that the TikTok platform caused emotional and mental harm to the students who used the platform. In turn, according to the complaint, the students "acted out" by destroying school property.  The complaint also alleged that the challenge videos themselves encouraged minor TikTok users to replicate the property destruction.

The court was unimpressed with either argument.  As to the first one, the court noted that foreseeability concerns doomed the complaint.  The court cited California law for the proposition that "there must be a limit imposed on Defendants' liability in order 'to avoid an intolerable burden on society.'"  In the court's view, under the complaint's liability theory, "any company that causes mental or emotional harm through interactions with a customer would be liable to any individuals . . . harmed by that customer" if that harm caused the customer to "act out" in some way.  Recognizing this theory would "throw open the courthouse doors to a deluge of lawsuits that would be both hard to prove and difficult to cull early in the proceedings."  The court was not willing to start the deluge. 

As to plaintiff's second argument – that the videos encouraged the mischief – the court found that Section 230 of the federal Communications Decency Act precluded that claim.  Section 230 precludes the court from treating social media platforms as the publisher of third-party content.  Here, the plaintiffs' theory ran directly counter to the statute.  As the court noted, "the relevant negligence claims would be based on the allegation that certain minor users watched the challenge videos and were then encouraged by the challenge videos' content to engage in "copycat" actions destroying or defacing school property. . . . Such claims suffered as a result of the challenge videos on TikTok are barred by Section 230." 

The plaintiffs argued that a case involving an app called "Snap" – which allegedly encouraged drivers to drive their vehicles at an excessive speed and then post video – controlled here.  In that case, the court found Snap was not protected by Section 230.  But in that case, Snap provided a "speed filter" that allowed users to record their actual speed.  The parents of two deceased teens were permitted to proceed with their wrongful death lawsuit.  But in that case, Snap itself provided the speed filter, so it was not a case of third-party content.  In the case before the California court, TikTok supplied none of the content.  It was a classic Section 230 case.

I doubt this is the end of the controversy concerning social media's effects on teens.  It is simply too pervasive.  But for now, Section 230 is a barrier to liability.  And it will remain so unless Congress amends the law.

About The Author

Jack Greiner | Faruki Partner