Game-Changing Lawsuit Targets TikTok Over Blackout Challenge
Written on
It is not the first deadly online challenge to gain traction. Viral trends like consuming Tide Pods, injuring oneself with salt and ice, and jumping from moving vehicles have exemplified a troubling form of peer pressure. However, the Blackout Challenge, which entices young users to choke themselves until they lose consciousness, has had particularly tragic outcomes, resulting in the deaths of seven children, all under the age of 15.
Despite the shocking nature of these incidents, the current legal framework offers limited options for holding tech companies accountable for the harmful content they expose users to on social media. Nevertheless, a recent lawsuit filed on behalf of two minors who lost their lives could potentially reshape this landscape.
Section 230 of the U.S. Communications Decency Act, passed in 1996, protects technology platforms from being held liable for user-generated content. As these platforms merely provide a space for content creation without producing it themselves, they enjoy immunity from legal action. The rationale behind Section 230 is rooted in democratic principles, as lawmakers sought to prevent over-regulation from stifling free speech. This has allowed the internet to flourish as an unrestricted forum for expression.
In recent times, there has been bipartisan pressure on Congress to reassess Section 230. While Republicans criticize perceived censorship, Democrats advocate against hate speech and misinformation, there is a shared consensus that the law has become outdated. When it was enacted in 1996, the pervasive influence of social media on daily life and the enormous profits tech companies would generate from our engagement were unforeseen.
Even though Section 230 may be outdated, it remains the sole legal structure available for users. Under its provisions, the parents of the deceased children encounter significant hurdles, as much offensive speech is still protected online.
However, the lawsuit aiming to hold TikTok accountable for the Blackout Challenge does not center on the content itself but rather on the platform's construction. It adopts a product liability perspective, positing that TikTok is a flawed product with the potential to inflict harm. This strategic shift seeks to redefine technology companies as more than just neutral platforms.
If this approach succeeds, its implications could extend beyond TikTok's inner workings, prompting a reevaluation of the responsibilities tech platforms have towards their users. A legal victory against TikTok would bolster this emerging framework for expanding tech liability.
There are three key reasons why a product liability claim could introduce new legal risks for tech companies:
It acknowledges our algorithmic reality
Section 230 states: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" (47 U.S.C. § 230). The Electronic Frontier Foundation labels it as "the most important law protecting internet speech." The essence of Section 230 is anchored in the First Amendment, which restricts the government's ability to limit most forms of speech. As a result, U.S.-based tech companies have become safe havens for free expression, while also facilitating rapid technological advancements.
However, Section 230 does not accurately reflect the current social media landscape. It was designed with the virtual message board in mind, catering primarily to the needs of bloggers at a time when blogging was considered the forefront of innovation. The landscape has evolved.
Today, platforms do not merely act as conduits for others' speech; they curate and customize user experiences. They suggest content tailored to individual users and encourage addictive interactions. For the first time, content consumption resembles a drug-like experience, providing dopamine hits and activating the brain's reward pathways.
In a statement to People Magazine, TikTok asserted, "This disturbing ‘challenge,’ which people seem to learn about from sources other than TikTok, long predates our platform and has never been a TikTok trend. We remain vigilant in our commitment to user safety and would immediately remove related content if found. Our deepest sympathies go out to the family for their tragic loss."
However, TikTok is not facing litigation merely for its content; it is being accused of amplifying harmful content. The Social Media Victims Law Center, representing the victims' families, contends that "TikTok unquestionably knew that the deadly Blackout Challenge was spreading through their app and that their algorithm was specifically feeding the Blackout challenge to children."
Unlike previous lawsuits against tech platforms that focused solely on offensive material, this case confronts the economic reality that technology companies thrive on user addiction to their platforms. It alleges that these companies are willing to promote harmful content for profit.
From this angle, TikTok can be likened to a defective product with inadequate safety features.
It confronts what TikTok knows about its users and its algorithm
TikTok has recently faced scrutiny for its extensive data collection on users. While all social media applications gather user information, TikTok is accused of doing so more aggressively than others. Parents claim that TikTok is aware that "hundreds of thousands of children as young as six years old are currently using its social media product," yet they maintain these accounts due to the financial implications of losing ad revenue. Furthermore, the platform is capable of determining a user's age. "In other words," they assert, "TikTok knows when a user claims to be 22 but is really 12."
Here, the plaintiffs emphasize a crucial point: an app that collects extensive data about its users cannot simultaneously claim ignorance of their identities.
Families argue that TikTok was aware that these users were minors and that the platform failed to adequately supervise the content accessible to them. The lawsuit alleges that TikTok "knew or should have known that failing to take immediate and significant action to suppress the spread of the deadly Blackout Challenge would lead to further injuries and deaths, particularly among children."
Similar situations have arisen in other contexts. For example, the Federal Communications Commission (FCC) regulates obscene and indecent speech in broadcast media, prohibiting the airing of such content during hours when children may be present. These regulations acknowledge the societal need for free expression while also recognizing the potential harms associated with certain content.
There is a possibility that the government may take further steps to regulate harmful internet speech. However, technology companies do not have to wait for government intervention. Section 230 empowers them to monitor their platforms and enforce community standards. TikTok could have chosen to promptly eliminate dangerous content but allegedly chose not to. If this is accurate, TikTok failed to safeguard its most vulnerable users. While technology platforms cannot control user-generated content, they are responsible for responding to it effectively.
Increasingly, children rely on social media for information instead of traditional media. Society depends on these platforms to regulate the content they host. Despite existing protocols in broadcast media to ensure "decency," this lawsuit highlights that children are exposed to inappropriate and harmful behavior on TikTok.
It reflects a modern conception of harm
For years, lobbyists denied the health risks associated with smoking, despite evidence linking it to cancer and other serious conditions. However, a pivotal lawsuit in 2002 prompted significant changes in the tobacco industry. A California jury awarded a woman with terminal lung cancer $850,000 in compensatory damages, along with $28 billion in punitive damages against Philip Morris.
This case succeeded for two primary reasons: the public began to recognize the dangers of smoking, and internal documents revealed that tobacco companies were aware of the risks yet misled the public.
A similar demand for accountability is emerging against technology companies today, driven by comparable concerns. People are citing negative impacts on mental health stemming from social media, which can be just as harmful as more visible injuries.
Whistleblower Frances Haugen, a former Facebook data scientist, testified to Congress that Facebook was aware of the detrimental effects its platform had on children but refused to implement necessary safeguards that would cut into profits. Subsequent reports, dubbed the Facebook Files, published by The Wall Street Journal, revealed that Facebook was aware of a rise in suicidal thoughts, eating disorders, and depression among young Instagram users.
Haugen also emphasized the power of algorithms, revealing that Facebook prioritizes user engagement over user protection. Meta, Facebook’s parent company, is not alone; many tech companies have been criticized for putting profits before people.
TikTok's algorithm, while arguably more aggressive and addictive than its competitors, shares this characteristic. Similar to other platforms, its content has the potential to negatively influence users' mental well-being. The app's short videos have been linked to shorter attention spans, and experts express concerns about rising rates of eating disorders, anxiety, and depression.
Although some experts acknowledge TikTok's positive aspects—such as content that promotes recovery and community—no one disputes the presence of valuable material on the platform. Regardless of the intent behind user-generated content, the algorithm prioritizes interaction over positivity.
TikTok is designed to satisfy user cravings rather than promote well-being. The lawsuit highlights the mental health ramifications of this design.
TikTok gained immense popularity during the pandemic, offering a means of connection when many were isolated. Children, in particular, may have benefited from this engagement. The lawsuit underscores TikTok's appeal, and the strategy lies in recognizing the addictive nature of such products.
As the world returns to normalcy, TikTok's success continues to soar. This year, the platform is projected to triple its global ad revenues to $11.6 billion, surpassing the combined ad revenues of Snapchat and Twitter. Over 60% of TikTok users belong to Generation Z, born after 1996, making them a highly sought-after demographic for tech companies. Meanwhile, children remain active in digital spaces.
Recently, TikTok announced plans to implement content filters and maturity ratings, assigning scores to content to give users more control over moderation. This move indicates a more rigorous approach to managing user-generated content and serves as damage control in response to potential government regulations and lawsuits. It mirrors the type of remedies typically seen with defective products, where harmful items are either withdrawn from circulation or improved. This case argues that while social media is here to stay, tech companies must be responsive to harmful incidents and rectify their flaws.
The names of the victims were Lalani Erika Walton and Arriani Jaileen Arroyo. Should this lawsuit prevail, the repercussions for Big Tech could be profound and enduring.