The social media landscape has become increasingly complex and controversial, especially as platforms like TikTok dominate the lives of young users worldwide. On Tuesday, October 3rd, 2023, TikTok found itself at the center of legal turmoil yet again, as 13 U.S. states, alongside the District of Columbia, filed lawsuits accusing the app of causing significant harm to younger users. The lawsuits, filed separately in states like California, New York, and the District of Columbia, allege that TikTok’s practices are designed to exploit young people’s vulnerabilities by encouraging addictive behavior, thereby failing to protect them from the platform’s potential risks.
As TikTok continues to expand its user base, particularly among younger populations, it faces increasing scrutiny from regulators and state authorities. The lawsuits assert that the company is engaging in unethical practices to boost user engagement, disregarding the adverse effects it may have on children and teenagers. The platform is further accused of using intentionally addictive algorithms to keep young users on the app for extended periods, leading to negative mental health outcomes and leaving them vulnerable to exploitation.
This article delves deeper into the lawsuits, the allegations brought forward by the states, TikTok’s response to these claims, and the broader implications of this legal battle on the future of social media platforms and their role in protecting young users.
Table of Contents
TikTok’s Alleged Harmful Practices: A Legal and Ethical Concern
One of the central allegations in these lawsuits is that TikTok has intentionally designed its platform to be addictive. Attorney generals from the involved states argue that the platform’s algorithms are crafted to maximize user engagement, with a particular focus on young users who are more susceptible to developing unhealthy social media habits. California Attorney General Rob Bonta described TikTok’s business model as one that “cultivates social media addiction to boost corporate profits.” He further emphasized that TikTok targets children who are less capable of establishing healthy boundaries around their screen time and usage.
The mechanics of TikTok’s algorithms have been a long-standing subject of debate. The “For You” page, for instance, is designed to curate personalized content based on user activity, promoting content that keeps users engaged. Critics claim that this feature encourages compulsive behavior, as users are continually served content that they find interesting or entertaining. This cycle of endless scrolling can lead to excessive screen time, leaving young users vulnerable to mental health issues like anxiety, depression, and feelings of inadequacy—problems that many studies have already linked to the overuse of social media platforms.
Letitia James, the Attorney General of New York, emphasized this point in her statement, pointing out that “young people are struggling with their mental health because of addictive social media platforms like TikTok.” The lawsuits argue that TikTok’s business model prioritizes engagement over user well-being, creating an environment that not only fosters addiction but also fails to address the broader implications of such behavior on mental health.
TikTok’s Defense: Safety Features and Content Moderation
In response to the lawsuits, TikTok has expressed its disagreement with the claims. The company has defended its position by highlighting the various safety features it has implemented for younger users. These include default screen time limits and privacy settings for minors under the age of 16. TikTok has also pointed out that it offers a range of parental control options through its “Family Pairing” feature, which allows parents to set screen time limits, restrict content, and manage privacy settings for their children.
TikTok’s spokesperson stated, “Many of these claims are inaccurate and misleading,” expressing disappointment that the states chose to pursue legal action rather than working with the company on constructive solutions. TikTok maintains that it is committed to creating a safe online environment and is constantly evolving its safety features to address the challenges that come with a rapidly growing platform.
However, the states involved in the lawsuits argue that these measures are insufficient. Despite TikTok’s efforts to implement safeguards, critics argue that the company’s primary goal is to maximize profit by keeping users on the app for as long as possible. They claim that TikTok’s business model and content moderation policies are inherently flawed, as they are not designed to prioritize user safety over corporate revenue.
Accusations of Sexual Exploitation and Inadequate Content Moderation
Beyond the issue of addiction, the lawsuits also raise serious concerns about TikTok’s failure to protect young users from inappropriate and harmful content. Washington D.C. Attorney General Brian Schwalb went so far as to accuse the platform of facilitating the sexual exploitation of underage users through its live-streaming and virtual currency features. In his words, TikTok operates “like a virtual strip club with no age restrictions,” allowing minors to be exposed to inappropriate content and engage in potentially dangerous interactions.
The concern over TikTok’s live-streaming feature is not new. Critics have long pointed out that the platform’s moderation tools are not equipped to handle the real-time nature of live content. This has led to situations where inappropriate behavior, including sexual exploitation, has occurred without immediate intervention. The lawsuits claim that TikTok’s moderation efforts are reactive rather than proactive, leaving young users vulnerable to exploitation.
Moreover, the platform’s virtual currency system, which allows users to purchase digital “gifts” for content creators during live streams, has raised concerns about the financial exploitation of minors. The lawsuit filed by Washington D.C. alleges that TikTok is operating an unlicensed money transmission business through these features, further exacerbating the risk of harm to young users.