HomeHealthCalifornia and New York Take Legal Action Against TikTok, Claiming It Harms...

California and New York Take Legal Action Against TikTok, Claiming It Harms Children and Promotes Addiction

Keep Up with LAist

If you’re enjoying this article, you’ll love our daily newsletter, The LA Report. Each weekday, catch up on the five most pressing stories to start your morning in three minutes or less.


TikTok Under Fire: A Legal Battle Over Youth Safety

More than a dozen states, spearheaded by California and New York, have launched a significant legal offensive against TikTok. The lawsuits, filed separately in 13 states and the District of Columbia, accuse the popular video app of misleading the public regarding its safety and deliberately designing features that keep young users hooked. This legal action highlights growing concerns about the impact of social media on mental health, particularly among teenagers.

Why Now?

The timing of these lawsuits is crucial. A bipartisan coalition of attorneys general is pushing for changes to TikTok’s product features, which they argue are manipulative and harmful to young users. They are also seeking financial penalties against the company. This legal challenge comes at a time when TikTok is already facing scrutiny over a potential U.S. ban, set to take effect on January 19, unless it severs ties with its China-based parent company, ByteDance.

The Backstory

With approximately half of Americans using TikTok, the app now finds itself defending against a wave of state lawsuits that reflect a broader national unease about social media’s design and its potential contribution to mental health issues. While establishing a direct link between social media use and mental health problems is complex, state authorities assert that TikTok prioritizes its growth and profits over the safety of children. California Attorney General Rob Bonta emphasized this point, stating, “TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content.”

This lawsuit follows a similar legal action taken last year against Meta, the parent company of Instagram and Facebook, which is still pending. The ongoing scrutiny of social media platforms underscores a growing recognition of the need for accountability in the tech industry.

States Allege Harm from Features and Algorithms

The lawsuits specifically target TikTok’s design elements that encourage excessive use, such as its hyper-personalized algorithm, endless scrolling, and push notifications. Attorneys general argue that these features can lead to emotional and behavioral changes in users, particularly adolescents, who may be left feeling inadequate and self-doubting after prolonged use.

One of the most concerning aspects highlighted in the lawsuits is TikTok’s use of beauty filters. These filters allow users to alter their appearance significantly, which can lead to body image issues and contribute to eating disorders and body dysmorphia, particularly among young girls. New York Attorney General Letitia James stated, “Beauty filters have been especially harmful to young girls,” emphasizing the need for TikTok to warn users about these potential harms.

Moreover, the District of Columbia’s lawsuit claims that TikTok traps teens in online bubbles, bombarding them with content that the platform claims to restrict, including videos related to weight loss, body image, and self-harm. The allegations extend to TikTok’s live-streaming feature, which has reportedly been exploited by underage users for financial gain, raising concerns about the sexual exploitation of minors.

TikTok’s Response

In response to the lawsuits, TikTok has characterized the accusations as misleading. Spokesman Alex Haurek stated that the company has implemented robust safeguards, including the proactive removal of suspected underage users and the introduction of safety features like default screen time limits and family pairing options. Haurek expressed disappointment that the states chose to pursue legal action rather than collaborate on constructive solutions to industry-wide challenges.

The lawsuits will proceed in 14 separate state courts, as each complaint relies on specific state consumer protection laws. This means that individual trial dates will be set in the coming months or years, unless the cases are dismissed or settlements are reached.

A Move to Increase Child Safety Tools

In light of these concerns, many social media platforms, including TikTok, have enhanced their child safety tools. Recently, Meta announced new features aimed at improving parental supervision on Instagram and making all teen accounts private to protect young users from potential predators. Similarly, TikTok has implemented measures such as disabling direct messaging for young users and setting accounts to private by default.

Despite these efforts, state officials argue that TikTok’s safety measures are inadequate and merely serve as public relations stunts. They contend that the company has not effectively verified users’ identities, allowing adolescents to misrepresent their ages and bypass safety protocols. California Attorney General Bonta criticized TikTok’s protective features, stating, “The harmful effects of the platform are far greater than acknowledged, and TikTok does not prioritize safety over profit.”

The states involved in the lawsuits include New York, California, and the District of Columbia, along with Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, North Carolina, New Jersey, Oregon, South Carolina, Vermont, and Washington. As this legal battle unfolds, it raises critical questions about the responsibilities of social media companies in safeguarding the mental health and well-being of their young users.