Amnesty International Files Complaint Against TikTok Over Mental Health Risks to Youth
Amnesty International has lodged a formal complaint against TikTok for exposing youth to harmful mental health content, challenging the platform's compliance with EU laws.
- • Amnesty International accuses TikTok's algorithm of amplifying harmful content related to suicide and self-harm among youth.
- • A complaint has been filed with French regulator Arcom under the EU Digital Services Act for TikTok's legal non-compliance.
- • Experiments showed fake teen accounts rapidly exposed to depressive and suicidal content.
- • TikTok denies allegations, citing proactive content removal and safe user experience efforts.
- • The complaint coincides with a broader EU investigation into TikTok’s protection of minors.
Key details
Amnesty International France has formally accused TikTok of amplifying exposure to harmful content linked to suicide and self-harm among young users. On October 21, 2025, Amnesty revealed findings that TikTok's algorithm creates a 'spiral' effect that rapidly steers teenagers interested in sad or depressive content toward increasingly distressing videos within an hour. Spokesperson Katia Roux announced a complaint filed with the French regulator Arcom under the EU's Digital Services Act (DSA), citing TikTok's failure to meet legal obligations in protecting minors since August 2023.
Amnesty conducted experiments using fake 13-year-old accounts, which were exposed to predominantly mental health-related and sometimes suicidal content within less than an hour. The NGO collaborated with the Algorithmic Transparency Institute to compare these manually managed profiles with automated ones, finding the former experienced more harmful content amplification.
TikTok responded by questioning the study's methodology and emphasizing its efforts to remove 90% of violating videos before viewing, aiming to provide a safe environment for teens. Despite this, Amnesty highlighted TikTok's insufficient measures to identify risks and prevent harmful content dissemination.
This complaint adds to the European Commission's ongoing investigation into TikTok since February 2024 over alleged failures to protect minors on its platform, reinforcing concerns over youth mental health amid digital content regulation debates.