A French parliamentary committee has just recommended the filing of criminal charges against TikTok for allegedly endangering young users’ lives. This move follows a thorough investigation into the platform’s impact on children and growing concerns over toxic online content.
The committee also advocated for a comprehensive prohibition on children under 15 using social media, proposing measures that could potentially revolutionize the regulation of digital platforms in Europe.
Why Did France Target TikTok With Criminal Charges?
The committee’s investigation concluded TikTok “knowingly exposes children to hazardous and addictive content,” with Socialist chairman Arthur Delaporte announcing plans to refer the case to Paris prosecutors.
The inquiry began after families came forward in 2024, blaming TikTok for exposing children to harmful material such as content encouraging self-harm and suicide.
Lawmakers argue that TikTok’s parent company, ByteDance, has failed to take adequate steps to shield minors from psychological harm. They claim the platform's algorithms and moderation practices have endangered a vulnerable population.
Did you know?
Australia is the first country to ban all social media for children under 16, with its law set to take effect in December 2025.
What Are the Main Findings of the Parliamentary Investigation?
The report was unanimous among all 28 committee members, describing TikTok as “a production line of distress” for young users. Investigators allege TikTok executives committed perjury while denying knowledge of internal harm reports, raising legal risks beyond civil liability.
Emotional testimonies from grieving parents drove home the need for immediate intervention, with evidence of self-harm videos and exposure to toxic content shared in the hearings.
How Did Parents and Lawmakers Respond to TikTok’s Safety Measures?
TikTok, in its response, cited AI-driven moderation catching the majority of harmful content in France, arguing it is not solely responsible for wider societal issues.
Lawmakers countered that safety efforts remain insufficient and accused the company of complicity in endangering children.
The committee’s actions followed lawsuits from seven families and mounting political pressure, pushing the debate over platform accountability to the forefront of French policymaking.
ALSO READ | What changes did Microsoft promise for its Teams platform?
What Regulatory Steps Are Proposed to Protect Children Online?
Among the committee’s 43 recommendations are a total social media ban for under-15s and a “digital curfew” restricting access between 10 PM and 8 AM for teens aged 15 to 18.
Lawmakers also proposed a digital negligence offense for parents who fail to protect children online, in addition to criminal liability for platforms.
President Emmanuel Macron expressed support, aligning France with Australia and EU efforts to strengthen child safety regulations across the internet.
How Could France’s Case Influence Global Social Media Regulation?
The Paris prosecutor’s decision on whether to proceed with criminal charges could set a precedent for Europe and beyond. Other EU countries are already considering similar age restrictions, and the European Commission plans to convene experts on platform safety by year-end.
If successful, France’s move could inspire broader reforms and encourage governments worldwide to hold tech giants accountable for safeguarding youth on their platforms.
Comments (0)
Please sign in to leave a comment