EU Targets Meta in Crackdown on Big Tech’s Impacts on Kids’ Mental Health
Facebook’s parent company, Meta, is the subject of an official inquiry by the European Commission. Regulators worry that children’s mental health and welfare may be harmed by addictive behaviours promoted by these wildly popular social media sites.
This high-stakes investigation will look at whether Meta has broken the comprehensive Digital Services Act (DSA), a significant new law that gives EU authorities the ability to take action against online damages that large tech companies cause. The Digital Services Act (DSA), which went into force last year for the whole 27-nation bloc, holds digital platforms responsible for problems like fraud, disinformation, child exploitation, and other social evils made possible by their algorithms and business plans.
Governments throughout the world are working to control the negative effects of social media on children, and one such endeavour is looking into whether Facebook and Instagram may be encouraging young people to use the platform compulsively. For many years, Meta (formerly Facebook) has been subject to intense criticism and allegations that it created features and algorithms that were highly addictive in order to maximise user engagement at all costs, particularly with children and teens. A group of more than thirty state solicitors general from the United States filed a lawsuit against the tech giant in October 2022, claiming that it had used “psychologically manipulative tactics” to entice youngsters in breach of consumer protection rules.
The European Commission raised particular concerns over the fundamental technologies that drive Meta’s platforms while launching its official investigation. Authorities think that by generating harmful “rabbit holes”—where recommendation engines constantly offer more extreme, fascinating content to keep young users scrolling endlessly—Facebook and Instagram’s algorithms may be encouraging behavioural addictions in youngsters.
The effectiveness of Meta’s age verification procedures and their capacity to keep minors from accessing improper or potentially dangerous content are also significant concerns. Based on the results of a thorough risk assessment that Meta itself presented in September 2023, the Commission voiced reservations about these protections.
In the event that the EU inquiry proves Meta did break the DSA, the business may be subject to severe fines and legal action. These might include strict penalties and orders to change algorithms and stop any risky behaviours that are discovered to be endangering the wellness of young people.