top of page
Writer's picturemeowdini

EU Launches Investigation into Meta Over Child Safety Concerns on Facebook and Instagram

Updated: Jun 6

EU Investigates Potential Breaches of Child Safety Rules by Facebook and Instagram

The European Union (EU) has launched a formal investigation into Meta Platforms (META.O), the parent company of Facebook and Instagram, for potential violations of the bloc's Digital Services Act (DSA) regarding child safety. This investigation could result in significant fines for Meta if the company is found to be inadequately addressing risks to children on its platforms.


The European Union (EU) has launched a formal investigation into Meta Platforms (META.O), the parent company of Facebook and Instagram, for potential violations of the bloc's Digital Services Act (DSA) regarding child safety.

DSA Enforces Action on Harmful Content

The EU's landmark DSA legislation, which came into effect last year, mandates stricter regulations for online platforms to combat illegal and harmful content. As part of its compliance efforts, Meta submitted a risk assessment report in September 2023. However, the EU Commission expressed concerns that Facebook and Instagram's current measures fall short in protecting children.


Specific Areas of EU Scrutiny

The EU's investigation focuses on two key areas:

  • Algorithmic Addiction: The EU Commission is concerned that Facebook and Instagram's algorithms, which personalize user experiences, may encourage "behavioral addictions" in children. These algorithms can potentially trap users in echo chambers, exposing them only to content that reinforces their existing views and potentially leading to harmful online behavior.

  • Age Verification Concerns: The EU also has doubts about the effectiveness of Meta's age verification methods.Inaccurate or lax age verification processes can lead to children accessing inappropriate content that could be detrimental to their well-being.


Meta's Response

Meta maintains that it prioritizes child safety and has implemented various online tools to safeguard young users. A company spokesperson emphasized their decade-long commitment to developing over 50 tools and policies specifically designed to protect children on their platforms.



The Importance of Child Online Safety

The EU's investigation sheds light on the growing global concerns about online safety for children. Social media platforms have a significant responsibility to create safe spaces for young users and prevent them from encountering harmful content or developing unhealthy online behavior.


What's Next?

The EU's investigation into Meta serves as a reminder of the importance of online safety regulations. As the investigation progresses, it will be crucial to determine the effectiveness of Meta's existing safeguards and explore how online platforms can be held accountable for protecting children in the digital age.



Source: Reuters




Comments


bottom of page