The European Union Commission has launched investigations into Facebook and Instagram over growing concerns about the safety of children on these platforms.
The probe is centered around worries that the platforms’ algorithms may foster behavioral addictions and create “rabbit-hole effects” in young users, drawing them into prolonged and potentially harmful online activity.
A significant aspect of the investigation focuses on the age assurance and verification methods employed by Meta, the parent company of Facebook and Instagram.
The commission is scrutinizing whether these methods are effective in preventing underage users from accessing content inappropriate for their age.
These investigations are being conducted under the framework of the EU’s Digital Service Act (DSA), which imposes regulations on all social media platforms operating within the EU.
The formal proceedings grant the commission the authority to enforce compliance, implement interim measures, and secure commitments from Meta to rectify any identified issues.
Facebook and Instagram have been classified as Very Large Online Platforms (VLOPs) under the DSA, a designation given to platforms with over 45 million monthly active users in the EU.
This status subjects them to stricter regulatory scrutiny and higher compliance standards.

