The Mental Health Coalition (MHC), a nonprofit organization dedicated to mental health advocacy, has introduced a new initiative named Thrive, aimed at addressing the issue of suicide and self-harm content on digital platforms.
 Thrive’s primary objective is to enable online platforms to detect and manage potentially harmful material by using unique digital identifiers for graphic content.
The programme’s initial participants include major tech companies Meta (which oversees Facebook, Instagram, and WhatsApp), Snap, and TikTok.
Meta has provided the technical infrastructure for Thrive, drawing on its experience with the Lantern child safety program from the Tech Coalition.
Thrive is designed to allow participating platforms to aggregate data, receive notifications, and independently review and take action on content flagged as harmful.
 Dan Reidenberg, director of Thrive and managing director at the National Council for Suicide Prevention, will lead the program’s management.
Notably, X (formerly Twitter) and Google (YouTube) are not involved in the Thrive initiative. X has seen a reduction in its moderation team, while YouTube has been criticized for its role in recommending harmful content.
Both companies have faced legal and ethical scrutiny, with a British authority recently holding Instagram accountable for a 14-year-old girl’s suicide linked to exposure to self-harm material.
Research indicates a connection between extensive social media use and diminished mental well-being, including heightened risks of depression and anxiety.
 The Thrive programme represents a significant step in addressing these challenges and improving online safety.