In a bid to safeguard teenagers from potential online risks, Snapchat is set to introduce a slew of new features designed to create a safer environment for its younger users. Snap Inc., the parent company of Snapchat, has unveiled plans to implement a comprehensive strike system and cutting-edge detection technologies to eliminate accounts that promote inappropriate content to teenagers.
Under the umbrella of these latest safety measures, Snapchat is taking significant steps to protect teenagers from unsolicited contact with strangers. One of the key features is a pop-up warning that will alert teenagers when someone they haven’t interacted with in real life attempts to add them as a friend. Additionally, the app will offer a convenient option for teenagers to report or block unfamiliar individuals who reach out to them.
The platform already enforces strict guidelines for 13 to 17-year-olds, requiring them to have mutual friends before adding someone. However, Snapchat is further enhancing this safety measure by increasing the threshold for mutual connections, aiming to bolster protection against issues like violence, self-harm, misinformation, sexual exploitation, and pornography.
Snapchat’s innovative ‘Strike System’ will play a pivotal role in removing age-inappropriate content promptly. Furthermore, the platform is introducing new in-app content to address topics such as responsible sharing, online safety, and mental health. Developed in collaboration with ‘Young Leaders for Active Citizenship,’ this content will be featured in the ‘Stories’ section and will be rolled out to all users in the coming weeks.
For parents concerned about their children’s safety on Snapchat, the platform has launched a YouTube series explaining how kids can stay secure while using the app. Over the past few months, Snapchat has also introduced various generative AI-powered features, including the My AI chatbot, new augmented reality lenses, and the ability to try on outfits using AR, enriching the overall user experience.