Meta partnered with Lantern Program to improve safety of children in online spaces



Meta partnered with Lantern Program to improve safety of children in online spaces
The growing concerns regarding child safety on social media platforms have prompted tech giant Meta, led by Mark Zuckerberg, to actively participate in the Lantern program. Lantern is the first cross-platform signal-sharing program dedicated to child safety, launched by the Tech Coalition, a global alliance of tech companies that combat child sexual abuse and exploitation online. Meta's participation as a founding member of Lantern highlights the critical importance of robust policies implemented by tech companies to ensure compliance and swift redressals in the event of policy violations. This move also signifies the increased responsibility of tech giants to address child safety issues in the wake of the proliferating use of social media platforms.
Meta has implemented advanced technologies such as Microsoft's PhotoDNA and Meta's PDQ to fight the spread of child sexual abuse content on the internet. Despite these efforts, Meta recognizes the need for additional solutions to prevent predators from using multiple apps and websites to target children. At Meta, we aim to provide young people with safe and positive experiences online, and we have spent a decade developing tools and policies to protect them. The Lantern program, which is part of the Tech Coalition, is a global alliance that works collaboratively to combat online child sexual exploitation and abuse. The coalition's goal is to inspire, guide, and support its members in fostering collaboration to protect children from online threats.
Online child sexual exploitation and abuse (CSEA) pose a pervasive threat across platforms and services, making cross-platform collaboration essential. The Lantern program addresses this need by identifying signals related to accounts and profiles violating policies against CSEA, such as inappropriate sexualized contact with a child, online grooming, and financial sextortion of young individuals. Signals shared through Lantern include information like email addresses, child sexual abuse material hashes, or keywords used in grooming, buying, or selling such materials. While these signals do not serve as definitive proof of abuse, they offer valuable clues for investigations to trace perpetrators in real-time.
Prior to the establishment of Lantern, there was no effective way to take collective action against individuals engaging in harmful activities. The Lantern program has improved the ability to prevent and detect such activities, expedited the identification of threats, and increased situational awareness across different platforms. Companies involved in the program can securely share signals about activities that violate their child safety policies. This collaborative approach enables them to upload and access signals, review similar activities on their platforms, and take appropriate actions based on their enforcement procedures. The program promotes responsible management through safety and privacy considerations, prioritizing human rights, transparency, and stakeholder engagement. Meta's involvement in the Lantern program demonstrates a collaborative industry effort to confront the challenges associated with ensuring child safety in the digital world, highlighting the shared responsibility of tech companies to create a secure online environment for younger users.