Big tech companies, such as Meta (formerly known as Facebook) and Google, have come together to address the pressing issue of online child sexual abuse and exploitation. They have launched a new collaborative program called Lantern, which aims to share signals of activity that violate their policies on child exploitation. By sharing information, platforms can quickly detect, take down, and report problematic content related to child abuse and exploitation.
The protection of child victims of abuse is a crucial concern for both regulators and tech companies. In response to the growing concern, these companies are actively demonstrating their commitment to safeguarding children and teenagers online. Lantern fills a significant gap in the existing collaboration efforts, shedding light on cross-platform attempts to exploit and abuse children online. With this program, the internet can become a safer space for kids.
The participating tech companies will share various signals of activity that indicate child exploitation. These signals can include email addresses, specific hashtags, or keywords that are commonly used by predators to groom and exploit young people. By identifying and sharing these signals, platforms can swiftly detect potential instances of abuse, take down the content, and report it to the relevant authorities.
Apart from Meta and Google, several other platforms have joined the collaborative effort through the Tech Coalition. This coalition encompasses companies like Snap, Discord, and Mega, a privacy-focused platform based in New Zealand. By uniting under the shared goal of protecting children online, these platforms contribute their expertise and resources to make a significant impact.
During the pilot phase of Lantern, Meta took prompt action based on the information shared by Mega. More than 10,000 Facebook profiles, pages, and Instagram accounts associated with child exploitation were removed by Meta as a result. The company reported these accounts to the National Center for Missing & Exploited Children in the United States and also shared its findings with other platforms for their own investigations.
Child predators do not limit their activities to a single platform; they exploit various apps and websites. Antigone Davis, Global Head of Safety at Meta, emphasized the need for the technology industry to collaborate and protect children across different platforms. By sharing information and working together, tech companies can effectively counter predators’ attempts to harm children online.
A Damning Testimony
Coinciding with the announcement of the Lantern program, a former senior engineer from Meta testified before a Senate hearing in Washington. Arturo Bejar revealed that top executives, including Mark Zuckerberg, ignored his warnings about the safety of teenagers on the company’s platforms. In an internal survey conducted on Instagram, it was discovered that 13 percent of 13-15-year-olds had received unwanted sexual advances within the previous seven days. Bejar’s testimony highlights the urgent need to address the shortcomings in existing measures and prioritize the safety of young users.
The launch of the Lantern program marks a significant collaborative effort by big tech companies to combat online child sexual abuse and exploitation. By sharing signals of activity, platforms can enhance their ability to detect, remove, and report harmful content. The participation of multiple platforms through the Tech Coalition demonstrates a united front in ensuring the safety of children online. However, the testimony of a former Meta senior engineer serves as a reminder that further improvements are needed to effectively protect young users from harm. With continued collaboration and deeper commitment to child protection, the internet can become a safer place for children to explore and learn.