Home » Technology » Tech Firms Launch Lantern to Eradicate Child Predators

Tech Firms Launch Lantern to Eradicate Child Predators

The Tech Coalition – an industry group including Discord, Google Mega, Meta Quora, Roblox Snap and Twitch – announced on Tuesday a new initiative for better identifying child predators that hide their activity by jumping across tech platforms.

Lantern allows the companies that are part of the coalition to exchange information about child sexual abuse. This increases their detection and prevention capabilities, builds situational awareness, and helps them to report criminal offenses.

John Litton’s Executive Director explained on the website of the coalition that online child exploitation and abuse is a pervasive threat that can be found across different platforms and services.

The two most dangerous dangers of our time are online grooming (inappropriate sexualized contact with children) and financial threats. sextortion He went on to say that young people is what he was concerned about.

He wrote that “predators often begin by contacting young people in public forums and posing either as friends or peers.” They then direct their victim to private chats or different platforms in order to solicit and share Child Sexual Abuse Material (CSAM).

He said that because this activity is spread across multiple platforms, a single company may only be able to see a fraction of the damage a victim faces. “To get the full picture, and to take action properly, companies have to work together.”

Gathering Signals to Combat Child Exploitation

How does the Lantern Program work?

  • Lantern allows companies to send “signals”, which are alerts, about any activity that is in violation of their policy against child exploitation.
  • Signals can include information about accounts that are violating policy such as usernames, CSAM-hashes, keywords, email addresses or other data. Signals do not prove abuse. They can give clues that will help with further investigation.
  • Companies can upload signals into Lantern and then run them through their platforms. They can review the content and any activity the signal reveals against the policies and terms and conditions of their respective platforms. Then they take the appropriate enforcement action. For example, they may remove an account and report criminal activity to law enforcement agencies and the National Center for Missing and Experiemented Children.

Infographic: How Lantern Child Safety Signals Shared Program Works The Tech Coalition)


Litton explains that, “up until now, companies have not been able to cooperate against predatory acts who are able to evade detection by using different services.” “Lantern helps to fill this gap, and sheds light on attempts of cross-platform child sexual exploitation. It makes the internet safer and more secure for children.”

Lantern Initiative: What is it?

Alexandra Popken Vice President of Trust and Safety at WebPurifyThe service is based in Irvine, Calif. and offers cloud-based filtering of web content as well as online protection for children.

She told TechNewsWorld, “Each platform has its own unique set of challenges in addressing CSAM, whether it’s related to knowledge, resources, tools or other resources.” “Lantern is a symbol of unity among platforms to combat this issue, and provides the infrastructure necessary to pull it through.”

Lantern builds upon the work already done by tech companies to share information with law enforcement. Ashley Johnson, an analyst at the Information Technology and Innovation Foundation in Washington D.C., a Washington D.C.-based research and policy organization, said that Lantern is a continuation of the existing efforts.


She told TechNewsWorld that she hoped to see more collaborations of this nature. “I could see this as a useful tool for fighting terrorist content. But I think that online chat sexual abuse would be a good place to begin with this type of information sharing.”

Popken explained malicious actors use a wide range of tactics to weaponize platforms, and employ various strategies to avoid detection.

She explained that in the past platforms were reluctant to signal-share because it would be an admission of abuse. “However…initiatives like this demonstrate a paradigm shift, recognising that cross platform sharing enhances collective security, and safeguards user’s well-being.”

Tracking Platform nomads

Chris Hauk is a consumer privacy advocate at Pixel PrivacyThe publisher of privacy and security guides for consumers.

He added that “sharing information across networks” will help social platforms better detect these actions.

He said that when predators are shut down from one website or app, they move to another. “By sharing the information, social networks are able to stop this type of activity.”

Johnson said that it is very common in online grooming cases for the perpetrators of such crimes to ask their victims to move their communications from one website onto another.

She said that a predator might suggest switching to another platform because of privacy concerns or because the controls for parents are less strict. It is important to be able to track the activity on different platforms.

Manage data responsibly in child safety initiatives

Lantern can speed up the detection of threats against children. This is an important aspect of the program. Popken stated that “if data uploaded into Lantern is scanned against other platforms and content can be rejected or surfaced for review in real time,” then this represents a meaningful step forward in enforcing the problem on a large scale.


Litton noted in his posting, that over the past two years the coalition developed Lantern. The program was designed to be effective and efficient against online child exploitation and abuse. It also had to be responsibly managed.

  • Business for Social Responsibility can help ensure respect for human rights through a Human Rights Impact Assessment.
  • Invite more than 25 experts, organizations, and government agencies that are focused on digital rights, child safety, advocacy for marginalized communities, and government to provide feedback, and invite them to take part in the HRIA.
  • Promoting transparency can be done by including Lantern as part of The Tech Coalition’s annual report on transparency and providing recommendations to participating companies about how to integrate their participation into their transparency reports.
  • Designing Lanterns for Safety and Privacy

The importance of privacy in child protection measures

Johnson explained that privacy is paramount when sharing data, particularly with information about children. This is because children are a very vulnerable population.

“It is very important that companies participating in this project protect the identity and data of the children, and prevent it from falling into wrong hands,” continued Ms. Ayers.

She stated that “based on what we have seen from tech firms,” she said. They’ve done an excellent job protecting victims’ privacy. Hopefully they can keep it up.

However, Paul Bischoff, privacy advocate at Comparitech“Lantern is not perfect,” warns, an online resource for reviews, information and advice on consumer security products.

“An innocent person,” he told TechNewsWorld, “could unwittingly trigger a ‘signal’ that spreads information about them to other social networks.”

Online Grooming: A Comprehensive Overview

The Tech Coalition released a research document titled “Considerations to Detection, Respond, and Prevent Online Grooming”. This paper sheds light on the complexities of the online grooming phenomenon and outlines the collective measures taken by the tech sector.

The document, which is intended for education only, explores the established protocols as well as the industry’s efforts to minimize the impact of predatory behavior.

The Tech Coalition offers this paper as a direct downloadNo registration or forms are required.