Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
49 views22 pages

Lantern Transparency Report 2024

The Tech Coalition's 2024 Lantern Transparency Report outlines the program's ongoing efforts to combat online child sexual exploitation and abuse (OCSEA) through cross-platform signal sharing. Key updates include the expansion of participation to financial institutions, the introduction of a financial sector pilot, and a focus on enhancing operational efficiency and human rights due diligence. The report highlights significant achievements in 2024, including the sharing of over 1 million signals and actions taken against numerous accounts and URLs related to OCSEA.

Uploaded by

Seyoon Kim
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views22 pages

Lantern Transparency Report 2024

The Tech Coalition's 2024 Lantern Transparency Report outlines the program's ongoing efforts to combat online child sexual exploitation and abuse (OCSEA) through cross-platform signal sharing. Key updates include the expansion of participation to financial institutions, the introduction of a financial sector pilot, and a focus on enhancing operational efficiency and human rights due diligence. The report highlights significant achievements in 2024, including the sharing of over 1 million signals and actions taken against numerous accounts and URLs related to OCSEA.

Uploaded by

Seyoon Kim
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Tech Coalition • Lantern Transparency Report 2024

ADVANCING CHILD SAFETY


THROUGH SIGNAL SHARING

TRANSPARENCY
REPORT 2024

Page 1
Tech Coalition • Lantern Transparency Report 2024

About this report

Introduction........................................................... 3
Many of the core operational activities related
to Lantern remain unchanged from those Lantern Participation
outlined in the 2023 Transparency Report.
Lantern Participation Criteria............................... 4
For broader context, readers are encouraged to Eligibility ................................................................ 4
refer to last year’s transparency report before Expanding Lantern’s Reach.................................. 4
reviewing the 2024 Transparency Report.
Application Process.............................................. 5
The following sections highlight key Lantern Participant Expectations........................ 6
improvements, process changes, or program Participating Companies...................................... 7
updates, ensuring a clear picture of Lantern’s
2024 Program Enrollment.................................... 7
evolution while minimizing redundancy.
2024 Program Engagement................................. 8
ThreatExchange Integration ................................ 9
We are proud to present our second Lantern Annual Compliance Process................................ 9
annual transparency report.
2024 Activities
This report provides a review of Lantern’s
2024 Program Activities....................................... 10
impact in 2024, highlighting key programmatic
updates, successes, challenges, and Financial Pilot........................................................ 10
opportunities for future growth. Human Rights Due Diligence................................ 11
Government Access and Disclosures.................. 11
As industry collaboration deepens, momentum
is building in the collective effort to combat Operational Improvements................................... 12
OCSEA. The Tech Coalition remains committed
Signal Sharing
to strengthening Lantern in order to create a
safer digital world for children. How Lantern works............................................... 13
Signal Sharing Framework................................... 13
Program Taxonomy............................................... 14
Parameters for Signal Sharing............................. 14
Parameters for Signal Use.................................... 14

Metrics and Outcomes

Metrics and Outcomes......................................... 15


Uploaded Signals in 2024..................................... 15
Removed Signals in 2024..................................... 16
Content-Based Signals ........................................ 16
Incident-Based Signals ........................................ 17
Cross-Platform Flags in 2024.............................. 18
Measuring Impact and Outcomes....................... 18

Taxonomy Appendix

Appendix A - Program Taxonomy........................ 19

Page 2
Tech Coalition • Lantern Transparency Report 2024

Introduction

Lantern enables technology companies


to share critical insights, identify patterns
of abuse, and take action in ways that no
single company could achieve alone.
Over 1 million signals shared to date
Efforts to combat online child sexual exploitation
and abuse (OCSEA) face growing complexity and In 2024, Lantern signals have led to:
urgency. Perpetrators continue to exploit evolving
technologies, leveraging multiple platforms and y enforcement actions against
tools to groom, manipulate, and exploit children. 102,082 accounts
y removal of 7,048
Companies are under increasing pressure not
pieces of CSAM
only to detect and remove child sexual abuse
material (CSAM), but also to take proactive y blocking or removal of
measures preventing and disrupting harm, 135,077 CSEA URLs
both at its source and across platforms. All in addition to actions taken
For too long, efforts to combat OCSEA have by the original signal uploader.
been siloed, allowing offenders to exploit gaps
between platforms and evade detection.

As the new Evolving Technologies Horizon Scan from


Thorn and WeProtect Global Alliance emphasizes, “to To address these gaps, Lantern officially launched in
focus on one [technology or platform] at the exclusion August 2023 following a two-year pilot, establishing
of others will only further the game of ‘whack-a-mole,’ the first cross-platform signal-sharing program to
which we have played for the last several decades.” enhance industry collaboration against OCSEA.

Page 3
Lantern participation

Tech Coalition • Lantern Transparency Report 2024

Lantern Participation Criteria

Eligibility
Lantern is a voluntary initiative for Lantern remains an industry-only initiative,
companies and serves as one of many not available to NGOs, researchers, law
tools available to combat OCSEA. enforcement, governments, or other entities.

Each company may decide whether to pursue Looking ahead to 2025, the Tech Coalition is
enrollment based on how the program aligns with assessing the feasibility of integrating select
its capabilities, practices, and strategic goals trust and safety vendors to explore whether
for protecting children on its platform(s). they could offer participating companies more
efficient ways to engage with Lantern.
There is no cost for companies to participate in
Lantern. The program is fully funded by the Tech Any potential integration will undergo
Coalition and its members, with generous in-kind extensive vetting and review before any
support from Meta for technical hosting services. decisions or implementation occur.

Both Tech Coalition members and non-members


that meet Lantern’s eligibility requirements
and are committed to collaborating to
combat OCSEA are welcome to apply.

Expanding Lantern’s Reach:


Financial Institutions
Lantern launched with only technology
companies, recognizing their central role
in detecting and preventing OCSEA.

However, as suggested in the 2023


Transparency Report, our team conducted
an assessment to evaluate additional
industries, such as financial or hospitality,
where cross-platform OCSEA cases
frequently occur and where signal-sharing
could serve as an effective tool.

Following this review, we initiated a financial


sector pilot in August 2024, exploring how
financial institutions can contribute to disrupting
OCSEA-related activities. As a result, Lantern
now includes both technology companies
and select US-based financial institutions.

A detailed discussion of the financial


pilot’s scope and objectives is
included later in this report.

Page 4
Lantern participation

Tech Coalition • Lantern Transparency Report 2024

Application Process
To become a Lantern participant, companies must In 2024, the application was updated to include
complete a thorough application process and more granular confirmation that companies
compliance review before entering into a formal have properly staffed teams to manually review
legal agreement with other Lantern participants. signals, investigate cases, and responsibly take
action as appropriate and permitted by law.
This process helps ensure that companies
have the necessary policies, safeguards, To accommodate the addition of financial institutions,
and operational procedures to appropriately the Tech Coalition developed a separate application
share and handle signals in accordance with process tailored to the unique responsibilities and
legal, ethical, and security requirements. regulatory requirements of financial companies
based in the US. While mirroring the core structure
The Tech Coalition oversees and administers of the tech company application, this version
this process to maintain the integrity includes additional considerations such as:
and effectiveness of the program.
y Tools used to detect suspicious financial
For technology companies, the application process activities related to OCSEA,
was outlined in last year’s 2023 Transparency Report.
y Processes for handling accounts
flagged for OCSEA-related activity,
y Balancing detection and reporting obligations
with the need to prevent over-actioning,
y Required reporting mechanisms (e.g.,
NCMEC CyberTip Reports, Suspicious
Activity Reports aka “SARs”, etc.), and
y Protocols for notifying law enforcement
or regulators, given financial institutions’
distinct regulatory frameworks.
These updates reflect Lantern’s commitment to
ensuring all participants are equipped to responsibly
manage and act on OCSEA-related signals while
adhering to the highest standards of privacy,
security, and due process regardless of sector.

Page 5
Lantern participation

Tech Coalition • Lantern Transparency Report 2024

Lantern Participant Expectations

The Tech Coalition maintains Official Program Expectations


(formerly “Commitments”) that all applicants agree to uphold
before joining Lantern.

These expectations establish clear participation principles,


promoting responsible and effective engagement in the program.

For context, the human rights impact assessment (HRIA)


conducted by Business for Social Responsibility (BSR) highlights
several important ways in which Lantern - despite its legitimate
goal of combating OCSEA - may inadvertently put certain human
rights in tension, e.g., right to privacy and freedom of expression.

However, if implemented and managed carefully, potential impacts


can be prevented, addressed, and mitigated.

The expectations stem from recommendations provided in the


HRIA across key themes and recommended practices, including:

y Engagement: regularly contributing to Lantern in tangible ways


that produce real-world outcomes in the fight against OCSEA;
y Quality assurance: ensuring that shared signals are accurate,
relevant, and necessary to effectively combat OCSEA;
y Transparency: promoting accountability and trust
among stakeholders through disclosure of processes,
metrics, and outcomes where appropriate;
y Human rights: taking a human rights-based approach
to signal-sharing, investigations, and handling
government requests or inquiries related to Lantern;
y Annual compliance: demonstrating continued commitment
by participating in annual training and compliance reviews.
Each expectation has been carefully developed to reflect
recommendations in the HRIA, mitigate risks, and uphold
fundamental human rights.

The Tech Coalition and participating companies will review and


update these expectations annually, as needed, to confirm
continued relevance and effectiveness.

Page 6
Lantern participation

Tech Coalition • Lantern Transparency Report 2024

Participating Companies

By the end of 2023, 12 companies had joined Lantern,


demonstrating a strong early commitment to cross-
platform collaboration in combating OCSEA.

Participation more than doubled in 2024, with


26 companies now enrolled in the program*.
Of these, 23 companies come from the tech
sector, while three financial institutions enrolled
as part of the financial sector pilot.

2024 Program Enrollment


To ensure accessibility, Lantern remains free and does
not require engineering resources for participation.

In 2024, the Tech Coalition focused its outreach


efforts on companies that aligned with the following:

y Had prior evidence of cross-platform abuse


occurring on their platform that could benefit from
signal-sharing, or hosted large volumes of content
that could benefit from hash and URL sharing.
y Reached a sufficient level of maturity in their
child safety investigation workflows to effectively
integrate signal-sharing into their processes.
y Demonstrated interest and willingness
to actively engage in the program
with other industry partners.
Looking ahead to 2025, the Tech Coalition will
continue refining its prospecting strategy with a focus
on increasing engagement in specific harm areas or
industry sectors, such as the financial sector pilot.

* The twenty-fifth participant declined to be included in


this report, while the twenty-sixth participant was unable
to meet compliance requirements and will no longer be
continuing with the program, therefore neither logo is listed.

Page 7
Lantern participation

Tech Coalition • Lantern Transparency Report 2024

2024 Program Engagement


While overall enrollment in Lantern remains an In 2024, the following companies met the
important metric, the Tech Coalition has also engagement criteria by making regular
focused on moving beyond enrollment as the contributions to the Lantern program:
sole measure of company participation.
y Block, Inc.
Meaningful progress in combating OCSEA y Discord
stems from active engagement. We recognize
y Dropbox
companies that have dedicated resources
to meet the engagement guidelines outlined y MediaLab
in the Official Program Expectations. y Mega
y Meta
It is important to note that these engagement
guidelines are voluntarily pursued, and companies y Niantic
vary in their operational readiness and capacity y Reddit
to contribute due to a variety of factors. y Snap
This acknowledgment is not intended to y Western Union
penalize companies at different stages y X Corp.
of implementation but to highlight those y Yahoo
making proactive strides in participation.
By recognizing engagement beyond enrollment,
Currently, engagement is defined as we aim to encourage deeper participation while
making recurring contributions to Lantern acknowledging the different operational realities
in one or more of the following ways: companies face as they work toward implementing
and scaling their participation in Lantern.
y Directly contributing signals to Lantern
related to violations of OCSEA.
y Providing feedback and reactions on signals
uploaded by other companies to assist
with the quality assurance process.
y Sharing outcomes regarding how
signals were used in investigations and
the results of said investigations.

Page 8
Lantern participation

Tech Coalition • Lantern Transparency Report 2024

ThreatExchange Integration
Lantern operates on ThreatExchange, a platform
developed by Meta to enable organizations to share
information securely and in a privacy-compliant
manner.

Lantern data is securely shared within ThreatExchange


and can be accessed via user interface or API.

In 2024, two companies fully integrated with the


API, 19 participants interacted with Lantern through
the user interface, and 5 companies have not yet
completed onboarding to access ThreatExchange.

Annual Compliance Process


As part of the Official Program Expectations,
participating companies are required to complete
an annual compliance process to help maintain
responsible engagement with Lantern. In 2024,
this process included the following activities: CASE STUDY

y Mandatory personnel training covering human


Backlog Review Uncovers Parent-on-Child Harm
rights due diligence and risk mitigation (in
partnership with BSR), data protection principles A newly onboarded Lantern participant began
for handling and sharing signals, an overview integrating Lantern signals into their enforcement
of Lantern’s purpose and process restrictions, workflows by cross-referencing past signals with
and other operational considerations. recent high-risk interactions on their platform.
y A mandatory company survey assessing This process led to the identification of multiple
compliance with legal requirements outlined in the users discussing the sexual abuse of their own
Lantern agreement and other relevant obligations. children and the production of CSAM, including
y A Data Protection Assessment evaluating cases involving originally produced content.
how signals are used, shared, and protected
Recognizing the severity of these findings,
within each company’s workflows.
the company took swift enforcement
Out of 26 participants, 25 successfully completed action, submitted reports to NCMEC, and
these requirements and remain in good escalated the cases for urgent review.
standing for continued participation in 2025.
This case underscores Lantern’s role as a powerful
One company was unable to meet compliance investigative tool, helping companies uncover critical
requirements despite engagement from the Tech threats that might have otherwise gone undetected,
Coalition and therefore will not continue in the and take decisive action to protect children.
program in 2025.

Page 9
2024 Activities

Tech Coalition • Lantern Transparency Report 2024

2024 Program Activities

In 2024, the Tech Coalition introduced several initiatives and


enhancements to strengthen Lantern’s impact in combating OCSEA,
with a focus on executing the financial sector pilot, continual human
rights due diligence, and improving operational efficiency.

Financial Sector Pilot


Financially motivated OCSEA includes a The pilot was developed in collaboration with
range of harmful activities, such as: human rights advocates and financial legal experts,
ensuring that appropriate safeguards were in place
y Sextortion cases involving to mitigate potential risks. Key safeguards include:
demands for compensation
y A separate addendum tailored to financial
y The purchase and sale of CSAM
institutions, ensuring compliance with US
y Child sex trafficking and tourism financial sector regulatory frameworks.
y Remote live-streamed abuse y A dedicated database for financial
While consumers of these activities are often sexually institutions, preventing them from
motivated, research shows that perpetrators frequently accessing general Lantern signals.
engage in these crimes for financial gain, viewing y Tech companies, that are existing participants,
them as a means to make “quick and easy money”. opt into the pilot voluntarily and only
share signals when there is confirmation
Recognizing this, the Tech Coalition launched
of a financial component (such as a
a financial sector pilot in August 2024 with
transaction) linked to an OCSEA case.
select US-based financial institutions to assess
whether signal-sharing can help disrupt the y Financial institutions are “consume-only”
financial incentives driving OCSEA. participants, meaning they cannot contribute
signals back to tech companies in Lantern.
The pilot initially comprised of two companies
with the internal capacity to investigate Although the pilot launched in August 2024,
OCSEA and a willingness to collaborate. tech companies that opted in did not begin
sharing signals until Q4 2024. As a result, the
Block, Inc. was included as a participant due to the pilot will continue through summer 2025, at
risk of OCSEA with a financial component across which point the Tech Coalition will evaluate its
industry platforms and products, as well as its desire effectiveness and determine whether to continue
to partner with industry to address emerging risks. engaging financial institutions. The evaluation is
considernig effectiveness in disrupting financial
Western Union became a participant due to its global incentives, risk mitigations, and overall impact.
footprint, including in high-risk areas for OCSEA, and
its expertise in combatting this problem. Early insights suggest that signals - particularly those
(See Trends in Financial Sextortion An investigation linked to a confirmed transaction with a known date
of sextortion reports in NCMEC CyberTipline data). and amount - are enabling financial institutions to flag
and investigate potential OCSEA-related financial
Later, PayPal joined the pilot in response to activity more effectively. However, an ongoing
reports from Lantern members noting an challenge is ensuring that financial institutions
increase in PayPal handles appearing in financial receive enough contextual information to properly
sextortion schemes targeting minors, broadening investigate and take action on potential violations.
the dataset and enabling further analysis of
financial patterns associated with OCSEA.

Page 10
2024 Activities

Tech Coalition • Lantern Transparency Report 2024

Human Rights Due Diligence Government Access and Disclosures


The Tech Coalition remains committed to In 2024, the Tech Coalition implemented a government
developing Lantern with human rights and access request policy, outlining clear procedures for the
data protection principles embedded in its Tech Coalition staff in responding to law enforcement
design, governance, and operations. or government requests for information in Lantern.

Ensuring that privacy, security, and due This policy is aligned with principles from the
process safeguards are integrated into the Global Network Initiative (GNI) Principles and
program is essential to maintaining trust Implementation Guidelines on Freedom of Expression
and effectiveness in combating OCSEA. and Privacy and includes commitments to reject
government requests whenever possible, only respond
As part of this commitment, the Tech Coalition to government requests where legally required,
continued its partnership with BSR throughout disclose the minimum amount of data necessary
2024, incorporating their recommendations into to comply with legal obligations, and more.
Lantern’s governance and strategic development.
Additionally, the Tech Coalition developed a
BSR has provided expert guidance on human rights resource for Lantern participants, offering guidance
considerations, helping shape policies, resources, on establishing internal policies for handling
and best practices for responsible signal-sharing. government access and disclosure requests.
In addition to this ongoing collaboration, the Tech In 2024, the Tech Coalition was informed by three
Coalition conducted independent legal and data participating companies that they had received
risk assessments to inform key program decisions. requests for information related to Lantern.
These assessments reaffirmed the importance of
several foundational principles, leading to strategic In response, the Tech Coalition engaged with each
program refinements, including, but not limited to: company to gather additional details and, where
appropriate or legally permissible, understand
y Purpose Limitation: Keeping Lantern’s scope the origins and results of each request.
strictly focused on combating OCSEA to
uphold data privacy and minimize overreach.
y Security Enhancements: Strengthening
data security by implementing mandatory
two-factor authentication.
y Data Minimization and Accuracy: Refining
upload strategies by prioritizing smaller, high-
quality qualitative uploads over large-scale
quantitative data to improve the precision
and actionability of shared signals.

Page 11
2024 Activities

Tech Coalition • Lantern Transparency Report 2024

Operational Improvements
In 2024, the Tech Coalition implemented and
tested several new initiatives aimed at enhancing
participation and improving operational efficiencies.

These improvements were designed to streamline


processes, foster collaboration, and increase the
effectiveness of signal-sharing in combating OCSEA.

Key operational improvements included:

y Launching a dedicated investigator subgroup


for child safety analysts to regularly
meet and share insights on emerging
threats observed across platforms.
y Developing a participant resource center
with customized documentation and
tutorials to simplify the use of Lantern.
y Adding resources on signal-sharing protocols,
internal policy development templates, application
preparation documentation and more.
y Featuring Lantern at Initiate, the Tech Coalition’s CASE STUDY
annual hackathon, offering hands-on engineering
and policy support to Lantern participants. Meta x Snap: Disrupting Financial Sextortion Networks:
y Collaborating with Meta to refine onboarding
A Meta investigation into Nigerian financial sextortion
procedures, reducing the onboarding time
accounts identified signals linked to confirmed
for new participants by nearly five weeks.
offending accounts, including Snap identifiers.
y Introducing whitelisted sharing, allowing
companies to share signals with select partners Meta shared these signals with Snap and the
rather than all Lantern participants, offering a more companies subsequently held a coordination
targeted and controlled approach to collaboration. call to review additional context.

As a result of these improvements and the In response, Snap contributed new investigative findings,
commitment of participating companies, further strengthening the collaborative response.
Lantern saw increased engagement and
more meaningful outcomes, strengthening At least one other industry partner reported that
its impact in the fight against OCSEA. this intelligence helped them recognize suspicious
activity on their platform, link it to financial sextortion
activity, and prioritize their investigations.

This case highlights how targeted intelligence


sharing through Lantern enhances industry-
wide detection and response, improving
protections for users across platforms.

Page 12
Signal Sharing

Tech Coalition • Lantern Transparency Report 2024

Signal Sharing Framework


Lantern enables companies to share signals (also These signals can then be used by other participating
known as threat indicators) when they detect violative companies to uncover related abuse on their
activity or content that breaches their policies own platforms and conduct independent reviews
prohibiting OCSEA. against their respective child safety policies.

How Lantern works


Company adds appropriate
signals to Lantern
Another participating company can
Such as violating URLs
Company takes or keywords
select from the signals in Lantern to
action according see if they help to surface violating

3
to its policies content or activity on its platform

2 4
If activity is an illegal
offense, reports it to the
proper authorities

1 5
Participating company This participating
detects violation on its company reviews
platform content and activity
Company provides feedback to
surfaced on its platform
Lantern about signals used
Company takes from these signals

7
action according against its policies
to its policies

6
If activity is an illegal
offense, reports it to the
proper authorities

CASE STUDY
1<>1 Sharing: Enhancing Targeted Enforcement

Two Lantern participants piloted a 1:1 direct sharing


initiative, testing username-sharing between platforms
to improve detection and enforcement.

y Platform A identified a pattern: offenders


repeatedly created new accounts, posted CSAM,
and directed other users to contact them via
specific usernames for additional material.
y Through the pilot, Platform A shared these
confirmed usernames with Platform B, which
conducted investigations and then took
enforcement action on 73.5% of them.
This pilot demonstrated that more direct, targeted
sharing can be highly actionable, helping disrupt
systemic patterns of abuse and laying the
foundation for stronger industry collaboration.

Page 13
Signal Sharing

Tech Coalition • Lantern Transparency Report 2024

Program Taxonomy
All signals uploaded to Lantern must include at Each participating company is responsible for
least one tag from the official Program Taxonomy, independently reviewing signals and determining
which is maintained by the Tech Coalition in appropriate actions in accordance with its own
collaboration with participating Lantern members. policies, terms of service, and legal obligations.

This taxonomy was developed using a variety Once a company confirms it meets the baseline
of inputs and sources and serves multiple key conditions above, it may begin uploading signals
functions, including compliance, quality assurance, to Lantern. To further support the quality of
and overall usability of shared signals. Lantern, companies implement internal processes
to validate signals before uploading and must
As a living document, the taxonomy is continuously limit Lantern access to dedicated personnel.
refined to address emerging threats. Participating
companies can propose updates throughout the When uploading signals, companies include at least
year, ensuring the taxonomy remains relevant one tag from the Program Taxonomy to categorize the
and adaptable to address new trends. nature of the violation as it pertains to OCSEA, provide
additional required information regarding the severity
A portion of the 2024 taxonomy is included as of violation and confirmation of the company’s review
an appendix to this report. The full taxonomy status, and optionally may provide additional context,
includes concrete examples of how these harms such as supplemental tags, as legally permitted.
typically manifest on platforms. However, to protect
the sensitivity of this content, only tag names
and definitions are published in this report. Parameters for Signal Use
Participating companies must also vet and
Parameters for Signal Sharing assess signals they download from Lantern
to confirm alignment with their policies
Signals may only be shared when they - at a before taking enforcement actions.
minimum - meet the following key conditions:
Companies are encouraged to document the
1. Sharing of signals must be permitted by applicable usage and outcomes of signals to demonstrate
laws, including international and national how signals do or do not contribute to
regulations, as well as privacy frameworks; combating OCSEA on their platform.
2. Signals must be shared in alignment
Further, companies are required to maintain user
with the Lantern legal agreement;
appeals and recourse mechanisms to remove
3. Signals must relate to violations of signals if no longer deemed applicable or relevant to
platform policies prohibiting OCSEA; Lantern and notify the Tech Coalition accordingly.
4. Signals must be shared in accordance with publicly
accessible terms of service/privacy policies; and
5. Signals must be necessary and proportional
to address the potential violation.
Lantern does not facilitate automated
enforcement actions based on signals.

Page 14
Metrics & Outcomes

Tech Coalition • Lantern Transparency Report 2024

Metrics and Outcomes

In the following sections, we outline key metrics 2024 at a glance


related to Lantern’s signal composition and
the outcomes demonstrating its impact. In 2024, companies flagged signals of
All data is presented in aggregate at the program level high-risk CSEA cases, including:
and is not attributable to any particular company. y 81 contact child sexual offenses
The metrics include overall signal counts, as well as y 45 trafficking cases
notable changes in 2024, providing insight into how
y A 1:1 username-sharing pilot enabled
Lantern has evolved and its role in combating OCSEA.
one company to take enforcement
During the 2024 compliance check, companies action on 73.5% of flagged offenders
provided official data on the following outcomes:
y 102,082 accounts actioned: number of
accounts enforced against for violations related
to child sexual exploitation and abuse.
367 GenAI Prompts and Keywords
y 7,048 pieces of CSAM removed: number of newly
12,288 Photo DNA Image Hashes
identified pieces of content containing child sexual
abuse or exploitation material detected and removed. 20,250 PDQ Image Hashes
y 12,033 CSEA URLs actioned by hosts: number
of URLs hosting child sexual exploitation and
abuse content that were detected and removed.
52,761 Account Information
y 123,044 CSEA URLs blocked for transmission:
number of URLs containing CSEA violations
that companies blocked from being shared
or transmitted on their platforms.

Uploaded Signals in 2024 68,747 MD5 Video Hashes

From 1 January 2024 to 31 December 2024, Total uploaded


296,336 new signals were uploaded into in 2024
Lantern, bringing the cumulative number
of uploaded signals to 1,064,380. 296,336
y The largest category of signals (48%)
consists of URLs, primarily representing
websites hosting CSAM.
y 34% of the uploaded signals are hashes, further
categorized in this report. Note: Lantern’s
141,923 URLs
underlying technology, ThreatExchange, has
the flexibility to incorporate additional hash
types in the future if participating companies
find them valuable for sharing and detection.
y 18% are incident-based violations, including
account-related data such as email addresses
and usernames linked to OCSEA activity.
y Less than 1% are keywords, such as those used
in exploitative content and generative AI prompts. Uploaded Signals by Type in 2024

Page 15
Metrics & Outcomes

Tech Coalition • Lantern Transparency Report 2024

2 Keywords
Removed Signals in 2024
2,466 Account
Information
Companies can only remove signals from
Lantern that they have uploaded; they cannot
remove signals contributed by other participants. 7,950 Hashes
Signals are typically removed when:

1. A company determines that they no longer meet


Lantern’s Approved Purpose after further review.
2. They have reached their maximum retention
period under the Lantern Data Retention Policy. 11,086 URLs

In 2024, 21,504 signals were removed from


Lantern. Once removed, the Tech Coalition
retains only the removal date and signal
type - no other information is stored.
Removed Signals by Type

Content-Based Signals 100%

Content-based signals include media shared


across the internet, such as images, videos,
drawings, and audio recordings.

These signals are uploaded into Lantern as


hashes or URLs and often include CSAM and
other forms of illicit minor sexualization.
50%
All signals in Lantern must be categorized using the
official Program Taxonomy (see appendix), ensuring Minor
Sexualization
alignment with Lantern’s purpose and scope.
Meme: Humor
This year, we analyzed tag categorization by hash type, CSAM: B2
leveraging the metadata provided by the companies
CSAM: B1
that shared hashes, to better understand whether
different hashing technologies are more effective CSAM: A2

for detecting specific content types or use cases. 0% CSAM: A1


PDQ PhotoDNA MD5
While more data is needed to determine Image Image Video
Hashes Hashes Hashes
definitive trends, early insights suggest:
Taxonomy Categorization by Hash Type
y PDQ hashes are predominantly used
for detecting minor sexualization.
Notably, most URLs uploaded to Lantern were tagged
y PhotoDNA hashes are more concentrated with “CSAM” but lacked granular subcategorization.
in B1 and B2 categories, reflecting
different OCSEA content types. This may be because companies flag URLs
found in advertisements or discussions related
y MD5 video hashes are heavily concentrated in B1. to CSAM and upload them into Lantern without
directly accessing them, leading to broader
classification as suspected violations.

Page 16
Metrics & Outcomes

Tech Coalition • Lantern Transparency Report 2024

Incident-Based Signals
Incident-based signals capture violative behaviors
across platforms and are uploaded into Lantern
as account-related data (e.g., usernames, email
addresses). These signals help participating
companies identify individuals or networks
engaging in OCSEA-related activities.

Most shared incidents involve attempts to


distribute or acquire CSAM. However, in 2024
there was an increase in signals reflecting
more direct forms of harm, including:

y 275 cases of grooming (sexual) where an y 45 cases of trafficking (up from 0 in 2023).
adult introduces sexual content, discussions,
y A rise in financial exploitation through
or behaviors into the interactions with
sextortion (475 signals involved in financial
a minor (up from 12 in 2023).
transactions in 2024, up from 120 in 2023).
y 81 cases of individuals using online platforms
For full definitions of these categories,
to gain access to children for contact
see the taxonomy appendix.
sexual offenses (up from 0 in 2023).

Taxonomy Categorization by Incident-Signal Type

CSAM: Consumer

CSAM: Distributor

CSAM: Solicitor

Sextortion

CSAM: Coordinator

Grooming

CSAM: Producer

Contact Offense

Trafficking

0 5,000 10,000 15,000 20,000

Page 17
Metrics & Outcomes

Tech Coalition • Lantern Transparency Report 2024

Cross-Platform Flags in 2024 Measuring Impact and Outcomes


In 2024, the Tech Coalition encouraged the use of As part of the annual compliance process,
the “platform:____” tag to enhance cross-platform participating companies report key metrics
alerting, allowing participating companies to indicate that reflect their enforcement actions and the
when activity is detected across multiple platforms. impact of Lantern in combating OCSEA.

This helps escalate signals efficiently This data helps assess how cross-platform
to the appropriate teams. collaboration contributes to the detection and
disruption of harmful content and behaviors.
This year, we are including data on platforms
flagged in at least one Lantern upload to During the 2024 compliance check,
support cross-platform trend analysis and companies provided official data on:
provide insights into how OCSEA behaviors
manifest across different digital spaces. y Accounts Actioned: The number of accounts
enforced against for violations related to child
Note: The absence of a platform tag does not sexual exploitation and abuse, in accordance
necessarily indicate that platform behavior was not with platform policies and applicable laws.
observed or flagged; instead, these tags represent an
y Pieces of CSAM Removed: The number
additional layer of manual escalation to the platform.
of newly identified pieces of content
2024 Platform Flags in Lantern containing child sexual abuse or exploitation
(Ordered Alphabetically) material detected and removed.
y CSEA URLs Removed (for Hosts): The number
y Block, Inc.
of URLs hosting child sexual exploitation
y Discord and abuse content that were detected
y Facebook Marketplace and removed by hosting companies.
y Google Play y CSEA URLs Blocked (for Transmission):
y Instagram The number of URLs containing CSEA
y Mega violations that companies blocked from being
y PayPal shared or transmitted on their platforms,
reducing proliferation and user access.
y Roblox
y Snap For each metric, these numbers are in addition
to actions already taken by the original signal
y Telegram
uploader, representing net new outcomes that
would not have been possible without cross-
industry collaboration through Lantern.

It is important to note that not all


participating companies have fully integrated
Lantern into their workflows yet.

The 2024 reporting reflects data from 10


actively reporting companies, establishing a
baseline for measuring Lantern’s impact.

As adoption grows, future reports will reflect a more


comprehensive view of industry-wide collaboration.

Page 18
Taxonomy Appendix

Tech Coalition • Lantern Transparency Report 2024

Appendix A - Program Taxonomy

The taxonomy is a dynamic and evolving document Definitions labeled as “Supplemental” may be
designed to categorize signals in alignment used to enhance categorization but cannot be
with the approved purpose of combating online applied independently to a signal upload.
child sexual exploitation and abuse (OCSEA).

The definitions provided serve as a reference and


are subject to change as needed to accurately reflect
evolving detection methods across platforms.

y Contact Offense: When an adult openly admits y Industry Classification System of CSAM
to or provides evidence (e.g., explicit disclosures,
y A1 CSAM: Any form of media that depicts a
documented proof of abuse, etc.) of sexually
prepubescent child engaged in a sexual act.
abusing a child in an offline, real-world setting.
y A2 CSAM: Any form of media that
y Child Sexual Abuse Material (CSAM): Any
depicts a prepubescent child engaging
form of media - including images, videos, live-
in a lascivious exhibition or being used in
streamed content, or other digital content - that
connection with sexually explicit conduct.
depicts the sexual abuse of exploitation of a
child. This includes but is not limited to rape, y B1 CSAM: Any form of media that depicts a
molestation, sexual intercourse, masturbation, postpubescent child engaged in a sexual act.
or imagery depicting the lascivious exhibition y B2 CSAM: Any form of media that depicts
of genitalia, the anus, or pubic areas. a postpubescent child engaging in a
y CSAM - Animated: Hand-drawn or digitally lascivious exhibition or being used in
created animations that depict sexual acts connection with sexually explicit conduct
involving characters resembling children or y CSAM Coordinator (Actor): An individual who
characters originally designed for child audiences plays an organizing or facilitating role in the
engaging in sexually explicit behaviors. creation, distribution, or exchange of CSAM.
y CSAM - Egregious: CSAM depicting extreme Rather than directly producing or consuming the
or highly severe situations that may require material, the coordinator may manage networks,
specialized training and heightened wellness recruit victims, connect perpetrators, or provide
considerations for proper review and triage. technical support to enable the spread of CSAM.

y CSAM - Generative: CSAM created using y CSAM Consumer (Actor): An adult who engages
artificial methods, such as generative in the consumption, possession, or interaction
models or AI-based tools. The content may with CSAM/CSEM. This includes viewing,
or may not be photorealistic but is known downloading, storing, or otherwise accessing
to have been artificially generated. content that depicts children in sexually
exploitative or compromising situations.
y CSAM - Manipulated: CSAM featuring a real
child that has been digitally altered using AI, y CSAM Distributor (Actor): An adult involved
generative models, or other manipulation tools in the distribution, sharing, or promotion of
(e.g., photo editors) to depict the individual CSAM/CSEM. This includes sharing CSEA
as a child in a sexually explicit manner. content publicly or privately; providing access to
CSEA content via links, platforms, or services;
y CSAM - Self-generated: Sexually explicit
enticing or instructing others to seek, access,
content that a minor has created of themselves,
or distribute CSEA content. Additionally, this
either voluntarily or under coercion, which
definition includes saving or collecting CSAM/
meets the legal definition of CSAM.
CSEM within accounts, groups, or communities
where such content is accessed or exchanged.

Page 19
Taxonomy Appendix

Tech Coalition • Lantern Transparency Report 2024

y CSAM Producer (Actor): An adult involved y Grooming (Inappropriate Contact): The early
in the creation or facilitation of new CSAM/ stages of grooming in which an adult persistently
CSEM, including capturing or requesting CSEA engages with a child (or multiple children) to
content through direct interactions with children; establish trust and emotional connection while
generating sexually exploitative images using displaying inappropriate behaviors. Though not yet
artificial intelligence, digital manipulation tools, explicitly sexual, this stage may include excessive
or other nefarious means; uploading original compliments, intrusive personal questions (e.g.,
CSAM/CSEM to the internet, including content asking about the child’s address or school),
they have created or directly facilitated. encouragement to keep the interaction secret, or
attempts to isolate the child from trusted adults.
y CSAM Solicitor (Actor): An adult who
actively seeks out CSAM/CSEM but does not y Grooming (Sexual): The stage of grooming
demonstrate possession at the time of the where an adult introduces sexual content,
violating incident. This includes soliciting discussions, or behaviors into their interactions
content from other individuals, engaging in with a child. This may involve exposing the child
forums dedicated to CSAM, or attempting to explicit material, making sexual comments
to acquire CSAM through digital means. or requests, or escalating toward direct sexual
exploitation, either in a virtual setting (e.g.,
y CSEA - Livestream: The sexual abuse or
coercing the child into sharing explicit content)
exploitation of children that occurs or is
or through an offline contact offense.
facilitated in real-time through livestreaming
methods, such as webcams, video chats, or y Minor Sexualization: Any form of media, behavior,
live social media broadcasts. This includes or communication (including images, videos,
both direct abuse and instances where minors digital content, or conversations) that depicts
are coerced or manipulated into engaging children in a sexually inappropriate, suggestive, or
in sexually explicit acts on camera. objectifying manner. While it does not necessarily
meet the legal definition of CSAM, it contributes
y CSEA - Manual: Any form of written, digital, or
to the normalization of child exploitation and
visual content (such as documents, websites,
can be a precursor to more explicit abuse.
or guides) that provide instructions or
techniques for sexually abusing or exploiting y Sextortion: A form of sexual exploitation in
children. These materials may include which an adult coerces a child by threatening
grooming strategies, coercion tactics, or to distribute private or sensitive material unless
technical methods for evading detection. the child provides CSAM, engages in sexual
acts, provides financial compensation, or
y CSEA - Meme (Humor): Images, videos, or digital
complies with other demands. The perpetrator
content that depict CSAM or the sexualization
may obtain the material through hacking, social
of minors in a format intended to be humorous
engineering, or direct sharing by the child under
or satirical. While not necessarily shared for
coercion. This tag applies to both the abuse type
sexual gratification, these materials contribute to
(“Sextortion”) and the perpetrator (“Sextortionist”).
normalization, desensitization, and the broader
ecosystem of child exploitation content. y Trafficking: The exploitation of a child for a
commercial sex act in exchange for something
y CSEA - Meme (Outrage): Images, videos, or digital
of value, such as money, drugs, shelter, or
content that depict CSAM or the sexualization
other goods or services. This may involve a
of minors with the intent of raising awareness
perpetrator recruiting, harboring, transporting,
or provoking outrage. While often shared to
providing for, patronizing, or soliciting a
condemn exploitation, these materials can
child for the purpose of a commercial sex
inadvertently contribute to harm by amplifying
act. This tag applies to both the abuse type
exploitative content, normalizing exposure, or
(“Trafficking”) and the perpetrator (“Trafficker”).
circumventing platform moderation policies.

Page 20
Taxonomy Appendix

Tech Coalition • Lantern Transparency Report 2024

y Bestiality (Supplemental): The sexual abuse y Platform: _______ (Supplemental): Manually


of a child involving an animal. This tag is identifies the specific platform relevant to a signal
used to indicate the presence of such abuse or violation. This tag helps highlight platforms
in conjunction with other classifications. requiring further investigation or intervention
based on the nature of reported activities.
y Financial Transaction (Supplemental):
Communication or evidence regarding a y Prepubescent (Supplemental): Refers to a
financial transaction - whether completed or child who is no longer an infant or toddler
not - often involving virtual currencies, which but has not yet developed obvious signs of
may be linked to the exchange of CSAM, puberty or secondary sexual characteristics.
exploitation, or other forms of abuse. If the child appears very young (typically up to
around five years old), the tag “infant_toddler”
y Incest (Supplemental): Sexual activities involving
should be used instead. This tag is applied
family members or close relatives. This tag
alongside classifications such as CSAM to
is used alongside primary classifications
assist in escalating the severity of the abuse.
to highlight instances where familial
relationships are a factor in the abuse. y Report ID:______ (Supplemental): A reference tag
used to include the case number from a report
y Infant Toddler (Supplemental): Refers to children
submitted to NCMEC or another relevant authority.
from infancy through early walking stages,
typically characterized by an unsteady gait. This y Reported to Authority (Supplemental):
tag is used in conjunction with classifications Indicates that the signal or violation has
such as CSAM to escalate the severity of been formally reported to NCMEC via
abuse involving the youngest victims. CyberTipline or another relevant authority.
y Organized Harm Group (Supplemental): A known y Self-Harm (Supplemental): Used when a child
organization or network involved in or adjacent to expresses intent to self-harm (e.g., cutting)
CSEA activities, such as CSAM distribution. While or shows evidence of self-harm behaviors
CSAM may not be the group’s sole purpose, it is related to incidents of OCSEA. This tag also
often used to gatekeep entry, show loyalty, or as applies when a perpetrator encourages or
a desensitization device. Examples include 764, coerces a child to engage in self-harm.
Order of Nine Angles (O9A), and other criminal/
y Suicidal Ideation (Supplemental): Applied when
extremist organizations that engage in or facilitate
a child expresses thoughts of suicide related
child exploitation as part of their broader activities.
to incidents of OCSEA. This tag also includes
y Plans to Meet (Supplemental): Expressed scenarios where a perpetrator encourages
plans for an in-person meeting, whether or coerces the child to commit suicide.
past, scheduled, or still under discussion.
While the intent may not always be explicitly
sexual, this tag is commonly used in cases
involving grooming, trafficking, or direct sexual
encounters between an adult and a child.

Page 21
Tech Coalition • Lantern Transparency Report 2024

The Tech Coalition is an alliance of global technology


companies of varying sizes and services working together
to combat child sexual exploitation and abuse online.

By convening the industry to pool knowledge, share expertise,


and strengthen all links in the chain, even the smallest startups
can have access to the same level of knowledge and technical
expertise as the largest tech companies in the world.

www.technologycoalition.org

Page 22

You might also like