Lantern Transparency Report 2024
Lantern Transparency Report 2024
TRANSPARENCY
REPORT 2024
Page 1
Tech Coalition • Lantern Transparency Report 2024
Introduction........................................................... 3
Many of the core operational activities related
to Lantern remain unchanged from those Lantern Participation
outlined in the 2023 Transparency Report.
Lantern Participation Criteria............................... 4
For broader context, readers are encouraged to Eligibility ................................................................ 4
refer to last year’s transparency report before Expanding Lantern’s Reach.................................. 4
reviewing the 2024 Transparency Report.
Application Process.............................................. 5
The following sections highlight key Lantern Participant Expectations........................ 6
improvements, process changes, or program Participating Companies...................................... 7
updates, ensuring a clear picture of Lantern’s
2024 Program Enrollment.................................... 7
evolution while minimizing redundancy.
2024 Program Engagement................................. 8
ThreatExchange Integration ................................ 9
We are proud to present our second Lantern Annual Compliance Process................................ 9
annual transparency report.
2024 Activities
This report provides a review of Lantern’s
2024 Program Activities....................................... 10
impact in 2024, highlighting key programmatic
updates, successes, challenges, and Financial Pilot........................................................ 10
opportunities for future growth. Human Rights Due Diligence................................ 11
Government Access and Disclosures.................. 11
As industry collaboration deepens, momentum
is building in the collective effort to combat Operational Improvements................................... 12
OCSEA. The Tech Coalition remains committed
Signal Sharing
to strengthening Lantern in order to create a
safer digital world for children. How Lantern works............................................... 13
Signal Sharing Framework................................... 13
Program Taxonomy............................................... 14
Parameters for Signal Sharing............................. 14
Parameters for Signal Use.................................... 14
Taxonomy Appendix
Page 2
Tech Coalition • Lantern Transparency Report 2024
Introduction
Page 3
Lantern participation
Eligibility
Lantern is a voluntary initiative for Lantern remains an industry-only initiative,
companies and serves as one of many not available to NGOs, researchers, law
tools available to combat OCSEA. enforcement, governments, or other entities.
Each company may decide whether to pursue Looking ahead to 2025, the Tech Coalition is
enrollment based on how the program aligns with assessing the feasibility of integrating select
its capabilities, practices, and strategic goals trust and safety vendors to explore whether
for protecting children on its platform(s). they could offer participating companies more
efficient ways to engage with Lantern.
There is no cost for companies to participate in
Lantern. The program is fully funded by the Tech Any potential integration will undergo
Coalition and its members, with generous in-kind extensive vetting and review before any
support from Meta for technical hosting services. decisions or implementation occur.
Page 4
Lantern participation
Application Process
To become a Lantern participant, companies must In 2024, the application was updated to include
complete a thorough application process and more granular confirmation that companies
compliance review before entering into a formal have properly staffed teams to manually review
legal agreement with other Lantern participants. signals, investigate cases, and responsibly take
action as appropriate and permitted by law.
This process helps ensure that companies
have the necessary policies, safeguards, To accommodate the addition of financial institutions,
and operational procedures to appropriately the Tech Coalition developed a separate application
share and handle signals in accordance with process tailored to the unique responsibilities and
legal, ethical, and security requirements. regulatory requirements of financial companies
based in the US. While mirroring the core structure
The Tech Coalition oversees and administers of the tech company application, this version
this process to maintain the integrity includes additional considerations such as:
and effectiveness of the program.
y Tools used to detect suspicious financial
For technology companies, the application process activities related to OCSEA,
was outlined in last year’s 2023 Transparency Report.
y Processes for handling accounts
flagged for OCSEA-related activity,
y Balancing detection and reporting obligations
with the need to prevent over-actioning,
y Required reporting mechanisms (e.g.,
NCMEC CyberTip Reports, Suspicious
Activity Reports aka “SARs”, etc.), and
y Protocols for notifying law enforcement
or regulators, given financial institutions’
distinct regulatory frameworks.
These updates reflect Lantern’s commitment to
ensuring all participants are equipped to responsibly
manage and act on OCSEA-related signals while
adhering to the highest standards of privacy,
security, and due process regardless of sector.
Page 5
Lantern participation
Page 6
Lantern participation
Participating Companies
Page 7
Lantern participation
Page 8
Lantern participation
ThreatExchange Integration
Lantern operates on ThreatExchange, a platform
developed by Meta to enable organizations to share
information securely and in a privacy-compliant
manner.
Page 9
2024 Activities
Page 10
2024 Activities
Ensuring that privacy, security, and due This policy is aligned with principles from the
process safeguards are integrated into the Global Network Initiative (GNI) Principles and
program is essential to maintaining trust Implementation Guidelines on Freedom of Expression
and effectiveness in combating OCSEA. and Privacy and includes commitments to reject
government requests whenever possible, only respond
As part of this commitment, the Tech Coalition to government requests where legally required,
continued its partnership with BSR throughout disclose the minimum amount of data necessary
2024, incorporating their recommendations into to comply with legal obligations, and more.
Lantern’s governance and strategic development.
Additionally, the Tech Coalition developed a
BSR has provided expert guidance on human rights resource for Lantern participants, offering guidance
considerations, helping shape policies, resources, on establishing internal policies for handling
and best practices for responsible signal-sharing. government access and disclosure requests.
In addition to this ongoing collaboration, the Tech In 2024, the Tech Coalition was informed by three
Coalition conducted independent legal and data participating companies that they had received
risk assessments to inform key program decisions. requests for information related to Lantern.
These assessments reaffirmed the importance of
several foundational principles, leading to strategic In response, the Tech Coalition engaged with each
program refinements, including, but not limited to: company to gather additional details and, where
appropriate or legally permissible, understand
y Purpose Limitation: Keeping Lantern’s scope the origins and results of each request.
strictly focused on combating OCSEA to
uphold data privacy and minimize overreach.
y Security Enhancements: Strengthening
data security by implementing mandatory
two-factor authentication.
y Data Minimization and Accuracy: Refining
upload strategies by prioritizing smaller, high-
quality qualitative uploads over large-scale
quantitative data to improve the precision
and actionability of shared signals.
Page 11
2024 Activities
Operational Improvements
In 2024, the Tech Coalition implemented and
tested several new initiatives aimed at enhancing
participation and improving operational efficiencies.
As a result of these improvements and the In response, Snap contributed new investigative findings,
commitment of participating companies, further strengthening the collaborative response.
Lantern saw increased engagement and
more meaningful outcomes, strengthening At least one other industry partner reported that
its impact in the fight against OCSEA. this intelligence helped them recognize suspicious
activity on their platform, link it to financial sextortion
activity, and prioritize their investigations.
Page 12
Signal Sharing
3
to its policies content or activity on its platform
2 4
If activity is an illegal
offense, reports it to the
proper authorities
1 5
Participating company This participating
detects violation on its company reviews
platform content and activity
Company provides feedback to
surfaced on its platform
Lantern about signals used
Company takes from these signals
7
action according against its policies
to its policies
6
If activity is an illegal
offense, reports it to the
proper authorities
CASE STUDY
1<>1 Sharing: Enhancing Targeted Enforcement
Page 13
Signal Sharing
Program Taxonomy
All signals uploaded to Lantern must include at Each participating company is responsible for
least one tag from the official Program Taxonomy, independently reviewing signals and determining
which is maintained by the Tech Coalition in appropriate actions in accordance with its own
collaboration with participating Lantern members. policies, terms of service, and legal obligations.
This taxonomy was developed using a variety Once a company confirms it meets the baseline
of inputs and sources and serves multiple key conditions above, it may begin uploading signals
functions, including compliance, quality assurance, to Lantern. To further support the quality of
and overall usability of shared signals. Lantern, companies implement internal processes
to validate signals before uploading and must
As a living document, the taxonomy is continuously limit Lantern access to dedicated personnel.
refined to address emerging threats. Participating
companies can propose updates throughout the When uploading signals, companies include at least
year, ensuring the taxonomy remains relevant one tag from the Program Taxonomy to categorize the
and adaptable to address new trends. nature of the violation as it pertains to OCSEA, provide
additional required information regarding the severity
A portion of the 2024 taxonomy is included as of violation and confirmation of the company’s review
an appendix to this report. The full taxonomy status, and optionally may provide additional context,
includes concrete examples of how these harms such as supplemental tags, as legally permitted.
typically manifest on platforms. However, to protect
the sensitivity of this content, only tag names
and definitions are published in this report. Parameters for Signal Use
Participating companies must also vet and
Parameters for Signal Sharing assess signals they download from Lantern
to confirm alignment with their policies
Signals may only be shared when they - at a before taking enforcement actions.
minimum - meet the following key conditions:
Companies are encouraged to document the
1. Sharing of signals must be permitted by applicable usage and outcomes of signals to demonstrate
laws, including international and national how signals do or do not contribute to
regulations, as well as privacy frameworks; combating OCSEA on their platform.
2. Signals must be shared in alignment
Further, companies are required to maintain user
with the Lantern legal agreement;
appeals and recourse mechanisms to remove
3. Signals must relate to violations of signals if no longer deemed applicable or relevant to
platform policies prohibiting OCSEA; Lantern and notify the Tech Coalition accordingly.
4. Signals must be shared in accordance with publicly
accessible terms of service/privacy policies; and
5. Signals must be necessary and proportional
to address the potential violation.
Lantern does not facilitate automated
enforcement actions based on signals.
Page 14
Metrics & Outcomes
Page 15
Metrics & Outcomes
2 Keywords
Removed Signals in 2024
2,466 Account
Information
Companies can only remove signals from
Lantern that they have uploaded; they cannot
remove signals contributed by other participants. 7,950 Hashes
Signals are typically removed when:
Page 16
Metrics & Outcomes
Incident-Based Signals
Incident-based signals capture violative behaviors
across platforms and are uploaded into Lantern
as account-related data (e.g., usernames, email
addresses). These signals help participating
companies identify individuals or networks
engaging in OCSEA-related activities.
y 275 cases of grooming (sexual) where an y 45 cases of trafficking (up from 0 in 2023).
adult introduces sexual content, discussions,
y A rise in financial exploitation through
or behaviors into the interactions with
sextortion (475 signals involved in financial
a minor (up from 12 in 2023).
transactions in 2024, up from 120 in 2023).
y 81 cases of individuals using online platforms
For full definitions of these categories,
to gain access to children for contact
see the taxonomy appendix.
sexual offenses (up from 0 in 2023).
CSAM: Consumer
CSAM: Distributor
CSAM: Solicitor
Sextortion
CSAM: Coordinator
Grooming
CSAM: Producer
Contact Offense
Trafficking
Page 17
Metrics & Outcomes
This helps escalate signals efficiently This data helps assess how cross-platform
to the appropriate teams. collaboration contributes to the detection and
disruption of harmful content and behaviors.
This year, we are including data on platforms
flagged in at least one Lantern upload to During the 2024 compliance check,
support cross-platform trend analysis and companies provided official data on:
provide insights into how OCSEA behaviors
manifest across different digital spaces. y Accounts Actioned: The number of accounts
enforced against for violations related to child
Note: The absence of a platform tag does not sexual exploitation and abuse, in accordance
necessarily indicate that platform behavior was not with platform policies and applicable laws.
observed or flagged; instead, these tags represent an
y Pieces of CSAM Removed: The number
additional layer of manual escalation to the platform.
of newly identified pieces of content
2024 Platform Flags in Lantern containing child sexual abuse or exploitation
(Ordered Alphabetically) material detected and removed.
y CSEA URLs Removed (for Hosts): The number
y Block, Inc.
of URLs hosting child sexual exploitation
y Discord and abuse content that were detected
y Facebook Marketplace and removed by hosting companies.
y Google Play y CSEA URLs Blocked (for Transmission):
y Instagram The number of URLs containing CSEA
y Mega violations that companies blocked from being
y PayPal shared or transmitted on their platforms,
reducing proliferation and user access.
y Roblox
y Snap For each metric, these numbers are in addition
to actions already taken by the original signal
y Telegram
uploader, representing net new outcomes that
would not have been possible without cross-
industry collaboration through Lantern.
Page 18
Taxonomy Appendix
The taxonomy is a dynamic and evolving document Definitions labeled as “Supplemental” may be
designed to categorize signals in alignment used to enhance categorization but cannot be
with the approved purpose of combating online applied independently to a signal upload.
child sexual exploitation and abuse (OCSEA).
y Contact Offense: When an adult openly admits y Industry Classification System of CSAM
to or provides evidence (e.g., explicit disclosures,
y A1 CSAM: Any form of media that depicts a
documented proof of abuse, etc.) of sexually
prepubescent child engaged in a sexual act.
abusing a child in an offline, real-world setting.
y A2 CSAM: Any form of media that
y Child Sexual Abuse Material (CSAM): Any
depicts a prepubescent child engaging
form of media - including images, videos, live-
in a lascivious exhibition or being used in
streamed content, or other digital content - that
connection with sexually explicit conduct.
depicts the sexual abuse of exploitation of a
child. This includes but is not limited to rape, y B1 CSAM: Any form of media that depicts a
molestation, sexual intercourse, masturbation, postpubescent child engaged in a sexual act.
or imagery depicting the lascivious exhibition y B2 CSAM: Any form of media that depicts
of genitalia, the anus, or pubic areas. a postpubescent child engaging in a
y CSAM - Animated: Hand-drawn or digitally lascivious exhibition or being used in
created animations that depict sexual acts connection with sexually explicit conduct
involving characters resembling children or y CSAM Coordinator (Actor): An individual who
characters originally designed for child audiences plays an organizing or facilitating role in the
engaging in sexually explicit behaviors. creation, distribution, or exchange of CSAM.
y CSAM - Egregious: CSAM depicting extreme Rather than directly producing or consuming the
or highly severe situations that may require material, the coordinator may manage networks,
specialized training and heightened wellness recruit victims, connect perpetrators, or provide
considerations for proper review and triage. technical support to enable the spread of CSAM.
y CSAM - Generative: CSAM created using y CSAM Consumer (Actor): An adult who engages
artificial methods, such as generative in the consumption, possession, or interaction
models or AI-based tools. The content may with CSAM/CSEM. This includes viewing,
or may not be photorealistic but is known downloading, storing, or otherwise accessing
to have been artificially generated. content that depicts children in sexually
exploitative or compromising situations.
y CSAM - Manipulated: CSAM featuring a real
child that has been digitally altered using AI, y CSAM Distributor (Actor): An adult involved
generative models, or other manipulation tools in the distribution, sharing, or promotion of
(e.g., photo editors) to depict the individual CSAM/CSEM. This includes sharing CSEA
as a child in a sexually explicit manner. content publicly or privately; providing access to
CSEA content via links, platforms, or services;
y CSAM - Self-generated: Sexually explicit
enticing or instructing others to seek, access,
content that a minor has created of themselves,
or distribute CSEA content. Additionally, this
either voluntarily or under coercion, which
definition includes saving or collecting CSAM/
meets the legal definition of CSAM.
CSEM within accounts, groups, or communities
where such content is accessed or exchanged.
Page 19
Taxonomy Appendix
y CSAM Producer (Actor): An adult involved y Grooming (Inappropriate Contact): The early
in the creation or facilitation of new CSAM/ stages of grooming in which an adult persistently
CSEM, including capturing or requesting CSEA engages with a child (or multiple children) to
content through direct interactions with children; establish trust and emotional connection while
generating sexually exploitative images using displaying inappropriate behaviors. Though not yet
artificial intelligence, digital manipulation tools, explicitly sexual, this stage may include excessive
or other nefarious means; uploading original compliments, intrusive personal questions (e.g.,
CSAM/CSEM to the internet, including content asking about the child’s address or school),
they have created or directly facilitated. encouragement to keep the interaction secret, or
attempts to isolate the child from trusted adults.
y CSAM Solicitor (Actor): An adult who
actively seeks out CSAM/CSEM but does not y Grooming (Sexual): The stage of grooming
demonstrate possession at the time of the where an adult introduces sexual content,
violating incident. This includes soliciting discussions, or behaviors into their interactions
content from other individuals, engaging in with a child. This may involve exposing the child
forums dedicated to CSAM, or attempting to explicit material, making sexual comments
to acquire CSAM through digital means. or requests, or escalating toward direct sexual
exploitation, either in a virtual setting (e.g.,
y CSEA - Livestream: The sexual abuse or
coercing the child into sharing explicit content)
exploitation of children that occurs or is
or through an offline contact offense.
facilitated in real-time through livestreaming
methods, such as webcams, video chats, or y Minor Sexualization: Any form of media, behavior,
live social media broadcasts. This includes or communication (including images, videos,
both direct abuse and instances where minors digital content, or conversations) that depicts
are coerced or manipulated into engaging children in a sexually inappropriate, suggestive, or
in sexually explicit acts on camera. objectifying manner. While it does not necessarily
meet the legal definition of CSAM, it contributes
y CSEA - Manual: Any form of written, digital, or
to the normalization of child exploitation and
visual content (such as documents, websites,
can be a precursor to more explicit abuse.
or guides) that provide instructions or
techniques for sexually abusing or exploiting y Sextortion: A form of sexual exploitation in
children. These materials may include which an adult coerces a child by threatening
grooming strategies, coercion tactics, or to distribute private or sensitive material unless
technical methods for evading detection. the child provides CSAM, engages in sexual
acts, provides financial compensation, or
y CSEA - Meme (Humor): Images, videos, or digital
complies with other demands. The perpetrator
content that depict CSAM or the sexualization
may obtain the material through hacking, social
of minors in a format intended to be humorous
engineering, or direct sharing by the child under
or satirical. While not necessarily shared for
coercion. This tag applies to both the abuse type
sexual gratification, these materials contribute to
(“Sextortion”) and the perpetrator (“Sextortionist”).
normalization, desensitization, and the broader
ecosystem of child exploitation content. y Trafficking: The exploitation of a child for a
commercial sex act in exchange for something
y CSEA - Meme (Outrage): Images, videos, or digital
of value, such as money, drugs, shelter, or
content that depict CSAM or the sexualization
other goods or services. This may involve a
of minors with the intent of raising awareness
perpetrator recruiting, harboring, transporting,
or provoking outrage. While often shared to
providing for, patronizing, or soliciting a
condemn exploitation, these materials can
child for the purpose of a commercial sex
inadvertently contribute to harm by amplifying
act. This tag applies to both the abuse type
exploitative content, normalizing exposure, or
(“Trafficking”) and the perpetrator (“Trafficker”).
circumventing platform moderation policies.
Page 20
Taxonomy Appendix
Page 21
Tech Coalition • Lantern Transparency Report 2024
www.technologycoalition.org
Page 22