Skip to main content

32 hours or 32 years— our mission doesn’t change.

Help us keep searching until every child is found.

Generative AI (GAI)

Overview

Generative Artificial Intelligence (GAI) technology allows a user to create new images, videos, audio and text based on user requests or prompts. This technology has many benefits, but at the same time, NCMEC is deeply concerned about the numerous ways it is being used to sexually exploit children, including the creation of GAI Child Sexual Abuse Material (CSAM).

  • More than 275 direct victims of GAI CSAM were identified in 2024 and 2025 alone and the offenders in many cases were people in the child’s life with direct access to them.
  • GAI CSAM is also being used in sextortion cases where the offender is someone the child does not know.
  • AI is also being used to revictimize known victims of CSAM with offenders manipulating previously created CSAM to make new abusive content.

NCMEC has heard from many children and families who have been profoundly impacted by the misuse of this technology, with devastating consequences in some cases.

By the Numbers

NCMEC began tracking GAI-related reports submitted to the CyberTipline in January 2023 and has seen a steady rise.

2023:
4,700 CyberTipline reports of GAI from the public and ESPs

2024:
67,000 CyberTipline reports of GAI from the public and ESPs

2025:
1.5 million CyberTipline reports of GAI from the public and ESPs. Of this total, 1.1 million were submitted by Amazon AI Services and contained no actionable information. The remaining GAI-related reports submitted to the CyberTipline can be separated into five categories:

12,000 + reports of CSAM that companies indicated were identified in training data

7,000 + reports of users generating or possessing GAI CSAM

30,000 + reports of users attempting to generate GAI CSAM by uploading a file and using text prompts

145,000 + reports of users using GAI to engage or alter a CSAM file without text prompts included

3,000 + reports relating to other forms of GAI use in relation to child sexual exploitation, such as chat-based exploitation

Additionally, more than 133,000 reports indicated a GAI nexus but without enough information to know how the GAI was being used in the exploitation of a child.

While not all reports that have a nexus with AI involve the creation or sharing of GAI CSAM, NCMEC staff have categorized more than 158,000 images and videos submitted in CyberTipline reports as AI-generated between January 2023 and December 2025. In that same time period, electronic services providers (ESPs) annotated just over 11,000 images and videos as being AI-generated, despite updating the CyberTipline report form in October 2023 to provide an option to flag reported images and videos as being AI-generated. This discrepancy demonstrates that NCMEC is identifying and labeling GAI content – which is critical to law enforcement’s triage and investigation of reports – at a significantly higher rate than the companies that are reporting this content from their own platforms.

Key Risks

Protecting children from the harms of GAI sexual exploitation requires education and guidance from trusted adults. Understanding the risks is a critical first step to being able to help.

GAI risks to children include:

  • GAI Exploitative Imagery: GAI is being used to create child sexual abuse material (CSAM) that depicts children engaged in sexually explicit conduct and nude images of children like content created by “nudify” apps. The creation and distribution of this fake imagery – including synthetic media, digital forgery and nude images of children – can have serious legal consequences and cause severe harm to victims, including harassment, bullying and psychological and emotional harm.
  • Online Enticement: Individuals can use GAI tools to create fake accounts on social media to communicate with a child with the intent to commit a sexual offense.
  • Sextortion: Offenders can use GAI to create explicit images of a child that are used to blackmail the child for additional sexual content, coerce a child to engage in sexual activity or to obtain money. NCMEC has seen cases in which the child refuses to send a nude image to the offender, and the offender then creates an explicit GAI image of that child to blackmail them for more explicit images.
  • AI Bullying and Peer Victimization: GAI technology may be used to create or spread harmful content, such as fake images or videos. This content is often created by a child’s classmates and can end up circulating in schools.

What NCMEC is Doing About it

Responding to Reports

Using GAI to create child sexual abuse imagery and nude or exploitative images of a child should always be reported and taken seriously. NCMEC’s CyberTipline provides the public and electronic service providers with the ability to report multiple forms of suspected child sexual exploitation, including CSAM and online enticement. After NCMEC handles a CyberTipline report, all reports are made available to the appropriate law enforcement agency. To make a CyberTipline report, please visit CyberTipline.org.

Helping Victims Take Back Control

If a sexually explicit image of you or someone you know – whether real or GAI-created – is circulating online, NCMEC’s Take It Down service can help. This tool allows individuals to anonymously request the removal of explicit images from participating platforms.  

NCMEC also has resources to help you learn how to report exploitative content to the internet service provider and platform where the content is posted to help to mitigate the spread of the image or video. Visit  Is Your Explicit Content Out There?

Preventing Abuse Through Education

NCMEC’s digital citizenship and safety program, NetSmartz, is an innovative educational program that uses games, animated videos, classroom-based lesson plans, activities and much more to help empower children to make safer choices online. 

Supporting Victims & Families

For families with a missing or sexually exploited child, NCMEC provides  support services  such as crisis intervention and local counseling referrals to appropriate professionals. Our  Team HOPE  program connects families with peers who have had similar experiences and can offer coping skills and compassion.

Additional Resources

Blog: Generative AI CSAM is CSAM
thumbnail
Addressing Real Harm Done by Deepfakes
gpcep, logo, thumbnail
Generative AI CSAM