{"id":31416,"date":"2025-07-10T09:01:51","date_gmt":"2025-07-10T09:01:51","guid":{"rendered":"http:\/\/medexperts.pro\/?p=31416"},"modified":"2025-07-10T09:26:22","modified_gmt":"2025-07-10T09:26:22","slug":"a-i-generated-images-of-child-sexual-abuse-are-flooding-the-internet","status":"publish","type":"post","link":"https:\/\/medexperts.pro\/?p=31416","title":{"rendered":"A.I.-Generated Images of Child Sexual Abuse Are Flooding the Internet"},"content":{"rendered":"<div><\/div>\n<p id=\"article-summary\" class=\"css-79rysd e1wiw3jv0\">Organizations that track the material are reporting a surge in A.I. images and videos, which are threatening to overwhelm law enforcement.<\/p>\n<section class=\"meteredContent css-1r7ky0e\">\n<div class=\"css-s99gbd StoryBodyCompanionColumn\" data-testid=\"companionColumn-0\">\n<div class=\"css-53u6y8\">\n<p class=\"css-at9mc1 evys1bk0\">A new flood of child sexual abuse material created by artificial intelligence is hitting a tipping point of realism, threatening to overwhelm the authorities.<\/p>\n<p class=\"css-at9mc1 evys1bk0\">Over the past two years, <a class=\"css-yywogo\" href=\"https:\/\/www.nytimes.com\/interactive\/2025\/06\/29\/business\/ai-video-deepfake-google-veo-3-quiz.html\" title>new A.I. technologies<\/a> have made it easier for criminals to create explicit images and videos of children. Now, researchers at organizations including the Internet Watch Foundation and the National Center for Missing &amp; Exploited Children are warning of a surge of new material this year that is nearly indistinguishable from actual abuse.<\/p>\n<p class=\"css-at9mc1 evys1bk0\">New data released Thursday from the Internet Watch Foundation, a British nonprofit that investigates and collects reports of child sexual abuse imagery, identified 1,286 A.I.-generated videos of child sexual abuse so far this year globally, compared with just two in the first half of 2024.<\/p>\n<p class=\"css-at9mc1 evys1bk0\">The videos have become smoother and more detailed, the organization\u2019s analysts said, because of improvements in the technology and collaboration among groups on hard-to-reach parts of the internet called the dark web to produce them.<\/p>\n<\/div>\n<\/div>\n<div data-testid=\"Dropzone-1\"><\/div>\n<div class=\"css-s99gbd StoryBodyCompanionColumn\" data-testid=\"companionColumn-1\">\n<div class=\"css-53u6y8\">\n<p class=\"css-at9mc1 evys1bk0\">The rise of lifelike videos adds to an explosion of A.I.-produced child sexual abuse material, or CSAM. In the United States, the National Center for Missing &amp; Exploited Children said it had received 485,000 reports of A.I.-generated CSAM, including stills and videos, in the first half of the year, compared with 67,000 for all of 2024.<\/p>\n<p class=\"css-at9mc1 evys1bk0\">\u201cIt\u2019s a canary in the coal mine,\u201d said Derek Ray-Hill, interim chief executive of the Internet Watch Foundation. The A.I.-generated content can contain images of real children alongside fake images, he said, adding, \u201cThere is an absolute tsunami we are seeing.\u201d <\/p>\n<div class=\"css-kbghgg\">\n<div class=\"css-121kum4\">\n<div class=\"css-171quhb\"><\/div>\n<div class=\"css-asuuk5\">\n<div class=\"css-7axq9l\" data-testid=\"optimistic-truncator-noscript\">\n<div data-testid=\"optimistic-truncator-noscript-message\" class=\"css-6yo1no\">\n<p class=\"css-3kpklk\" data-tpl=\"t\">We are having trouble retrieving the article content.<\/p>\n<p class=\"css-3kpklk\" data-tpl=\"t\">Please enable JavaScript in your browser settings.<\/p>\n<\/div>\n<\/div>\n<div class=\"css-1dv1kvn\" id=\"optimistic-truncator-a11y\">\n<hr \/>\n<p>Thank you for your patience while we verify access. If you are in Reader mode please exit and\u00a0<a href=\"https:\/\/myaccount.nytimes.com\/auth\/login?response_type=cookie&amp;client_id=vi&amp;redirect_uri=https%3A%2F%2Fwww.nytimes.com%2F2025%2F07%2F10%2Ftechnology%2Fai-csam-child-sexual-abuse.html&amp;asset=opttrunc\">log into<\/a>\u00a0your Times account, or\u00a0<a href=\"https:\/\/www.nytimes.com\/subscription?campaignId=89WYR&amp;redirect_uri=https%3A%2F%2Fwww.nytimes.com%2F2025%2F07%2F10%2Ftechnology%2Fai-csam-child-sexual-abuse.html\">subscribe<\/a>\u00a0for all of The Times.<\/p>\n<hr \/>\n<\/div>\n<div class=\"css-1g71tqy\">\n<div data-testid=\"optimistic-truncator-message\" class=\"css-6yo1no\">\n<p class=\"css-3kpklk\" data-tpl=\"t\">Thank you for your patience while we verify access.<\/p>\n<p class=\"css-3kpklk\" data-tpl=\"t\">Already a subscriber?\u00a0<a data-testid=\"log-in-link\" class=\"css-z5ryv4\" href=\"https:\/\/myaccount.nytimes.com\/auth\/login?response_type=cookie&amp;client_id=vi&amp;redirect_uri=https%3A%2F%2Fwww.nytimes.com%2F2025%2F07%2F10%2Ftechnology%2Fai-csam-child-sexual-abuse.html&amp;asset=opttrunc\">Log in<\/a>.<\/p>\n<p class=\"css-3kpklk\" data-tpl=\"t\">Want all of The Times?\u00a0<a data-testid=\"subscribe-link\" class=\"css-z5ryv4\" href=\"https:\/\/www.nytimes.com\/subscription?campaignId=89WYR&amp;redirect_uri=https%3A%2F%2Fwww.nytimes.com%2F2025%2F07%2F10%2Ftechnology%2Fai-csam-child-sexual-abuse.html\">Subscribe<\/a>.<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/section>\n","protected":false},"excerpt":{"rendered":"<p>Organizations that track the material are reporting a surge in A.I. images and videos, which are threatening to overwhelm law enforcement.A new flood of child sexual abuse material created by artificial intelligence is hitting a tipping point of realism, threatening to overwhelm the authorities.Over the past two years, new A.I. technologies have made it easier for criminals to create explicit images and videos of children. Now, researchers at organizations including the Internet Watch Foundation and the National Center for Missing &#038; Exploited Children are warning of a surge of new material this year that is nearly indistinguishable from actual abuse.New data released Thursday from the Internet Watch Foundation, a British nonprofit that investigates and collects reports of child sexual abuse imagery, identified 1,286 A.I.-generated videos of child sexual abuse so far this year globally, compared with just two in the first half of 2024.The videos have become smoother and more detailed, the organization\u2019s analysts said, because of improvements in the technology and collaboration among groups on hard-to-reach parts of the internet called the dark web to produce them.The rise of lifelike videos adds to an explosion of A.I.-produced child sexual abuse material, or CSAM. In the United States, the National Center for Missing &#038; Exploited Children said it had received 485,000 reports of A.I.-generated CSAM, including stills and videos, in the first half of the year, compared with 67,000 for all of 2024.\u201cIt\u2019s a canary in the coal mine,\u201d said Derek Ray-Hill, interim chief executive of the Internet Watch Foundation. The A.I.-generated content can contain images of real children alongside fake images, he said, adding, \u201cThere is an absolute tsunami we are seeing.\u201d We are having trouble retrieving the article content.Please enable JavaScript in your browser settings.Thank you for your patience while we verify access. If you are in Reader mode please exit and\u00a0log into\u00a0your Times account, or\u00a0subscribe\u00a0for all of The Times.Thank you for your patience while we verify access.Already a subscriber?\u00a0Log in.Want all of The Times?\u00a0Subscribe.<\/p>\n","protected":false},"author":1,"featured_media":31418,"comment_status":"close","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-31416","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-technology"],"_links":{"self":[{"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/posts\/31416","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=31416"}],"version-history":[{"count":2,"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/posts\/31416\/revisions"}],"predecessor-version":[{"id":31419,"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/posts\/31416\/revisions\/31419"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=\/wp\/v2\/media\/31418"}],"wp:attachment":[{"href":"https:\/\/medexperts.pro\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=31416"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=31416"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/medexperts.pro\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=31416"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}