AI update: FBI deepfake warning and new NY law
In New York, an amendment to the NDII law (formerly known as “revenge porn”) wraps “deep fake” images into the definition of ‘unlawful dissemination or publication of an intimate image’ – a victory for victims and those that advocate for them.
We understand this to be the first Non-Consensually Distributed Intimate Image law in the country to include a mention of “deepfake” images.
This new piece of legislation was created based on the experiences of survivors who suffered image-based abuse through deepfakes. It has historically been a defense to an NDII charge to say “it wasn’t really her, it was partially fake” – a hollow and illegitimate defense, given the disruption and trauma deepfakes can cause, and the easy accessibility of technology that enables anyone to create extremely realistic images that could be construed as real.
It also means that a victim can now get an Order of Protection if someone has or is distributing deep fakes of them. Senate Bill S01042A has passed the senate and assembly, and we’re waiting for the governor to sign it.
“I am so proud of survivors and advocates in New York State for taking the lead on image-based abuse and intimate partner violence. Going forward, our NY clients will benefit from the recognition that deep fakes cause a unique harm that can now be addressed when we get an Order of Protection to stop further abuse, and when we advocate with law enforcement during criminal proceedings.”
– C.A. Goldberg, PLLC Managing Associate Annie Seifullah
FBI warns of child sexual abuse material and sextortion risks posed by deepfakes
Last week, the FBI released a warning to the public of malicious actors creating “deepfakes” by manipulating photographs or videos to target victims and sextort or harass them. In a recent Public Service Announcement they stated, “The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content. The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”
Explaining how deep fakes are being used in image-based sexual abuse of children and adults, they said, “Malicious actors use content manipulation technologies and services to exploit photos and videos—typically captured from an individual’s social media account, open internet, or requested from the victim—into sexually-themed images that appear true-to-life in likeness to a victim, then circulate them on social media, public forums, or pornographic websites. Many victims, which have included minors, are unaware their images were copied, manipulated, and circulated until it was brought to their attention by someone else. The photos are then sent directly to the victims by malicious actors for sextortion or harassment.”
They also advised about how deep fakes could be used in Sextortion saying, “The FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats.”
“Based on recent victim reporting, the malicious actors typically demanded: 1. Payment (e.g., money, gift cards) with threats to share the images or videos with family members or social media friends if funds were not received; or 2. The victim send real sexually-themed images or videos.”
You can read more about how deepfakes are weaponized here.