In January, pornographic deepfake images of Taylor Swift were viewed by millions of people on X and other social media platforms before the sites took action. No one has been identified and prosecuted for using artificial intelligence tools to create the graphic and sometimes violent images of Swift. The response from the tech sector has been limited.
For Melissa Hutchins, the episode was disappointing but not surprising.
Research shows that women and girls are overwhelmingly the victims of this sort of exploitation. But most of the attention on stopping malicious deepfakes is focused on political candidates and issues, Hutchins notes, which while incredibly important, misses the majority of the technology’s victims.
So Hutchins launched Certifi AI, a Seattle startup that is developing technology to identify and hopefully reduce the harm to girls and women who are targeted by deepfake bad actors.
Her motivation is personal.
Five years ago, her roommate’s husband began sending her unwanted, inappropriate text messages. The situation escalated, and Hutchins moved out and received a restraining order. The man’s cyberstalking intensified, and as Hutchins pursued legal action the man widened his attack, sending thousands of messages that threatened violence and murder to people close to Hutchins, as well as law enforcement on the case. The man, a cybersecurity consultant, tried covering his tracks.
Last month, he was convicted of cyberstalking, and awaits sentencing. His crime is punishable by up to five years in prison.
Hutchins said the experience made her feel “powerless.”
“These campaigns of hate and malicious intent — the characters behind these campaigns are usually very committed to their cause and they’re very dedicated to exercising humiliation and exercising power over their victims,” Hutchins said. “The process for healing is difficult when there’s very few ways to make the activity stop.”
The legal recourse for victims is limited. Washington lawmakers this year passed a bill expanding child pornography rules to criminalize the creation, possession and sharing of deepfake images involving minors. At least 10 other states have passed similar measures, according to Crosscut.
Hutchins wants to provide technology that could help address this sort of deepfake abuse. Certifi AI is training a model to identify deepfake images and videos, creating a tool for law enforcement and criminal prosecutions and to help media platforms quickly recognize and remove inappropriate and illegal deepfake content.
Hutchins eventually would like to offer the tool to the girls and women who are being targeted, so they can protect themselves and their identities.
The startup launched in August 2023. Hutchins, who is CEO and co-founder of Certifi AI, has bootstrapped the company and is currently the only full-time employee. Her previous roles include product management at The Disney Company and Expedia.
The other two co-founders are the chief technology officer and lead data architect, both of whom also hold 9-to-5 jobs. Hutchins is working to raise some investment dollars in order to bring them on full time. Certifi AI aims to release an initial product later this year.
There are other organizations working in the deepfake space, including Seattle-based nonprofit TrueMedia, an effort focused on political disinformation led by Oren Etzioni, a University of Washington professor and former CEO of the Allen Institute for AI. Hutchins has spoken with Etzioni and is eager to collaborate with others in the field where it makes sense.
Hutchins said her personal experience with the issue provides her with valuable, unique insights into the problem. She wants to see more women leading and founding efforts in this area given they’re the main targets of the attacks.
“Other CEOs and competitors in this space, they’re never going to be able to know what it feels like to not only be a woman in this type of problem space,” Hutchins said, “[or] having gone through the system of trying to get legal protection, and knowing all of the different hoops that you have to go through.”