Skip to content

Digital Deceit: AI Fakes Objectify Missing People

Among the latest queasy applications of artificial intelligence, families of missing people are using AI to generate digital simulations of their absent loved ones:

Since artificial intelligence development has become available to the masses, interesting, funny, and sometimes odd videos have been created, but AI is also being used for good: raising awareness for those who are missing and murdered.

…Creating AI-generated missing videos, told by the person who disappeared, can be a powerful tool. It’s a way for Tawana Spann to honor her murdered son.

Spann posts the videos online.

“I’ve been an advocate for missing persons and homicide victims ever since. It makes me feel like he didn’t die in vain,” Spann said. “It’s a marketing tool for their cases.”

Video collaborator Laura Bollock, who is also an administrator for the South Dakota Missing Persons Facebook page, helps families looking for loved ones in South Dakota.

“It’s giving her back her voice and letting her kinda speak for herself. What I’m hoping for is really that we can just touch the heart of somebody who maybe has information,” Bollock said [Beth Warden, “Missing and Murdered Victims Tell Their Own Stories Through AI,” KSFY, 2025.11.18].

The fake video of Rachel Cyriacks, who went missing in 2013, that prompted Warden to write this story is not labeled as a fake. Nothing on the If We Could Speak website says the organizers used artificial intelligence to create digital mock-ups of the missing people. Even if you are using AI “for good”, as Warden labels it, you ought to clearly label your AI video, audio, and text as computer-generated products, not actual testimony from actual people.

Of course, the fake video hardly needs a disclaimer for even casually observant viewers to recognize its AI provenance. The simulation speaks with repetitive intonation and makes repetitive gestures with the hands, face, head, and shoulders that all feel disjointed from the the words and the varying emotions one might expect to accompany varying human expressions. The computer places a tattoo on the simulation’s neck but fails to accurately recreate the actual tattoo Cyriacks had on her neck, “Brad“, the name of the estranged husband who abused her and is suspected of killing her. (Maybe that detail is too grim for a gentle computer simulation, but if you’re really trying to help find a missing person, accurate details matter.) The computer also fails to recreate the tattoos on Cyriacks’s right forearm. The simulation stands in front of a desert background, which has no seeming connection to Cyriacks’s life in Woonsocket and the James River Valley.

But fake-Cyriacks doesn’t say anything to make clear that “she” isn’t “real” until halfway through the video, when the simulation says, “I went to pick him up. Nobody knows why. Maybe I thought he’d changed. That was the last time anyone saw me alive.” No living human would say those words. At best, this simulation presents a ghost.

I suppose such sloppy fake videos are the next techno-logical step from generating fake photographs using algorithms to show what kidnapped children might look like now 10 or 20 years after their still unsolved disappearances. But the language in Warden’s report gives me pause.

Are we sure that sloppily animating absent people is “good”?

Do we really want to call this AI application a “marketing” tool? “Marketing” is what we do to sell products. Using that word seems to commoditize the mystery, objectifying the missing person.

Can we with a straight face say a deepfake (or, given its low quality, should we say shallowfake?) gives any person her voice back and lets her speak for herself? This is not the missing woman’s voice. The missing woman is not speaking for herself. A computer is masquerading as a human. This cheap puppetry does not revive or empower; animating a ghost is merely grotesque.

And presenting AI recreations of humans without explicit statements that they are fakes, regardless of the good causes those fakes are deployed to support, is not good. It is a lie.

Leave a Reply

Your email address will not be published. Required fields are marked *