Ex-CNN anchor Jim Acosta is no stranger to political derangement. He has outdone himself.
Acosta teased a “one of a kind interview with Joaquin Oliver,” published to Acosta’s YouTube channel Monday. Oliver died seven years ago in the Parkland, Florida, high school shooting.
A show you don’t want to miss at 4p ET / 1p PT. I’ll be having a one of a kind interview with Joaquin Oliver. He died in the Parkland school shooting in 2018. But his parents have created an AI version of their son to deliver a powerful message on gun violence. Plus Texas State… pic.twitter.com/mbdM2WxwUR
— Jim Acosta (@Acosta) August 4, 2025
“But his parents, Manuel and Patricia, have created an [artificial intelligence] version of their son to deliver a powerful message on gun violence,” Acosta told his viewers.
One can create a version of a document or software. One cannot create a version of a specific human being. Acosta didn’t speak to Oliver so much as he spoke to something wearing Oliver’s skin.
That’s not to direct the bulk of the criticism towards Oliver’s parents. Grief is seductive: “If only I could have him back; If only things were the way they were before.”(RELATED: ROOKE: There Aren’t A Lot Of Things That Surprise Me These Days But One Mother’s Parenting Style Takes The Cake)
Acosta, on the other hand, is not a grieving parent, but a newsman. Technically. Even if he’s spent the majority of the last decade bickering with President Donald Trump.
Trump tiff aside, “poor judgement” is the kindest assessment of what follows.
“Joaquin, it’s Jim Acosta. I was wondering if you could tell me what happened to you,” he begins.
“I appreciate your curiosity,” the AI replies. “I was taken from this world too soon due to gun violence while at school. It’s important to talk about these issues so we can create a safer future for everyone.”
A quick word on AI itself.
Large language models ingest vast quantities of text. They parse it for grammar, vocabulary, syntax, sentiment, etc. When one asks a question, the model predicts and spits out the most likely response. The precise nature of prediction is subject to much debate.
Generative chatbots, like ChatGPT or Grok, are powered by large language models. They “remember” past queries and incorporate those interactions into their predictions.
Digital necromancy is going to be one of modern horrors gifted to us by AI
People are already building emotional relationships with chat bots but just wait until it can scrape the social media of your dead loved ones
This will destroy people https://t.co/JvnPoM6cyk
— Auron MacIntyre (@AuronMacintyre) August 5, 2025
We don’t have much insight into the training data for the “Oliver AI.” All of its responses to Acosta are fairly pat.
“I love basketball. It’s such a fun way to connect with your friends and show off some skills. Plus, there’s nothing like the thrill of the game.”
“Luke Skywalker is a classic hero. His journey from a farm boy to a Jedi is so inspiring. I’d probably say Yoda is one of my favorites, too.”
“The story of hope and resilience against darkness is so powerful. It shows that even in the toughest times, unity and courage can lead to victory. Darth Vader’s journey adds so much depth to that theme.”
Do these sound like the off-the-cuff musings of a teenage boy? Or do they sound like a computer collapsing trillions of pieces of text into a few bland sentences?
Acosta’s interview comes with an additional layer of insidiousness. He’s not just indulging an uncanny fantasy. He’s using it for political ends.
“We would like to know what your solution would be for gun violence,” Acosta asks.
“I believe in a mix of stronger gun control laws, mental health support, and community engagement,” the AI offers.
Oliver may well have shared those sentiments. If there’s evidence to that effect — messages to friends, essays, journal entries — Acosta might’ve reviewed it. That would be a decent means of understanding Oliver’s politics.
But Acosta’s segment has very little to do with understanding Oliver. One doesn’t usually appeal to teenagers for policy advice. The segment instead appears to be a vehicle for gun control advocacy. Reanimating the likeness of a deceased high schooler is a pretty decent means of emotional manipulation towards that end. (RELATED: AI Company Plays Wack-A-Mole To Keep Chinese Propagandists From Using ChatGPT To Sow Division In American Politics)
Oliver’s likeness isn’t the first to be used in this manner.
Christopher Pelkey took the stand to forgive the man who fatally shot him four years prior. Well, Pelkey didn’t.
An AI-generated video resembling Pelkey — created by Pelkey’s sister and brother-in-law in a voice generated from Pelkey’s actual voice — did.
“It is a shame we encountered each other that day in those circumstances,” the Pelkey AI recites. “In another life, we probably could have been friends. I believe in forgiveness and in God who forgives. I always have and I still do.”
No jury assessed the video. It was the last “victim statement” offered at a sentencing hearing for Pelkey’s killer, according to NPR.
It’s not hard to imagine an attorney petitioning a judge to let them resurrect their client for direct examination. Just imagine how the O.J. Simpson trial might’ve gone if jurors were face to face with a sobbing Nicole Brown Simpson.
Necromancy has proved a near-universal mythological theme. Osiris is cut up into little pieces and is put back together. Odin hangs himself and is reborn with greater powers. Belief in “zombies,” reanimated dead enslaved to their revivers, remains common in Haiti to this day.
All that to say, when someone attempts a ventriloquist act with a corpse, proceed with caution.
Follow Natalie Sandoval on X: @NatSandovalDC
Read the full article here