News and Music Discovery
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

After an Arizona man was shot, an AI video of him addresses his killer in court

A screenshot of the AI generated video of Christopher Pelkey.
YouTube
A screenshot of the AI generated video of Christopher Pelkey.

For two years, Stacey Wales kept a running list of everything she would say at the sentencing hearing for the man who killed her brother in a road rage incident in Chandler, Ariz.

But when she finally sat down to write her statement, Wales was stuck. She struggled to find the right words, but one voice was clear: her brother's.

"I couldn't help hear his voice in my head of what he would say," Wales told NPR.

That's when the idea came to her: to use artificial intelligence to generate a video of how her late brother, Christopher Pelkey, would address the courtroom and specifically the man who fatally shot him at a red light in 2021.

On Thursday, Wales stood before the court and played the video — in what AI experts say is likely the first time the technology has been used in the U.S. to create an impact statement read by an AI rendering of the deceased victim.

A sister looking for the right words

Wales has been thinking about her victim impact statement since the initial trial in 2023. The case was retried in 2025 because of procedural problems with the first trial.

The chance to speak in court meant a great deal to Wales, who held back her emotions throughout both trials to avoid influencing the jury.

"You're told that you cannot react, you cannot emote, you cannot cry," she said.  "We looked forward to [sentencing] because we finally were gonna be able to react."

Wales' attorney told her to humanize Pelkey and offer a complete picture of who he was.

So Wales went on a mission. She said she contacted as many people from Pelkey's life — from his elementary school teacher to high school prom date to the soldiers he served alongside in Iraq and Afghanistan.

A photo of Christopher Pelkey walking his sister Stacey Wales down the aisle at her wedding.
A photo of Chris Pelkey walking his sister Stacey Wales down the aisle at her wedding. / Stacey Wales
/
Stacey Wales
A photo of Christopher Pelkey walking his sister Stacey Wales down the aisle at her wedding.

In total, Wales gathered 48 victim impact statements — not counting her own. When it was time to write hers, she was torn between saying how she truly felt and what she thought the judge would want to hear.

"I didn't wanna get up there and say, 'I forgive you,' 'cause I don't, I'm not there yet," she said. "And the dichotomy was that I could hear Chris' voice in my head and he's like, 'I forgive him.'"

Pelkey's mantra had always been to love God and love others, according to Wales. He was the kind of man who would give the shirt off his back, she said. While she struggled to find the right words for herself, Wales said writing from his perspective came naturally.

"I knew what he stood for and it was just very clear to me what he would say," she added.

A digitally trimmed beard and an inserted laugh

That night, Wales turned to her husband Tim, who has experience using AI for work.

"He doesn't get a say. He doesn't get a chance to speak," Wales said, referring to her brother. "We can't let that happen. We have to give him a voice."

Tim and their business partner Scott Yentzer had only a few days to produce the video. The challenge: there's no single program built for a project like this. They also needed a long, clear audio clip of Pelkey's voice and a photo of him looking straight to the camera — neither of which Wales had.

Still, using several AI tools, Wales' husband and Yentzer managed to create a convincing video using about a 4.5-minute-video of Pelkey, his funeral photo and a script that Wales prepared. They digitally removed the sunglasses on top of Pelkey's hat and trimmed his beard — which had been causing technological issues.

Wales, who was heavily involved in making sure the video felt true to life, said recreating her brother's laugh was especially tough because most clips of Pelkey were filled with background noise.

The experience made Wales reflect on her own mortality. So one evening, Wales stepped into her closest and recorded a nine-minute-video of herself talking and laughing — just in case her family ever needs clear audio of her voice someday.

"It was a weird out-of-body experience to think that way about your own mortality, but you never know when you're going to not be here," she said.

The night before the sentencing hearing, Wales called her victim rights attorney, Jessica Gattuso, to tell her about the video. Gattuso told NPR that she was initially hesitant about the idea because she had never heard of it being done before in Arizona court. She was also worried that the video may not be received well. But after seeing the video, she felt compelled that it should be viewed in court.

"I knew it would have an impact on everyone including the shooter, because it was a message of forgiveness," Gattuso said.

The AI generated video helped with healing, sister says

Ten people spoke in support of Pelkey at the sentencing hearing. The AI-generated video of him went last.

"Hello. Just to be clear for everyone seeing this, I'm a version of Chris Pelkey recreated through AI that uses my picture and my voice profile," the AI avatar said.

The video went on to thank everyone in Pelkey's life who contributed an impact statement and attended the hearing. Then, the video addressed his shooter, Gabriel Paul Horcasitas.

"It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God who forgives. I always have and I still do," the video said.

The video ended with the avatar encouraging everyone to love one another and live life to the fullest. "Well, I'm gonna go fishing now. Love you all. See you on the other side," it concluded.

Neither the defense nor the judge pushed back. Later in the hearing, Judge Todd Lang said, "I loved that AI. Thank you for that."

A photo of Christopher Pelkey.
/ Stacey Wales
/
Stacey Wales
A photo of Christopher Pelkey.

He added, "It says something about the family because you told me how angry you were and you demanded the maximum sentence. And even thought that's what you wanted, you allowed Chris to speak from his heart, as you saw it. I didn't hear him asking for the maximum sentence." Horcasitas received 10.5 years for manslaughter.

Wales said she didn't realize how deeply the video would affect her and her family. For her teenage son, it was a chance to hear his uncle say goodbye. For Wales, it gave her the strength to finally look back at photos of her brother.

"Going through this process of AI and what he'd sound like and trimming his beard and inserting laughs and all these other things, it was very cathartic and it was part of the healing process," she said.

What AI and legal experts say

Over the years, there have been a growing number of examples testing the bounds of AI's role in the courtroom.

For instance, in 2023, President Trump's former lawyer Michael Cohen unwittingly sent his attorney bogus AI-generated legal citations. More recently, last month, a man attempted to use an AI-generated lawyer avatar in court — an effort that was quickly shut down by the judge.

But the use of AI for a victim impact statement appears novel, according to Maura Grossman, a professor at the University of Waterloo who has studied the applications of AI in criminal and civil cases. She added, that she did not see any major legal or ethical issues in Pelkey's case.

"Because this is in front of a judge, not a jury, and because the video wasn't submitted as evidence per se, its impact is more limited," she told NPR via email.

Some experts, including Grossman, predict generative AI will become more common in the legal system, but it raises various legal and ethical questions. When it comes to victim impact statements, key concerns include questions around consent, fairness and whether the content was made in good faith.

"Victim statements like this that truly try to represent the dead victim's voice are probably the least objectionable use of AI to create false videos or statements," Gary Marchant, a professor of law, ethics and emerging technologies at Arizona State University's Sandra Day O'Connor College of Law, wrote in an email.

"Many attempts to use AI to create deep fakes will be much more malevolent," he added.

Wales herself cautions people who may follow in her footsteps to act with integrity and not be driven by selfish motives. "I could have been very selfish with it," she said. "But it was important not to give any one person or group closure that could leave somebody else out."

Copyright 2025 NPR

Juliana Kim
Juliana Kim is a weekend reporter for Digital News, where she adds context to the news of the day and brings her enterprise skills to NPR's signature journalism.