
The facsimile of Pelkey thanked the judge and told his killer he believed in forgiveness, saying that “in another life, we probably could have been friends”. He ended the video with a farewell to his family: “Well, I’m going to go fishing now.”
It wasn’t a perfect likeness of Pelkey. His face moved stiffly, and his voice was clipped. But the video moved his family and friends and stirred the judge, who said he “loved that AI” in his closing remarks.
“I feel that that was genuine,” said Todd Lang, the Maricopa County Superior Court judge who ruled in the case. He sentenced Pelkey’s killer to 10-and-a-half years in prison, the maximum for manslaughter – which Wales had asked for.
Wales’ video joins a growing list of cases in which parties have brought generative artificial intelligence into the courtroom. Experts said the AI footage of Pelkey was striking for its novelty, and how well it was received.
AdvertisementAdvertise with NZME.
“This definitely caught a number of us by surprise,” said Diana Bowman, a law professor at Arizona State University.
Pelkey was killed in a road rage incident in Chandler, Arizona, in November 2021, court records show. While stopped at a red light, Pelkey left his car and approached another car whose driver had honked repeatedly at him. That driver, Gabriel Horcasitas, shot and killed him as he approached.
A jury convicted Horcasitas of manslaughter in March. As his sentencing approached, Wales contacted Pelkey’s friends and family and gathered dozens of written statements, video clips and photos to show the judge. Then she thought that she could do more.
“I said to myself, ‘Well, what if Chris could make his own impact statement?’,” Wales said.
Controversial AI now taps into legal proceedings, giving victims a chance to participate. Photo / 123RF
Wales’ husband, Tim Wales, a tech entrepreneur, had experience using generative AI to animate photos and replicate voices. She proposed creating a video of Pelkey.
“I won’t let it [be published] if it’s hokey or flat,” Stacey Wales recalled reassuring him at the time.
Tim Wales and a friend used AI tools to edit a photo of Pelkey, clone his voice based on old videos of him speaking, and animate his face so his eyes blinked and his mouth moved as he spoke. Wales wrote Pelkey’s speech herself – by hand and without AI, she said – based on what she thought her brother would say.
Wales wanted the toughest sentence allowable for Horcasitas, she said, but she wrote in Pelkey’s voice that he “believed in forgiveness and God who forgives”.
Then she showed her victim’s attorney, Jessica Gattuso.
“I thought it was very effective,” Gattuso said. “It was appropriate. I didn’t know what kind of objections we might get or pushback. … I did kind of prepare for that.”
AdvertisementAdvertise with NZME.
But no one objected when Wales played the video in court after dozens of other friends and family members gave their own tributes to Pelkey. Wales kept the video a surprise to her family. She also did not disclose it to the judge or Horcasitas’ attorneys; Arizona law does not require that, Gattuso said.
The video appeared to resonate with Lang, who praised it before delivering Horcasitas’ sentence. Lang requested a copy of the video to show his peers a few days after the hearing, Wales and Gattuso said.
Wales fared better in bringing AI-generated video into the courtroom than others who did so in different contexts. A New York man was scolded for using an AI avatar to represent him in an employment dispute in March. A Washington state judge rejected bystander video submitted as evidence in a triple murder case last year because it was enhanced with AI tools.
Bowman, the law professor, said Wales’ case avoided controversy probably because the video was introduced during a sentencing and wasn’t being used to determine the defendant’s guilt. It also helped that Wales, unlike the New York man, clearly introduced her video as AI-generated.
Gary Marchant, a professor of law, ethics and emerging technologies at Arizona State, said attorneys might have objected to showing a video that fabricates a victim’s voice to a jury.
“In most cases, it’s going to be possibly misleading and prejudicial, probably,” Marchant said. “So I think it’s dangerous to start using non-real evidence that is created by an AI, even though, in this particular case, I’m kind of sympathetic to it.”
AdvertisementAdvertise with NZME.
Arizona’s highest court is open to bringing AI into the legal process, state Supreme Court Chief Justice Ann Timmer said. The court formed an AI committee to investigate the risks of parties fabricating AI-generated evidence but has also begun using AI-generated avatars to explain court rulings on YouTube.
Timmer declined to comment on Wales’ video but said any problems that arise from using AI-generated evidence during a sentencing would be decided under the state’s existing guidelines for victim-impact statements.
“You can make statements that even can be emotional, but you can’t go so far as to deprive someone of a fundamentally fair trial,” Timmer said.
Wales said she didn’t think it was unfair to give a voice to her brother in court. The video would help keep his memory alive and gave her family closure after a long criminal trial, she said.
“Of course, AI is uncanny,” Wales said. “But in this moment, for Chris to be able to speak on his behalf, it was absolutely worth it.”