Chris Pelkey's AI-generated victim impact statement draws criticism
PHOENIX - An AI-generated version of a road rage victim speaking at his killer's sentencing is turning heads.
Big picture view:
It was the first time in Arizona history, and possibly nationwide, that AI has been used for a victim’s own impact statement.
AI is rapidly changing our world and industries, and the law is no different. It has strict rules on what can come into legal proceedings and sanctions for having AI cite fake cases in briefs.
That’s why some are urging caution before allowing this kind of technology into courtrooms.
The backstory:
The unthinkable happened in an Arizona courtroom in the road rage death case of Chris Pelkey.
"To Gabriel Horcacitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," the AI-generated video of Chris said.
He spoke directly to the man accused of killing him at his sentencing.
"I am a version of Chris Pelkey recreated through AI that uses my picture and my voice profile," Chris said.
Chris's sister, Stacey, helped create the video.
"I had my own thoughts and feelings about how much time I wanted the sentence to be," she said.
That’s why she said she had the video saying what she believes Chris would have said.
"I believe in forgiveness and in God, who forgives," Chris said.
The other side:
Others fear using this technology is stretching beyond ethical and moral boundaries.
"Human beings have thoughts, feelings and emotions. It doesn’t matter how much we try to simulate that with AI. It’s simply inauthentic," Jason Lamm, the lawyer representing Horcasitas, said.
Horcasitas was convicted of manslaughter and endangerment in Chris's death. He’s now filed an appeal.
"It’s just simply inauthentic to put the words in the mouth of the likeness. It’s much like Geppetto putting words in the words of Pinnocio’s mouth. Those words were a stark contrast from the reality that numerous witnesses testified to, those being Chris Pelky’s last words, of challenging my client to a fight violently getting out of his car in a crowded intersection waving his arms in the air," Lamm said.
It’s not unusual for victim impact statements to involve photo galleries, PowerPoints or videos. But, Lamm says AI is very different.
"It’s one thing to show a video or a photo where we have some indicia of reliability and authenticity. But when it comes to AI, you can make a likeness that’s presented, in this case to a court, to say absolutely anything you want, no matter how untethered it is to the facts and reality," Lamm explained.
He’s not alone in his fears either.
There are dozens of law school review articles that contemplate the ethical dilemma of this technology.
What's next:
The Arizona Supreme Court has created a steering committee for AI’s role in the courts.

(Full interview) Family creates AI victim impact statement for slain brother
Chris Pelkey's sister used artificial intelligence to create his victim impact statement, using it in court to face his killer. It's believed to be unprecedented in a courtroom.