Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

As Darth Vader's voice is replaced by an AI, we ask: at what point do special effects start being more machine than man?

After more than four decades, Darth Vader's voice is being provided by software... and he's not the first Star Wars character that's true for.

Star Wars: Obi-Wan Kenobi
Image credit: Lucasfilm

In the mythology of Star Wars, Darth Vader is a terrifying amalgam of man and machine – a dehumanized (literally) shell of a person with pieces ripped out and replaced with technology in order to better serve his master. In not entirely unrelated news, it emerged recently that James Earl Jones has retired as the voice of Darth Vader – a role he originated in 1977, and has returned to in the years and decades since – with his replacement being Respeecher, an Artifically Intelligent computer program drawing from recordings of Jones as the character.

This isn’t science fiction; As Vanity Fair revealed, Ukrainian start-up Respeecher provided the voice of Darth Vader in this year’s Star Wars: Obi-Wan Kenobi mini-series on Disney+, and few noticed. No-one should be surprised, though; in the past few years, Lucasfilm in particular has been pushing the envelope when it comes to finding ways to allow technology to evoke the childhood of its eager audience.

That's No Man

Take the appearance of the young Luke Skywalker in 2021’s The Book of Boba Fett; that wasn’t a digitally de-aged Mark Hamill, as had been the case with the character’s appearance in 2020’s The Mandalorian season two finale. Instead, Boba Fett’s Luke was a deep fake building off on-set performances from two actors named Scott Lang and Graham Hamilton. In both cases, the voice for Luke didn’t come from Hamill, or any other human being; it, too, came from Respeecher, as revealed by Mandalorian showrunner Jon Favreau in a documentary about the show that aired on Disney+.

One of the Mandalorian sound editors explained the process in the Disney+ documentary. “We had clean recorded ADR from the original films, a book on tape [Hamill had] done from those eras, and then also Star Wars radio plays he had done back in that time,” he said. “I was able to get clean recordings of that, feed it into the system, and they were able to slice it up and feed their neural network to learn this data.” The same process, presumably, explains how James Earl Jones was so ably replaced as the Dark Lord of the Sith – a fact undoubtedly added by the distortion effects traditionally applied to Darth Vader’s vocalizing. An amalgam of man and machine, indeed.

But again, even this is nothing new for Star Wars. After all, this is a franchise that digitally resurrected Peter Cushing for 2017’s Rogue One: A Star Wars Story, 23 years after the actor’s death. (It also created a CGI version of a young Carrie Fisher for the same movie, using actor Ingvild Deila as a body double, but such things fade into the background when faced with a digital zombie stranded in the uncanny valley in such an important role.) At the time, Lucasfilm and Rogue One filmmakers faced no small amount of backlash for what it had done, to the point where John Knoll, chief creative officer of Industrial Light & Magic – which created the character – publicly pushed back, telling critics that Cushing’s estate had signed off on its work. “We weren’t doing anything that I think Peter Cushing would’ve objected to,” he said at the time. “I think this work was done with a great deal of affection and care.”

As Above, So Below

It’s something to consider, the idea that the work is done with affection and care. Beyond the implications for filmmakers and, especially, actors, this notion of artificial actors and AI-programmed performances places a great deal of pressure on the programmers, designers, and engineers responsible behind the scenes. One of the more remarkable things in the same Vanity Fair piece that revealed that Respeecher was responsible for Darth Vader’s dialogue was the discovery that those working at the company are not only based in the Ukraine, but were working on the show as Russian forces invaded the country – a fact that, impressively, didn’t stop them from working on the project.

“If everything went bad, we would never make these conversions delivered to Skywalker Sound,” one of the workers is quoted as saying, explaining why they felt compelled to keep working despite everything. Elsewhere in the story, the founder of Respeecher talks about the emotional cost of the project, and others of a similar ilk, on workers. “You’ve seen how united and resilient Ukrainians are at this moment, but in terms of how we live now: We wake up, we go to work, and then we go home and try to get some sleep. I’m currently separated from my family. So my wife and daughter, they’re abroad. I brought them to the border as soon as it all started.”

The idea that visual and audio effects work has a hidden cost is far from a new idea; just this summer, there were multiple stories appearing across the media about the overwork and burnout suffered by effects houses working on projects for Lucasfilm’s corporate sibling, Marvel Studios, as well as the impact this had on the quality of work being produced. Obviously, there are extenuating circumstances with regards to the invasion of the Ukraine, as well as those needing to continue to work in order to support family sent out of the country as a result of that invasion; similarly, while Lucasfilm and Marvel share a number of creative staff and a parent company, there’s an argument to be made that it’s unfair to cross the streams in this way, and pull these two separate circumstances together. And yet…

Dinosaurs and Other Impossible Beasts

In addition to Obi-Wan Kenobi, The Mandalorian, The Book of Boba Fett, and everything else I’ve mentioned above, there’s a documentary on Disney+ about Industrial Light & Magic; it’s a six episode history of the effects shop and how it revolutionized filmmaking on more than one occasion, from its origins creating the effects for the original Star Wars in 1977 to its early, eye-popping CGI visuals in both Terminator 2: Judgment Day and Jurassic Park. It’s a genuinely fun, informative watch that I heartily recommend, not least of which because there’s a moment when discussing the experiments in digital effects for 1993’s Jurassic Park when those involved talk about the realization that, if done properly, virtual effects work could do so much more than anyone had previously imagined.

That’s still true today; if anything, it’s even more true, with audiences left unsure whether or not characters onscreen are real or virtual – or, indeed, what the difference actually is, effectively. Yet, increasingly often, it’s being done at an excessive cost to those creating the effects, and as AI characters become more and more common (which they almost certainly are; consider the relative failure of Solo: A Star Wars Story, the one time that Lucasfilm tried to recast a high-profile character, and compare it to the success of projects using computer-generated recreations of actors), it’s arguably coming at a cost to actors, as well.

Perhaps Darth Vader isn’t the best character to reference when thinking about the topic, after all. Let’s go with Jurassic Park, instead. What Jeff Goldblum’s Ian Malcolm says in that movie, about cloning, might be equally applicable to the current trend in visual effects work: “Your scientists were so preoccupied with whether they could, they didn't stop to think if they should.”


On the very opposite end of the effects spectrum, why not revisit the Emerald City Comic Con panel for Doug Jones, the performer who’s been bringing monsters, aliens, and the unknown to life for decades at this point?