The sleep and meditation app Calm has released a story using the voice of the late actor, who died in 1997.
Jimmy Stewart’s voice is taking on a life of its own.
On Friday, the Calm meditation and sleep app released a bedtime story narrated by an AI-generated version of the actor’s famous drawl.
While Stewart died in 1997, Calm was able to resurrect his voice with the help of Respeecher, a Ukraine-based company that produces synthetic speech and voice clones with machine learning technology.
“It’s a Wonderful Sleep Story” recounts “a heartwarming new holiday tale” in the style of Stewart’s 1946 Christmas classic “It’s A Wonderful Life” and is available to subscribers of Calm Premium, which costs $69.99 per year.
The use of the “Vertigo” actor’s voice was approved by CMG Worldwide, the company that manages Mr. Stewart’s licensing, according to Variety.
In the audio shared by the outlet, Stewart begins by introducing himself as “Jimmy,” before telling listeners to make themselves “nice and comfortable.”
“It’s a heartwarming story of love, of loss, of hope and of joy,” the voice clone continues. “But most of all, it’s a wonderful sleep story.”
Calm has previously enlisted celebrities like Michael Bublé, Idris Elba, Matthew McConaughey, Mandy Moore and Harry Styles for their sleep stories, but the AI-generated content is new territory.
In a statement shared with The New York Times, Respeecher said it was able to re-create Stewart’s Western Pennsylvania phrasing by combining recordings from a voice actor with old clips of the “Rear Window” star, making sure “all the important tiny inflections” were “preserved.”
The company, who worked to reproduce Mark Hamill’s voice of Luke Skywalker for “The Mandalorian” and James Earl Jones’ voice of Darth Vader for “Obi-Wan Kenobi,” said it does not allow its technology to be used for “deceptive” purposes.
As AI technology has improved, there has been increased anxiety around the use of “deepfakes,” a term for AI-altered images and videos that often focus on distorting clips of public figures or news events.
“We know voice replication technology could be dangerous in the wrong hands,” Respeecher says on a website page entitled “Ethics.”
The company says it’s pledged never to use the voice of a private person or actor without their consent, but has used the voices of historical figures like Richard Nixon and Barack Obama to showcase the technology.
Stewart’s daughter Kelly Stewart Harcourt celebrated the project in a statement, saying, “It’s amazing what technology can do and wonderful to see Dad’s legacy live on this holiday season in new ways, like helping people find restful sleep and sweet dreams.”