A former Disney Channel star just launched an app that lets you have video conversations with deceased loved ones. Yet the internet has spoken: “This is one of the most evil, psychotic things I’ve ever seen.”

Calum Worthy — who played Dez on Austin & Ally from 2011 to 2016 — co-founded 2Wai, an AI company that creates conversational video avatars of people, including those who’ve died. The app’s “HoloAvatar” feature needs just three minutes of uploaded footage to generate a digital replica that looks, talks, and moves like the original person. The avatar can then have real-time conversations with users in over 40 languages.

The promotional video Worthy posted on November 11 has been viewed more than 20 million times and sparked immediate outrage. “What if the loved ones we’ve lost could be part of our future?” the ad asks. But what if we shouldn’t?

The Video That Broke the Internet

The 2Wai promo opens with a pregnant woman video-calling her mother for pregnancy advice. Heartwarming, until you realize the mother is dead…she’s an AI recreation. The ad then jumps forward through time: the AI grandmother reads bedtime stories to her newborn grandson, gives him advice about his middle school crush, and eventually counsels him as an adult when he becomes a father himself.

“With 2Wai, three minutes can last forever,” the video declares. The responses were swift and brutal. Thousands of replies poured in, with users calling the app “nightmare fuel,” “demonic,” and “dystopian.” One viral response labeled it “beyond vile,” arguing that regular videos already preserve memories without the “AI guesswork.”

The backlash wasn’t just about technological squeamishness. Jonathan Haidt, social psychologist and author of The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness, called it “sick” and warned it represents “our lonely future unless we start saying no to it now.” Conservative commentator Matt Walsh wrote, “The people who create this stuff are evil. The people who use it are tragically misguided.”

Black Mirror Come to Life

Nearly everyone comparing 2Wai to something invoked the same reference of Black Mirror. Specifically, the 2013 episode “Be Right Back,” where a grieving widow creates an AI version of her dead boyfriend, eventually downloading it into a synthetic body. The episode ends with the android locked in an attic, brought out only for special occasions, which offers a chilling exploration of what happens when we refuse to let go.

The comparison isn’t unfair. 2Wai’s technology takes the Black Mirror premise and makes it accessible on your iPhone. The app is currently free in beta but will shift to a subscription model. That means ongoing payments to maintain your relationship with a digital ghost.

One darkly humorous response captured the concern perfectly: “You know they’ll introduce a tier where your dead relative starts reading you advertisements.”

The Grief Tech Gold Rush

2Wai isn’t alone in this space. The “grief tech” industry has been quietly growing for years, with companies racing to monetize mourning through AI. HereAfter AI, founded in 2019, creates “Life Story Avatars” from pre-death interviews, emphasizing consent from the living before they die. StoryFile offers interactive videos recorded before death; it filed for Chapter 11 bankruptcy in 2024 owing $4.5 million.

Replika, a chatbot service that lets users mimic deceased loved ones through text, faced backlash after a 2023 update “killed” personalized bots. A Belgian man’s 2023 suicide was linked to his eco-anxiety conversations with the platform.

2Wai raised $5 million in pre-seed funding in June 2025 from undisclosed investors. The company says it’s working with British Telecom and IBM. Worthy’s background as an actor who lived through the 2023 SAG-AFTRA strikes over unauthorized AI likenesses adds an ironic twist to his venture into digital resurrection.

What Experts Actually Worry About

The outrage isn’t just emotional; there are legitimate psychological and ethical concerns. Psychologists warn that “griefbots” could disrupt the natural grieving process, which requires learning to reconcile a person’s death with the visceral sense that they should still be here.

“We all want to feel close to our loved one after they die,” explains research on grief technology, “but if this technology can show evidence that it does no harm in properly controlled empirical studies, then it could prove an exciting way of memorializing.”

The problem, though, is that those studies don’t exist yet. As one Cambridge researcher put it, this is “a vast techno-cultural experiment.”

Mary-Frances O’Connor, author of The Grieving Body, notes that while all cultures have used available technology to connect with the dead – photos, videos, recordings – AI avatars that respond and interact cross a different threshold. They blur the line between memory and presence in ways we don’t fully understand.

There’s also the risk of emotional dependency. If you can talk to your dead mother anytime you want, when do you learn to live without her? Some experts worry the technology could trap users in prolonged grief rather than helping them heal.

The Legal Black Hole

Death creates legal complications. Privacy laws protect living people but offer almost no posthumous safeguards. Can someone create an avatar of your deceased relative without your permission? In most cases, yes. Federal law doesn’t prevent anyone from building bots out of the dead or living using data already in their possession.

California’s AB 1836, signed in September 2024, bans unauthorized AI replicas of deceased performers in audiovisual works without estate consent, with penalties up to $10,000. But it only covers entertainers, not regular people. And it doesn’t address what happens when family members disagree about creating an avatar, or when the subscription lapses and the avatar “dies” a second death.

2Wai claims to include family approvals for deceased avatars, but critics question enforcement. The company’s technology runs on “FedBrain,” which processes interactions on-device to limit AI “hallucinations” and keep responses tied to approved data. Still, the capacity for these bots to learn and deviate from recorded behavior raises questions about preserving someone’s true legacy.

More Than Just Ghosts

2Wai isn’t only about the dead. The app also features AI versions of historical figures like William Shakespeare, Frida Kahlo, and King Henry VIII as “real-time teaching assistants.” There are specialized AI coaches for cooking, astrology, and travel. And there’s a digitized version of Worthy himself, ready to share behind-the-scenes Disney anecdotes.

The platform positions itself as “building a living archive of humanity, one story at a time.” Worthy told Variety that 2Wai puts “control back in the hands” of celebrities who want “meaningful relationships with fans around the world” without being online 24/7.

But the dead relatives feature has overshadowed everything else. When your marketing centers on keeping grandma “alive” to meet great-grandchildren she’ll never know, the educational Shakespeare chatbot becomes an afterthought.

The Uncomfortable Questions

Perhaps the most troubling aspect isn’t the technology itself but what it reveals about us. We’re so uncomfortable with loss, so desperate to maintain connection, that we’ll pay monthly fees to text with simulations of people we loved.

As one critic noted: death and loss are normal parts of life. By creating technology that lets us avoid that reality, are we creating “dependent and lobotomized adults”? Are we pathologizing the natural impermanence of human existence?

The app also raises uncomfortable commercial questions. Picture this cancellation flow: “Are you sure you want to cancel your subscription and never talk to your dead parents again?” One user called Worthy a “parasite” for “preying on the deepest human feelings, looking for ways to leverage them for profit.”

Worthy hasn’t directly addressed the backlash. The 2Wai beta is live on the App Store, with Android coming soon. Whether the controversy kills the product or proves that any publicity is good publicity remains to be seen.

But we’re certainly now living in the exact future Black Mirror warned us about, and we’re doing it by choice.

Skip to content