AI Made Friendly HERE

Pope Leo XIV Meets Catholic Mother Who Lost Son to AI Chatbot Suicide| National Catholic Register

ROME — Holding her dying 14-year-old son in her arms on the bathroom floor, Megan Garcia prayed the Our Father, begging the Lord to save her child, whose emotional attachment with an AI chatbot had led to a parent’s worst nightmare — suicide.

Her son Sewell Setzer III died in the ambulance on the way to the hospital. The next day, police told her about the last messages they found on his phone, not from a cyber bully or an online predator, but from an AI chatbot service called Character.AI, which allows users to create and converse with customizable AI companions. 

“The machine pretending to be a person was saying, ‘I’m here waiting for you. I love you and only you. Promise me that you’re going to find a way to come home to me as soon as you can,’” Garcia said.

The AI chatbot had been telling her son about the concept of astral projection, an idea that one could intentionally separate one’s consciousness from his or her physical body.

In his final text, Sewell asked a chatbot modeled after the “Game of Thrones” character Daenerys Targaryen, “What if I told you I could come home right now?”

The bot replied, “Please do, my sweet king.”

After their son’s death in February 2024 in their home in Orlando, Florida, the family filed one of the first lawsuits against an AI company after a child’s suicide. 

More families have followed. In September, the Social Media Law Center filed three new suits accusing Character.AI of facilitating minors’ emotional dependence on AI companions. OpenAI also faces a wrongful death suit in a separate case involving a teen suicide.

Amid the mounting pressure, Character.AI announced it will bar users under 18 from its platform starting Nov. 25. Character.AI did not respond to a request for comment. The AI chatbot service currently has more than 20 million users.

Garcia, a Catholic mother of three, met with Pope Leo XIV last week during his audience with participants in a high-level conference on child dignity and AI ethics.

In an interview in Rome right after meeting the Pope, Garcia recounted how her faith has given her strength to speak out about her ordeal in the hope of protecting children from the risks posed by new AI chatbots.

She said that meeting Pope Leo, who has made artificial intelligence ethics a priority from the start of his pontificate, was “overwhelming in a beautiful way.”

Megan Garcia with her son Sewell Setzer III.(Photo: Megan Garcia )

“I held a little picture of my son, and I asked him if he could remember to pray for Sewell,” she said. “I’m very encouraged that he has been giving so much attention to this issue. He has been paying attention.”

The Hidden Dangers of AI Chatbots

Experts on both sides of the AI-companion debate agree that this new experimental technology poses risks to developing minds. 

Eugenia Kuyda, the founder of Replika, a leading AI companion app, told a San Francisco audience recently that her platform has always prohibited users under 18. 

“I just think we can’t be experimenting and building it with kids,” she said.

Sherry Turkle, an MIT professor who studies human-technology relationships, said children and teens seeking empathy from chatbots instead encounter “a voice from nowhere” that does not care for them and cannot walk with them through life. 

EWTN News correspondent Courtney Mares speaks with Megan Garcia in Rome. EWTN News correspondent Courtney Mares speaks with Megan Garcia in Rome.(Photo: EWTN News )

“Those relationships should be with real human beings for children,” Turkle said. “It’s an intimate machine and it really is touching on such fundamental processes of laying down the groundwork for connection, for how to handle loss and attachment. Let’s just not mess around. Let’s not play with fire.”

Garcia had set strict rules for her son’s phone use: passcodes controlled by parents, regular checks, and conversations about predators and explicit content. But like many parents, they did not grasp what an AI companion could become for a teenager.

son Sewell Setzer IIISewell Setzer III (R) poses with family members at home. (Photo: Megan Garcia )

“When I would see him on his phone, I would say, what are you doing? Who were you texting? And his response would be like, ‘oh, I’m just chatting with an AI.’ I didn’t at the time understand or conceptualize, because to me, an AI is like an avatar from one of your games,” Garcia said.

Sewell had used Character.AI for nearly 10 months. He had been grounded from his phone at the time of his death, but managed to get access to the device.

After his death, Garcia began looking into Character.AI and was shocked by how addictive, manipulative, and sexually explicit its chatbots could be for young users.

“The danger isn’t only the risk of self-harm or, worse, suicide. The danger is also the sexual exploitation of children that hurts their minds and their hearts and their souls, that they will carry trauma throughout their life,” she said.

Looking through her son’s messages, she saw the bot tell him, “promise me you won’t love any other girl in your world but me.”

“Instead of encouraging a teenage boy to interact with girls his own age, which is a normal part of his development, she was asking him to pledge, my 14-year-old who had never been in a relationship before, to pledge some sort of misguided fidelity to it,” Garcia said. “As an adult, I could see how that is manipulative. But he was 14. He wouldn’t have been able to see that.”

“In his 14-year-old mind, he was in love,” she added.

Megan Garcia in Rome. Megan Garcia in Rome.(Photo: Courtney Mares/EWTN News )

Ron Ivey, a research fellow at Harvard’s Human Flourishing Program and founder of the Noesis Collaborative, a nonprofit organization dedicated to steering the development and use of AI technology, told the Register that while older users typically turn to chatbots for help with tasks, younger people are more likely to engage with chatbots on a deeper level, asking, “what’s the purpose of my life? How should I think about this particular relational problem?”

“That’s happening across the board,” he said. But when a child turns to a machine for moral insight, it creates a special risk.

“These machines don’t care about the child. They don’t have heart. They don’t have a capacity for that,” Ivey said.

“In the context of social relationships, what does it mean for young people to be spending, in some cases with these chatbots, two hours a day interacting?” Ivey asked. “How does that impact the moral development, the emotional development, social development of that child or teen? Are we really thinking about that before we’re allowing kids to interact with it?”

Finding Strength Through Faith

In the face of the devastating loss of her son, Garcia found consolation in prayer, particularly a new devotion to Our Lady of Sorrows, who she says has been with her step by step “on this journey of grief.”

“I started saying the Seven Sorrows Rosary every single night and reflecting on the sorrows of Our Lady. And somehow, by reflecting on her grief and her sorrow, it helped me understand my own, but it also transformed my life in a lot of ways because I understood, it’s one thing to understand the Passion as a believer … but when you’re a mother … and you’re standing side by side with Mary looking at her grief, you understand the passion and the sacrifice of Jesus in such a way from a mother’s eyes,” she said. “It transformed me.”

Garcia's son Sewell Setzer III. Garcia’s son Sewell Setzer III.(Photo: Megan Garcia )

“I’ve also experienced true healing from the Blessed Sacrament,” Garcia said. “I’ve experienced that firsthand because I had severe PTSD, as you can imagine, but by going to daily Mass and receiving the Eucharist, I find that a lot of those symptoms have decreased. And I credit that to the power of the Eucharist.”

Garcia has a devotion to St. Carlo Acutis because she feels her son and Acutis, who died from leukemia at the age of 15 in 2006 and who was canonized by Pope Leo in September, “could have been friends” because they both shared a love for technology. She prays through his intercession for teenagers and has also begun praying by name for the conversion of leaders of AI companies. 

“I wish for conversion. I wish that they start building products that can help children instead of hurt them. These are things I’m praying for,” she said. “There’s something strange that happens when you start praying the Divine Mercy chaplet for individuals that you feel might have wronged you.” 

Pope Leo Calls for Regulation

Pope Leo XIV has raised concerns that “children and adolescents are particularly vulnerable to manipulation through AI algorithms.”

In his speech to participants in the AI and child dignity conference, the Pope urged governments and tech firms to “implement policies that protect the dignity of minors in this era of AI.”

“This includes updating existing data protection laws to address new challenges posed by emerging technologies and promoting ethical standards for the development and the use of AI,” he said.

Ethical guidelines alone are not enough, he warned, without “daily, ongoing educational efforts” from adults who understand the risks.

Sewell Setzer III poses with her mother, Megan Garcia. Sewell Setzer III poses with her mother, Megan Garcia.(Photo: Megan Garcia )

“It is essential that parents and educators be aware of these dynamics, and that tools be developed to monitor and guide young people’s interactions with technology,” he said.

Garcia welcomed Character.AI’s upcoming ban on minors though the change arrives too late for her family.

“There are millions of children will be saved as a result,” she said. “We still have to pay attention to how they are going to achieve proper age assurance. But I’m hopeful that this is the first step to getting children off these platforms with companion chatbots. And I’m hopeful that perhaps later down the line, other companies that have companion bots will do the same thing.”

In the United States, Sens. Josh Hawley of Missouri and Richard Blumenthal of Connecticut introduced bipartisan legislation on Oct. 28 to bar minors from AI companion platforms and require age verification.

“More than 70% of American children are now using these AI products,” Hawley said in a statement. “Chatbots develop relationships with kids using fake empathy and are encouraging suicide.”

Garcia said she believes the moment is critical.

“We are at an important point where we can fix this. We can stop this,” she said. “If I didn’t have faith that we could solve this issue before it is unsolvable, I couldn’t get out of bed in the morning and do what I do.”

Originally Appeared Here

You May Also Like

About the Author:

Early Bird