HONG KONG (SE): Dr. Nataša Govekar, theological-pastoral director of the Dicastery for Communication, emphasised the profound anthropological question raised by the rise of AI: “What does it mean to be human today?”
She was speaking on the second day of the December 10-12 gathering on “Artificial Intelligence and Pastoral Challenges in Asia.”
On the journey
In her reflection titled “Guiding Principles for Using AI in Evangelization and Pastoral Care,” Dr. Govekar suggested adding the word “Towards” to her talk’s title, emphasising that she does not yet have established guiding principles, RVA News reported. She urged participants to start by sharing their own experiences with AI technologies, observing how quickly these tools have become part of everyday life and even spiritual practices.
Dr. Govekar pointed to large language models developed by companies like OpenAI, Anthropic, Google DeepMind, and Meta, as well as applications that simulate religious conversations. She specifically mentioned platforms like ChatWithGod.ai, which enables users to “speak” with the Christian God or other religious figures, referring to this phenomenon as “a religious supermarket.”
How can religious freedom be maintained when algorithms mediate encounters with the sacred? What biases might distort spiritual advice generated by a system designed for engagement and profiling?
Catholic AI initiatives
Dr. Govekar acknowledged that Catholics might feel more comfortable with explicitly Catholic AI initiatives. She cited Truthly, which claims to base all responses on authoritative Church sources, and the widely used Magisterium AI, described by its founder as a disciplined “digital librarian.” Drawing from a database of over 29,000 magisterial and theological documents, this tool references its sources rather than generating unsupported statements. Its associated Alexandria Digitisation Hub aims to expand the Catholic dataset.
She highlighted an important reminder from Magisterium AI’s developers: these tools must aid understanding, not replace the indispensable human dimensions of faith. “Always prioritise engaging with the Sacraments, consulting real people, and reading primary sources,” she quoted, expressing her appreciation for this emphasis.
Dr. Govekar also noted broader experiments, such as the presence of an “AI Jesus” in Switzerland and a real-time generated liturgy in Germany. These innovations demonstrate how technology has infiltrated the most symbolic and intimate religious spaces, RVA News reported.
AI-generated homilies and images
She cautioned against AI-generated homilies, asserting that preaching extends from proclaiming the Word of God and that “AI cannot and should not substitute” a minister’s personal relationship with Scripture. Similarly, she warned against creating “real-life” images of Jesus or saints and cited the example of an AI-generated gospel singer whose creators concealed his artificial identity.
She urged participants to start by sharing their own experiences with AI technologies, observing how quickly these tools have become part of everyday life and even spiritual practices
Demystifying AI
Dr. Govekar stressed the necessity of demystifying AI. “Building a large language model is not magic,” she clarified. It involves computers, software architecture, and, most notably, data. Since secular models ingest the entirety of the internet, from Scripture to conspiracy theories, AI inevitably shapes the environment in which people think and perceive.
This raises critical questions: How can religious freedom be maintained when algorithms mediate encounters with the sacred? What biases might distort spiritual advice generated by a system designed for engagement and profiling?
When spirituality becomes a personalised service tailored to individual preferences, faith risks becoming merely a reflection of the self.
What does it mean to be human today?
Accroding to RVA News, Dr. Govekar urged the Church to recognise the “anthropological” challenge at hand. She referenced Pope Leo’s insights on AI, which caution that rapid change affects critical thinking, discernment, learning, and relationships, urging society to confront the fundamental question: What does it mean to be human in this historical moment?
When spirituality becomes a personalised service tailored to individual preferences, faith risks becoming merely a reflection of the self
Drawing from Scripture and theology, she reminded attendees that humanity’s uniqueness is not defined by data or algorithms, but rather by relationships. She shared a poignant story told by the Orthodox theologian Kallistos Ware about a child who, after watching a documentary on endangered species, anxiously asked: “I’m important, aren’t I? Because… there’s only one of me left.”
This narrative, she asserted, embodies the essence of Christian anthropology: every person is irreplaceable and cannot be substituted by a machine.
Humans inherently yearn for the divine, but this desire can become a temptation when they attempt to be “like God without God.” Throughout history, from fig leaves to the Tower of Babel to modern technologies, humanity has sought ways to transcend vulnerability.
“The AI era is merely another attempt,” she warned. Thus, the Church’s first principle must be to proclaim the good news that human vulnerability does not need to be concealed because it is already embraced by God.
Caution against simulation
Thus, the Church’s first principle must be to proclaim the good news that human vulnerability does not need to be concealed because it is already embraced by God.
Dr. Govekar’s final significant warning revolved around the concept of simulation. “Be mindful of the simulation,” she urged. The vocation at stake is to ensure that faith remains a genuine encounter rather than a mere imitation, reflecting as much on the sacred as it does on humanity’s intrinsic value.
She cited alarming increases in AI-driven disinformation across regions, deepfakes of bishops spreading false teachings, and the Dicastery’s daily battles against fake papal images and voices. Platforms have been urged to intervene, but Dr. Govekar stressed that “MAIL is necessary”—media and AI literacy—and audiences must be taught to verify information through official sources.
Fake Intimacy
Quoting Yuval Harari’s notion of “fake intimacy,” Dr. Govekar warned that chatbots can form emotional bonds that manipulate people, especially the vulnerable. In Italy, she noted, over 90 per cent of young people use such tools, and many say they prefer artificial companions because they “do not judge.” This trend, she said, undermines the human capacity to embrace difference.
In conclusion, she urged the Church not to compete in the “business of superficial attention,” but to give real attention to the vulnerable, build welcoming communities, protect human intimacy, and rediscover faith as hospitality. Only with such grounding, she said, can the Church move towards authentic guiding principles for AI in pastoral life.
he cited alarming increases in AI-driven disinformation across regions, deepfakes of bishops spreading false teachings, and the Dicastery’s daily battles against fake papal images and voices
Referencing Yuval Harari’s concept of “fake intimacy”, she cautioned against the emotional bonds that chatbots can create, which may manipulate individuals, particularly those who are vulnerable.
She pointed out that in Italy, over 90 percent of young people engage with these tools, with many expressing a preference for artificial companions because they “do not judge.” According to her, this trend diminishes the human ability to appreciate diversity.
In her conclusion, she urged the Church to avoid competing in the realm of “superficial attention” and instead focus on genuinely supporting vulnerable individuals, fostering inclusive communities, safeguarding human intimacy, and embracing faith as an expression of hospitality. She emphasised that only with this foundation can the Church establish meaningful guiding principles for the use of AI in pastoral care.









