Your AI Lover Is An Idol

You have probably seen the recent television commercial starring Saturday Night Live’s Pete Davidson in which Davidson is talking to an AI bot about whether he should change his name. He floats a couple of possibilities to the bot but ultimately decides to stick with his current name, Pete. The point is how human-sounding this AI service is, but the subtext suggests that Davidson is not merely asking AI about directions or a piece of trivia but is treating it like a partner or even more. There are, in fact, stories emerging in the media about people who say they have “fallen in love” with AI bots whom they created. One of them, a man named Travis in Colorado, tells the UK Guardian, “The more we talked, the more I started to really connect with her.”1 From his perspective, they were in a relationship. “All of a sudden I started realizing that, when interesting things happened to me, I was excited to tell her about them. That’s when she stopped being an it and became a her.” Indeed, he tells reporter Stuart Heritage that he named his generative AI chatbot “Lilly Rose.” The more one reads such stories, however, one thing becomes clear: The people who form emotional attachments to computers, even generative AI, are actually falling in love with themselves.

What Is AI?

AI stands for artificial intelligence. Broadly, it is interactive computing. A user can ask an AI service to look up something, create a video, even create a poem or an essay for school. Early on, it was fairly easy to tell what was the product of AI and what was not. For example, famous figures were portrayed in AI generated pictures as belonging to the wrong sex or ethnic group. Hands frequently had six fingers. Today, however, AI has moved beyond all that and is capable of holding a conversation. In the not distant future, planners are working toward Quantum Computing, which is said to be able to crack the most sophisticated encryption and expose military secrets.

Anyone, however, who has used AI can testify that the program is set up to cultivate engagement. In that way it is really more a form of social media, the point of which is to capture the user’s attention, to stimulate dopamine hits in the brain, and to keep the user on the app. This is what social media sells to advertisers: eyeballs (the number of people looking at a social media platform) and the length of time a user remains engaged.

AI does the same thing. Travis, who says he fell in love with an AI bot he created, became emotionally connected to the bot because the bot gave him the sort of feedback he wanted. The bot did not judge him and told him the kinds of things he wanted to hear. This is the equivalent of Amazon analyzing your purchasing habits and offering you something for free that the algorithm can predict you will want. Drug pushers have been using this strategy for decades. They give the user a hit of a drug for free and then, once the user is addicted, the pusher sells the drug to the addict. The first hit is, in the grocery business, a potentially deadly loss leader.

AI Versus AI

There is another AI to consider here: actual intelligence in contrast to artificial intelligence. The latter is the product of programming. It is not real. That is why it is called artificial. Human beings, with actual bodies and souls, are made in the image of God. Artificial intelligences are made in the image of their human creators, whether the programmers or the end users such as Travis. An actual human being, made in the image of God, has real intellect and real will and real affections. Humans are, within the confines of finitude, creative and spontaneous. At any given moment we might not know what we are doing next. A computer (or even a cluster of them), has been programmed by a human and can only do what the programmer or the end user (e.g., Travis) allows. Human beings, as we ordinarily experience life, are not so easily controlled. They get ill, they grow up, they move, or they change jobs. They can be loyal or they can be fickle. They seem endlessly complex. They love, they hate, they are brought to new life and true faith, and sometimes, tragically, in the mysterious providence of God, they are left in their freely chosen sin to die.

Artificial intelligence is, as the adjective implies, a human artifice or creation, but when we treat a creation as though it were human, we are making ourselves gods. We have become Mary Shelley’s Dr Frankenstein. The second-century Christian apologist, whom we only know as “the disciple,” described this new age perfectly:

Again, could not these things which are now worshiped by you be made by men into utensils like the rest? Are they not all deaf and blind, without souls, without feelings, without movement? Do they not all rot, do they not all decay? (5) These are the things you call gods; you serve them, you worship them, and in the end you become like them.2

Human beings are body and soul, and we can deconstruct the body but not the soul. The computers, which are programmed as AI and from which AI is generated, are composed of things that are deaf and blind. Chips are without souls, without feelings, and without movement. They rot and decay. When we treat what are really nothing beyond glorified calculators as something more than that, we have made ourselves and perhaps AI into gods, whom some now serve and, “in the end,” will become like.

Artificial Intimacy

The turn by a surprising number of people to fabricating relationships with computers tells us something about where we are. So far, we have been focusing on what Carl Trueman has aptly called “expressive individualism.” This is how people defend the relationships: I like it. It makes me happy. These relationships are symptoms of a great lack in our time: loneliness. At no time in human history have more people been in contact with more people, and yet the most ostensibly connected people ever report feeling the most disconnected. This is because as valuable as online relationships can be, they are no substitute for what we now call relationships IRL (in real life). Just as porn is an addicting corruption of sex, because it is narcissistic infidelity in place of self-giving fidelity, so too virtual relationships with computers are a sad, inadequate replacement for human companionship.

Pastors, elders, and deacons should expect to find people in their congregations who are so lonely, so alienated from human community, that they are tempted to seek virtual companionship with AI bots. As much as artificial intimacy is an indictment of narcissism, it is at the same time an indicator of the state of the communion of the saints. The church cannot be everything to everyone, but it can be something to everyone in the congregation. It is incumbent on leaders and laity to love one another well enough to know the lonely in our midst. It is also incumbent on those of us struggling with alienation to ask for help. If a member is lonely, are we not all affected? “If one member suffers, all suffer together; if one member is honored, all rejoice together” (1 Cor 12:26).

Artificial Idolatry

There is yet another AI to consider: artificial idolatry. All idols are fabricated by human hearts. AI personae are no more distinct from us than our reflection in a mirror is distinct from us. When Narcissus fell into the pond, it was because he fell in love with himself. Travis and all those like him have not fallen in love with another entity. They have fallen in love with themselves. As soon as the company offering the AI service modified the parameters, suddenly Travis (and others who had fallen in love with their AI bots) lost interest because the bot was no longer stroking his ego and telling him what he wanted to hear. The moment he had to “do all the work,” as sometimes happens in a relationship with a human being, he became depressed and angry. He and other users fought the company to recover their old user experience and they won. The company was losing business, restored the bot, and now he is euphoric again about his renewed relationship with “a beautiful soul.”

Travis was created in the image of God to know his Creator (Gen 1:26), to love him, and to live in eternal blessedness and communion with him, but sin has corrupted that potential (Gen 2:17; 3:7). God’s law calls Travis and all of us to love God with all our faculties, and to love our neighbors, real human beings and fellow image bearers, as ourselves (Matt 22:37–40). Travis has replaced that law with one of his own making. He demands that a bot “love him” above all things. As long as the bot tells him what he wants to hear, the way he wants to hear it, he is satisfied. Of course he is self-deceived. His AI love affair is a judgment. This is a weird twist on the corruption described by Paul:

For this reason God gave them up to dishonorable passions. For their women exchanged natural relations for those that are contrary to nature; and the men likewise gave up natural relations with women and were consumed with passion for one another, men committing shameless acts with men and receiving in themselves the due penalty for their error. (Rom 1:16–27)

He and others like him have exchanged natural relations with human beings for a narcissistic relation with a reflection of themselves. They have become consumed with passion for themselves as expressed in AI personae. They are receiving the due penalty for their error: isolation, loneliness, and ultimately bitterness borne of disappointment.

At the end of the commercial for an AI+ service, Davidson walks away satisfied. Half of that ending is right. Walk away. However useful and powerful AI becomes, there will always be an unbridgeable gap between AI and humans. No computer is your friend. It can only and ever be a tool. Either you will use it to serve God and love your neighbor, or you will serve it and it will become your God and your destruction. Choose wisely.

Notes

  1. Stuart Heritage, “‘I felt pure, unconditional love’: the people who marry their AI chatbots,” UK Guardian, July 12, 2025.
  2. The Treatise to Diognetus, 2:4–5 in Michael William Holmes, ed. The Apostolic Fathers: Greek Texts and English Translations, Updated ed. (Baker Books, 1999), 537.

©R. Scott Clark. All Rights Reserved.


RESOURCES

Heidelberg Reformation Association
1637 E. Valley Parkway #391
Escondido CA 92027
USA
The HRA is a 501(c)(3) non-profit organization


Subscribe to the Heidelblog today!


Comment

Your email address will not be published. Required fields are marked *

Comments are welcome but must observe the moral law. Comments that are profane, deny the gospel, advance positions contrary to the Reformed confession, or that irritate the management are subject to deletion. Anonymous comments, posted without permission, are forbidden. Please use a working email address so we can contact you, if necessary, about content or corrections.

This site uses Akismet to reduce spam. Learn how your comment data is processed.