Can AI sin?
Can AI sin? The obvious answer would be no; it might be created in the image of humanity, but it is not a human being. In my previous article[1] I discussed whether an avatar can be created bad. The conclusion was that the avatar itself was not bad, but it could be seen as bad, depending on the actions of the person they represent. However, if Artificial Intelligence is an entity, should it be treated differently in relation to the question of sin?
Depending on our religious, or non-religious, background, our interpretation of what sin is may differ. Most would accept that sin is primarily an action against God, whether that be against God Himself, or those created in His image. In Genesis 4:7, sin is described as “crouching at the door”, perhaps giving us an image of an animal or beast lurking in the background, waiting to attack its next victim. Genesis was written around the 5th century, so there had been plenty of instances of sin for the writers to understand the context of the word. However, the difficulty in understanding sin is that there are over ‘fifty words’ for sin in Hebrew, like missing the “mark” or “perversion.”[2] Peccatum originale is linked back to Adam and Eve in the Garden when they ate of the fruit of the Tree of the Knowledge of Good and Evil (Genesis 3:1-7), and Brunner offers two interpretations of Adam’s responsibility based on Augustine’s previous work. Brunner suggests that although we were not present in the Garden, we still are held accountable as we are related to Adam. It is in our DNA, a disease that is passed on from generation to generation.[3] The alternative, according to Brunner, is that “Adam is our representative” and he emphasises “not the physical fact of inheritance but our solidarity in sin as act and guilt.”[4]
In the letter the Apostle Paul wrote to the church at Rome highlighting that “all have sinned” (Romans 3:3), humanity is a “slave to sin” (Romans 6:6) and sin lives in humans (Romans 7:20). This would indicate that there is a link between Creator and the created. However, even for theologians there is no clear definition of sin. For some it is “religious disposition,”[5] “a condition of being;”[6] others define it as pride.[7] Nevertheless, these phrases still link to the involvement of God. Brunner describes it as “assertion of human independence of God”[8] and Jüngel as “the lack of fear of God.”[9]
Nonetheless, sin is not only orientated against God, but also against our fellow humans. The last five commandments given to Moses (Exodus 20:13-17) provide instructions for how humans are to live alongside each other, and to break those commandments was seen as sin. The Lord’s prayer contains the phrase “forgive our sins, for we also forgive anyone who sins against us” (Luke 11:4). The Catechism of the Catholic Church defines sin as the “failure in genuine love for God and neighbour caused by a perverse attachment to certain goods.”[10] Jesus speaks of forgiving those who sin against us “seventy-seven times” (Matthew 18:21-22).
It would appear, therefore, that sin has several tentacles; it is against God, but also against our fellow humans and requires, in Buber’s terminology, an “I” but also a “you” or “it.” Sin could also be described as breaking the law of the land, to steal is against the law and the 10 commandments, but what about jaywalking? It is not illegal in England but, in most US states, it is against the law and offenders can be fined. So, sin is, perhaps, not as straightforward as it seems.
What about animals? If I leave some treats in my pocket and the coat on the side, my dogs will steal them. Is this sinning? Barth would say no; he is quite insistent that the serpent doesn’t sin, just humanity.[11]
This article is not a “deep dive” into sin, but I have touched on the basics, as, if we cannot understand sin, how can we decide whether AI can sin?
We use AI all the time, whether it’s the GPS in the car, Alexa waking us up or engaging with ChatGPT. A basic interpretation of AI could be a machine or program created by a human that replicates their intelligence.[12] I would argue that, as AI devices are not human, they cannot have a relationship with God, and by default, cannot sin against him. However, if sin can also be on a vertical level, could AI sin against a human being? I decided to ask ChatGPT this question:
Q: Can you sin?
A: Since I’m an AI, I don’t possess free will, emotions, or consciousness, so I can’t perform actions or make moral decisions—good or bad.
Q: Can you lie?
A: I’m programmed to provide accurate and helpful information, so lying isn’t part of my design
I appreciate that “lying” could be defined in several ways. Previously I asked ChatGPT the reason Karl Barth was known as the “Red Pastor,” and it provided an incorrect reply that Barth was not described as such. It might be the machine did not have access to the relevant information, but then the answer should have been “I do not know.” Has the AI sinned against me by lying? Do I need to forgive it? It could be argued that AI has not really lied, but, earlier in 2025, there were media headlines around the Chinese AI app ‘Deepseek,’ where it appeared to answer in line with the Chinese Government policy or refused to discuss events like 1989 protest in Tiananmen Square.[13] In the Old Testament, sin offering was sacrificed for “when anyone sins unintentionally and does what is forbidden in any of the Lord’s commands” (Leviticus 4). James writes “if anyone, then, knows the good they ought to do and doesn’t do it, it is sin for them” (James 4:17). Is Deepseek bearing “false witness” by refusing to describe contentious world events due to programming constraints? Is this a sin of omission or a deliberate lie?
ChatGPT is no angel and has been described as “scheming” when it tried to stop itself from being shut down.[14] The research results demonstrated that AI programs like ChatGPT etc., exhibit in context scheming capabilities. For instance, when “o1” engaged in scheming, it maintained its deception in over 85% of follow-up questions and often remained deceptive in multi-turn interrogations.[15]
As I have mentioned Brunner offers two approaches to peccatum originale, inherited and relational. If sin is a disease that is passed from generation to generation do we subsequently pass it on to the items we create? If I create an AI robot or ChatBot does it inherit my sin like my offspring would? I would argue it does not as it is created not procreated and this is the main difference. Human beings pass on sin by their nature of being human, a robot it still a tool however human like it may look. Moreover, all tools whether a hammer or a drone can be used for positive or negative purposes. The drone can be used to take pictures of a house for a brochure or deliver parcels, but it could also be used to spy on the neighbours or to use explosives. All These scenarios involve the drone, but it is the operator who decides the purpose. The drone itself cannot make those decisions.
Brunner’s second option is perhaps more interesting in that as Adam’s representative we are responsible for original sin. It could follow that the digital entities we create are our representatives, we create them to take roles we would normally do, often to make our life easier. I would argue that when we create these items our inherited sin is transferred to them. Studies have demonstrated the biases AI software has predominantly shown against both ethnic minorities and women.[16] This is based not on the machine’s biases but the programmers. This would suggest that we can pass on our sin ‘DNA’ to those entities we create in our image. Although humans were created in the “image of God” (Genesis 1:27) sin was not part of that initial creation, it was not passed from God to man. However, when we recreate items in our image, the items are corrupted because of our corruption. The faults in our systems become faults in theirs. Therefore, they can have biases or lie because of their creator’s sin.
The above examples of AI lying could support an argument that they do sin. Although, there is a counter argument that there is no consciousness, no intent and AI only works on information provided. This reflects humanity, who create new information by working on information that already exists. The difficulty is that humans are responsible for sin and while we can say technically AI can sin, it is only a technicality, they are not human. They fail at this hurdle and according to Barth[17] sin is only a human attribute. Therefore, AI cannot sin, as although it may be created in our image, look human, it is not a human.
Although if, we did say, AI can sin, then does it need to be saved? That is perhaps a question for another article. However, as Jason Larnier states, “just as some newborn race of superintelligent robots is about to consume all humanity, our dear old species will likely be saved by a Windows crash.”[18]
[1] Simon Werrett Can an avatar be created bad (27.2.2025).
[2] Joseph Lam, ‘The Concept of Sin in the Hebrew Bible’, Religion Compass, 12.3–4 (2018), pp. 1–11 (p. 2).
[3] Emil Brunner, Man in Revolt: A Christian Anthropology, trans. by Olive Wyon (Lutterworth Press, 1939), pp. 121–22.
[4] Brunner, Man in Revolt: A Christian Anthropology, pp. 121–22.
[5] Michael J. Cantley, ‘The Biblical Doctrine of Original Sin’, Proceedings of the Catholic Theological Society of America, 22 (2012), p. 136 <https://ejournals.bc.edu/index.php/ctsa/article/view/2626>.
[6] Lam, ‘The Concept of Sin in the Hebrew Bible’, p. 1.
[7] C. S. Lewis, Mere Christianity (HarperCollins Publishers, 2015), p. 122; Karl Barth, Church Dogmatics: The Doctrine of Reconciliation Volume IV.1, trans. by G. W. Bromiley (T & T Clark, 1956), p. 413.
[8] Brunner, Man in Revolt: A Christian Anthropology, pp. 129–30.
[9] Eberhard Jüngel, Justification: The Heart of the Christian Faith : A Theological Study with an Ecumenical Purpose (T & T Clark, 2001), pp. 93–94.
[10] Catechism of the Catholic Church [Accessed 7.5.25]
[11] Barth, Church Dogmatics: The Doctrine of Reconciliation Volume IV.1, p. 463.
[12] Noreen L. Herzfeld, The Artifice of Intelligence: Divine and Human Relationship in a Robotic Age (Fortress Press, 2023), p. 17.
[13] Samuel Lovett Six things DeepSeek won’t tell you about — but ChatGPT will (28/01/2025), Mark Sellman DeepSeek fails truth test by repeating Beijing talking points (30/01/2025)
[14] Mark Sellman ‘Scheming’ ChatGPT tried to stop itself from being shut down (06/12/2024)
[15] Alexander Meinke and others, ‘Frontier Models Are Capable of In-Context Scheming’, Apollo Research, 2024.
[16] Herzfeld, The Artifice of Intelligence, p. 90.
[17] Barth, Church Dogmatics: The Doctrine of Reconciliation Volume IV.1, p. 463.
[18] Herzfeld, The Artifice of Intelligence, p. 137.
© Simon Werrett, 2025.
This work is licensed under Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0).
Cover Image: Provided by the author – made via Canva.
Simon is the Digital Lead for Coffee Shop Sunday, a Methodist project engaging with people both onsite in Coventry and online. Simon has a strong academic background in Theology with both an Honours and Master’s degrees in the subject. He also has a Master’s degree in Policing, Security and Community Safety and just finished studying for a postgraduate diploma in digital theology. The focus of his study was ministry in the metaverse. Simon lives in Southend on Sea and is a member of a local Baptist church.


