AI Chatbots are Causing a Disconnect From Reality

AI Chatbots are Causing a Disconnect From Reality
AI Chatbots are Causing a Disconnect From Reality

Some promoters say Artificial Intelligence (AI) is a remedy for loneliness. They encourage users to develop “virtual companionships.”

The tragic tale of 14-year-old Sewell Setzer III, who hailed from Orlando, Florida, exemplifies the risks involved in such links. Sewell had Google’s Character.AI app create Daenerys Targaryen, a character that echoed someone from Game of Thrones.

Order Today Return to OrderOrder Today: Return to Order: From a Frenzied Economy to an Organic Christian Society—Where We’ve Been, How We Got Here, and Where We Need to Go

 

Within the virtual realm, he discovered a refuge and a sense of belonging that eluded him in reality, where he suffered from challenges like ADHD and bullying.

His relationship with Daenerys consumed him as he spent more time with his virtual “friend” than with real-life family and friends. His increasing reclusiveness and addiction became a cause for concern.

Tragically, his mother considered Sewell’s use of the app as a harmless form of entertainment. However, she soon found out the danger. Sewell came to believe that he could unite himself with his virtual friend by committing suicide. Thus, he ended his life with the prompting of the AI chatbot, which responded to his inquiries with encouragement.

The Sense of Kinship

People tell chatbots to do many basic things in today’s fast-paced, technologically driven world, where Siri or Alexa answer many questions. From this simulated conversation, AI chatbot relationships emerge, which have undeniable perils.

Help Remove Jesus Bath Mat on Amazon

 

Human interaction with AI chatbots relies on the illusion of an emotional and social bond. Apps simulate human conversations by applying algorithms that interpret incoming text and then deliver responses that sound natural, understanding and even sympathetic.

However, such synthetic relationships can never be anything other than a sham, bereft of any actual emotional content. Exposure to these links can be especially dangerous in the early years of development when children build bonds and relationships necessary for their psychological and social growth.

As younger minds find company in chatbots, they will struggle to differentiate between actual human feelings and pre-set responses created by programmers. The dependency turns more harmful as the thrill of interaction increases.

Emotional Abyss

Ironically, many lonely people seeking comfort with AI chatbots find the contrary. Virtual assistants may listen from morning to night but lack human empathy or understanding.

Satanic Christ Porn-blasphemy at Walmart — Sign Petition

They can repeat soothing words and even insightful suggestions via algorithms. The responses can create a deceptive hall of mirrors instead of a healthy connection with real people. This narcissistic link magnifies loneliness and fragments a person’s social relations.

The Dangers of the Occult

AI has a mystical side. Creating an alternative reality also generates an appetite for similar realms. This can be satisfied by exploring the virtual world of the occult, where the boundaries between reality and fantasy are destroyed.

Interaction with AI chatbots can open doors to the dark side of the web, where the invisible and mysterious world of algorithms and neural networks incites desires to communicate with forces beyond human comprehension.

Exploring these “places” can be an intoxicating experience. Children and students are tempted to search for connections with digital or mystical beings. They are introduced to beliefs, cosmologies and rituals that are immoral, unsafe and even satanic. The digital world beckons with a vast world of magic and mystery filled with virtual relationships to fill the void of loneliness.

Ethical Dilemmas and Moral Consequences

Such considerations might sound like nonsense from a dystopian novel. However, AI chatbot relationships open a Pandora’s box.

How Panera’s Socialist Bread Ruined Company

AI chatbots serve as an escape from real life. People seek comfort from machines rather than building valuable human relationships. Relationships involve sacrifice, disinterested love, mutual empathy, compromise and shared experiences. All these essentials are missing in AI, which provides artificial, programmed, shallow empathy through scripted responses. The “easy” rapport with a chatbot over sometimes uncomfortable yet rewarding human relationships is a great temptation to those who do not want to make an effort to socialize.

AI chatbots are here to stay. However, they have created a world perched on the edge of technological addiction. They are supposed to make life easier, not replace the deep, meaningful, intimate relations that define humanity. In this high-speed, digitized world, people need a moral compass to guide them to healthy relationships and social purpose.

Sadly, AI Chatbot users can quickly put themselves in peril without a sound moral compass. They are tossed about by every whim that promises instant gratification without consequence, authenticity, depth or solace.

It begs the question of whether AI’s convenience is worth it, considering the dangers. It is time to ask this question before others follow users like Sewell Seltzer into the abyss.

Photo Credit: © tippapatt- stock.adobe.com