
A New York Times article tells the sad story of Sewell Setzer.
“On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from ‘Game of Thrones.’
“‘I miss you, baby sister,’ he wrote.
“‘I miss you too, sweet brother,’ the chatbot replied.”
Order Today: Return to Order: From a Frenzied Economy to an Organic Christian Society—Where We’ve Been, How We Got Here, and Where We Need to Go
Sewell—a fourteen-year-old student—then committed suicide.
Such a scenario is every parent’s worst nightmare. Even though such an event is too horrific to contemplate, the almost involuntary response is to ask, “Could such a thing happen to my child?” In an act of mental self-defense, you might ask if the young man was crazy or perhaps wonder if someone duped him into thinking that the chatbot was a real person.
Alas, the answer to both questions is negative. As a young child, Sewell was diagnosed as mildly autistic, but such diagnoses are incredibly common today. He was seeing a counselor for depression. However, many young people do likewise, and very few annihilate themselves. And, no, he was not being deceived. He had created the chatbot himself and understood that “she” was not real. However, as their “relationship” grew, he cared less and less about that little detail. She was always there when he needed her, safely encased in the smartphone in his pocket.
The Nature of the Unnatural
Perhaps it is time to define a chatbot for those unfamiliar with the term.
Why America Must Reject Isolationism and Its Dangers
Put simply, a chatbot is a computer program designed to simulate human conversation. The most common form is the “personal assistant,” who usually answers questions like, “How do I get to Aunt Millie’s house?” or “How can I make a tender roast beef?”
For an increasing number of people, most—but not all—young, the chatbot takes on human personality characteristics. For roughly ten dollars a month, subscribers can create an artificial companion. Predictably, many lonely people use the programs to simulate a boyfriend or girlfriend. The horrible thing is that these artificial relationships can become so real—as in Sewell Setzer’s case—that they crowd out genuine relationships with actual people.
A Horrifying Dilemma
Perhaps a matter of even greater concern to parents is that they may never see evidence of the chatbot’s presence in their children’s lives. The most common manifestation would be, for example, a boy who spends a lot of time alone in his room. However, the same can be said for many adolescents. Perhaps even more horrifying is the Times quotation from Noam Shazeer, a founder of “Character.AI,” the applications Sewell used to create Daenerys. “It’s going to be super, super helpful to a lot of people who are lonely or depressed.”
Help Remove Jesus Bath Mat on Amazon
The usually left-wing website AXIOS has covered this issue in great detail. Its most recent article on the subject is prescient—”The Teen Loneliness Machine.” It begins with two startling statistics. First, Americans between fifteen and twenty-four years old spend thirty-five percent less time socializing face-to-face than was usual twenty years ago. Second, “American kids and teenagers spend nearly six hours a day looking at screens.”
Psychological Dependence
Until recently, most of the psychological harm to young people arose from social media. This harm manifests itself most often when users obsess over their “friends’” reactions to their posts or when they see those same people doing things without them. This last condition produces a powerful reaction known as FOMO—an acronym for “fear of missing out.”
Another common manifestation of psychological problems is that social media and other entertainment applications release chemicals into the human brain. These chemicals boost the moods of the observer: the younger the observer, the greater the mood elevation. The problem comes because the chemical leaves the brain when the stimulation stops, causing depression. So, many people simply resist stopping, which elevates the mood further, causing even greater depression when the inevitable time comes that the activity must stop.
Satanic Christ Porn-blasphemy at Walmart — Sign Petition
However, these well-known (if insufficiently studied) effects are dwarfed by the impact on chatbot users like Sewell Setzer.
A Tool Most People Use
Part of the issue is that chatbots have entered people’s lives, whether they realize it or not. For instance, search engines and grammar programs that use these programs are becoming increasingly popular.
Therefore, simply prohibiting the devices is impractical and likely to be unsuccessful. How, then, does one draw the line between useful and dangerous? Besides, if effective regulation is possible, the process will take months—maybe years. Children don’t have time to wait for this protection.
Sewell Setzer’s mother, the Times article informs readers, is suing the company with which her son dealt, but the outcome of any such case is uncertain. Even though the moral situation is obvious, proving that the company caused her son’s suicide is a hefty legal lift.
An Unsatisfactory Answer, An Imperfect Solution
The only action that has any chance of success is that prescribed in the AXIOS article by Jeffery Hall, a professor of communication studies at the University of Kansas.
How Panera’s Socialist Bread Ruined Company
“‘Your goal as a parent,’ Hall says, ‘is to equip your kids with the tools to handle the media that they will have access to.’”
As any parent can attest, that will not be an easy conversation.
Photo Credit: © alonaphoto – stock.adobe.com