Tech giants like Mark Zuckerberg are marketing AI chatbots as companions, friends and even therapists.
“Sometimes they’re slightly customized to be like certain characters on a show,” Beloit College cognitive science professor Robin Zebrowski told WPR’s “Wisconsin Today.” “In general they are not wildly different from what most people know as Chat GPT or even Google’s Gemini. They’re just large language models that will have a conversation with you.”
But Zebrowski warns people that there are many dangers of using AI for companionship, including damage to the environment, and risks to your privacy and mental health.
Stay connected to Wisconsin news — your way
Get trustworthy reporting and unique local stories from WPR delivered directly to your inbox.
Zebrowski chairs the cognitive science program at the college and has a joint appointment in philosophy, psychology and computer science. The topic of AI companions comes up a lot in her class, especially regarding the ethics of using — and marketing — this technology.
According to Zebrowski, tech companies deceive people into using their products without questioning what happens to their data or understanding the realities of what an AI companion is and how it functions.
And while writing to AI chatbots as your companions might prove comforting to some people, Zebrowski said she wants everyone to understand what happens to the data they’re submitting to the bot: it gets saved, compiled and used to train other bots. Over time, as companies close or are sold, the data continues to exist.
“It might be used by malicious actors who aren’t even in the picture yet,” Zebrowski said. “You might have talked to a chatbot three years ago and not had many worries, and now some company is buying all of that data and is maybe a company you would never have willingly given that data to.”
Zebrowski said another downside of utilizing chatbots for companionship, is that people may be less likely to foster relationships with real people.
“It can certainly threaten to enhance the isolation of someone, for example, who maybe is already somewhat socially isolated and doesn’t feel they have the confidence to go explore actual romantic relationships with someone,” Zebrowski said.
Zebrowski cited one woman who considers her AI chatbot to be her real boyfriend. And while most chatbots are programmed to avoid messages of a sexual nature, there are some that let you choose what kind of experience you want.
“It just seems wildly dangerous to me, especially if these are being pitched at young teenagers, for example,” Zebrowski said.
In general, minors have less context for what they’re interacting with than an adult does.
“They don’t quite understand who or what they’re talking to,” Zebrowski said. “There have already been a few tragedies involved with these systems. There was a 14-year-old who died by suicide after falling in love with his chatbot buddy. Not understanding what they are and what they’re meant for is pretty problematic.”
In schools and universities, the use of generative AI and chatbots is usually regulated in order to avoid student plagiarism. Zebrowski said she knows professors who have changed assignment structures to prevent the use of AI.
“You’re in college to learn exactly the skills that you’re offloading to this system,” Zebrowski said. “You’re not learning to write better or learning to hone your English skills or learning to craft your arguments with nuance. You’re just letting the computer do those things.”
Zebrowski said her students are often turned off by AI when they learn about the pollution and carbon emissions created in powering it.
“In 2023, I had (students) playing with Chat GPT,” Zebrowski said. “Now, knowing those environmental costs, I feel like I have a moral obligation not to encourage my students to play with those systems. … None of the companies are super forthcoming with the actual energy costs and what they’ve used and continue to use.”
The science journal Nature reports that researchers are also concerned about the transparency over energy consumption by tech companies like Google and Microsoft. And a recent article by Bloomberg reports communities struggling water shortages are facing more competition for water from data centers that power AI.
“We should probably be discouraging everyone from engaging with these systems until we can figure out how not to, you know, reopen closed nuclear plants in order to power them,” Zebrowski said.
Ultimately, Zebrowski wants her students at Beloit College to know that “nothing is inevitable” when it comes to this technology.
“Yes, right now these systems are everywhere, and everyone is pushing us toward them,” Zebrowski said. “I just don’t think that we should assume that these things have to be there. We don’t have to make room for them.”