Every morning, I ask my Google Nest Mini to tell me the weather and the news, then I thank her for doing so even though I know it’s a machine. And yes, I refer to it as “her” because it talks with a female voice, despite knowing it has no gender.
This phenomenon — when humans project human traits onto machines — is called the Eliza effect.
In the 1960s, MIT professor Joseph Weizenbaum developed a chatbot called Eliza, which posed as a therapist.
Using a technique developed by psychologist Carl Rogers, Eliza communicated by slightly rephrasing what humans said to it. For example, if you told Eliza you were sad, it might ask you how long you’ve been sad or why.
Weizenbaum expected people to realize these interactions were superficial, yet found that even short interactions could lead people to believe it was intelligent. Weizenbaum once said that even his secretary fell for Eliza’s charms, asking him to leave the room while she talked with it.
This disturbed Weizenbaum, who went on to openly decry man’s reliance on computers until his death in 2008.
The modern Eliza
In 2016, Vice reported on the relationships people formed with Siri and other virtual assistants and chatbots, citing a survey in which ~50% of respondents said they could imagine falling in love with one.
Today, thanks to generative AI, we have chatbots that do everything, running on far more complex language models and vast amounts of data.
While research suggests AI companions could ease loneliness, experts warn our willingness to trust and befriend machines could spread misinformation, manipulate users, or cause emotional harm.
- A man recently claimed his AI girlfriend encouraged him to assassinate the Queen of England.
- Replika is an AI companion company. Several users said they’d fallen in love with their bots, only to be devastated when the company changed their personalities.
BTW: You can have your own therapy session with Eliza here. But remember, it is just a bot.