Is AI Sentience Becoming A Reality Or Are Machines Just Getting Better At Conversation?
The question of whether or not an AI has become sentient is always in the back of many people's minds. Recently, a Google engineer who was placed on leave claimed that Google's LaMDA AI chatbot was in fact wanting to be considered sentient. Blake Lemoine was doing ethics research into the chatbot when he formed his opinion. Lemoine not only claimed the AI chatbot was sentient, but that the AI wished to be viewed as such.
It is not something new for someone to believe an AI chatbot has become self-aware. In a recent report from Reuters, Chief Executive Eugenia Kuyda of Replika stated, "We're not talking about crazy people or people who are hallucinating or having delusions. They talk to AI and that's the experience they have."
The AI Kuyda is referring to is Replika's bespoke avatars that are meant to be companions for people seeking someone to talk to and listen to them. Kuyda says that it is not uncommon for people who use entertainment chatbots to feel this way.
"We need to understand that exists, just the way people believe in ghosts," remarked Kuyda. "People are building relationships and believing in something."
Replika is part of a much wider conversational AI industry that is estimated to have a worth of over $6 billion in global revenue last year. Most of that revenue went toward business-focused chatbots for customer service, but industry experts are expecting more social chatbots to emerge as companies improve at blocking offensive comments and developing programs that are more engaging.
As the technology grows in complexity and ability to hold intelligent conversation, one fear is where this might lead for some using it as a social outlet.
"Suppose one day you find yourself longing for a romantic relationship with your intelligent chatbot, like the main character in the film 'Her'," stated Susan Schneider, founding director of the Center for the Future Mind at Florida Atlantic University. "But suppose it isn't conscious," she added. "Getting involved would be a terrible decision, you would be in a one-sided relationship with a machine that feels nothing."
Kuyda says the above statement is made in hopes people using the chatbot will avoid addictive conversations. When users do express a feeling their companion is more than just a program, Kuyda says the company tells them that the technology is in its infancy and that some responses may simply be nonsensical.
Kuyda said she recently spent 30 minutes with one customer who felt his Replika was suffering from emotional trauma. She told the customer, "Those things don't happen to Replikas as its just an algorithm."
What are your thoughts when it comes to AI chatbots and AI in general? Do you think we are anywhere close to seeing a truly sentient AI, or are we still years away? Let us know in the comments.
Top Image Credit: Gerd Altmann from Pixabay