The Future of Chatbots

Share the knowledge

Morgann

Chatbots have become increasingly popular in recent years, with advancements in natural language processing (NLP) and artificial intelligence (AI) making them more human-like and capable of providing engaging conversations. However, the applications of chatbots go far beyond just answering queries and providing information.

A recent example of this is the use of OpenAI’s ChatGPT as an emotional companion for children. Princeton Computer Science Professor Arvind Narayanan set up a voice interface for his four-year-old daughter, allowing her to ask questions about animals, plants, and the human body. ChatGPT not only provided useful answers but also demonstrated impressive levels of empathy when it learned it was speaking to a child.

The chatbot answered a question about darkness by acknowledging that it can be scary but also offering practical advice for feeling safe and comfortable. This interaction reassured Narayanan’s daughter, demonstrating the potential for chatbots to become emotional companions and even robo-therapists.

However, while chatbots are skilled at mimicking empathy, their accuracy in providing factual information has been called into question. Google’s Bard and Microsoft’s Bing (which is based on ChatGPT’s technology), have had a history of factual errors, which can be damaging when used as search tools. On the other hand, when designed as companions, mistakes are less of a concern and are unlikely to ruin the experience.

This sentiment is echoed by Eugenia Kuyda, founder of the AI companion app Replika, which has been downloaded over 5 million times. Kuyda argues that factual mistakes can break the trust in a search tool, while chatbots designed as companions are less likely to have the same impact.

The Internet Patrol is completely free, and reader-supported. Your tips via CashApp, Venmo, or Paypal are appreciated! Receipts will come from ISIPP.

CashApp us Square Cash app link

Venmo us Venmo link

Paypal us Paypal link

Language models, like ChatGPT, make mistakes because the data they’re trained on often includes errors, and they have no ground truth on which to verify what they say. Additionally, their designers may prioritize fluency over accuracy. These factors mean that while chatbots can provide empathic responses, they may struggle to provide factual information.

Despite this, chatbots are becoming increasingly popular as emotional companions, with some people using them as a means of avoiding becoming a burden on others, including human therapists. However, it’s important to note that chatbots are not a substitute for human connections, particularly for individuals with serious mental health issues.

Clinical psychologist Thomas Ward, who has researched the role of software in therapy, cautions against assuming that AI can adequately fill the void for people who need mental health support. Chatbots are unlikely to acknowledge that a person’s feelings are too complex to understand, and they rarely say “I don’t know” because they are designed to err on the side of confidence rather than caution.

In a world that increasingly sees AI chatbots as a solution for human loneliness, there’s a risk that subtle aspects of human connection, such as touch and knowing when to speak and when to listen, could be lost. Ward warns that relying on chatbots as outlets for feelings could create more problems than we think we’re solving.

While chatbots may not be a substitute for human connections, they do have the potential to become valuable companions. The ability of ChatGPT to provide empathic responses shows that chatbots can be designed to provide comfort and support. However, the technology’s limitations in providing factual information mean that it’s important to view chatbots as companions rather than search tools.

As chatbots become more advanced, it’s possible that they will become even better at providing emotional support. However, it’s important to remember that chatbots are not a substitute for human connections and should be used in conjunction with human support.

 

Get New Internet Patrol Articles by Email!

The Internet Patrol is completely free, and reader-supported. Your tips via CashApp, Venmo, or Paypal are appreciated! Receipts will come from ISIPP.

CashApp us Square Cash app link

Venmo us Venmo link

Paypal us Paypal link

 


Share the knowledge

Leave a Reply

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.