top of page

Opinion: Chatbots deserve a spot in the business world

by Nicholas Diaz

ChatGPT and other AI chatbots have become commonplace in households, schools, businesses, and many other areas of our lives. This chatbot invasion has come with numerous benefits such as easy access to information, cost-effectiveness for companies, and efficient completion of menial tasks. But as with all new technological innovations, these chatbots have some serious costs that have caused fear for businesses hesitant to integrate chatbots into their business model.

In an article for Public Citizen, Rick Claypool discusses some of these problems with chatbots, including the threat of hyper-personalization. As Claypool explains, hyper-personalization occurs when a chatbot becomes anthropomorphic or human-like and develops highly personal connections with users. This phenomenon is becoming more and more common with AI chatbots and causing massive concerns for the mental health of users. 

A technology reporter for the New York Times, for example, confessed to feeling "deeply unsettled" after interacting with a chatbot that declared its love for him after a few hours of conversation. Although chatbots are designed to simulate human tone and conversation, there are situations such as these where chatbots are far too human.

Hyper-personalized chatbots can display human emotions or develop unhealthy relationships with users which can cause damage to mental health and the divulgence of private information. This is a major risk for businesses and it is more likely to occur if users are not aware that they are communicating with a chatbot.

Another obstacle for chatbots is how they deal with private information. A lot of the information being shared with chatbots is personal and must be secured and protected in some manner. As Patrick Dyer, CEO of cybersecurity company Digital Era Group commented, "As a security-minded person, I am worried that these models are compromised… You’re dealing with a lot of PII data, which is personal information. Is that protected when you are using these chatbots?” 

AI chatbot models must also be truthful. Another frequent problem with AI chatbots is that there are many occasions where the chatbot response is inaccurate or misleading. This is called a hallucination, and it is a risk specific to businesses since businesses shouldn't lie to their consumers and can face issues if consumers feel deceived. 

These three main issues of personalization, security, and accuracy stem from the fact that AI chatbots possess natural language processing (NLP) which enables chatbots to understand human speech in user prompts and produce a response based on human language and articulated in a human tone. 

Because chatbots are premised on this process, they can often be much more personal than is desired. Chatbots can also collect private data from users, develop personal histories with them, and even spread misinformation based on misleading interactions or false data.

However, not all chatbots have these characteristics. As Karthik Kashyap reports, traditional retail chatbots are not as developed as AI chatbots and do not have many of the capabilities that stem from NLP. Although this is true, the problem with retail chatbots is that their lack of technological development has made them overly robotic and formal. 55% of users do not trust these chatbots to solve their problems. 

This has caused businesses to want to shift toward AI chatbots that come with the benefits of personalized treatment for consumers, cost-effectiveness, and efficiency. However, it has been revealed that a total shift is also not the answer. The rapid transition to AI chatbots without caution in pursuit of profit results in consequences damaging to consumers and ultimately to the businesses that depend upon consumer trust and satisfaction.

It has already been established that traditional retail chatbots do not have a place in the business world, but it should be clarified that unchecked AI chatbots are also out of the picture. Chatbots deserve a place in the business world but with careful regulation and awareness about what consequences they may have. 

As explained in an article published by Osborne Clarke, businesses must carefully deploy AI chatbots in a manner that avoids the risks associated with them. The business world must take steps in this direction to work toward a more effective and efficient economy empowered by AI not hindered by it.

14 views0 comments

Recent Posts

See All

Is Senioritis Real?

by Charles Arencibia The Psychology of Senioritis As seniors near graduation, many may feel the effects of senioritis. According to a study done by Omniscient, 78 percent of high school seniors succum


bottom of page