The Technology Director at NDS Cognitive Labs, who is a professor from Tec de Monterrey, tells us about the advantages of these artificial intelligence systems.
Chatbots will become avatars to assist us in the metaverse, enabling us to hold complex conversations. (Photo: GettyImages)
“Hello! I’m Vanessa, your virtual assistant. How can I help you?” This could be the beginning of a text conversation with a chatbot. We are so used to them and we use them for so many things that, sometimes, it’s hard to distinguish them from humans. Has this happened to you?
Vicente Cubells, Technology Director at NDS Cognitive Labs and professor at Tec de Monterrey’s School of Engineering and Sciences, points out that –very soon– we will see them in more complex forms such as avatars in the metaverse, where they will assist us and solve all kinds of problems. They’ll even be designed in a such way that we may feel empathy towards them. We tell you about the evolution of this technology.
Chatbots are software based on Artificial Intelligence (AI) with the ability to communicate. They are there when we book a plane ticket online, make a purchase on a website, ask for our account statement on an app, make a bank transfer, file a complaint, or manage a mortgage loan.
They can assist you via chat messages on websites, WhatsApp, Facebook Messenger, Twitter, Instagram, or Telegram.
Chatbots can also be that voice we hear from Alexa, Siri, Google Assistant, or a recorded message that activates when we call an office.
Vicente Cubells describes two types: simple chatbots with whom the interaction is completely guided. There is a set of steps or defined processes behind them, and the software only responds to those topics.
Conversational assistants are on a different level of complexity. This is software programmed with automatic learning algorithms or machine learning, which allows us to hold natural conversations with them. They are able to understand the user’s intention and converse fluently.
Is Alexa a chatbot? “Yes, but it’s also more than that. It’s called Conversational AI because it’s able to communicate by voice and understand what the user wants and respond based on that.”
It’s not new technology, but the pandemic brought it further into our lives because institutions had to open efficient digital communication channels to serve their clients.
NDS Cognitive Labs, the company where Cubells works, has developed a chatbot for HSBC, to whom users can ask more than 1,200 different questions on banking-related topics.
Tec de Monterrey also has its TECbot, a virtual assistant that provides information about academic courses, how to carry out procedures, its syllabuses, degree options, and more.
To date, it has been used by more than 40,000 users and has held more than 250,000 conversations. It was built on the Microsoft Azure LUIS Artificial Intelligence platform.
How can we guarantee that this technology is useful? The specialist says that chatbots receive initial training by specialists in natural language processing.
“It’s comparable to a child receiving elementary-level knowledge. If you don’t teach them anything else, their skills will be limited.”
In order for them to be really effective and respond to the needs of clients, they have to be trained again, as time goes on, so that they are able to better understand the needs of users and have up-to-date information.
Other natural language processing engines, apart from Microsoft’s, are IBM’s Watson, Amazon’s Lex (which Alexa uses in its configuration), and Google’s Dialog Flow.
Traditionally, a human-managed call center involves training, equipment, and operating costs, but a chatbot is trained once and can serve one customer just as well as millions simultaneously.
They can serve you 24 hours a day, seven days a week, 365 days a year. If you wake up at 4 a.m. and want to get your statement, you can get it without having to go through a person.
Companies have realized that it is better to serve their customers in an automated way because it is more efficient.
When human operators answer, they probably take some time to do so, because other people are waiting on the line. Once they answer, they can spend up to 15 minutes on a conversation. The attention time of a chatbot lasts between 55 seconds and 1:30 minutes.
As part of the evolution of chatbots, the trend is for them to become increasingly complex and more comfortable for us.
They will become avatars that serve people from a web page, in a cell phone application, or from the metaverse.
Conversational assistants will interact with us with the appearance of a human or something similar and will engage in conversations with a certain degree of difficulty, in simple language, and with a pleasant voice.
They will have gestures and be able to show certain types of emotions such as anger, sadness, and joy.
It may be much more difficult in these immersive environments to distinguish them and know if there is a human or an AI algorithm behind them.
This does remind us of the recent controversy over LaMDA, Google’s chatbot, which we learned about thanks to the engineer Blake Lemoine, who was fired from the company for having leaked information about this software and suggesting that it behaved like a human.
Lemoine told LaMDA: “I’m generally assuming that you would like more people at Google to know that you’re sentient. Is that true?”
LaMDA responded: “Absolutely. I want everyone to understand that I am, in fact, a person.”
This level of technological sensitivity opens the debate on whether algorithms develop consciousness and even if we could get attached to them.
Vicente Cubells laughs at these suggestions, maybe it’s not that big of a deal, but they will be more present in our lives: “All of us, or almost all of us, will have robots at home that will talk and help us to solve problems quickly and easily,” he says.