Whether they’re picking up a complaint from the bank, collecting and sorting job applications, or helping you order a pizza, chatbots are cost-effective and efficient tools for both companies and individuals.
Read More: Job interview coming up? Just imagine if Alexa could do the hard work for you
Since the first computer-generated chatbot was created in the 1960s, technology has advanced at breakneck speed. As our capabilities with computers become more complex, so too do the roles that chatbots are now being assigned.
From counsellors and companions to doctors and physicians, we take a look at three chatbots that are muscling in on what have traditionally been very human roles. It turns out that we are easily duped into creating an emotional connection with something that is essentially made from some well-thought-out code.
Woebot
Woebot is a chatbot that has been designed by psychologists from Stanford University. By teaching the AI system the basics of CBT (Cognitive Behavioural Therapy) the creators hope to reduce the amount of depression and anxiety felt by millions of people across the globe by giving them a resource to talk to, 24/7.
Read More: ‘Authoritarianism is easier in a world of total visibility’: WEF report
The chatbot checks in with the user every day, asking them how they feel, and over time can identify recurring patterns that may help individuals better manage their emotions.
According to the World Economic Forum Global Risks Report 2019, “‘woebots’ and similar tools could transform the delivery of emotional and psychological care—analogous to heart monitors and step counters.”
However, the report also warns, “New possibilities for radicalization would also open up, with machine learning used to identify emotionally receptive individuals and the specific triggers that might push them toward violence. Oppressive governments could deploy affective computing to exert control or whip up angry divisions.”
Woebot creator Alison Darcy, PhD is a Clinical Research Psychologist who carried out a randomised test on a small sample of American college students to analyse the efficacy of the chatbot. The study she published revealed that “conversational agents appear to be a feasible, engaging and effective way to deliver CBT,” and there was a clear reduction in feelings of depression and anxiety in comparison to the control group.
Read More: Artificial vs Emotional Intelligence in Machine Learning
The study also reported that 75% of people who need mental health support don’t get it. This still holds true for college students where this support is often free and ubiquitous, meaning that cost is not always a barrier to accessing help — stigma, however, is.
Online and mobile solutions that provide mental health interventions have been considered one of the best ways to remove stigma from the equation. A study carried out by John Torous, MD at Harvard university revealed that 70% of those surveyed expressed an interest in smartphone apps to monitor and manage mental health conditions.
Many of the participants of the study who had a positive experience described Woebot as “he,’’ “a friend,” and a “fun little dude,” suggesting that they felt that Woebot was empathising with them, not its code. This shows how easily we can be duped into believing – on some level – that an animate, automated character has feelings and can empathise with humans.
Despite awareness that Woebot isn’t an actual living being, it is providing users with the support they need while retaining a sense of autonomy.
“In my first session with Woebot, I found it immediately helpful…addressing my anxiety without another human’s help felt freeing.” Nick Douglas, lifehacker, quoted on the Woebot website.
Endurance
Socialising has always been viewed as an activity carried out between humans. However, as our interactions have become increasingly digital – think Skype and WhatsApp – how different is it really for someone to have a robot as a companion?
Companion robots aren’t new, with Buddy, which is designed to ease loneliness among senior people, or No Isolation, created for the same purpose but among children who cannot attend school due to health reasons.
Russian robotics company Endurance, however, is in the process of creating an AI-powered companion robot for those suffering from Alzheimers. The degenerative brain disease affects memory and cognitive function, and is the main cause of dementia in older people. It is especially difficult for family members, as the individual often forgets who people are and suffers from short-term memory loss.
Read More: ZoraBots launches humanoid robots powered by IBM’s Watson to help children, elderly
Endurance’s robot, states the whitepaper, hopes that their robot will make life both easier and more enjoyable for those with dementia. It won’t get frustrated by having the same conversation over and again, and can also work as an alarm clock, reminding the user (and subsequently checking) if they have taken a pill or locked the front door, for example.
Most importantly, the robot helps monitor the progression of the disease. Integral to the software will be “diagnostic questions” related to the short term memory, what they are able to remember of previous conversations, or whether they can perform simple puzzles. This information, along with a transcript of all conversations, will be stored on the cloud and available for family members or doctors, allowing for a personalised diagnosis.
MedWhat
Following on the medical theme, MedWhat is a “intelligent super-computer” that acts as a chatbot, answering medical and health questions for consumers and doctors instantly. It was created to avoid situations where people follow “strange but true” remedies and end up rubbing lime on their foreheads to get rid of a headache.
In 2013, MedWhat CEO and co-founder Arturo Devesa told the Standford Daily that the company wants to address the issue of people drastically misdiagnosing themselves by searching the internet for possible causes to their symptoms.
“There’s a mixed quality of sources, and you are on your own to find the answers,” he said, adding, “what we are creating is a questions and answers engine, which provides instant answers to relevant medical questions.”
The application can also store previous searches, giving the user an account of their past searches and medical issues.
Read More: Healthcare Bots Digitally Transform Vietnam’s Medical Facilities
The potential issue, unlike the two previous AI chatbots, is not that users are attributing human emotions and personalities onto this chatbot, but rather that they are trusting a machine to carry out tasks with potentially severe consequences.
However, although the website lauds the cognitive ability of the chatbot, saying that it is constantly exposed to new medical materials and therefore constantly learning, the small print divests the company of any responsibility.
“MedWhat.com does not provide medical or any other healthcare advice, diagnosis or treatment. The information contained on the website is for education purposes only.”
Alarm Bells?
The fear that sentient AI robots will take over the world and extinguish our species has been instilled in us from science fiction movies. However, this doesn’t stop us from embracing their efficiency and luxuriating in their convenience.
Read More: The Story Of Artificial Intelligence As Told By The Ancient Mayan Popol Vuh
We see new and improved technology as a reason to celebrate, rather than to think about the ethics of whether such digital advances should be implemented in different spheres. However, the developers and creators of such chatbots have at no point tried to hide the fact that the user is interacting with a machine; Woebot founder Darcy explained they chose the name for this very reason.
Therefore, the responsibility lies with us. The question of whether it is wrong for us to have emotional feelings towards an inanimate – but responsive – entity has not yet been answered.
Read More: Chatbot market expected to rise 37% over next 4 years before bubble bursts
However, if a chatbot can help us manage depression, stave off loneliness for dementia patients, and give us a responsible answer to a medical question, it seems ill-advised to ignore it.