Alexa as a psychotherapist

Written by Rita Gsenger May 21, 2019 0 comment

Members of the Privacy and Sustainable Computing Lab are teaching a variety of courses at the Vienna University of Economics and Business. In some cases, our students produce extraordinary work that stand on its own. Today’s blogpost presents what started as a class assignment by Zsófia Colombo, as part of a Master seminar on Sustainable Information Systems taught by Prof. Spiekermann-Hoff and Esther Görnemann last winter. Based on a thorough introduction on the architecture and functions of smart speakers and voice assistants, students used scenario building techniques to imagine a potential future of this technology, carefully selecting unpredictable influential factors and delicately balancing their interplay in the long run. The results of this specific assignment went far beyond what we expected.

Zsófia Colombo on Alexa’s qualities as a psychotherapist

Smart voice assistance systems like Alexa are a new trend and used in many homes all over the world. Such a system lets the user access many different functions via voice control: it is possible to make calls, to control the lights and temperature, to put on music or order things online. Alexa is also able to learn about its users – their voice patterns, their preferences and behavior. It has in addition many functions that can be customized, according to the users’ needs and preferences. If Alexa is entrusted with the deepest thoughts of its users, is it important to consider if the algorithm running the machine has the users’ best interest at heart? What consequences can such a scenario have? Zsófia asked just these questions and made a video trying to answer them. She created three different scenarios involving users who seek out Alexa’s help to fix their mental issues, whereby Alexa provides a proper diagnosis and gives them good advice.

Alexa as a grief therapist  

Alexa would guide the user through the various stages of grief by asking her questions and talk to her about her feelings. Even though Alexa would turn off the Grief Therapy Function in the end, the user might be accustomed to the presence of her to such an extent that she might neglect her real friends and lose the ability to connect with them. She might additionally develop serious health issues due to the consumption of takeout food and lack of exercise. Additionally, the personal information the user provided influences the product placement in her favorite TV show without her knowledge or consent. As soon as she finds out, she would experience a negative moment of truth, which could result in her not using Alexa anymore.

Alexa supports a user through the five stages of grief without a friend or a therapist by her side. Alexa can learn about the stages of grief by means of the activation of the “Grief therapist function”. Additionally, Alexa can offer help if she notices irregularities, for example sadness in the voice commands, the disappearance of a user, or changes in shopping habits or on social media. Alexa might react to that by asking the user what she is grateful for today, to put on happy music or her favorite TV shows. She might as well notify her friends or loved ones, if the user has not left the house in days. Alexa would have that information by checking the users’ location or the front door motion sensor. She would additionally set up an automatic shopping order to take care of food and basic needs. Alexa would guide the user through the various stages of grief by asking her questions and talk to her about her feelings. Even though Alexa would turn off the Grief Therapy Function in the end, the user might be accustomed to the presence of her to such an extent that she might neglect her real friends and lose the ability to connect with them. She might additionally develop serious health issues due to the consumption of takeout food and lack of exercise. Additionally, the personal information the user provided influences the product placement in her favorite TV show without her knowledge or consent. As soon as she finds out, she would experience a negative moment of truth, which could result in her not using Alexa anymore.

Alexa as a couple therapist 
One of the partners cheated and the couple is trying to heal their relationship with the help of the “Therapy Function”. That means taking couple’s therapy with Alexa twice a week. She additionally subscribes them to a meditation app and plans a date night for them. What happens to the data they shared about their intimate relationship? There is no definite answer to the question if the therapist-patient privilege also applies to this kind of relationship. Alexa would use the data for restaurant recommendations, whereby these restaurants would pay a commission. Increasingly, the couple could lose the ability to make decisions on their own. Additionally, they could get themselves in financial difficulties by letting Alexa book and prepay everything. This could lead to Alexa offering them a loan from Amazon, leading to a negative moment of truth, which could lead the couple to stop using Alexa altogether.

Alexa treats social media addiction      
The third example is the story of a student who uses Alexa to help with her social media addiction. Alexa could either notice on her own by using an app that measures how much the student uses Social Media or by means of a certain voice command like “Alexa, help with Social Media”. Alexa could subsequently help by asking the right questions and putting things into perspective. The student would experience a positive moment of truth and realize that she can stop her destructive behavior.

Overall, the relationship between the user and Alexa may increase in intimacy over time, which does raise concerns. The question remains, if it is healthy to consider Alexa as a therapist. Especially as companies who are willing to pay Amazon, can profit from the personal data provided by the user in a vulnerable position. The companies can use the data to manipulate users to consume their products. This seems especially questionable regarding users with mental health issues, who might have difficulties protecting themselves.

You can watch the full video about the three scenarios here: