Saluto
A mental health service that leverages artificial intelligence to act as a mediator between patients and psychologists.
Context
Classroom project at Carnegie Mellon University
Timeframe
3 weeks, Fall 2018
Role
Research and synthesis, conceptualization, information architecture, prototyping, visual design, branding
Tools
Sketch, Illustrator, Invision
Team members
Ema Karavdic, Jiasi Tan, Matt Prindible
The Challenge
The prompt for this project was to visualize an online counseling service that uses AI to offer personalized experiences to the patient. We were also given the task of designing the psychologist-facing interface by leveraging the information gleaned by the AI during therapy sessions for further review and management by the psychologist.
The Solution
The solution consists of two platforms, one patient-facing, and the other, psychologist-facing. The patient-facing platform is a mobile application uses AI as an entry point to provide initial mental health support for those people who are hesitant or unable to reach out to a human psychologist. If required, with the permissions of the patient, it also acts as a mediator to refer a patient to a human psychologist for more advanced therapy.
The psychologist-facing platform is a desktop application that assists psychologists in reviewing and managing their patients’ AI counseling history as well as on-going treatment information.
The visual below illustrates the key steps that a user would go through while using this service. It starts with onboarding on the mobile application and ends with in-person sessions with a human psychologist.
On-boarding
In order to provide a customized experience that would address the patient’s specific needs, the patient would be asked a few questions during on-boarding. This would help the AI counselor get initially started to determine the kind of therapy that would be needed by the patient. This information would also determine the personality of the AI counselor. Privacy terms and conditions would also be a part of this process.
Counseling
As the treatment would progress, the insights gained from the conversations between the AI counselor and the patient would help in further shaping the course of treatment. For an action-oriented treatment like Cognitive Behavioral Therapy or CBT, the AI assistant would regularly assign ‘homework’.
Referral
Slower progress indicated by regular assessment test results would prompt the AI counselor to suggest that the patient meets with a human psychologist. Owing to the increased understanding and trust between the patient and the AI counselor, the counselor would suggest a suitable psychologist for the patient. It would also help the patient with setting up an appointment.
Data handoff
Through regular conversations, exercises, and access to third-party apps, the AI would gather a lot of information about the patient. With the patient’s permission, the AI would hand over this information to the human psychologist. The patient would decide which information is to be shared with the psychologist and which information is to be kept confidential. This step would also include adding a personal health history.
In-person therapy
As per the permissions granted by the patient, the psychologist would be able to view the information obtained from the AI counseling sessions to understand the patient’s case in detail. Additionally, other features would be available on the platform to help the psychologist record and manage the information obtained from on-going sessions at the clinic.
Weekly tasks
The psychologist would assign weekly tasks to the patient that would be accessible to him or her on the Saluto mobile app.
The Process
Given that the design brief was open-ended, kickstarting the project with research helped us frame the problem statement more clearly.
How might we design a service that uses artificial intelligence to provide mental health services to those who are hesitant or unable to seek help?
Research
Mental health is a socially sensitive topic, and we realized that we had to be tactful while designing for such a delicate subject. Designing for AI was also new to us, and it was necessary to understand its strengths and limitations to inform our design decisions during the course of the project. We used different types of research methods like psychologist interviews, market analysis, literature reviews, secondary research to gain some useful insights.
Literature review
Psychologist
interview
Secondary
research
Market
analysis
Persona
creation
The following questions guided our research:
-
What are the barriers to seeking medical help?
-
How can we create a personalized user experience through AI?
-
What are the strengths of AI that can be leveraged?
-
What are the limitations of AI, and how might we design them?
Insights
We gained three important insights that guided our project further:
Barriers to treatment
The major barriers included stigma, cost, shortage of psychologists and lack of mobility. Prejudicial attitudes and discriminating behavior towards individuals with mental health problems, as well as self-perceived stigma, which is an internalized stigma that the person with the mental illness suffers from, prevent people from seeking the required medical help. Mental disorders top the list of medical conditions with the highest estimated spending in the United States.
AI as an easy access point
It is easier to share potentially embarrassing information with a virtual therapist. In human-to-human interaction, there is often a degree of self-restraint. Shame can prevent people from sharing openly with another person. However, when sitting with a virtual therapist, patients are more willing to express themselves, which could have an important therapeutic advantage.
AI as a mediator, not a replacement
During our research, psychologists suggested that mental health apps are an excellent way to help people stay connected outside of sessions to the work they are doing in therapy, but cannot be an alternative or replacement for traditional treatment. Mental health apps are very useful for people who cannot have sessions as often as they would like, but AI cannot take the place of a human because it cannot offer individualized interpretations and insights. AI might not always able to understand the user and his or her intent. Experts suggest that this medium should be used in conjunction with a human therapist.
The scenario
As a part of the design brief, we were given a transcript of a counseling session to help us imagine the scenario that we would be designing for and to understand the personalities of the patient and the AI counselor.
Mary is a history major in college. She has recognized the need for counseling because of frequently feeling hopeless and defeated. She describes herself as being ‘stressed’ all the time. She is hesitant to seek help from the mental health professional on her university campus because she is afraid that her classmates might judge her. She has developed a dislike for herself because of various reasons like being unable to score straight A’s in classes, not being able to keep her house clean and being overweight. She is extremely worried about paying off her college loan and is worried about getting a job after graduation.
Mary
College student
Joan
AI counselor
Joan recognizes that Mary’s negative feelings about her situation and self-image arise from distorted thinking patterns. In Joan’s opinion, Mary is experiencing difficulties because she has been lying to herself. Joan tries to use the approaches and tools of CBT or Cognitive Behavioral Therapy to help Mary tackle this problem of ‘distorted thinking’. She tries to help Mary be more accurate about her problems so that they can work together to change the patterns of thinking and behavior and in turn, her feelings. Owing to Mary’s personality, she tries to strike a balance between being forbearing and having a stern tone of voice.
The conversation transcript reveals a unique relationship between Mary and Joan, the AI counselor. The transcript gives the sense that the interaction between Mary and Joan is a comfortable one, owing to the bond formed between them over a period of time. Even though it seems like the conversation between Mary and Joan is adversarial, it is actually productive as Mary responds well to the ‘tough love’ given to her by Joan. Mary seems invested in the conversation and her responses show a willingness to continue the sessions with Joan. She is regular in completing the weekly ‘homework’ that is assigned to her by Joan.
Their relationship
Design imperatives
From our research insights and definition of personas, we arrived at the objectives that the service should fulfill:
Entry point for seeking medical help
The AI counselor should act as an entry point for the patient to seek the required medical help. It should be able to recognize the instances that the user might require advanced treatment and should act as a mediator to encourage the patient to seek an expert’s assistance.
Personalized and context responsive
The AI counselor should detect the individual’s needs and accordingly suggest and implement an appropriate course of treatment for them.
Ensuring privacy and confidentiality
The AI counselor should assure the user that their information is private and safe. At every step, for any action that requires the AI counselor to use the data provided, it should ask for the user’s permission.
Trustworthy and approachable
The personality of the AI assistant should professional, yet friendly enough to allow for the AI assistant to gain the user’s trust and lay the ground for easy rapport building.
User journey
Information architecture
The information architecture of the mobile app was fairly straightforward as the AI counselor was the most important feature of the app. However, the referral process was an interesting workflow and we decided to focus on visualizing what that scenario would look like.
For developing the information architecture of the doctor dashboard, we started by asking the following questions:
-
What was important information for the psychologist to see?
-
What kind of data was being brought in from the AI counseling sessions?
-
What would be the hierarchy of the information?
-
How would that information be translated to wireframes?
High-fidelity designs
After multiple sessions of establishing the information architecture and creating many iterations of the wireframes, we moved on to creating high fidelity visual designs for our concept.
Design decisions
Here are some important considerations we made in the design process of both the platforms:
On-boarding questions for personalization
In order to provide a personalized experience that would address the patient’s specific needs, the patient would be asked a few questions during onboarding. This would help the AI get initially started to determine the course of treatment and the kind of therapy that would be needed by the patient. This information would also determine the personality and tone of the AI counselor.
Humanizing the mental health professional
The patient might be hesitant at first to seek therapy from a human psychologist. However, if they would know who they are being asked to see, the anticipatory anxiety and fear of the unknown would be reduced. A prospective professional’s personality, human qualities would help in making the patient feel at ease. For example, the AI counselor might suggest, “Dr. Gelding has experience in mental health counseling for about 15 years. He’s nice, maybe about 40 years old, has a client focus on teenagers and college students. I think he also collects antiques. He has a dry sense of humor. I think you would like him.”
Data privacy
The patient would have complete control over the information that would be handed off by the AI to the human psychologist. The patient would be able to review and consent to various types of information like conversation transcripts, third-party app data, homework etc. before it would be shared with the psychologist.
Psychologist having access to the AI counselor
The psychologist would be able to interact with the AI assistant at the click of a button so that he or she would be able to surface any required information from the patient’s therapy sessions held in the past. However, the type of information that could be shared would depend on the permissions given by the user during the hand-off process.
Reflection
Given more time, I would like to dig deeper into the psychologist's requirements and develop more relevant features for the psychologist-facing platform. More interviews and user-testing might help in achieving this. While this platform is designed for the psychologist to use after therapy sessions, I would also like to visualize a platform that would be used by the psychologist during therapy sessions for recording observations and taking down notes. I would like to work on the visual form of the AI counselor and explore its motion during different moments in a conversation.