AIGHT


Topics in Design Capstone IBM Project

Designed a mobile application, AIght, a mental health tracker that leverages AI to address user hesitancy in trusting artificial intelligence.

We developed a mobile application, AIght, for a mental health solution leveraging AI as part of the first-ever "Topics in Design" course under Professor McKinney at UT Dallas. Our task, assigned by IBM, was to address the user hesitancy in trusting artificial intelligence for mental health support. As lead researcher, I worked alongside my talented team—Kristina Smolyakova, Angela Wang, and Linh Nguyen—to create an innovative app that gained recognition as the highest-earned case study project in the class.

role

Lead Researcher

skills

Users Research, Research Methodology, Data Analysis, User-Centered Design

timeline

Aug - Dec 2023

team

Project Manager, Researcher, Designer, Developer

project outcome

A mobile application designed to help individuals manage their mental health with AI-driven solutions and personalized support. Whether you're tracking your mood, journaling, or seeking advice from the chatbot, "AIGHT" is here to support you!

tools

Figma


project brief

Professor McKinney launched his first class titled "Topics in Design." This course involves studying user experience design, such as accessibility, interaction, typography, usability, or other forms of design thinking and creative production involving technology.

key results

Prompt

Why did we create AIGHT?

Develop a mobile app that addresses mental health by leveraging artificial intelligence, fostering increased user trust and confidence in AI capabilities.

Mental health impacts over 60% of the youth population in the United States, representing a significant and urgent concern. Amongst various demographics, college students are the largest group experiencing mental health challenges. Barriers, including cultural, social, and financial factors, impede the youth's access to mental health resources and services.

Why: There is a lack of trust regarding AI in mental health services.

Who: College students need help with managing their mental health.

How: We designed a mobile application featuring AI to increase accessibility and effectiveness.

What: Provide comfort with exercises and tools to keep track of progress.

problem

How can artificial intelligence be utilized to address and alleviate the challenges associated with college students accessing mental health resources?

So why are they not getting the help?

In class, we gained deeper insights into our seven main problems. Here is the breakdown of our secondary research:

Why is it hard for college students to access mental health services?

Mental health has been a controversial topic for years. In many countries, it is still considered taboo to talk about it or mention that you have a mental illness. College students are the largest demographic of the population that suffers from mental illness.

Who is our user?

To know what our users need, we conducted ten in-depth qualitative interviews and collected over 100 data results on quantitative research. Before we dive in, we knew some basic demographics that we were aiming for:

  • College students - age 18-25

  • Users seeking mental health advice and guidance 

  • Users who are looking to track mood and a way to record their thoughts

Major Stressors

83%

Education

Based on those results, we found three main stressors that university/college students face:

56%

Work

61%

Finance

Primary Research

How do college students perceive the reliability and trustworthiness of AI-powered apps in addressing mental health concerns?

  1. College students, accustomed to technology from a young age, may readily open to AI-driven apps for mental health support.

  2. College students appreciate the convenience of apps.

  3. Culture affects various attitudes toward mental health and technology. AI-powered apps that are culturally sensitive and inclusive in their approach might gain more trust.

  4. There are concerns about the long-term benefits of AI mental health.

  5. Users must be well-informed about how their data will be collected, used, and shared.

“Despite having access to mental health resources, many people are not likely to utilize those resources, such as on-campus mental health centers.”

surveys

Our team conducted a survey, and we analyzed the results from the 62 responses:

  • 77% University students; 23% Community college

  • 50% - Senior

  • 87% have health insurance

  • 50% don’t know their health insurance. Therefore, our app can help search for health insurance plans.

“I’ve made others uncomfortable in the past by trauma dumping.”

Interviews 📄

We interviewed 8 UTD students and asked questions with this focus:

“AI is still a sensitive topic, and not many people are willing to trust it.”

“There is a disparity between wanting anonymity versus having a personal experience with the AI.”

How does your family perceive mental health?

What type of stressors are you currently dealing with?

What are your alternative methods to alleviate your mental health?

How do you feel about using a mental health app that uses AI?

"However, the majority are open to the discussion and the use of it for mental health.”

Pros

  • Appealing UI

  • Can track habits

  • Offer literature on mental health topics

“The majority of people struggle with at least one type of stress or have mental health issues.

“Suggested using the AI app to suggest coping mechanisms and stress-alleviating methods.”

App Store Reviews

“Feel empty, cold and artificial” - Wysa

“Dumped all of my feelings and it was completely ignored. No feedback” - Wysa

“Many trolls and bullies” - Humans Anonymous

“No moderation and no feedback” - Humans Anonymous

Competitor Analysis

🗯

Competitor Analysis 🗯

Cons

  • No personalization

  • Misrepresentation of therapy as mental health coaching

  • Public domains = no moderation

  • Many features behind paywall

Introducing Seraphine, our persona.

Seraphine’s User Journey

What about therapy?

AIGHT is a valuable tool for self-reflection and coping strategies, but it cannot replace the personalized support and expertise provided by trained mental health professionals.

While AIGHT can complement therapy by aiding in mood tracking and offering coping skills, it is not a substitute for the in-depth guidance and tailored treatment plans offered by therapists.

Our features are not there to completely eliminate professional help. It is there to be as a tool for those who feel unsafe or uncomfortable talking about their feelings and emotions to other people.

What about privacy?

We prioritize user data security and employ various measures to protect personal information:

  • We are transparent about utilizing Artificial Intelligence from the beginning when we introduce Puff

  • Since we use AI to tailor responses and journal prompts, the user can choose if they want AI to utilize their part chats in generating Journaling prompts. We ask what the user wants to do with the information they provide:

  • Delete = information is permanently removed from the platform

  • Save = information is saved (can still be deleted at any point) and is used for statistical graphs and AI Journal prompts

  • The user can always access their privacy settings and adjust them.

Addressing Sensitive Topics

Questions about mental health issues vary in sensitivity. No matter how you train AI, there are still many legal and ethical obstacles to discuss on whether AI can be qualified to give professional psychological advice.

If a user texts/AI suspects that the user is planning on hurting themselves or others, they are immediately prompted with a large pop-up for a hotline that directly links them to call the line. If they don’t want help, they are prompted again just to make sure.

We use the color YELLOW in our app only during these sensitive times to emphasize urgency without being too alarming and uninviting as opposed to red.

Judging and Feedback

In our UX design project, my team and I demonstrated a commitment to success in every aspect of our work. We prepared complete, detailed materials and delivered an outstanding presentation, balancing information across all team members. The professionalism reflected in our presentation design was acknowledged, showcasing our dedication to a polished and effective communication style.

Our primary research efforts were recognized for the depth of understanding we achieved from the target audience's perspective. The emphasis on our proposed solution's accessibility resonated well with professors and IBM professionals, effectively addressing a significant barrier. Including mind mapping in our process was noted as a thoughtful touch, offering insight into our methodology. Our comprehensive brand guide, including the tagline and mascot explanation, demonstrated thoroughness and intelligence in our design choices.

The completeness of our prototype received high praise, with particular recognition for anticipating and addressing potential user issues. Incorporating AI for trigger word identification was acknowledged as a smart move, even catching the attention of IBM. Practical features like searchable diary entries and user-selectable chat forms were highlighted for their innovation and user-centric approach, contributing to building trust in the application. The clarity and coherence of our KPIs and their tie-back to the initially identified issues were commended, providing a logical and compelling conclusion to our presentation.

Our professor's highest ranking among all teams in our capstone class and the subsequent grade of "A+" with a numerical score of 98 underscored the recognition of our group as a high-functioning and impressive team. Collaborating and showcasing our hard work this semester was truly a pleasure.