Mindful Connect: AI for Mental Health
Make it happen.
-
Role
UX Designer/Researcher
-
Tools
Figma
-
Team
1 MSI HCI and UX
Overview & Problem
Mindful Connect explores how AI can help monitor and support mental health by analyzing social media activity. The goal was to create a user-friendly tool that helps people understand how their online interactions impact their mental state, offering insights and resources to complement traditional therapy. The Mindful Connect Capstone project explores how AI can help monitor and support mental health by analyzing social media activity. The goal was to create a user-friendly tool that helps people understand how their online interactions impact their mental state, offering insights and resources to complement traditional therapy.
Research & Initial Questionairre
Stages
-
Empathize (Research and Initial Questionnaire)
-
Low Fi Prototype: Sketching
-
Mid-Fidelity Prototype
-
User Interviews
Research
Young Adults & Social Media
Prolonged social media use is linked to depression, anxiety, and loneliness.
Need for interventions to support mental health in digital contexts.
Excessive Use & Psychiatric Disorders
Overuse is tied to anxiety, depression, insomnia, appearance-based anxiety, and emotional dysregulation.
Stresses public awareness and tailored care.
Older Adults & Social Media
Social media both helps and hurts—improves connectivity, but can worsen depression.
Calls for more research on emotional impacts.
AI & Sentiment Analysis
AI tools can identify depressive language in real time, aiding in proactive mental health care.
Must consider ethical issues (privacy, nuance, false alarms).
Initial Questionnaire Findings and Insights on AI-Powered Mental Health Tools:
•Mood patterns require context.
•Ethical concerns: privacy, consent, risk of harm.
•Clinicians: supportive for self-monitoring, cautious about over-reliance.
•Could compliment therapy by offering early alerts and behavioral insights outside of sessions
•Challenges: interpreting sarcasm, diverse expressions, false alarms over social media
Ideation and Sketching
User Persona
Name: Lisa Snow
Age: 34
Occupation: Freelancer / Digital
Location: Portland, Oregon
Tech Comfort Level: High
Background: Lisa is a creative professional who thrives on visual storytelling and digital engagement. As a freelancer, she relies heavily on social media to showcase her work and connect with clients and peers. However, the constant digital presence and the pressure to stay visible online often leave her feeling emotionally drained. She worries about the long-term effects of this digital burnout on her well-being and wants a way to understand better how her social media use impacts her mental health.
Goals:
• Better understand how her social media use impacts her mental health and creativity.
• Find accessible, engaging tools to support daily emotional well-being.
• Consistently integrate mindfulness practices into her busy schedule.
Frustrations:
• Overwhelmed by apps that lack transparency about data use and privacy.
• Finds generic mental health advice uninspiring or disconnected from her creative lifestyle.
• Easily loses motivation without engaging prompts or reminders.
Needs:
• Clear, intuitive interfaces that reduce cognitive load rather than add stress.
• Personalized emotional tracking reflects nuanced states like "inspired," "frustrated," or "burned out."
• Trustworthy platforms that clearly communicate how user data is collected, stored, and analyzed.
Preferred Features:
• Creative journaling prompts and inspirational daily messages.
• Interactive visualizations and personalized trend analyses linking moods to social media activity.
• Direct access to mindfulness and stress-relief exercises tailored to creative professionals.
Technology Expectations:
• High level of transparency regarding AI insights and algorithms.
• Strong data privacy assurances when integrating social media accounts.
• Engaging, visually appealing interfaces that inspire regular interaction.
Mid-Fidelity Prototype
The UI design for Mindful Connect focused on creating a clean, intuitive, and engaging experience. Key elements included:
A minimalist layout to reduce cognitive overload and make it easy for users to track emotional states linked to social media habits.
Clear sections for journaling and resources, complemented by interactive visualizations and dynamic graphs to enhance engagement.
Visual clarity balanced with transparency—participants appreciated the aesthetic appeal but stressed the need for clear explanations of AI insights and processes.
Supportive features like creative journaling prompts and daily reminders were integrated to encourage self-monitoring.
Ethical considerations guided the UI, ensuring user autonomy, data privacy, and a user-centered approach.
This design approach prioritized user trust and accessibility while also addressing the nuanced emotional challenges users face in the digital space.
Interview & Findings
Conducted a series of user interviews after sharing the mid-fidelity prototype, with prior consent and sessions done in person and over Zoom.
Participants appreciated the clean, visually appealing interface.
They understood the app’s purpose—tracking emotional patterns linked to social media use—and found features like journaling and interactive visualizations helpful and engaging.
Concerns included data privacy, the accuracy of AI-generated insights, and the potential for misinterpretation.
Some felt the “Insights” and “Connections” features needed clearer explanations.
There was skepticism about trusting AI alone without human oversight.
Users suggested adding stress-relief exercises, using friendlier language to avoid harsh labeling, and ensuring the app felt like a supportive companion rather than a clinical replacement for therapy.
These insights shaped the app’s design refinements, improved clarity, and reinforced the focus on human-centered ethics.
AI Proof of Concept
Data Collection: Social media posts (e.g., Reddit, tweets).
Preprocessing: Cleaned and labeled data as ‘depression,’ anxiety,’ or ‘neutral.’
Model Training: Used no-code AI tools (e.g., MonkeyLearn) to train a classifier.
Visualization: Mapped sentiment trends in an interactive dashboard, showing how users’ emotions shifted over time
This is a small set of example social media posts, each labeled as depression, anxiety, or neutral. We can train an AI model to detect and classify the emotional tone of the text. We can teach a model to recognize patterns in language that signal mental health concerns. Once trained, the model can analyze new posts and predict if they’re expressing depression, anxiety, or neutral emotions, essentially helping us use AI for early mental health detection.”