Conference Paper

Efficient Mental Health Therapist Chatbot Assisted by Artificial Intelligence

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Mental health coaching programs have demonstrated significant improvements in outcomes, such as a 66% reduction in depression symptoms and a 32% enhancement in work productivity, indicating their efficacy in stress reduction and emotional support [37]. AI-based chatbots further augment this support by offering safe spaces for emotional expression and tailored advice, facilitating immediate support and connecting users with appropriate resources [38]. Therapist-assisted management systems enhance care plans through interactive learning modules and regular consultations, ensuring continuous improvement and personalized attention [39]. ...
Article
Full-text available
The global mental health crisis is compounded by barriers such as cost, accessibility, and stigma, leaving millions without adequate support. FASSLING (fassling.ai), an innovative artificial intelligence (AI)-powered platform, addresses these challenges by providing free, 24/7 multilingual emotional and coaching support through text and audio interactions. Grounded in inclusivity and compassion, FASSLING bridges gaps in traditional mental health systems by offering immediate, non-clinical support while complementing professional services. This paper explores FASSLING's design and implementation, emphasizing its user-centered features, including cultural adaptability, trauma-informed care principles, and active listening techniques. The platform not only empowers users to navigate emotional challenges but also fosters resilience and empathy, creating a ripple effect of societal compassion. Ethical considerations, such as ensuring user privacy and managing the limitations of AI, are central to FASSLINGs mission. By integrating advanced AI technologies with psychological best practices, FASSLING sets a new standard for accessible and inclusive mental health support, positioning itself as a transformative tool for global well-being. This case study highlights FASSLING's potential to redefine emotional support systems and drive positive change in mental health care worldwide.
Article
Full-text available
Several studies have demonstrated a critical association between cardiovascular disease (CVD) and mental health, revealing that approximately one-third of individuals with CVD also experience depression. This comorbidity significantly increases the risk of cardiac complications and mortality, a risk that persists regardless of traditional factors. Addressing this issue, our study pioneers a straightforward, explainable, and data-driven pipeline for predicting depression in CVD patients. Methods: Our study was conducted at a cardiac surgical intensive care unit. A total of 224 participants who were scheduled for elective coronary artery bypass graft surgery (CABG) were enrolled in the study. Prior to surgery, each patient underwent psychiatric evaluation to identify major depressive disorder (MDD) based on the DSM-5 criteria. An advanced data curation workflow was applied to eliminate outliers and inconsistencies and improve data quality. An explainable AI-empowered pipeline was developed, where sophisticated machine learning techniques, including the AdaBoost, random forest, and XGBoost algorithms, were trained and tested on the curated data based on a stratified cross-validation approach. Results: Our findings identified a significant correlation between the biomarker “sRAGE” and depression (r = 0.32, p = 0.038). Among the applied models, the random forest classifier demonstrated superior accuracy in predicting depression, with notable scores in accuracy (0.62), sensitivity (0.71), specificity (0.53), and area under the curve (0.67). Conclusions: This study provides compelling evidence that depression in CVD patients, particularly those with elevated “sRAGE” levels, can be predicted with a 62% accuracy rate. Our AI-driven approach offers a promising way for early identification and intervention, potentially revolutionizing care strategies in this vulnerable population.
Article
Full-text available
The utilization of artificial intelligence (AI) in psychiatry has risen over the past several years to meet the growing need for improved access to mental health solutions. Additionally, shortages of mental health providers during the COVID-19 pandemic have continued to exacerbate the burden of mental illness worldwide. AI applications already in existence include those enabled to assist with psychiatric diagnoses, symptom tracking, disease course prediction, and psychoeducation. Modalities of AI mental health care delivery include availability through the internet, smartphone applications, and digital gaming. Here we review emerging AI-based interventions in the form of chat and therapy bots, specifically conversational applications that teach the user emotional coping mechanisms and provide support for people with communication difficulties, computer generated images of faces that form the basis of avatar therapy, and intelligent animal-like robots with new advances in digital psychiatry. We discuss the implications of incorporating AI chatbots into clinical practice and offer perspectives on how these AI-based interventions will further impact the field of psychiatry.
Article
Full-text available
Background: Suicide poses a significant health burden worldwide. In many cases, people at risk of suicide do not engage with their doctor or community due to concerns about stigmatisation and forced medical treatment; worse still, people with mental illness (who form a majority of people who die from suicide) may have poor insight into their mental state, and not self-identify as being at risk. These issues are exacerbated by the fact that doctors have difficulty in identifying those at risk of suicide when they do present to medical services. Advances in artificial intelligence (AI) present opportunities for the development of novel tools for predicting suicide. Method: We searched Google Scholar and PubMed for articles relating to suicide prediction using artificial intelligence from 2017 onwards. Conclusions: This paper presents a qualitative narrative review of research focusing on two categories of suicide prediction tools: medical suicide prediction and social suicide prediction. Initial evidence is promising: AI-driven suicide prediction could improve our capacity to identify those at risk of suicide, and, potentially, save lives. Medical suicide prediction may be relatively uncontroversial when it pays respect to ethical and legal principles; however, further research is required to determine the validity of these tools in different contexts. Social suicide prediction offers an exciting opportunity to help identify suicide risk among those who do not engage with traditional health services. Yet, efforts by private companies such as Facebook to use online data for suicide prediction should be the subject of independent review and oversight to confirm safety, effectiveness and ethical permissibility.
Article
Full-text available
Purpose of Review Treatments in psychiatry have been rapidly changing over the last century, following the development of psychopharmacology and new research achievements. However, with advances in technology, the practice of psychiatry in the future will likely be influenced by new trends based on computerized approaches and digital communication. We examined four major areas that will probably impact on the clinical practice in the next few years: telepsychiatry; social media; mobile applications and internet of things; artificial intelligence; and machine learning. Recent Findings Developments in these four areas will benefit patients throughout the journey of the illness, encompassing early diagnosis, even before the patients present to a clinician; personalized treatment on demand at anytime and anywhere; better prediction on patient outcomes; and even how mental illnesses are diagnosed in the future. Summary Though the evidence for many technology-based interventions or mobile applications is still insufficient, it is likely that such advances in technology will play a larger role in the way that patient receives mental health interventions in the future, leading to easier access to them and improved outcomes.
Article
Full-text available
Background Web-based cognitive-behavioral therapeutic (CBT) apps have demonstrated efficacy but are characterized by poor adherence. Conversational agents may offer a convenient, engaging way of getting support at any time. Objective The objective of the study was to determine the feasibility, acceptability, and preliminary efficacy of a fully automated conversational agent to deliver a self-help program for college students who self-identify as having symptoms of anxiety and depression. Methods In an unblinded trial, 70 individuals age 18-28 years were recruited online from a university community social media site and were randomized to receive either 2 weeks (up to 20 sessions) of self-help content derived from CBT principles in a conversational format with a text-based conversational agent (Woebot) (n=34) or were directed to the National Institute of Mental Health ebook, “Depression in College Students,” as an information-only control group (n=36). All participants completed Web-based versions of the 9-item Patient Health Questionnaire (PHQ-9), the 7-item Generalized Anxiety Disorder scale (GAD-7), and the Positive and Negative Affect Scale at baseline and 2-3 weeks later (T2). Results Participants were on average 22.2 years old (SD 2.33), 67% female (47/70), mostly non-Hispanic (93%, 54/58), and Caucasian (79%, 46/58). Participants in the Woebot group engaged with the conversational agent an average of 12.14 (SD 2.23) times over the study period. No significant differences existed between the groups at baseline, and 83% (58/70) of participants provided data at T2 (17% attrition). Intent-to-treat univariate analysis of covariance revealed a significant group difference on depression such that those in the Woebot group significantly reduced their symptoms of depression over the study period as measured by the PHQ-9 (F=6.47; P=.01) while those in the information control group did not. In an analysis of completers, participants in both groups significantly reduced anxiety as measured by the GAD-7 (F1,54= 9.24; P=.004). Participants’ comments suggest that process factors were more influential on their acceptability of the program than content factors mirroring traditional therapy. Conclusions Conversational agents appear to be a feasible, engaging, and effective way to deliver CBT.
Article
Full-text available
Deep learning (DL) is a family of machine learning methods that has gained considerable attention in the scientific community, breaking benchmark records in areas such as speech and visual recognition. DL differs from conventional machine learning methods by virtue of its ability to learn the optimal representation from the raw data through consecutive nonlinear transformations, achieving increasingly higher levels of abstraction and complexity. Given its ability to detect abstract and complex patterns, DL has been applied in neuroimaging studies of psychiatric and neurological disorders, which are characterised by subtle and diffuse alterations. Here we introduce the underlying concepts of DL and review studies that have used this approach to classify brain-based disorders. The results of these studies indicate that DL could be a powerful tool in the current search for biomarkers of psychiatric and neurologic disease. We conclude our review by discussing the main promises and challenges of using DL to elucidate brain-based disorders, as well as possible directions for future research.
Article
Psychiatric disorders are now responsible for the largest proportion of the global burden of disease, and even more challenges have been seen during the COVID-19 pandemic. Artificial intelligence (AI) is commonly used to facilitate the early detection of disease, understand disease progression, and discover new treatments in the fields of both physical and mental health. The present review provides a broad overview of AI methodology and its applications in data acquisition and processing, feature extraction and characterization, psychiatric disorder classification, potential biomarker detection, real-time monitoring, and interventions in psychiatric disorders. We also comprehensively summarize AI applications with regard to the early warning, diagnosis, prognosis, and treatment of specific psychiatric disorders, including depression, schizophrenia, autism spectrum disorder, attention-deficit/hyperactivity disorder, addiction, sleep disorders, and Alzheimer's disease. The advantages and disadvantages of AI in psychiatry are clarified. We foresee a new wave of research opportunities to facilitate and improve AI technology and its long-term implications in psychiatry during and after the COVID-19 era.
User Acceptance of AI Mental Health Chatbots: A Comparative Study
  • Emma White
  • Maria Rodriguez
  • Michael Brown
Artificial Intelligence in Psychiatry: Potential Uses of Machine Learning Include Predicting the Risk of Suicide, Psychosis
  • H Ha Nasrallah
  • Kalanderian
AI Chatbots for Student Wellbeing: A Case Study in Educational Settings
  • Lucas Miller
  • Julia Bennett
Natural Language Processing for Emotional Assessment in AI Mental Health Chatbots
  • Reyna Patel
  • Rajiv Kumar
  • Anika Sharma
Al-Based Mental Health Chatbots: State-of-the-Art Review
  • Johnson
AI- Enhanced Conversational Agents in Mental Health: Ethical Considerations
  • Hall
Woebot: A Conversational Agent to Combat Mental Health Challenges
  • A Darcy
  • A K J Walls
  • K Gilstad-Hayden
AI- Driven Predictive Modeling for Mental Health Assessment
  • William King
  • Olivia Turner
  • Ethan Mitchell
Woebot: A Conversational Agent to Combat Mental Health Challenges
  • Darcy
Natural Language Processing for Emotional Assessment in AI Mental Health Chatbots
  • Patel
User Acceptance of AI Mental Health Chatbots: A Comparative Study
  • White
AI- Driven Predictive Modeling for Mental Health Assessment
  • King
Al-Based Mental Health Chatbots: State-of-the-Art Review
  • Sarah Johnson
  • David Lee
  • Mark Davis
AI- Enhanced Conversational Agents in Mental Health: Ethical Considerations
  • Sophia Hall
  • Alex Turner
  • Emily Patel