Generative AI Could Help Solve the U.S. Mental Health Crisis

Ashley Andreou & ChatGPT

At ten years old, I considered I, Robot a scary movie. To this day, it lives on as a flashbulb memory in my mind—loss of control and existential threat don’t sit well with me. A similar uneasiness took hold when I recently read the news: ChatGPT can handle questions on the US medical licensing exam (USMLE).1 Great, I thought. Now, I not only need to fear robotic sentience but also paid almost half a million for medical school for no reason. And, with Microsoft announcing a ten billion investment in ChatGPT maker, OpenAI, and Google unveiling its version, Bard, the future looms large and inevitable.2 Now, as I prepare to match into psychiatry, I can’t help but wonder if ChatGPT could do my future job.

The app-based chatbot released in December immediately popularized generative AI. Generative AI analyzes patterns in written words from across the web, forming complex mathematical and computational formulations, to produce human-like text. With its dizzying number of potential uses and user amassment, generative AI platforms could be instrumental in providing psychiatric care– especially given the stark supply and demand issues.

Today, 21% of US adults reported experiencing a mental illness, and one in ten youth report mental illness severely impacting their life. Yet, only one mental healthcare professional currently exists for every 350 people.3 Trained on clinical data, generative AI could aid in psychiatric diagnosis, medication management, and psychotherapy. The technology could act as a patient-facing chatbot or back-end assistant that provides the physician with insights garnered from its large language model (LLM) processing capabilities. But at what cost?

Generative AI Could Improve Patient Health, Autonomy, and Equity

For a hospital or clinic to safely integrate generative AI, a certified mental healthcare professional would need to provide oversight, in a hybrid care model, with all data falling under HIPAA’s jurisdiction. First, ChatGPT could help analyze patients’ linguistic and communication patterns to improve physicians’ diagnostic accuracy and identification of patients in crisis. For instance, ChatGPT could pick up on verbal subtleties that emerged before a patient’s past manic episodes. ChatGPT could also make pharmacological and interventional treatments more effective by analyzing patients’ language patterns to discern early signs of treatment responses. In conversation, it could quantify how much less a patient may perseverate or exhibit negative cognitive distortions. These conversational statistics, which hold mental status exam clues, could be analyzed and trended in real time. When presented in conjunction with the clinician’s findings, this process may protect against human error to improve diagnostic accuracy and enhance proactive treatment adjustments.

ChatGPT could also provide psychiatrists with the most up-to-date and relevant research on treatment options to address a patient’s particular symptoms (not just their broad diagnosis) and individualized care. Such individualization is integral in psychiatry given the existence of multiple options in each medication class (e.g., second-generation antipsychotics or SSRIs) and numerous overlapping symptoms within DSM-5 diagnoses.

In terms of generative AI’s use in psychotherapy, data shows a mounting exodus of psychiatrists from this realm: 50% of psychiatrists today report they do no psychotherapy.4 As many modalities of psychotherapy have become manualized (i.e., performed according to specific administration guidelines that maximize the probability of the intervention being conducted consistently across settings, therapists, and clients), studies show that the average psychiatrist does not offer a substantial comparative advantage in overall outcomes compared to less costly therapy practitioners. As supply and demand forces edge psychiatrists out of therapy, ChatGPT could be key in the search for cheaper alternatives.

If ChatGPT were to help craft therapeutic responses, it may be most beneficial in assisting therapists with manualized therapies that leverage specific tools. ChatGPT’s absent humanity is at the crux of its limited ability to autonomously provide psychotherapy. In the words of the psychiatrist, Phil Stutz, the common denominator of the human experience is pain, uncertainty, and constant work– none of which ChatGPT has felt. These universal elements add authenticity to a human therapist’s words. When they say, “I understand… I care about you… I want you to feel safe,” their verbal and non-verbal communication convey true empathy and emotional support derived from parallel lived experiences.

Countless studies show that the transcendent factor for the efficacy of all psychotherapy is the strength of the therapeutic alliance built through attunement, non-verbal gestures, and positive regard.5 Here, ChaptGPT falls short. The chatbot also wasn’t caring for a patient in their darkest times for years, even decades. A longitudinal and storied therapist-patient relationship builds an incomparable alliance.

To overcome ChatGPT’s lack of human experience and long-term rapport, ChatGPT may be best suited for therapy modalities that are minimally dependent on emotional/supportive statements (e.g., supportive psychotherapy or motivational interviewing) or the interpersonal relationship between patient and therapist (e.g., psychodynamic psychotherapy). Generative AI could primarily assist in manual-based treatment modalities like cognitive behavioral therapy or interpersonal therapy, where ready-made tools can be taught and applied. Ultimately, IRB-approved research will be needed to understand a chatbot’s ability to develop a therapeutic alliance and execute certain psychotherapy modalities.

In addition to clinical benefits, ChatGPT could improve patient empowerment. Bettering patients’ autonomy and dignity is crucial in psychiatric care: psychiatry has a history of coercive institutionalization and involuntary admission/treatment can make patients feel devalued or diminish their sense of agency.6 Despite significant progress, mental illness is still highly stigmatized. More than half of people with mental illness don’t receive care, citing stigma as the main barrier.7 Thus, accessing behavioral healthcare virtually, on one’s own terms/time, may empower patients. ChatGPT’s ability to democratize the source of medical information is may also deconstruct the (historically paternalistic) medical hierarchy and center patients in their mental health treatment. In addition, ChapGPT’s analysis of speech and language patterns may more accurately identify – and not circumvent – patients’ concerns (e.g., medication side effects that the physician may consider relatively minor) to optimize shared decision-making.

Finally, ChatGPT could not only benefit individual patient care but also the entire behavioral healthcare system. ChatGPT could help distribute mental health resources more equitably by reducing costs and increasing access to alleviate mental health disparities. Due to its high cost (and often insufficient coverage by Medicaid and Medicare) and lengthy wait times, substantial equity issues exist in accessing psychotherapy. ChaptGPT would save psychiatrists time (e.g., through aided telepsychiatry responses and automation of administrative tasks, such as routine patient progress tracking via screening tools (e.g., PHQ-9 or Y-BOCS)) and allow clinicians to care for more patients at a lower cost.

(Here is an independent example of how ChatGPT can write an informative blog that was prompted to write about “Mental Health and substance abuse in New York” – J. Riley)

Mental health and substance abuse are two interconnected issues that continue to plague many people in New York City. According to the New York State Office of Alcoholism and Substance Abuse Services, about 1.9 million New Yorkers struggle with a substance use disorder, and many of them also suffer from co-occurring mental health conditions. In this blog post, we will explore the impact of mental health and substance abuse in New York and the resources available to those who need help.

The Impact of Mental Health and Substance Abuse in New York
Mental health and substance abuse are often intertwined. Substance abuse can lead to the development of mental health disorders, and vice versa. In New York, many people struggling with mental health issues turn to drugs or alcohol as a way to cope with their symptoms. Unfortunately, this can exacerbate their condition and lead to addiction.

Substance abuse can have a devastating impact on individuals, families, and communities. It can lead to health problems, financial instability, relationship issues, and legal problems. In New York City, substance abuse is a leading cause of overdose deaths. In 2020, there were 2,317 overdose deaths in New York City, an increase of 25% from the previous year.

The pandemic has also exacerbated mental health and substance abuse issues in New York. Many people are struggling with isolation, financial stress, and other challenges, which can trigger or worsen mental health conditions and substance abuse. According to a survey conducted by the New York City Department of Health and Mental Hygiene, 53% of New Yorkers reported symptoms of anxiety or depression in September 2020, up from 29% in 2018.
Resources Available for Mental Health and Substance Abuse in New York

Fortunately, there are resources available for people struggling with mental health and substance abuse in New York. The state has a network of treatment providers, including hospitals, clinics, and community-based organizations, that offer a range of services, including detoxification, counseling, medication-assisted treatment, and peer support.

The New York State Office of Addiction Services and Supports (OASAS) operates a 24/7 toll-free helpline, 1-877-8-HOPENY, where individuals can get information about substance abuse treatment and referrals to local providers. OASAS also offers an online directory of treatment providers in New York.
The New York City Department of Health and Mental Hygiene operates a range of mental health programs and services, including crisis intervention, outpatient treatment, and mobile treatment teams. The NYC Well program provides free and confidential mental health support and referrals 24/7 through phone, text, and chat.

In addition to these resources, there are also many community-based organizations that provide support and services to people struggling with mental health and substance abuse in New York. For example, the Partnership for Drug-Free Kids offers resources and support to families affected by substance abuse, and the National Alliance on Mental Illness (NAMI) provides education, advocacy, and support for individuals and families affected by mental illness.

Conclusion
Mental health and substance abuse are complex issues that require a comprehensive approach to address. In New York, there are many resources available to individuals and families struggling with these challenges, including treatment providers, helplines, and community-based organizations. If you or someone you know is struggling with mental health or substance abuse, don’t hesitate to reach out for help. There is hope, and recovery is possible.

Generative AI’s Legal/Ethical Pandora’s Box and Therapeutic Limitations

The potential for ChatGPT to cause harm in psychiatric care would be largely dependent on the level of oversight by certified healthcare professionals. A collaborative approach could be a slippery slope with a healthcare system focused on efficiency and cost. Increasing generative AI’s autonomy and therapeutic responsibilities at the behest of substantial profits could erode critical behavioral healthcare infrastructure.

ChatGPT may not be as sensitive nor perceptive to communication signals by patients (i.e., tone, inflection, prosody, fluency, and non-verbals) to pick up on subtle markers of physical/sexual/emotional abuse or crisis (e.g., suicidality, homicidal tendencies, mania, or malignant catatonia). The inability to identify such queues and ask questions to discern if the patient or others in their life are safe could lead to catastrophic outcomes and a legal deluge; in many states, ChatGPT would become a mandated reporter (e.g., for abuse) with legally binding responsibilities.

Gaming the Chatbot?

Generative AI also has enormous implications for involuntary admission and treatment. Predicting which patients are high-risk and not good candidates for AI-assisted care is not straightforward. Perhaps most scarily, patients may learn to game AI in a way they couldn’t a real provider. For instance, a patient may know what to say to avoid hospitalization when suicidal. Patient risk assessment requires a thorough, moment-to-moment physical and mental status exam as well as a longitudinal knowledge of the individual. ChatGPT may not only miss crucial nuances but also incorrectly influence the thinking of the physiatrist reviewer. For instance, if an EKG machine’s reading differs from a doctor’s manual analysis, the doctor often second-guesses themself even when the machine is wrong– humans tend to defer to technology. Altogether, the malpractice concerns for misdiagnosis are staggering if AI evaluations are incorporated into safety assessments and involuntary admission. Patients could claim that generative AI got it wrong, leading to their unlawful admission.

ChatGPT may also jeopardize informed consent. ChatGPT is programmed to present information with a confident, highly factual tone, which could lead patients to miscalibrate evidence in their decision-making. Further, informed consent and feeling authentically supported by a chatbot therapist may be mutually incompatible. Was it the psychiatrist or ChatGPT who wrote the virtual therapeutic response? And, if both were involved, what did the co-construction process look like?

In addition to potential individual patient harm, ChatGPT’s implementation could negatively transform the behavioral healthcare landscape. It is entirely possible that ChatGPT is overhyped and not that effective compared to a practitioner. If chatbot-assisted care is less efficacious, its implementation could exacerbate mental health disparities by offering less effective treatment to low-income, rural, and adolescent patients. For instance, people may assume that adolescents facing the greatest shortage in mental health providers prefer chat-based psychiatric care due to their generation’s technological savviness. Paradoxically, recent studies have found that younger age groups are less likely to use digital mental health interventions and report a low preference for online mental healthcare compared to face-face treatment.8

The overestimation of this market fix would create further harm as funding is diverted from more evidence-based alternatives. The availability of ChatGPT may also reduce incentives for real people to offer care in medically underserved communities (even less pay, more competition) and further reduce the quality of available services. Touting this shiny new option could distract from the political, systems-based work needed to fix disparities, such as improving access to and financing psychotherapy for medically underserved communities. Altogether, more research is needed to determine which patient population, if any, ChatGPT may be most effective before we pour resources into it.

Finally, disseminating ChatGPT would require a private company, such as OpenAI or Google, to handle sensitive, HIPAA-protected data. Psychiatric notes are extremely personal, with psychotherapy notes having additional protections compared to other clinical documentation.9 Control over this data and its processing could lead to harm in that it could be stolen in a data breach. Transforming generative AI into an accurate clinical assistant will require training generative AI’s LLM on clinical data (not just general internet information like ChatGPT). Such training will need access to past notes and mass data collection, processing, and storage– substantially increasing the risk of mishandling high-stake information.

Ultimately, physicians’ egos or fears of extinction should not hinder their ability to help patients. Medicine can join, research, regulate, and symbiotically work with generative AI companies, or we can stick our heads in the sand and hope Will Smith will save us from the robots.

References

1. USMLE Program Discusses ChatGPT | USMLE. (n.d.). Retrieved March 1, 2023, from https://www.usmle.org/usmle-program-discusses-chatgpt

2. Google Announces Bard, Its Rival To Microsoft-Backed ChatGPT. (n.d.). Retrieved March 1, 2023, from https://www.forbes.com/sites/qai/2023/02/08/google-announces-bard-its-rival-to-microsoft-backed-chatgpt/?sh=2388e5043791

3. NIMH, Mental Illness. (n.d.). Retrieved March 1, 2023, from https://www.nimh.nih.gov/health/statistics/mental-illness

4. Trends in Outpatient Psychotherapy Provision by U.S. Psychiatrists: 1996–2016 | American Journal of Psychiatry. (n.d.). Retrieved March 1, 2023, from https://ajp.psychiatryonline.org/doi/full/10.1176/appi.ajp.2021.21040338

5. Alliance: A common factor of psychotherapy modeled by structural theory—PMC. (n.d.). Retrieved March 1, 2023, from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4404724/

6. Understanding psychiatric institutionalization: A conceptual review—PMC. (n.d.). Retrieved March 1, 2023, from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3702490/

7. Psychiatry.org—Stigma, Prejudice and Discrimination Against People with Mental Illness. (n.d.). Retrieved March 1, 2023, from https://www.psychiatry.org/patients-families/stigma-and-discrimination

8. Evidence on Digital Mental Health Interventions for Adolescents and Young People: Systematic Overview—PMC. (n.d.). Retrieved March 1, 2023, from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8120421/

9. 2088-Does HIPAA provide extra protections for mental health information compared with other health information? | HHS.gov. (n.d.). Retrieved March 1, 2023, from https://www.hhs.gov/hipaa/for-professionals/faq/2088/does-hipaa-provide-extra-protections-mental-health-information-compared-other-health.html

Leave a Comment

Your email address will not be published. Required fields are marked *

Add Comment *

Name *

Email *

Website

Keep Reading: Related Posts

The Power of Positivity in Addiction Recovery in Minneapolis
Optionsfamily.com When it comes to building a new life in addiction recovery from drugs or alcohol, the power of positivity is often underestimated. Focusing on long-term happiness can help each...
The Conflicting Science of Social Media and Mental Health
Austin Perlmutter M.D. As of 2023, the United States has almost 250 million social media users. That number climbs to nearly 5 billion people worldwide and is expected to reach...
“Back to School Blues” May Be Worse Than Just Blues
Peter Gray Ph.D. Schooling has a halo around it in society’s eyes, and halos tend to interfere with perception and judgment. Maybe that’s why nearly everyone, including journalists, whose job...
Skip to content