Bradley Merrill Thompson, Strategic Advisor with EBG Advisors and Member of the Firm at Epstein Becker Green, was quoted in Axios, in “Growth of AI in Mental Health Raises Fears of Its Ability to Run Wild,” by Sabrina Moreno.

Following is an excerpt:

The rise of AI in mental health carehas providers and researchers increasingly concerned over whether glitchy algorithms, privacy gaps and other perils could outweigh the technology's promise and lead to dangerous patient outcomes.
Why it matters: As the Pew Research Center recently found, there's widespread skepticism over whether using AI to diagnose and treat conditions will complicate a worsening mental health crisis.

  • Mental health apps are also proliferating so quickly that regulators are hard-pressed to keep up.
  • The American Psychiatric Association estimates there are more than 10,000 mental health apps circulating on app stores. Nearly all are unapproved.

What's happening: AI-enabled chatbots like Wysa and FDA-approved apps are helping ease a shortage of mental health and substance use counselors.

  • The technology is being deployed to analyze patient conversations and sift through text messages to make recommendations based on what we tell doctors.
  • It's also predicting opioid addiction risk, detecting mental health disorders like depression and could soon design drugs to treat opioid use disorder.

Driving the news: The fear is now concentrated around whether the technology is beginning to cross a line and make clinical decisions, and what the Food and Drug Administration is doing to prevent safety risks to patients.

  • KoKo, a mental health nonprofit, recently used ChatGPT as a mental health counselor for about 4,000 people who weren't aware the answers were generated by AI, sparking criticism from ethicists.
  • Other people are turning to ChatGPT as a personal therapist despite warnings from the platform saying it's not intended to be used for treatment.

Catch up quick: The FDA has been updating app and software guidance to manufacturers every few years since 2013 and launched a digital health center in 2020 to help evaluate and monitor AI in health care.

  • Early in the pandemic, the agency relaxed some premarket requirements for mobile apps that treat psychiatric conditions, to ease the burden on the rest of the health system.
  • But its process for reviewing updates to digital health products is still slow, a top official acknowledged last fall.
  • A September FDA report found the agency's current framework for regulating medical devices is not equipped to handle "the speed of change sometimes necessary to provide reasonable assurance of safety and effectiveness of rapidly evolving devices."

That's incentivized some digital health companies to skirt costly and time-consuming regulatory hurdles such as supplying clinical evidence — which can take years — to support the app's safety and efficacy for approval, said Bradley Thompson, a lawyer at Epstein Becker Green specializing in FDA enforcement and AI.

  • And despite the guidance, "the FDA has really done almost nothing in the area of enforcement in this space," Thompson told Axios.
  • "It's like the problem is so big, they don't even know how to get started on it and they don’t even know what they should be doing."
  • That's left the task of determining whether a mental health app is safe and effective largely up to users and online reviews.

Related reading:

Bradley Merrill Thompson Quoted in Globe Live Media, in “Artificial Intelligence Applied to Mental Health: Lack of Control Can Have Serious Consequences,” by Melissa Galbraith.

Jump to Page

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.