Therapists: Be Mindful of the Risks of AI Use and Confidentiality
Audio Version
Artificial Intelligence (AI) is becoming an increasingly common tool in the mental health field, from generating treatment plans and progress notes to analyzing client’s progression in therapy. While AI can improve efficiency, it also raises serious confidentiality and ethical considerations that therapists cannot ignore.
The trust between therapist and client depends on protecting sensitive information, and using AI without proper safeguards could jeopardize that trust and potentially violate HIPPA.

The Appeal of AI in Therapy
AI tools can help therapists:
-
Automate note taking and documentation.
-
Generate treatment plan ideas.
-
Identify patterns in client mood or behavior data.
-
Provide supplemental skill-building resources.
The American Psychological Association (APA) acknowledges that AI can serve as an enhancement to mental health care, but only when used with caution and ethical safeguards are put into place.
The Confidentiality Risks of AI Use
While AI can be a helpful addition, it comes with risks:
Data Privacy Concerns
-
Not all AI platforms are HIPAA-compliant, meaning client data may be stored, shared, or used without the necessary protections.
-
Information entered into AI systems may be maintained on the servers opening the risk of information being potentially accessible to third parties.
Data Breaches and Cybersecurity
-
AI tools connected to the internet can be vulnerable to hacking, as we have seen many major platforms being impacted over the past few years
-
Even anonymized data can sometimes be re-identified through advanced analytics.
Inadvertent Disclosure
-
Uploading session notes or transcripts to AI tools without safeguards can expose sensitive personal details.
-
“Free” AI tools often make money off of users’ previous information to train other programs. This can create more privacy concerns as information is essentially cycling through systems.
Best Practices for Therapists Using AI
To protect client confidentiality while utilizing AI:
-
Use HIPAA-Compliant Tools Only – Confirm that the platform provides a Business Associate Agreement (BAA).
-
Avoid Uploading Identifiable Information – Remove names, dates, locations, and other PHI before entering text into AI tools. Less is more principle.
-
Get Informed Consent – Discuss potential AI use with clients, informing them of the risks and concerns.
-
Remain informed – Keep up to date on AI regulations and ethical guidelines through professional literature and associations.
