The way to make ChatGPT HIPAA compliant is to deploy anonymizing software between users and the ChatGPT program in order that no Protected Health Information is disclosed to ChatGPT. However, when using this solution, it is necessary for the anonymizing software to be HIPAA compliant and for a Business Associate Agreement to be in place with the software vendor.
ChatGPT can be beneficial in any industry where it can be used to automate repetitive tasks, optimize workflows, and summarize data. In the healthcare industry, ChatGPT also has the potential to revolutionize the delivery of healthcare by analyzing symptoms, suggesting diagnoses, and recommending courses of treatment. The issue of using ChatGPT in healthcare is that the program is not HIPAA compliant – at present.
Recently, OpenAI – the developers of ChatGPT – published an article about HIPAA compliance in which the company stated it is willing to enter into a Business Associate Agreement for some API services, but not for ChatGPT. The article noted OpenAI is working on offering a ChatGPT HIPAA compliant option in the future, but this will be rolled out to Enterprise clients first. Smaller healthcare organizations will have to wait.
How to Make ChatGPT HIPAA Compliant
Healthcare organizations that want to benefit from ChatGPT’s capabilities sooner rather than later can make ChatGPT HIPAA compliant by deploying anonymizing software such as CompliantChatGPT and BastionGPT between users and the ChatGPT program. This prevents Protected Health Information (PHI) from being disclosed impermissibly to ChatGPT so the program can be used in compliance with HIPAA.
The way the software works eliminates the need for users to be careful about how prompts are worded. Any information that identifies the subject of the prompt is removed and replaced with tokens. So – for example – if a prompt discusses “John, 35, under the care of Dr. Richardson”, what ChatGPT sees is “name, age, under the care of Dr. A”. When a response is received from ChatGPT the tokens are replaced with the original input.
Because PHI is being disclosed to the anonymizing software, it is necessary to enter into a Business Associate Agreement with the software vendor. Both CompliantChatGPT and BastionGPT offer a “one-size-fits-all” Business Associate Agreement to healthcare organizations. If looking at other HIPAA compliant software vendors, it may be the case they connect with other AI platforms than ChatGPT.
Other Considerations when Using ChatGPT
While it is possible to make ChatGPT HIPAA compliant by deploying anonymizing software, it is important to consider the consequences of anonymizing PHI. In the above example, the age of the patient may be critical to producing an accurate output – especially if the patient is very young or very old. The same could be said for the gender, ethnicity, or biometric information of the patient. In some cases, anonymized data could produce contradictory outputs.
Therefore, while it is important to comply with the HIPAA Rules, it is more important that any output produced by any Generative Pre-trained Transformer (GPT) is checked for accuracy before being relied on to diagnose a health condition or prescribe a course of treatment for a patient. This guidance applies whether ChatGPT is used with anonymizing software or whether healthcare organizations wait until OpenAI make ChatGPT HIPAA compliant.