Tech & Innovation in Healthcare

Technology & Innovation:

Are You Compliant When Using ChatGPT With Office Tasks?

Question: A staff member was writing a prior authorization request to the patient’s payer and used ChatGPT to help them craft the letter. Is using artificial intelligence (AI) to help with office work compliant with HIPAA regulations?

North Dakota Subscriber

Answer: The answer depends on how the employee used the technology to write the prior authorization letter. If they asked ChatGPT to generate a prior authorization letter template with blank areas that they could then populate with the essential information the payer needs to review, then the employee probably remained compliant with HIPAA regulations.

On the other hand, if the employee shared protected health information (PHI) with the generative AI technology to provide a customized prior authorization request letter that the staff member just had to review, then that is a HIPAA violation. ChatGPT, like many other generative AI models, connects to the internet and shares the information the user inputs with the technology while also generating the output. By using the information from the request, the AI model learns. So while using a system like ChatGPT can be helpful to completing simple tasks, you should not use PHI in your requests.

Best practice: Speak with your healthcare organization’s IT team or cybersecurity professionals to establish AI use and operational policies to ensure staff members are using AI technologies safely while also protecting the patients’ data.

Mike Shaughnessy, BA, CPC, Production Editor, AAPC