How To Use AI While Staying HIPAA Compliant

Industries are incorporating artificial intelligence into their business models to maximize customers’ satisfaction and optimize returns on investment. The healthcare industry is no different. However, AI’s rapid rise in popularity has brought innovation to how we work, and there are also concerns about potential pitfalls and risks.

In this article, we will discuss how AI is rapidly changing the healthcare landscape for good. We will also outline its potential pitfalls and risks and how to protect patient privacy with AI.

The Growing Role of AI in Healthcare: The Benefits and Risks

The world population hit the 8 billion mark in 2022 (1). Advanced technologies and better living conditions have increased the lifespans of humans. This is to our advantage in many ways. But it has also created unique problems. For example, the world is experiencing the growth of an aging population that requires access to health care for chronic conditions associated with aging. Because of this, the demand for health workers is at an all-time high (2). How can the world meet this need in a timely and cost-effective manner?

In comes AI. It can save time, cut down costs, minimize waste, and make healthcare delivery efficient overall. Here are just a few ways AI is rapidly making the healthcare market an efficient one:

  1. Virtual nurse assistants (which are chatbots) provide 24-hour, round-the-clock support to patients. They attend to patient’s queries, forward lab reports and case notes to clinicians, and schedule hospital appointments. By doing this, they relieve clinicians from the drudgery of repetitive tasks, freeing up time to meet and interact with patients (3).
  2. Virtual administrative assistants automate administrative and clerical roles. With this, they reduce the risk of errors, fraud, and waste (3).
  3. Virtual personal assistants can work with patients on a one-to-one basis to provide support in the form of medication reminders and appointment reminders and even flag wrong administration of common self-administered drugs like insulin and inhalers (3).

Pioneers and visionaries are currently pushing the boundaries of AI by improving its ability to reason clinically so that it can take over clinical roles like patient consultation, surgeries, and medical and nursing care (4). This can reduce hospital waiting times and improve access to healthcare in remote areas.

However, the risks of AI are concerning, with the potential to become unprecedented. Misinformation, security breaches, surveillance, and existential risks are just a few examples. In the healthcare context, AI poses these risks (5):

  1. Breaches in data security, privacy, and integrity.
  2. Lack of transparency in how data is used.
  3. Misinformation and misdiagnosis.
  4. Over-reliance on AI and loss of skills.
  5. Displacement of human resources.
  6. Increased inequity and inequality in terms of race, color, gender, religion, and other socioeconomic and cultural contexts.
  7. Encouragement of monopoly.
  8. On an existential crisis scale, AI can perpetrate existential risks in the form of cyber terrorism and weapons of mass destruction, amongst others.

The Importance of HIPAA Compliance for AI Development

To prevent these risks, AI must protect the privacy, security, and integrity of user’s data. It must also be transparent in how it uses and analyses data.

Is AI currently compliant with these principles? Not exactly.

For example, ChatGPT, the most popular AI chatbot created by OpenAI, doesn’t offer business associate agreements (6). What does this mean? This means that covered entities that use AI in their healthcare services are liable for noncompliance with the standards of the Health Insurance Portability and Accountability Act (HIPAA) on protected health information. This also goes for business associates and sub-contractors.

The consequences of noncompliance with responsible AI use in healthcare are serious. Apart from increasing the chances of existential risks, erring parties suffer legal and financial consequences. 

Key Considerations for Using AI With Protected Health Information (PHI)

The risks of AI use in healthcare shouldn’t deter you from making the most of its benefits. If you decide to use AI as a healthcare provider, there are some questions you must ask. We created this outline of questions to help you protect patient privacy with AI. You should use this outline as a tool to screen prospective AI model providers:

  1. Do I have a business associate agreement with the AI model provider?
  2. What are the security options of my AI model provider? For example, do they host their services on a secure server? Do they use data encryption for data at rest and in transit?
  3. What measures do they use to protect patient privacy with AI? For example, do they use de-identified patient information? Do they use differential privacy to prevent the extraction of a user’s data?
  4. What measures do they use to preserve the integrity of data at rest and in transit? For example, how is data shared across networks? Do they use the same data-sharing agreement across all networks?
  5. What algorithm is used? Supervised or unsupervised? How does the type of algorithm influence the extent of HIPAA compliance?
  6. How transparent is my AI model provider? Do they clearly state how data are used and protected as per the HIPAA’s Notice of Privacy Practices (NPP) requirement?

Do This To Maintain HIPAA Compliance for AI Development

Do not underestimate the power of a thorough and regular risk assessment. To ensure the responsible use of AI in healthcare, risk assessment must be two-way. You first perform a risk assessment on your AI model provider using the questions outlined above. Then, you use the same set of questions to assess risk in your use of AI for healthcare. For example, after assessing your model provider’s approach to data integrity, you should also assess your commitment to integrity. For example, restricting access to only qualified personnel.

A thorough risk assessment requires a solid knowledge of HIPAA rules and regulations. This knowledge must go beyond a general overview of HIPAA laws. You must also know the practical application of these laws and how they affect your interaction with patients.

Health workers are often pressed for time. This is why we created this 90-minute HIPAA compliance training for health workers. This course teaches you the latest updates on HIPAA requirements, including HIPAA Privacy, Security, and Enforcement. It also includes a standalone exam that tests your scope of knowledge.

After completion, we offer a free, accredited, and instantly downloadable certificate of completion. Click here to get started!

 

References

1. United States Census Bureau (2024). World Population Estimated at 8 Billion.

2.  World Health Organization (2024). Health Workforce

3. IBM (2024). AI Healthcare Benefits

4. Bodenstedt, S., Wagner, M., Müller-Stich, B. P., Weitz, J., & Speidel, S. (2020). Artificial intelligence-assisted surgery: potential and challenges. Visceral Medicine, 36(6), 450-455.

5. Forbes (2024). The 15 Biggest Risks of Artificial Intelligence

6. OpenAI (2024). How can I get a Business Associate Agreement (BAA) with OpenAI?