Thursday, Mar 9, 2023

From Diagnosis to Treatment: Why Reducing Bias in Healthtech Matters

HLTH

The use of technology is an exciting and seemingly limitless solution to providing better, faster, and more accessible healthcare for everyone. It is long overdue, but there is growing recognition that this potential won’t be fulfilled without understanding and guarding against technology’s power to increase disparity. Left unchecked, implicit bias can impact the design, development, and use of health technology in ways that cause harm.


The electronic medical record (EMR) is a digital record of a patient’s medical history used to standardize medical data and improve patient outcomes by ensuring that providers have access to complete and accurate patient information. But implicit bias, our unconscious attitudes and stereotypes, can influence how clinicians collect and interpret information. For example, a doctor or nurse may unconsciously make assumptions or judgments based on a patient's race, gender, age, income, or other social determinants of health which influence how they document the patient's symptoms or medical history. 


A recent article published in JAMA Network Open1 looked at ED visits across a large urban academic health system to understand physician use of behavioral flag notifications to identify potentially unsafe or aggressive patients. These flags can help healthcare providers identify patients who may need additional support or interventions. However, the study found that Black patients were almost twice as likely than White patients to be flagged, even when they had similar symptoms and medical histories. The flags caused patients to wait longer to be seen resulting in delays in care and to receive less lab testing and imaging. Studies also show that Black patients are 2.54 times more likely to have negative comments in their patient notes than White patients2. While the EMR has revolutionized the way healthcare providers collect and manage patient information it exemplifies how technology can exacerbate the impact of implicit bias and racism.   


Healthcare innovation has historically been driven by people who are not representative of the diverse communities they serve. This lack of diversity in healthcare innovation has led to products and services that may not be effective for patients who have different health needs and experiences. Furthermore, there is a lack of diversity in clinical trials, which are crucial for testing the safety and efficacy of new treatments and technologies. This has led to products and technologies that are less effective for some populations or have higher risks of adverse effects.


To help address these issues the HLTH Foundation has launched the Techquity for Health Coalition. Tegria is proud to be part of it. The Coalition is committed to developing and integrating health equity standards into healthcare technology and data practices. We define techquity as the “strategic design, development, and deployment of technology to advance health equity, and encompasses the notion that technology can inhibit advancements in health equity if not implemented intentionally and inclusively.” We see it “not as an individual or consumer-level problem, but rather a campaign requiring collaboration, transparency, inclusivity and a commitment to organizational and systemic transformation” – to borrow a quote from Ipsos’ Alexis Anderson.  


We need to prioritize efforts to reduce bias and maximize the benefit of healthcare technology for everyone. Here are five steps we can start taking today: 

  • Conduct user research to understand the needs and experiences of different patient populations to help identify potential biases in the design and development of the technology.
  • Collect diverse data to train the technology and ensure it is representative of the population. This can help avoid biases in the algorithms and outcomes.
  • Regularly test the technology for bias by measuring its impact on different patient populations. This can help identify and address biases in the technology.
  • Be transparent about the technology's limitations and potential biases to build trust with patients and users.
  • Engage with the community and seek input from diverse patient populations to ensure that the technology meets their needs and is accessible to all.


The growing recognition about the consequences of implicit bias in healthtech is cause for hope. The Coalition’s work is to translate that awareness and concern into meaningful action. To all the leaders in developing and applying new technologies in healthcare settings, let’s make sure we’re holding ourselves accountable to rigorous, measurable standards, and to keep moving forward. 


References:

Agarwal AK, Seeburger E, O’Neill G, et al. Prevalence of Behavioral Flags in the Electronic Health Record Among Black and White Patients Visiting the Emergency Department. JAMA Netw Open. 2023;6(1):e2251734. doi:10.1001/jamanetworkopen.2022.51734

Sun M, Oliwa T, Peek M and Lung E. Negative Patient Descriptors: Documenting Racial Bias in the Electronic Health Record. Health Affairs. 2022

https://doi.org/10.1377/hlthaff.2021.01423


HLTH does not sell or provide any personal data (including email, phone, address) to any third parties and we never will. Any communication that pretends to be HLTH or any third parties selling purported lists, discounted rooms, or any product/services are NOT AFFILIATED with HLTH and are to be considered FRAUD.

Upcoming Event Dates

2024 | ViVE: Feb 25-28; HLTH Europe: Jun 17-20; HLTH US: Oct 20-23

2025 | ViVE: Feb 16-19; HLTH US: Oct 19-22

2026 | HLTH US: Nov 15-18

2027 | HLTH US: Oct 17-20

© 2024 HLTH, INC. All Rights Reserved