Improving Pulse Oximeters: An Engineering Endeavor

The efficacy of correcting discrimination in pulse oximeters is still up for debate. However, the approval of better devices might be hindered by the heightened performance benchmark. At a recent meeting, committee members reviewed a concept proposing that manufacturers should test their device on a minimum of 24 people who represent the complete spectrum of a 10-shade skin tone scale. The existing rule only requires a device to be tested on ten individuals, inclusive of at least two individuals with “darkly pigmented” skin.

While health professionals are determining how to use the current tools and whether they can be trusted, some questions persist. During a recent advisory committee meeting, a representative from Medtronic, a leading pulse oximeter supplier, was faced with the question of whether a voluntary device recall had been considered. In response, Medtronic’s patient monitoring chief medical officer, Sam Ajizian, said, “Our devices meet the FDA standards without a doubt“. He argued that recalling the devices would jeopardize public safety as they are critical in operation theatres, ICUs, ERs, and ambulances.

Despite this, not everyone agrees that the advantages outweigh the disadvantages. A community health center in Oakland, California sued some of the leading pulse oximeter manufacturers and vendors, requesting the court to stop the sales of the devices in the state until their accuracy for dark-skinned people was verified, or until the devices were equipped with warning labels.

Noha Aboelata, CEO of Roots Community Health Center, said, “Pulse oximeters are a tragic example of the harm caused when the health-care industry and its regulatory authorities prioritize the health of the white population over that of non-white patients.” She added, “The making, use, and marketing of biased pulse oximeters is a clear reflection of our health care system’s flaws.”

Earlier reports from Technology Review’s MIT archives shed light on the rampant racism in technology. Articles by Melissa Heikkilä and Charlton McIlwain suggested that technology was inherently biased, and that the prejudice imbued in AI could perpetuate racism if not corrected. This argument draws from instances wherein deep-learning models have proven to be as efficient as health workers in imaging tasks but have also propagated biases. Karen Hao’s report in 2021 also noted that one potential solution is to stop fine-tuning algorithms to mimic experts.

In other news, high lead levels found in applesauce pouches were traced back to a singular cinnamon processing plant in Ecuador. (NBC)