Top News, Articles, and Interviews in Philosophy

Bias & Medical Devices

Philosophy News image
It might seem like woke madness to claim that medical devices can be biased. Are there white supremacist stethoscopes? Misogynistic MRI machines? Extremely racist X-Ray machines? Obviously not, medical devices do not have beliefs or ideologies (yet). But they can still be biased in their accuracy and effectiveness. One example of a biased device is the pulse oximeter. This device measures blood oxygen by using light. You have probably had one clipped on your finger during a visit to your doctor. Or you might even own one. The bias in this device is that it is three times more likely to not reveal low oxygen levels in dark skinned patients than light skinned patients.  As would be expected, there are other devices that have problems with accuracy when used on people who have darker skins. These are essential sensor biases (or defects). In most cases, these can be addressed by improving the sensors or developing alternative devices that do not suffer from the bias. This would require testing the devices on a diverse group of people. While this is crudely put, much of medical technology is made by white men for white men. This is not to claim that these are all cases of intentional racism and misogyny. There is not, one assumes, a conspiracy against women and people of color in the field; but there is a bias problem.  In addition to hardware bias, there is also software bias. Many medical devices use software and software is often used in medical diagnosis. There is a misguided tendency for people to think software is unbiased, perhaps as the result of science fiction tropes about objective and unfeeling machines. While it is true that our current software does not feel or think, bias can easily make its way into the code. For example, software used to analyze chest x-rays would work less well on women than men if the software was “trained” only on X-rays of men. If you have seen Prometheus, it has an excellent (fictional) example of a gender-biased auto-doc that. . .

Continue reading . . .

News source: A Philosopher's Blog

blog comments powered by Disqus