Correcting unintentional biases in medical device product design is a big step toward improving equity in medtech.

medical device product design equity diversity

[Image from Pixabay]

Achuta Kadambi, an assistant professor at the UCLA Samueli School of Engineering, says when looking at bias in medical devices, there’s often a lot of narrative in the media of what it is. However, it’s also important to address what it doesn’t show.

“One that doesn’t show up is that it’s a really challenging technical problem and an exciting technical problem to address,” he said, citing an example of how light doesn’t play well with darker objects like darker skin tones.

Kadambi, who recently published a column in the journal Science about achieving fairness in medical devices, says there has to be a technical passion for solving these problems. (He also discussed his views during a recent DeviceTalks Weekly podcast.)

He adds that the social impact is equally crucial in making devices fair.

“When inventing a life-saving medical device, it’s important to make sure it doesn’t disadvantage a certain class of people,” Kadambi said. “You want to invent for humanity, not subsets of humanity.”

He added it’s possible to learn lessons from computer science, and you can bring that methodology to medical device design as well.

How can medical device product designers ensure their product design isn’t biased? Kadambi said there are several ways to avoid unintentional biases and improve equity in medical device design:

1. Include stakeholders as inventors

Kadambi said it may sound obvious, but designers should ensure their inventors include stakeholders. For example, if you are working on something for darker skin tones, don’t just rely on darker skin test subjects. He said to try to include them as co-inventors

2. Include diverse groups.

Make sure the inclusion criteria for your study includes diverse groups if you want to make a device. It’s vital to have minority populations and various demographics in clinical research studies and clinical trials, Kadambi said.

3. Quantify sample fairness.

Once you have the study subjects and the technical authors on the paper lined up, the next thing to do to make a medical device fair is to quantify fairness mathematically, according to Kadambi. “What does it mean for a device to be fair?” he said. You need some sort of numerical way to score bias or fairness.”

In his column, he suggested medical device journals require authors to quantify sample fairness in experiments. “Scientists are currently only required to provide statistics relating to a device’s performance without the human factor,” he said in that piece.

4. Start inventing fairly.

Once you have a score for bias, you should have a diverse group to evaluate your fairness score. Then, report it as something as important as performance.

“This is where the invention process begins — start to invent things that boost your fairness metric,” Kadambi said. “These could be inventions that take unconventional forms. For example, you realize light doesn’t reflect well off darker skin. Maybe you don’t use light. You use something else. So maybe combine light with another modality that makes the situation more fair.”

5. Read and cite other’s work.

Read and cite work from other authors who have been working on algorithmic fairness. “Read these papers and, to an extent, be aware of them because they’ll help you design these metrics in a better way,” Kadambi said. He added that algorithmic fairness is a very popular area in computer science, and there’s some literature out there folks should read.

6. Weigh fairness appropriately.

It’s also important to report your results in a way that gives fairness a similar kind of weight or importance to performance, he said. “Even if your device is not fair, report the shortcomings and let the community decide how they can improve that in future work,” he said.

7. Recalibrate measurement of performance.

Kadambi also suggests recalibrating how medical device performance is measured, determining how existing devices perform when using a quantifiable metric, such as race or ethnicity, in evaluating equipment.

Liz Hughes is an award-winning digital media editor with more than two decades of experience in newspaper, magazine and online media industries. Liz has produced content and offered editorial support for a variety of web publications, including Fast Company, NBC Boston, Street Fight, AOL/Patch Media, IoT World Today and Design News.