Dr. David Rhew, Microsoft’s global chief medical officer and VP of healthcare, has been studying how technology can improve health outcomes for nearly three decades.
His focus has long been on access to care, quality of care, patient safety, improving experiences for patients and providers, and finding ways to improve the overall efficiency of care.
He was previously CMO at Samsung and Zynx Health before that. But his role at Redmond, Washington-based Microsoft (NSDQ:MSFT) — one of the world’s cloud-computing leaders — offers his best opportunity yet to shape the future of medtech.
Rhew spoke with Medical Design & Outsourcing as part of an ongoing series of conversations about cloud computing’s contributions to medtech and the potential ahead. The discussion that follows has been edited for space and clarity.
MDO: What’s an example of an existing cloud-connected device that might inspire medtech designers and engineers?
RHEW: A company we’ve been working with, Sensoria, makes a boot for diabetics. One of the biggest challenges with diabetics is foot ulcers. This boot’s sensors can help track pressure and movement and a variety of things and allows us to pull all that data into the cloud and apply advanced analytics and AI to potentially predict pressure ulcers.
MDO: What about cloud-connected devices of the future? From your perspective, what’s possible in the next five or 10 years?
RHEW: There are sensors all over the place that we can use, and we can think of putting sensors of any type on any device, something you wear or implant in your body. From there, the question is what do you do with the data? And that’s where artificial intelligence and machine learning can come in. In a surgical arena or perhaps a procedure room, if you’re a clinician or a surgeon, you’re using a set of devices that don’t really have any sensors. If we can gather information about these devices — which are now not just mechanical, but becoming electronic — we can manage those devices in ways that we have in the past. It’s sort of like a car back when you didn’t have any sensors about when your oil was low, and your car just starts smoking. Today, we have a lot of devices where you don’t know when it’s going to conk out, probably the last thing you want to happen during the middle of a surgery. A lot of the devices, when you’re working on a particular patient, there’s activity specific to that patient that should be captured. It could be how long the procedure was, the number of pushes of certain buttons … when you think about how that information can be used, and you’re looking to determine the optimal ways to perform a procedure, that’s valuable information. We know the duration of a surgery has a direct impact on surgical outcomes. We now can get more granular.
And that’s what’s exciting because we’re starting to gather patient-specific information but can start also thinking about different devices and different procedures being done within, say, the surgical unit. We can then start pulling all that together and analyzing across the board, comparing hospitals to hospitals and providers to providers. There is so much variation in care, and a lot of times, it’s unwarranted and there are often best practices that can be gleaned. If we can understand what the benchmarks are and strive toward those, we can lead to better quality improvements. And that can only be done when you have data that’s accessible at the granular level, roll it up into different areas, benchmark it and allow for AI to be applied on top of it to provide real-time guidance and support and improve patient outcomes.
MDO: What can the cloud offer medtech companies from the earliest stages of device design and prototyping through manufacturing and having them in use?
RHEW: Start with supply chain, your ability to understand at any particular point where certain things are, your ability to predict demand. We’re already seeing that outside of healthcare. That is a critical part of manufacturing, a critical part of logistics, and will be critical for getting devices built in a timely manner. Start thinking about the actual device itself and the storage and compute capabilities. There’s a lot of data coming in, potentially real-time, maybe continuous. That information will be hard for a lot of existing systems to manage, and you certainly don’t want that to pop into the EHR. It’s just going to overload the system. We need high-performance-compute capability for some of these activities, cloud computing to synthesize information real-time. We’re going to need edge computing as well because a lot of that information doesn’t even need to go to the cloud simultaneously but can be stored and managed and then ultimately pushed in different time periods. If you think about what we have beyond even the advanced analytics and prediction capabilities, we can look at things such as image analysis, video analysis, ways that we can meta tag different images so we can find other images and search those and even reports with those images using text analytics. All of this information can be analyzed using tools for natural language processing and visual cloud computing that gives us a whole new robust set of data that you apply on top of the quantitative data sets. Combine that with the fact that we’ve got interoperability that allows us to pull in electronic health record data and others. This creates a brand new opportunity for us to start answering questions that we’ve always had about how to improve care.
MDO: Are there any cloud-enabled tools that are uniquely relevant for medtech?
RHEW: The ability for us to navigate with voice to access particular pieces of information is critical, and we’re starting to see it being used in clinical environments. Microsoft and Nuance are looking to come together, and we see natural language processing — in particular voice — as a key part of how healthcare professionals will be interfacing with their devices, with patients and with the data. If we have an opportunity to leverage voice, to improve and streamline that workflow, we now can gain user acceptance from clinicians, because it’s very tedious for clinicians to find information by typing it into a computer or asking somebody else to. But with voice as that intermediary, we have now an opportunity to access a lot of this information that is in there, it’s just hard to find.
MDO: How can companies get started with the cloud?
RHEW: We work with companies that are small, medium and large. On the small side, we have our Microsoft startups program, working directly with companies trying to build out their applications. When you’re a small company and even a medium-sized company, there’s a lot you do just because you have to create an end-to-end solution, but it’s not necessarily something that’s very worthwhile for you to start building out. For example, mechanisms to ensure the security of your data and mechanisms to build FHIR (Fast Healthcare Interoperability Resources) APIs to connect into different systems. Yes, it should be done, but it’s a bit of a commodity and something that’s probably best outsourced to a company that spends their entire day doing it. Microsoft can provide those type of platforms, the security, and we can provide the tools that allow you to build off of that. Now, if you’re a large company and you’re thinking about moving to the cloud, there’s a lot of capabilities we’ve been building that are fairly unique and perhaps represent a significant opportunity to leverage. Some of the AI capabilities, the high-performance compute, those are all things that could be very difficult for organizations that aren’t in the space to be building and continuing to add capabilities such as voice and video and text analytics. But that’s the type of stuff that often is essential if you want to take advantage of these newer data sets.
Microsoft is partnering with Johnson & Johnson on the cloud; read more about what the future holds in MDO‘s conversation with two leaders in Johnson & Johnson’s medical device business.