Examples of healthcare artificial intelligence have expanded exponentially in recent years as interest in health technology solutions increases. This article will explore the option of using healthcare artificial intelligence as a way to improve explaining lab test results to patients.
Medical lab results aren't exactly reader-friendly. Most patients just glance at the numbers and, at most, pay attention to whether further attention is recommended. Even if they try to understand the values, they probably won't get much out of them. If their basic numbers are just within the normal range, they'll figure everything is OK; if numbers are even slightly outside of a normal range, they may panic.
Of course, test results lie on a continuum from ideal values to a dire need for emergency treatment. A value which doesn't indicate a change in treatment could still be a cautionary flag. Some patients are statistical outliers and have abnormal numbers while being perfectly healthy. In a perfect world, the physician would be able to take their time in explaining lab results to patients and map out an adequate course of follow-up instructions to ease a patient's mind.
Unfortunately, in the healthcare industry, that's pretty much impossible to do with every patient, given the time constraints of office visits.
Recent developments in healthcare artificial intelligence solutions may present possible solutions in the near future, though.
Simplifying Doctor - Patient Communication: Explaining Lab Test Results to Patients
What if a software application could read a patient's lab results, put them in the context of the patient's age and medical history, and produce a plain-language explanation of what's good, bad, or just a little ugly?
Much of the work is number crunching, which computers have always done better than people. With a bit of help from artificial intelligence for the healthcare industry, software that lets people really understand those numbers is becoming possible.
One such healthcare artificial intelligence project is doc.ai. It's intended as the basis of conversational applications that will discuss results with patients. The creator, who has a strong background in computational linguistics, has entered a partnership with Deloitte to market the product to healthcare providers. The initial target is blood tests, to be followed by genetic and other types of testing.
The healthcare software puts lab test results together with biometric data, known conditions, and current medications to arrive at a meaningful interpretation to help in explaining lab test results to patients.
The real value of AI comes from identifying patterns. A comparison of the latest test readings with prior ones is more informative to patients and providers than looking at the latest numbers alone.
A sudden jump in a number might indicate the onset of a new problem. A slow, long-term trend tells a different story, possibly requiring different treatment. Several numbers close to the edge of the normal range might indicate an issue, even if each one appears like a minor concern by itself.
Healthcare software doesn't get bored or distracted, so it might spot problems that even a medical professional could overlook.
The doc.ai design is intended to be decentralized. The company's materials include references to "blockchain," though it isn't clear how this technology is incorporated. What is clear is that the project aims to avoid putting all the patient data or intelligence in one place.
This doesn't mean that healthcare artificial intelligence or software can replace medical advice. Physicians have to make the final judgment, and they have to examine and talk with a patient to do that. They need to decide whether to start immediate treatment, recommend additional tests, or wait and see.
True robo-doctors won't be a truly viable option for many years, if ever, and will likely serve as more of an extension of a human physician rather than a full replacement.
Regulatory Issues in Using Healthcare Artificial Intelligence
Does healthcare software that interprets patient lab results count as a medical device and require FDA certification?
Regulation creates questions that pose challenges to the progress of artificial intelligence and other healthcare technology in general.
The answer in this case is probably not, as long as the healthcare technology doesn't claim to do more than clarify existing information. If it recommends treatment, it could be getting into dangerous regulatory ground.
Even as a strictly analytical tool, such healthcare software will need access to electronic patient health information to do its job, so the HIPAA privacy laws will come into play. The software will have to include measures to prevent unauthorized access, just as it would if it presented the raw data.
Liability is a more complex issue.
If an application fails to point out a serious problem, is the publisher liable? Is the healthcare provider? Or is it just not reasonable to expect it to identify every potential issue?
These questions won't be fully resolved until there are actual legal cases to establish precedent. Disclaimers may help to limit liability, but they don't always hold up, especially when patient well-being is impacted.
Machine learning and artificial intelligence in healthcare are a growing part of many endeavors for the industry.
At best, such healthcare technology gives people better access to useful information and lets healthcare professionals concentrate on the more difficult tasks.
Technology is sure to become increasingly important in improving patient communication and letting people understand their own health.