Eric Topol’s Deep Medicine is a significant contribution to American medicine and should be required reading for anyone interested in the present and future of health delivery. I read the book as a student and teacher of digital health technologies and quickly discovered that my understanding of AI and medicine is superficial at best. In my quest to overcome this deficit, I found the following observations from the Topol book to be of particular relevance.
The first major entry of AI into the practice of medicine was automated systems for reading ECGs, which were first applied in the 1970s and became routine in the 1980s.
Deep neural networks (DNN) are the driving force supporting AI innovations in health. The DNN era was made possible by these four components: massive data sets, dedicated graphic processing units (GPUs), cloud computing, and open-source algorithmic development units.
Deep medicine incorporates three components:
- Deep definition of the medical essence of an individual, including medical, social, and behavioral subcomponents, family histories, and biology
- Deep learning, involving pattern recognition and machine learning as well as access to virtual health coaches
- Deep empathy and connection between patients and clinicians (Topol would argue this is the most important of the three).
In contrast, shallow medicine is arguably where much of the practice of medicine is today. Patients exist in a world of insufficient data, time, context, and presence.
Most of the published deep learning examples represent only in silico, or computer-based validation, as compared to prospective clinical trials involving real patients. This is an important distinction!
Radiology, pathology, and dermatology practices all involve pattern recognition, which AI is good at detecting. Topol believes that eventually all medical scans will be interpreted by machines. The machines will also produce an initial draft of the scan report, and the radiologist will make the finding official by signing off on the report. An interesting aside: It was noted that pigeons can discriminate between complex visual stimuli. A 2015 study showed that pigeons could be trained to read x-ray images and that flocked-source readings were remarkably accurate!
It seems likely that machines will outperform humans on specific, narrowly focused tasks, suggesting that narrow AI will take hold.
For many clinicians, the workflow can improve due to a faster and more accurate reading of scans and slides. Narrow AI can see things humans would miss and can improve communication and presence by eliminating the keyboard during a clinic visit.
Virtual medical assistants show promise, but no randomized controlled trials have shown improved clinical outcomes. For now, the products have utilized small retrospective or observational studies to demonstrate their worth.
We cannot realize the full potential of deep medicine unless we have a virtual medical assistant helping us out. Neither patient nor doctor is going to be able to process all of the continually expanding medical and biological data.
The electronic health record is a narrow, incomplete, error-laden view of an individual’s health. This situation presents a significant bottleneck for the virtual medical assistant of the future.
The contrast between the levels of self-driving cars (Level 0/No Automation to Level 5/Full Automation) and the practice of AI medicine is worthy of consideration. Topol believes that while Level 4 (High Automation) may be achievable for vehicles under ideal driving conditions, it is unlikely that AI medicine can move beyond Level 3 (Conditional Automation). He asserts that patients would never tolerate a lack of oversight by human clinicians across all conditions all the time.
In summation, Topol contends that the potential gains of deep medicine can serve to bring back “real medicine,” which consists of presence, empathy, trust, caring, and being human in his view.
Source: Eric Topol, Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again, New York, Basic Books, 2019
Michael Bice served as a senior hospital executive for 25 years at academic medical centers and healthcare systems and has held teaching and administrative positions at universities. He has an avid interest in aging services technology. His Fall 2019 class, Artificial Intelligence and Health Care is November 4 from 10:00 a.m. – noon at the Microsoft Store at International Plaza. This article was originally published on his blog Digital Health Technologies: An ongoing review, assessment and commentary on digital health technologies.