James Weinstein, Senior Vice President and Head of Innovation and Health Equity for Microsoft Healthcare, spent much of the past 30 years working on the Dartmouth Atlas, investigating unwarranted and warranted variation in health care utilization across the United States. The Dartmouth Atlas pulls that data from zip codes.
“It’s been of tremendous interest to me to understand how we as a society deliver that care,” Weinstein says. “Over the last 30 years of studying United States health care, I’ve come to believe that we have greater opportunities with artificial intelligence (AI) than maybe we appreciate right now.”
Weinstein, a spine surgeon by training with special interest in cancer of the spine, describes an image that showed a large, cancerous growth. “I saw many patients like this, unfortunately,” he says. “I wonder to myself how long this has been there, why there weren’t any symptoms. What could have been done to this 37-year-old mother of three before she came to me with a case that was almost inoperable? How can you save this person’s life?”
“We think of health care having a lot of data,” says Weinstein. “Today, we can collect data in almost everything we do, from every action we have, just from our own phone, from sensor devices on our wrist, and I believe that they will help us make earlier diagnoses for patients like this, who died 3 years after diagnosis.”
In addition to enabling earlier diagnoses, today’s health care data reveals inequities across the United States, including tremendous differences in life expectancy simply based on your zip code and where you live. “We should be able to understand — not through just social determinants of health, but also [a person’s] environment, [e.g.,] exposure to electrical systems — what’s possible, and what’s causing that difference in mortality,” Weinstein says.
“Those who think that we can find the answers by collecting data on just the social determinants may be misunderstanding how complicated it is to change the social systems, to change the life expectancy of people who are affected by these things, just by collecting data,” Weinstein says. “Making the kind of changes that need to be made in those communities in education, housing, food, pale in comparison to what the opportunities are in understanding them from the time they’re born to the time they die.”
“We in medicine should never forget why we’re here,” Weinstein adds. “We never should forget our patients and their families.” He describes how his daughter, who was diagnosed with leukemia at 3 months old, died after nearly 12 years of experimental radiation and chemotherapy. Then in 2018, he read a paper that had found the marker for her genetic defect and the treatment that would have saved her life. “Why did it take so long?” he asks. “What could we have done for other children, had we had the things like artificial intelligence and algorithms from massive amounts of data — not just from a single hospital, but from around the world?”
With artificial intelligence we can leverage data early and almost anywhere throughout a person’s life. “With CRISPR gene opportunities and other things to manipulate life, we have to be cautious about how we, as humans, start to intervene in nature, but we do have the possibility to change that course now, with more and more information,” says Weinstein.
“Through a cc of blood we can start to understand, with artificial intelligence, what your T cells are gearing up to do before you realize you have a disease,” says Weinstein. “And we have the new digital superpowers of the cloud.” The potential of the cloud to hold massive amounts of data securely, and AI’s ability to use algorithms and develop hundreds of models of machine learning to help health care practitioners, remain to be seen.
The “health superpowers” are exciting, but Weinstein cautions that we need to consider ethics when using them. “When do I tell you you’re at risk to develop congestive heart failure? What will that do to your life?” he asks. For a woman with the BRCA gene and at risk for breast cancer, when should providers share that information? “It’s fun to think about these things, but we need to think about the ramifications and implications on people’s lives as we become smarter and smarter about telling you what’s going to happen to you. When you might die.”
The future of health care is away from bricks-and-mortar medicine and toward home health, with digital engagement and early detection. “As our population continues to age, I, too, am not worried about the health care system having business,” Weinstein says. “I’m more worried about how we’re going to manage that population of older people with the dollars we have to spend.”
Early detection, using algorithms and machine learning, has a prominent place in diagnosing diabetic retinopathy, for example — having the diagnosis ideally occur before the patient gets to retinopathy and blindness. The area under the curve is high, maybe 0.7 to 0.9, according to Weinstein. But while that’s interesting intelligence, it’s not necessarily clinically relevant in how to treat the patient.
“We need to be careful to understand the difference between intelligence systems and the wisdom that you have as practitioners, and the experience you have, to understand whether these models make sense or not. We can’t just believe everything to be true,” he says. “Safety and outcomes are critical, and we’re still at a point where we need to validate some of these opportunities.”
In many of the AI models, the possibility of high false positives or false negatives becomes a real issue if adversarial “noise” is inserted that humans cannot perceive or won’t notice. He points to an image of a fundoscopic exam that indicates a healthy eye, with no retinopathy. But when injecting some adversarial noise from an algorithm, that same eye image becomes positive for diabetic retinopathy — even though that’s not true. “How will we recognize these things with artificial intelligence and machine learning?” he asks.
Demonstrating the advantages of using AI in clinical trials, but also where we need to take caution, Weinstein describes a study he conducted on back pain for thousands of patients in 17 U.S. states. Earlier on, the patients who’d undergone surgery did a little better than the patients who hadn’t, and as they followed these patients out for 10+ years, the difference in outcome narrowed.
“One could presume that surgery is better than non-surgery for clearly indicated patients who were randomized in a clinical trial. But can you apply this kind of knowledge to the general population with back and leg pain? The answer is no,” Weinstein says. “If you do, we will do too much surgery, and over-treat. We have to understand how to use the data, as good as it is, in different populations.”
As CEO at Dartmouth-Hitchcock, the only picture hanging in Weinstein’s office other than photos of family was a 1995 cover of The New Yorker showing a glass half full or half empty, depending on your view. He asks the audience for their view. “With artificial intelligence, machine learning, and the wisdom of you, the people who take care of our patients and their families, we need to think about a new glass — a new glass that provides great opportunities to change lives in ways that I wish I would have had for my own daughter.”
From the NEJM Catalyst event Provider-Driven Data Analytics to Improve Outcomes, held at Cedars-Sinai Medical Center, January 31, 2019.