Building digital strategies in healthcare—for anyone involved in the field—is like putting together pieces of a complicated, fast-motion chess tournament, played with continually changing toolkits. Although the ultimate goal is better patient care at affordable cost, competing interests among different subsectors – industry, academia, providers, government, etc.—often leads to different priorities and approaches.
The numbers of big-scale initiatives aimed at sorting through this miasma highlight the urgency; digitization is the fourth industrial revolution and blurring lines among biology, physics and data science, as Robert Califf said in a recent talk—and coming from one of the nation’s top clinician-research cardiologists and a former FDA commissioner, that’s not a hyperbole.
Califf and other prominent physician-scientists spoke recently about their approaches to building clinical grade-big data initiatives in order to delve deeper into human health, one key mechanism of the digitization puzzle. While high-level snapshots in a burgeoning field and not specific to any particular industry, medtech or otherwise, these efforts have potential to drive the field forward, affecting all players in different ways.
Now vice chancellor for health data science at Duke University and an advisor to Google-funded Verily Life Sciences, Califf this spring at New York Bio discussed some of the ways that digitization gives us an unprecedented opportunity to study how drugs, devices and healthcare systems perform in the real world. Thanks to NIH initiatives, such as The Health Care Systems Research Collaboratory (the Collaboratory), we have good, curated data on a third of the US population—more than 100 million people—which can be queried for scientific purposes without ever moving it outside of the healthcare system where it was generated, he explained. That is noteworthy for the promise it holds for sharing of electronic health records across institutions and also because it does not butt heads with HIPAA.
Califf has been speaking on this topic regularly, so his comments were hardly news. Still, they carried weight, coming from a man who, before assuming leadership at FDA, was the founding director of the Duke Clinical Research Institute and one of the nation’s most powerful advocates for advancing data science in clinical research.
A long-time proponent of overhauling the current, inefficient, and bloated clinical trials research paradigm, Califf spoke about progress made by the Collaboratory, a six-year-old project, which aims to foster innovative approaches to integrating research with care delivery using advances in information technology and healthcare delivery. The Collaboratory’s ultimate goal is a pro-active system, where constant surveillance of the use of medical products, coordinated with motivated patients and committed industry can produce high-quality clinical evidence faster and at lower cost.
Funding for the program, which has completed a Phase 1, was recently renewed. FDA Commissioner Scott Gottlieb, MD, is on board with its aims, Califf said. Much has been learned from its efforts and, yes, it can achieve its goals—but there is still hard work to be done before the full vision is realized. The Collaboratory has sponsored a dozen demonstration projects aimed at coordinating generation of evidence from disparate healthcare systems with real-world medical practice. The cost of obtaining answers from shared EHR data in the Collaboratory is $10 million, compared to $50 million to $100 million for standard regulated clinical trials, Califf told the New York Bio audience.
Phase 1 demonstrated that 34 independent healt care systems can curate their information to a high-quality standard on at least a quarterly basis, and that rare disease trials can be done by identifying people rapidly through electronic health systems, where there maybe only a few potential subjects in the system. Common disease trials also benefit from speedier enrollment opportunities afforded by continuous surveillance.
Thanks to digital advances, dramatic geographic discrepancies in life expectancy and health outcomes can be detailed at extremely local, as well as national levels. For the third year in a row, life expectancy in the US is declining, metrics that are easier to measure because of digitization, Califf told the New York audience. Startling examples of geospatial discrepancies abound: a five-year difference in average life expectancy between Westchester and New York City; an eight-year gap between nearby counties and San Francisco, and an 11-year difference, depending where one lives in New York City. While every economically developed country is moving forward, the US, which is the engine of biomedical innovation in the world, uniquely “is going backwards,” he said, making research reform all the more imperative.
Taking a different approach, but also seeking to optimize big data for research purposes, and ultimately overall patient care, Verily’s 15-month-old Baseline Project is tracking data from 10,000 people over four years, under a collaboration among Verily, Duke, Stanford and Google. The plan is to create a baseline of human health as well as a computational platform that can, at low cost, collect continuous information on it for research purposes. Such a platform will make it easier to tackle the bigger barriers to making big data meaningful, which center around analysis and interpretation. That was the vision presented by Jessica Mega, MD, chief medical officer at Verily Life Sciences at the Cambridge HealthTech Tricon Molecular Meeting this winter in San Francisco.
Ultimately, the group expects to collect “in ways that respect individual wishes” about six tetrabytes per person. Compliance involves four on site check ups a year, the use of new technologies and wearables, and completion of surveys and diaries about individual health. The participants are armed with tools of the digital trade, in some cases specially designed for Project Baseline: watches, sleep sensors that do not disrupt sleep and thus are kept under the mattress, and cell phones that measures certain parameters.
“Why would a cardiologist get a call from Google?” Mega asked the Tricon audience. As a faculty member at Harvard, “I ran large, traditional clinical trials… About 15 years ago, we decided to collect genetic information and started a biobank looking at pretty basic markers. We were getting better on the molecular side, but we were not closing the loop on phenotype. Also, we had an infrastructure problem, since genomics came from one channel and phenotyping from another,” she explained, comparing mapping of the human health system to another Google endeavor—building a driverless car. To understand this map, the collaborators have had to build devices and tools that are impactful and can accelerate the work.
If digitization is turning the clinical research trial topsy-turvy, it is also enabling other disruptions, according to Harlan Krumholz, MD, the Harold H Hines Jr. Professor of Medicine and director of the Center for Outcomes Research and Evaluation at Yale University, and a frequent speaker on medical futurism. His comments below stem from a talk at the ISPOR annual meeting in May. The computational power of real world evidence achievable through digitization is enabling what he and others refer to as the “learning healthcare system.” In this concept, the collective wisdom of everyday interactions within health care systems becomes an “inexhaustible source of knowledge to fuel improvement and make us smarter about medical care,” much the way every consumer interaction with Amazon increases the online retailer’s knowledge of its customer.
All three speakers emphasized the importance of patient engagement and formation of a new culture that treats research trial participants, both scientists and patients, as partners on a collaborative journey. At the heart of this cultural transformation is control of data, which Krumholz argued should aggregate at the level of the patient and be owned by that patient. “People have right to their own data by law,” he said. To this end, a company he co-founded two years ago, HUGOPHR, offers a secure cloud-based personal health platform that enables people to access their EHRs from multiple healthcare systems and synchronize them with a research database. It also allows people to contribute information to their EHRs from wearables and questionnaires. Developed with the Yale New Haven Health System, it is being used in a number of studies involving healthcare delivery and medical devices. This includes projects underway through NESTcc, a real world evidence organization, which I recently wrote about in MedTech Strategist, and which is is one of the medical device industry’s most active proponents of new ways to improve clinical evidence. “It is a health data asset that increases in value over time,” Krumholz said.
Have comments on this post, or suggestions for topics you’d like us to cover in the Community Blog? Contact firstname.lastname@example.org.
Further Reading in MedTech Strategist:
“In Medtech, Real-World Evidence is Moving Beyond Theory to Regulatory Use,” by Wendy Diller
“Digital Innovation: Consumer Activation Comes of Age in Medtech,” by Ajay Gupta, Karen Passmore, Sundar Ganapathy, and Chris Smith
Start-Ups to Watch - “Medley Genomics: Tackling Complexities of Genomic Heterogeneity to Advance Precision Medicine,” by Wendy Diller
Start-Ups To Watch - “Personal Genome Diagnostics: One-Stop Cancer Genome Testing,” by Wendy Diller
Precision Medicine - ”How the Device Industry Can Win in Cancer Immunotherapy: Devices as Immunoadjuvants,” by Wendy Diller
FDA Performance Snapshot - “2017 Regulatory Data Shows a Friendlier FDA,” by Wendy Diller
#CommunityBlog #MedTechStrategist #realworldevidence #Collaboratory #FDA #HUGOPHR #ScottGottlieb #medicaldevice #medtech #digitalhealth #RWE #Verily #Google #TheCollaboratory