8
Nov
2021

A Remarkable Life in Science: David Baltimore on The Long Run

Today’s guest on The Long Run is David Baltimore.

David is one of the most accomplished biomedical scientists – and scientific citizens — of the past 50 years. He recently won Lasker~Koshland Award for Special Achievement. The award was granted “for the breadth and beauty of his discoveries in virology, immunology, and cancer; for his academic leadership; for his mentorship of prominent scientists; and for his influence as a public advocate for science.”

David Baltimore, president emeritus, distinguished professor of biology, Caltech

It would take way more than an hour to discuss all of this work in depth.

His fundamental work in virology led him down a whole set of interesting paths in immunology and cancer biology.

He won the Nobel Prize in 1975 at age 37.

Now at age 83, Baltimore is in a position to reflect.

In this conversation, we talk about his upbringing, the value of humanities training for scientists, some early career turning points, how he got involved with biotech, and the kind of opportunities he’d like to see open up for young scientists in the future. I think this conversation pairs quite well with the last episode with Tony Kulesa.

Now, before we get started with the conversation with David Baltimore, a word from the sponsor of The Long Run – Answerthink.

Today’s sponsor, Answerthink, has been consistently recognized by SAP, one of the largest enterprise software companies, as a top business partner for delivering and implementing SAP solutions for small and midsized life science companies. Their SAP certified solutions designed for the Life Science Industry are preconfigured, rapidly deployable and address fundamental business and IT challenges such as:

  • Integrating your business applications
  • Delivering validated reporting
  • Increasing your speed to market
  • Support for global rollouts
  • As well as delivering a fully compliant solutions that meets FDA’s strict standards.

Explore how Answerthink can streamline your business processes to ensure growth.

Visit Answerthink.com/timmerman and get a copy of their e-book- “Top Three Barriers to Growth for Life Science Organizations.“

That’s Answerthink.com/timmerman

And, I’m pleased to welcome a new sponsor this week – Absci.

Absci is all about creating new possibilities in the realm of protein-based therapeutics. What does this mean?

Absci has a fundamentally different approach to drug discovery. It designs and develops next-gen biologics of any modality, from antibodies to T-cell engagers to completely novel protein scaffolds, including a futuristic format it calls “Bionic Proteins.”

Because Absci conducts its screens in its scalable production cell line, it collapses several steps of biologics discovery into one integrated, efficient process. Absci also has a unique computational antibody and antigen discovery approach for isolating fully-human antibodies from disease tissues and using these antibodies to identify novel drug targets.

Absci does all this with a powerful combination of deep learning AI and synthetic biology technologies. Absci is already helping some of the best partners in biopharma translate their ideas into drugs. Check them out at absci.com and absci.ai.

Now, please join me and David Baltimore on The Long Run.

7
Nov
2021

Aspiring Healthtech Companies Require A Nuanced Understanding Of Healthcare’s Human Dynamics

David Shaywitz

The best concise summary I’ve seen of tech industry travails in healthcare comes from Blake Dodge, a reporter at Business Insider. Dodge had just written a piece about Apple Health, and took to LinkedIn to share an additional quote that didn’t quite make the final copy. 

“I think they came to it pure of heart, really thinking that they could design something that would be ultimately logical,” a source at Apple told Dodge. “But the problem is, is that the actual way that healthcare is delivered in the US is not a one size fits all experience. It’s actually incredibly heterogeneous.”

Bingo. 

What’s more, this heterogeneity is expressed on every level, from the services and care provided to the way health information is managed – as Google engineer Adam Connors recently discovered.

An unexpected illness brought Connors face-to-face with the healthcare system, an experience that shocked him as a patient and troubled him as an engineer who thinks about how to improve systems.  Writing at CNN, Connors described poignantly the very personal consequences of a system unable to consistently access coherent and complete data, deficiencies that delayed diagnosis and complicated treatment. Being sick was stressful enough; dealing with a dysfunctional healthcare system made it even more frustrating.

Connors wondered how much waste and confusion could have been averted if his healthcare providers had a longitudinal health record, which would include all of an individual’s health information, independent of where care had been received. (I couldn’t agree more, and at a recent seminar at Duke, I pointed to the need for a comprehensive account of our health journeys as the single most important requirement for data-driven medicine of the future.)

It’s easy to feel discouraged about information sharing. For those working in healthcare, the obstacles are all-too familiar.

Consider an example that emerged from a panel discussion I recently moderated on digital health and diabetes co-hosted by the Digital Medicine Society (DiME) and the Veterans Administration (video here).

To facilitate the care of diabetic populations, a startup called Glooko developed a platform to aggregate a range of diabetes-related information. Glooko’s hope is to integrate data in a way that enables healthcare providers remotely monitor the health of their diabetic patients based on the data the patients regularly collect. Dr. Varman Samuel, an endocrinologist at Yale University and the VA, highlighted two conspicuous compatibility problems for Glooko: data from Medtronic insulin pumps and from Abbott’s continuous glucose monitoring system. Moreover, it seems that Glooko used to have access to data from Abbott devices, until it was “unilaterally blocked” by Abbott, according to this Medscape report. 

An Abbott scientist on the panel defended the company, saying it collaborates with a number of manufacturers. Yet, as one of the panelists suggested, it appears Abbott has opted for a closed system, developing a data platform, LibreView, that’s analogous in intent to Glooko, yet which seems only to work with Abbott devices. It’s a familiar “walled garden” strategy utilized by a number of medical device manufacturers (for example, those who make ICU equipment), intended to motivate purchasers to remain loyal to the brand. Fellow residents of Apple’s walled garden will recognize the approach (as the Wall Street Journal’s Joanna Stern discusses here).

Harvard bioinformatics professor Dr. Zak Kohane has seen many such examples, where well-intended data sharing schemes collide with more parochial interests.

Zak Kohane, Chair of the Department of Biomedical Informatics, Harvard Medical School

Kohane recently offered an especially thoughtful perspective on the challenges in healthcare data in a just-released podcast hosted by Apple AI expert Adriel Saporta and Harvard AI researcher Parnav Rajpurkar.  

The whole podcast is required listening, including an excellent, concise overview at the top of the show by the hosts of the history of health data legislation, moving briskly from HIPAA to HITECH to MACRA to 21st Century Cures. 

In the subsequent discussion with Kohane, seven themes stood out.

  1. Flying blind

According to Kohane, despite decades of lamentation, “we have an uninstrumented healthcare system” meaning that “things happen in our healthcare system and we don’t even count them.”  This matters, he says, because “you can’t have accountable care if you can’t count.” 

  1. Data access

“The beginning of making our data work for us,” Kohane says, is to have our data “in a systematic, computable, form, or at least to have it,” so that various statistical methods can potentially “transform it into something computable” which can be used to generate “insights at a population level, and actions on the individual, clinical level.” In Kohane’s mind, while having computable data is helpful, he would absolutely prioritize gaining access to data in the first place, whether these data are immediately computable or not.”

  1. Data silos and human behavior

Despite years of handwringing and public opprobrium, data silos remain a fixture of the healthcare landscape, as Google engineer Adam Conners’s experience painfully captures. The fundamental issue, Kohane suggests, isn’t technology, it’s people. 

“Lack of sharing,” Kohane says, “is core to the human condition,” and is exhibited by stakeholders throughout the system, from companies to healthcare systems to academic researchers to patient advocacy groups (yes, you heard that right). Kohane cites an example of “three different breast cancer groups not sharing data with one another,” an observation that would surprise no one familiar with the space. (I’ve discussed challenges of data sharing, including contrast between stated and revealed preferences, here.)

  1. Challenge of incentives

From the perspective of patients, the benefits of improved health data sharing are clear. Kohane shared the published example from 2009 showing that fairly coarse hospital discharge data could be used to detect domestic abuse in emergency room patients. Yet according to Koahne, “twelve years after we published that study, no one is doing it.” It’s “a huge health issue,” Kohane points out, but “is not an obvious income generating thing.” 

Kohane also highlights the challenges faced by hospital CEOs, who might need to balance the desire of a physician to share data with what can feel like asymmetric downside risk, such as a data leak, that puts the executive on “the front page” of the paper. In other words, the risks associated with sharing data tend to feel larger (and certainly more palpable) to decision-makers than the consequences of not sharing. 

Thus, while data sharing may be something that clinicians, researchers, and hospitals should do, actually persuading people to do things they don’t perceived as serving their self-interest can be a challenge. As Upton Sinclair famously said, “It is difficult to get a man to understand something when his salary depends on his not understanding it.” 

Incentives impacting the adoption of emerging technologies have also been a preoccupation of thoughtful healthcare leaders including Stanford’s Dr. Kevin Schulman, who has emphasized the need for “innovative business models,” data that are “actionable by patients,” and an economic model that leverages data but focuses on service.

  1. Multiple paths forward

Kohane discussed several potential paths forward, advocating the pursuit of all of them simultaneously. 

For starters, Kohane leads several  data sharing initiatives, such as the i2b2 consortium, and the 4CE consortium to evaluate COVID. By demonstrating the clinical and scientific value of data sharing, and learning what the sticking points are, the hope is to pave the way for broader changes. Kohane also highlights the role of regulation in compelling institutions to share data more substantively — as well as the need for institutions to understand and effectively manage the many regulations that seem to inhibit data sharing. 

But ultimately, Kohane seems to have the greatest faith in what he describes as an “end run” – consumerism.

“Our data may or may not be made to work for us by regulators, by medical leadership, and by other august leaders,” he says. “But the consumerist wave, which will continue to flow over healthcare” may drive progress. He cites the example of the ability to download information from an increasing number of hospitals to Apple Health using standard APIs (programming interfaces) as an indication of where the future may be headed. “In the long term, I think it’s all going to be in the consumer play because it’s where you have the scale and where you have alignment of interest.”

  1. No database to rule them all

Asked whether he’d like to see a single, consolidated database with all healthcare information, Kohane said he much prefers a federated approach, meaning that data live locally, and can be selectively shared using agreed-upon standards. As Kohane and Dr. Ken Mandl wrote in 2016, a distributed model enables “each institution’s data” to be “maintained separately,” which “afford local control over data.”

Why the federated approach? As Kohane explained on the recent podcast, “Medicine itself is so heterogeneous… it’s a very labor intensive, still, artisanal practice in many, many ways that if you just shove things up into a shared database, you’re actually missing a lot of the differences in practice.” 

In essence, co-host Saporta summarized, “without having the context of where the data is coming from, you’re losing a lot of very important information.”

Instead, Kohane advocates the creation of “purpose-built” federated databases. He emphasizes that “no interpretation of clinical data … is universal for all purposes.” It’s critical to understand the context in which data are generated, he says, echoing a point that both Dr. Amy Abernethy and I have repeatedly made. Context matters.  

As Kohane explains, “It’s just like science: a good biologist who refuses to understand the system in which they measure the proteins is as unlikely to come up with an interesting and robust finding as a machine learning expert who doesn’t want to get down and deep into how the data was generated.”

  1. Trust

The notion of sharing health data faces increasingly strong headwinds, as discussions of tech company’s questionable use of data and penchant for surveillance has increasingly attracted national attention. The opportunity, says Kohane, is “to leverage the intuition that many patients have, which is they trust their doctor and they trust their healthcare system more than they trust some anonymous large data company.” He suggests we should “focus on the sociology of where trust lies and maybe try to widen it to different spaces.” 

The importance of the trusted relationship between provider and patient, he noted, was evident in his experience assisting his elderly mother, who had struggled with her fluid balance and was hospitalized several times for diuresis. As Kohane had originally described to NPR, he had managed to help his mother stay out of the hospital simply by monitoring her daily weight, and, critically, engaging with her effectively when the numbers were off. 

The key, Kohane emphasizes, wasn’t the sophistication of his (incredibly basic) treatment algorithm, but rather, his relationship with the patient. He notes that while previous peer-reviewed studies of his approach had generally not been effective, he was able to succeed, he believes, because of the robust and daily dialog.

Bottom line: 

Healthcare is still struggling to effectively utilize technologies that have transformed other industries. It seems clear by now that the most significant hurdle isn’t technology, but people and human behavior. At the same time, the greatest hope for progress in health also lies with people. 

The challenge for technology is to leverage the best of medicine – including the often-powerful therapeutic relationship between caregiver and patient – while remaining cognizant of the very real anxieties. Most patients don’t want to be monitored all the time, or to continuously think about their health; hospitals want to provide excellent care but not reveal information that places them at a competitive disadvantage; researchers want to contribute to science but also manage their careers; companies have shareholders to consider. 

Emerging digital and data technologies continue to often exceptional promise for healthcare, but delivering on this potential remains elusive. Success will depend less on the elegance of a particular technology, and more on the emotional and organizational intelligence of the health technology team attempting to wield it.

Success will depend less on the elegance of a particular technology, and more on the emotional and organizational intelligence of the health technology team attempting to wield it.

26
Oct
2021

Young Biotech Entrepreneurs Finding Community: Tony Kulesa on The Long Run

Today’s guest on The Long Run is Tony Kulesa.

Tony is the co-founder of Boston-based Petri. It’s a seed and pre-seed investor in biotech startups. But that doesn’t quite fully describe it.

Tony Kulesa, founder, Petri.bio

It’s also a community for young founders trying to figure out how to get new enterprises off the ground, and connected with a network of seasoned entrepreneurs who can provide helpful advice. The Petri community is part of what some call the “founder-led biotech movement,” as opposed to the more traditional VC-led startup world.

Petri is a type of VC, but it has some differences in how it relates with founders, as Tony describes in this episode. As for its tastes, it gravitates toward companies at or near the intersection of biology and engineering – which can lead to therapeutics, industrial applications, and more. The advisors in the Petri network include founders and leaders behind companies like Ginkgo Bioworks, Twist BioScience, Exact Sciences, insitro, Beyond Meat, and more.

Petri is part of a larger fund called Pillar, and it recently raised Pillar Fund III, which is a combined $190 million across two funds.

Tony got his PhD from MIT. He has a history of working in small businesses. When he and a few classmates in grad school saw a need for more of a startup community at MIT, they got to work creating the MIT Biotech Group. It’s now a thriving ecosystem of young entrepreneurs, and part of a larger cross-campus effort called Nucleate.

I got to know Tony a few years ago when I was puzzled by the challenges of young people breaking into biotech, and wrote a few articles about this odd phenomenon at a time of maximum possibility in biotech.

For background, see previous Timmerman Report columns on young people in biotech.

One brief announcement: Petri and Pillar are organizing the Founder-Led Biotech Summit. It’s a free, virtual event held Nov. 1-5 with a pitch competition, awards, and a lineup of interesting speakers made up of young entrepreneurs, and some of the older generation that supports the work. Check it out at founderledbio.com.

And, before we get started, a word from the sponsor of The Long Run – Answerthink.

Answerthink has been consistently recognized by SAP, one of the largest enterprise software companies, as a top business partner for delivering and implementing SAP solutions for small and midsized life science companies. Their SAP certified solutions designed for the Life Science Industry are preconfigured, rapidly deployable and address fundamental business and IT challenges such as:

  • Integrating your business applications
  • Delivering validated reporting
  • Increasing your speed to market
  • Support for global rollouts
  • As well as delivering a fully compliant solutions that meets FDA’s strict standards.

Explore how Answerthink can streamline your business processes to ensure growth.

Visit Answerthink.com/timmerman and get a copy of their e-book- “Top Three Barriers to Growth for Life Science Organizations.“

That’s Answerthink.com/timmerman

 

In the Gibco ‘Art of Cells’ project, we paired artists from around the world with a research scientist, then tasked them with creating a piece of cell-artwork inspired by their scientist’s unique perspectives.

Meredith Woolnough is an embroidery artist from Newcastle, Australia. Her elegant and intricate artistic style takes inspiration from the organic structures of the natural world.

Paired with Dr. Marietta Hartl, a postdoctoral researcher specializing in embryonic development, Meredith’s amazing artwork takes inspiration from nature in a completely new way, as she endeavors to capture the earliest stages of life, in knitted threads.

Discover Chapter 1 of the ‘Art of Cells’ project at thermofisher.com/GibcoLoveYourCells.

Now, please join me and Tony Kulesa on The Long Run.

12
Oct
2021

Reimagining DNA Sequencing With Long Reads: Christian Henry on The Long Run

Today’s guest on The Long Run is Christian Henry.

Christian is the CEO of Menlo Park, Calif.-based PacBio. It makes DNA sequencing instruments that are used by scientists around the world.

Christian Henry, CEO, PacBio

The company has long toiled in the shadow of the market leader – some would say monopolist in DNA Sequencing – Illumina.

PacBio was way overhyped in its early days, and crashed hard. It almost went out of business. But it always had a hard core band of loyal customers that allowed it to hang in there. I wrote a column in 2014 calling PacBio the “post hype sleeper” of genomics, predicting a comeback.

It has taken a long time, but that’s what happened. PacBio found a way to hang in there so it could continue to improve its technology on key parameters that matter for its scientific customers – cost, throughput, and accuracy.

PacBio’s most compelling advantage is that it offers “long read” sequencing. That means that its instrument can read long segments of DNA from a sample, before they get assembled back together into a whole genome. The longer read technology enables PacBio to scan especially tricky parts of the genome – long repeats and structural variations especially — and piece them together with a high degree of accuracy. Illumina, in contast, built its empire with “short read” technology, which had important advantages in terms of cost and speed.

A couple years ago, Illumina sought to acquire PacBio for $1.2 billion. That deal was scrapped after antitrust regulators in the US and Europe raised monopoly concerns.

This is where the story gets really interesting. Henry is a former Illumina executive, one of the key players in its rise to dominance. He competed against PacBio for years. He had retired from Illumina, and was sailing around the world with family. When the Illumina acquisition fell through, he took on a new challenge as CEO of PacBio. If PacBio was going to be independent, it would need to find a way to compete.

Over the past year, under new leadership, PacBio has executed on a bold growth strategy. Henry brought in some key recruits from his former employer. PacBio has raised a ton of cash, and now has more than $1 billion in the bank. It acquired San Diego-based Omniome – a company with short-read sequencing technology that’s supposed to deliver a higher degree of accuracy. That could help fill a void in PacBio’s product lineup that it offers to customers who want short-read technology, as well as long read technology.

PacBio is now worth more than $5 billion.

It looks like a much stronger competitor now than it did a year ago.

It’s a classic story for The Long Run.

Now, before we dive in, a word from the sponsor of The Long Run – Answerthink.

Answerthink has been consistently recognized by SAP, one of the largest enterprise software companies, as a top business partner for delivering and implementing SAP solutions for small and midsized life science companies. Their SAP certified solutions designed for the Life Science Industry are preconfigured, rapidly deployable and address fundamental business and IT challenges such as:

  • Integrating your business applications
  • Delivering validated reporting
  • Increasing your speed to market
  • Support for global rollouts
  • As well as delivering a fully compliant solutions that meets FDA’s strict standards.

Explore how Answerthink can streamline your business processes to ensure growth.

Visit Answerthink.com/timmerman and get a copy of the e-book- “Top Three Barriers to Growth for Life Science Organizations.”

That’s Answerthink.com/timmerman

 

 

‘Art of Cells’ is the latest installation of Gibco’s campaign which explores the wonderful relationship between research scientists and their cells. We’ve searched far and wide for talented artists that express creativity in their own, incredibly unique style, and paired them with one of our research scientists.

 Each artist was then tasked to create a piece of artwork that portrays the true beauty of cells. From poetry to embroidery, 3D animation to music, the ‘Art of Cells’ project promises to explore cell science in a way that has never been done before.

Discover the ‘Art of Cells’ at thermofisher.com/GibcoLoveYourCells.

Now, please join me and Christian Henry on The Long Run.

4
Oct
2021

Pharma’s Digital Transformation: Enduring Challenges, Sustained Hopes, And A Progress Report From Lilly’s CEO

David Shaywitz

When it comes to emerging digital and data technologies, most pharma CEOs today are singing from the same hymnal.

They all emphasize their commitment to digital transformation, and assert that the adoption of digital processes are key to their companies’ success, and vital for the industry’s future. 

A recent Lazard survey of healthcare leaders echoes this message. The investment bank found that most senior executives believe “advances in digital technologies, data analytics AI and ML” represent the force that will “most transform the healthcare industry over the next 5-10 years.”

Outside of the C-suite, however, the view of emerging digital technologies seems noticeably more pessimistic. As one physician managing a large team at a big pharma wrote me this week, reacting to a STAT article on a disappointing clinical sepsis-prediction algorithm:

“Hard for me to mentally reconcile real world experience like this with all the hype around how we probably won’t need doctors in the near future because ‘AI will do the same work, but better,’ and we won’t really need drug hunters either because AI is going to solve that too.”

This operator candidly captures the current sentiments of most practicing physicians and drug developers, who’ve noticed that the promise of emerging digital technologies doesn’t yet appear to have found consistent, meaningful, impactful expression in their work. 

If digital technologies are going to transform the industry, where is this change? And when are we going to start to see palpable and persuasive examples, rather than continuing to learn on theoretical use cases and rosy consultant forecasts?   

Time Course Of Innovation

For starters, we should consider the life cycle of innovation (a topic I’ve discussed at length here), based on the model of economist Carlota Perez. 

Short version: it characteristically takes a long time — a really long time, on the order of decades — to go from a raw discovery to implementation at scale. It’s hard for people to get their heads around something that’s new. Often, a series of incremental tweaks are required before a novel concept can achieve widespread use. 

This phase – trying to get our hands around emerging digital and data technologies – is exactly where we are now in healthcare. Figuring out how to apply these technologies effectively is both our abiding challenge and our remarkable opportunity.

Often, we seem to have a magical view, of technology in general and of AI in particular. There’s even a phrase – “enchanted determinism” – that’s been used to lampoon the nearly-religious view that some have of AI, the view that AI will somehow solve what ails us. 

But AI is a tool not a deity, despite the reverential way it is spoken of by some impassioned advocates, entrepreneurs, and investors. It’s a useful tool for solving certain types of problems.

The next question is how do we begin to unlock and access the exceptional potential of emerging digital and data technologies in biopharma?

Here, we seem to be trapped between two extremes – magic and nihilism:

Magic in the sense of the belief that a thorny problem can be resolved simply by applying a powerful technology, generally AI. It can often seem like AI is pitched as the solution for every problem, including those that haven’t yet been identified. The tendency to exaggerate the potential of a new technology to promote adoption only adds to the challenge. The almost inevitable failures here tend to reinforce a sense of…

Nihilism, a belief that emerging digital and data technologies are so overhyped as to be functionally useless at best, and a waste of time and resources at worst. Busy drug developers often find themselves figuring out how to avoid getting dragged into what they see at the latest innovation initiative so that they can instead use the time to get their actual work done.

Slope of Enlightenment?

So why am I still so optimistic about the potential for emerging digital and data technologies to radically improve the way new medicines are discovered, developed, and delivered?

Short answer: because we’re learning

  1. Tech seems to have developed a more nuanced appreciation for the complexity of healthcare, drug development, and the human organism. Deep domain expertise is critically required for success – one of the reasons that leading technology companies have recruited and organizationally empowered some of the world’s most thoughtful and integrative physician-scientists, including Dr. Taha Kass-Hout, now at Amazon, and Dr. Amy Abernethy, who recently joined Alphabet’s Verily.
  2. Healthcare organizations and the biopharmaceutical industry wrestle every day with a range of challenges that digital and data technologies should be able to help address. Data obviously are critical to the durable maintenance of health, the effective treatment of disease, and the efficient development of meaningful new therapies. There remains an urgent, desperate need to upgrade the way we gather, utilize, and share information. This was a critical learning from the pandemic, including in a soon-to-be-published analysis that Abernethy, Microsoft’s Peter Lee, and I, along with an extraordinary team of collaborators, conducted as part of a more comprehensive initiative organized by the National Academies of Science, Engineering, and Medicine (stay tuned!).
  3. Tech, once seen largely as an entrepreneurial force for good, is now evaluated far more critically – this is essentially the theme of every AI book I’ve reviewed for the Wall Street Journal in last several years (see here, here). The good news here is that this fall from grace has prompted many tech companies to engage more thoughtfully and collegially with healthcare and biopharma companies (as I allude to here).
  4. Perhaps most importantly, digital and data technologies are becoming less exotic and more normalized – capabilities that are starting to be more routinely incorporated into the training of physicians, scientists, and business executives. I remember how excited I was when I bought my first smartphone; in contrast, my teenage kids view smartphones as ordinary, an established component of their world. For them, smartphones use is unremarkable and routine. Future generations of physicians and drug developers are likely to view the application of today’s emerging digital and data technologies in much the same way.
Worked Example: Lilly

To see how these trends are starting to play out, we can consider the example of Lilly, a company that, for the first time, was ranked top in innovation in Idea Pharma’s annual Pharmaceutical Innovation Index, released in April 2021.

Dave Ricks, CEO, Eli Lilly

Last week, at the “Future of Health Data Summit” in Washington, DC organized by Datavant, Lilly’s CEO Dave Ricks was interviewed onstage by Martin Chavez about Lilly’s approach to digital and data technologies. Ricks took the reins at Lilly in January 2017; Chavez is the former chief financial officer and chief information officer of Goldman Sachs and currently partner and vice chair of the global investment firm Sixth Street Partners.  (Note: I have no relevant disclosures related to either Datavant or Lilly.)

Chavez’s interview of Ricks covered two general areas:

–       Examples of impact/examples of continued challenge

–       Approach to organization and talent

Lilly: Examples of Impact

In 2019, about a year after Vas Narasimhan became CEO of Novartis and told the Wall Street Journal he aimed to “become a focused medicines company that’s powered by data science and digital technologies,” Narasimhan reflected on his early learnings. 

On the positive side, he cited AI’s contribution to the company’s finance and clinical trial operations. But he also acknowledged that the “The Holy Grail of having unstructured machine learning go into big clinical data lakes and then suddenly finding new insights” remained elusive. 

What was perhaps most striking about Chavez’s recent discussion with Ricks was how similar many of the themes were. Ricks similarly emphasized the use of data on the commercialization side – “pricing problems…spend optimization problems” while also pointing out this use case is “pretty well-trodden in pharma.” 

Also like Narasimhan, Ricks highlighted the value of data in improving clinical trial operations, calling out in particular a clinical trial optimization capability, which he described as “a virtual tool” used by Lilly to run simulations seeking to optimize trial protocols by examining the effect of tweaking a range of parameters on outputs such as projected recruitment speed and screen failure rate.

Ricks seemed far more cautious about the contribution of AI to the discovery phase of their work, although he acknowledged the apparent success of BenevolentAI’s approach in identifying Lilly’s marketed rheumatoid arthritis drug baricitinib (a JAK kinase inhibitor marketed as Olumiant) as having potential application in the treatment of COVID-19.

Subsequent clinical studies have documented the utility of the medication in a subset of hospitalized patients, as MGH infectious disease physician Rajesh Gandhi concisely summarizes here. Gandhi tells me the data suggest “baricitinib has an important role in hospitalized patients who are rapidly worsening and have evidence of inflammation.”

More generally, however, Ricks remained guarded about the application of AI to discovery. “The idea that you could just give a biology problem to a computer and it will tell you what a drug design is for it is not a reality, and it’s not going to be a reality for maybe a decade or more,” he said.

Asked by Chavez to identify the key obstacles in leveraging data, Ricks explained that “in life sciences, in general, the problems…are the big ones, which is we don’t have enough data or enough understanding about how to organize it to simulate the human organism. I think that’s a big problem.”

In contrast, Ricks said, “The easier problems are from the market working in.” In Ricks’s view, many of the commercialization challenges in pharma are not fundamentally different from other sectors, “there’s just different data.”

Manufacturing challenges in pharma, Ricks suggested, are also similar to other industries. As Siebel mentions in his book, Ricks pointed to the application of data tools to optimize predictive maintenance.

While clinical trials are obviously not a component of most other industries, they also represent, Ricks says, an “operational problem” and thus are also ripe for data-driven optimization that doesn’t rely on deep biological understanding. “We don’t need to envision how liver cells interact with heart cells to understand this,” he explains. “We just need to know our own operation and then work on ways to optimize that.”

Ricks argues Lilly’s efforts here over the last decade have reduced drug development cycle time (the time between IND filing and FDA approval) significantly, from over 12 years to under six years on average). He said Lilly,and the industry more generally, could get even faster in drug development, saying five years from IND to NDA is an appropriate goal.

In contrast, on the discovery side, Ricks doesn’t expect similar holistic, data-enabled improvements in the overall process, but believes “the discovery side will be more about tools that become platforms that eventually become solutions.” He adds, “we have to start somewhere.” In other words, AI might not comprise the whole “toolbox,” but might offer “a tool or two…that can help the human medical chemist be better at their job, speed up certain processes, and allow you to find success.”

Several specific data tools have yielded promising results in discovery, Ricks says. For example, the company has had some success “applying ML tools to very specific organic chemistry problems.”

He also highlighted the company’s positive experience using computers to profile protein structures, and predict which chemical entities in a known library are likely to have the best fit – a useful shortcut, he says. 

One pleasant surprise: the utility of algorithms in monoclonal antibody development. Although biologics tend to be large molecules, he says, the relevant interface space is much smaller and essentially more manageable and predictable. 

“It seems paradoxical,” he acknowledges, “because biologics seem more complicated, but in this way they’re less complicated.” He asserts this sort of approach saves “maybe 15, 20% of the time on monoclonal antibody discovery,” adding “we use it routinely.”   

Lilly: Approach to Organization and Talent

Since becoming CEO in 2017, Ricks says, he has focused on the company’s digital and data strategy and associated company organization. He’s placed someone specifically in charge of data and data strategy, and established a centralized group that’s in charge of all enterprise data management and analytics and yet does not operate through a “centralized model.”

Instead, Ricks describes the group as working through something more like a “consulting model,” where data scientist careers belong to the Center (for data and analytics), but their job reports to the function they’re in (such as clinical pharmacology or new product planning). They roll back to the data center when a project is complete, he explains, and then wait to go out to work on the next suitable problem.

Ricks notes that the market for data scientists is “hotter than anything,” and “so competitive,” and notes “10% of our [data scientist] roles are open any given time.”

One of the most interesting observations shared by Ricks was the apparent success of a program that enabled interested, traditionally trained scientists to “go back to school while they work,” receiving additional training (on Lilly’s dime) as data scientists. When they return, Ricks says, “they become the data scientist partner of their former lab mates.”

According to Ricks, this represents “a much better approach than just dropping in theoretically trained data scientists who then have to learn everything about chemistry and biology, which is complicated.”  Ricks says because the model has been so successful, it’s been extended from discovery scientists to include clinical development researchers as well.

In other words, according to Ricks, it seems to work out better, at least at Lilly, to teach traditional (but motivated) pharma researchers about data science than trying to teach traditional data scientists about pharma, suggesting the industry domain expertise is what’s most difficult to learn.

It’s not clear this choice will even be necessary in the future. 

As Chavez, the former Goldman Sachs executive, pointed out, data science first came to finance several decades ago. Today, he says, “people who are traders are also data scientists. So rather than having people in different roles with different experiences collaborating together, you wanted to get it all in the same body.” 

Ricks emphasized that he didn’t think everyone at Lilly needed to become a data scientist, though he said data are part of everyone’s job. Consequently, over the last three years, Lilly has “run a pretty comprehensive retraining of the entire workface on basic tools,” with an emphasis on teaching employees how to “self-serve your own data queries and how to access the data you need to access in this enterprise dataset – and know when you’re at your edge and when you need to call in an expert.”

Bottom Line

Between the breathless promises and ensuing disappointments, it’s easy for clinicians and researchers to become disillusioned, and write off the application of digital and data technologies as just the latest innovation trend. 

This would be a mistake.

Emerging digital and data technologies are following a familiar innovation journey, and we are incredibly fortunate to be at one of medical history’s most exciting inflection points. Clinical providers and medical researchers in universities and industry – those on healthcare’s front line — have the remarkable opportunity – and arguably also the obligation — to figure out how to leverage powerful but still relatively raw technologies, and come up with a way to apply or adapt them to our most pressing health challenges. 

Bridging the gap between technology and application absolutely requires the insight, experience, and expertise of those in the trenches, actively wrestling with problems and intensively searching for more effective solutions. Both the engineers developing new technology, and the lead users in healthcare and biopharma who seek to apply it, increasingly recognize the urgent need for partnership and creativity if the potential associated with emerging technologies is to be translated into durable improvements in human health.