In a recently published article on David Shaywitz, an executive at Takeda, discusses the impact of data science and analytics in the pharmaceutical industries. He begins by noting that data science and analytics has been in the “dancing bear stage” and that it’s time for it to “demonstrate its ability to materially impact health and disease.”

Much of the article focuses on parsing an Andreesen Horowitz podcast interview with Vas Narasimhan, chief executive officer at Novartis, and his views on the (limited) successes and (undoubtedly expensive) failures with data science and analytics deployments in pharma.

Early in the article, Shaywitz quotes a paper by Dr. Sachin Jain, former CMIO at Merck and now CEO of CareMore Health: “The conference circuit is now exploding with “AI and pharma” conferences… consultants excitedly discuss pharma’s digital transformation (and convince each pharma brand they’re distinctly behind), and exuberant stories about the power of data and AI resound almost daily across social media.”

The Progress of Artificial Intelligence in Pharma

This question of the progress with applying artificial intelligence in pharma is not solely a digital transformation issue. It’s also a broader innovation issue. Innovation is the result of disciplined experimentation and in most use cases pharmaceutical manufacturers are experimenting with emerging technologies. And a16z podcast, Novartis CEO Narasimhan acknowledges that there’s a significant hype factor around AI in pharma and that Novartis has discovered the challenges of just cleaning their data well enough to get to a practical training data set.

One might be forgiven for thinking they’re implementing proven tech, given the consumer applications of what we commonly think of machine learning, such as Alexa. But even in those consumer examples, particularly Siri, we can see the fundamental problem that is embedded in the application of these technologies: you need a LOT of data to properly train the algorithms. And, even with a lot of data, there’s no guarantee that you’ll get an effective result.

We’ve seen this inflection point before in the eighties and nineties when there was surge in econometric modeling (AKA multivariate regression). The mathematic methodologies had been around for centuries, and the emergence of data and computing power allowed for them to move from theory to practice. In a few instances, companies had both the data and the computing power. But often, models yielded questionable results due to the quality of data, or systems weren’t strong enough to process the high volume of data to live up to promises of real-time analytics. At the moment, AI in healthcare is not being held back by systems or analytic approaches, it’s the fragmentation and inaccessibly of data, and why the AMA has launched its Integrated Health Model Initiative.

Demystifying Artificial Intelligence and Delivering Impact

Dusty Mujumdar, former CMO for IBM Watson Health, recently cited a few organizations making progress in demystifying AI and delivering real impact. Recursion Pharmaceuticals is one of them, demonstrating collaboration between data scientists, software engineers, biologists, and automation specialists combining artificial intelligence with automation to conduct experimental biology at scale—testing thousands of compounds on hundreds of cellular disease models in parallel and publishing peer-reviewed articles clearly defining their scientific advancements with life-sciences companies. It’s In these very focused definable spaces where AI keeps showing its value.

In pharma this is, in part, a vendor-driven problem. Large technology vendors and systems integrators frequently market AI-driven solutions as robust and proven. And we’ve seen where that can lead (paywall). There’s no question that manufacturers do need to explore AI because it holds the promise of revolutionizing drug discovery as well as identifying new indications through analysis of real-world data sets. However, they must proceed with the full understanding that they’re experimenting and that, just like the traditional drug discovery process, there will be millions, if not billions, spent before there’s a viable commercial outcome.

Final Thoughts

As one begins experimenting, there are three places to watch. The first are single payer markets, and the larger the better. China is a great example, as highlighted in a recent New York Times article, Dr. Kang Zhang, chief of ophthalmic genetics at the UC San Diego highlights that the quantity of accessible, clean, and complete data sets in China, the size of its population, and low clinician-to-patient ration make it an ideal market for deep-learning diagnostics tools. The other space to watch is start-ups, where the success of AI is quintessential to their success.

Companies like Inspirata and Arterys are two great examples of AI gaining rapid success, particularly the often over-looked need for longitudinal data sets that are in short supply in the US. Start-ups like these are important to watch, as they are literally betting everything on AI’s success, and are putting sweat equity into unlocking its value. Lastly, look outside of healthcare. Leading analysts such as Altimeter’s Susan Etlinger regularly writes about endless examples of the successes and struggles with AI.

Learn more about AI, digital transformation and innovation within the pharmaceutical industry.


Leave a Reply

Your email address will not be published. Required fields are marked *