October 28, 2022 |
In the 1990s, biomarkers were the future of personalized medicine and a “lab on a chip” was the holy grail of understanding patient health. As much as past predictions of whole organ transplants and intra-body cameras have been realized, the notion of using measuring an array of protein to better understand a patient’s individual health, current and future, has still eluded us. But maybe for not much longer.
This fall, a panel of researchers pulled together by STATnews[1] discussed this very topic as it applies to oncology, focusing on where we are in terms of using a person’s chemistry to optimize their diagnosis and treatment. No two oncology patients are the same, with their differences going beyond both patient and tumour characteristics. That means each patient’s path to diagnosis and treatment is different. Biomarkers in blood, fluids, and tissues have significant potential to guide targeted therapies individualized to each patient and their disease.
The panel discussion was a tantalizing look at what’s in reach now, both in terms of diagnostic and therapeutic oncology targets, but also how we can use emerging technologies to re-evaluate how we frame personalized medicine.
It’s not just about getting the data—it’s about developing better analysis tools
We finally have the capacity to interrogate “Big Data”, meaning we can look at biomarkers in terms of systems, which is critical for how we understand epigenomics and early-stage cancer detection. In terms of current treatments, that can mean that biomarkers offer personalized clinical information that physicians can use to make treatment decisions. The challenge here is that no single or set of biomarkers can completely predict patient response to a particular treatment, which reflects the heterogeneity and complexity of disease pathways. So, deciding what blood chemistry tests to order for a patient is not often clear, at least not yet. And that’s where we need to involve the broader research community.
We need mathematicians and engineers, too
Frankly, biomarker research tends attract biologists interested in areas such as genomics and proteomics rather than engineers. But things are changing and a sweet spot between research science and engineering is the emerging field of synthetic biomarkers. As one panellist explained, the framework around synthetic biomarkers involves understanding the limitations of natural biomarkers. Engineers can synthesize new solutions around natural biomarkers to overcome their transient nature, and that has potential to change medicine in a different direction altogether.
What comes next?
Applying large-scale computing to cancer data is now a reality. What is missing, however, is elevating research from observational data to practical insights. It’s an area mathematicians and physicists have much experience in, but those researchers don’t typically get involved in this type of work.
More than ever, we need to look at a shared model of research and analysis. Pharma can’t do this on their own, and neither can physicians. We need to look at ways of bringing multidisciplinary teams of doctors and nurses, researchers and statisticians, mathematicians, and engineers, and more to work together on these complex datasets. It’s not the way that we have worked in the past, or are working now, but if we truly want to glean learnings from large biomarker datasets that are only now possible, much less apply those learnings to individual patients, we need to rethink who analyses the data and how.
Vanitha Sankaran, PhD, Medical Director, is part of the global medical strategy team at Evoke Mind + Matter. She works with her fellow matter makers to develop medical insights and strategy that drive our business with pharmaceutical and life sciences clients. We strive to make health more human by creating meaningful experiences to bring meaningful change.
[1] Sponsored by Astellas Pharma
This content was provided by Inizio
Latest Content from Inizio
Inizio, the global commercialization partner to the health and life sciences industry, today announced the launch of Intelligent Congress – a connected, end-to-end solution designed to help pharma organizations maximize...
Integrated capabilities combining advanced analytics, pharmacy, and specialized talent to improve access, affordability, and adherence
Pharma USA is North America’s largest cross-functional pharma event and Inizio was there to offer insights on patient engagement, accelerated product launches, and Intelligent Commercialization™. Now we’re back in the...
Discover how AI, analytics and real-world data are reshaping Medical Affairs.
Discover how teams from across Inizio provided a range of integrated services to give a client's product the best chance of launch success.
Explore Inizio Medical's key takeaway from ISMPP EU 2026, including AI governance, transparency, patient partnership and evolving impact measurement.
At Age of AI 2026, Inizio Medical’s Phil Wakefield and Inizio Ignite’s Tim Luxford will explore how cross-functional collaboration and AI can unlock faster, more actionable insights across medical and...
Artificial Intelligence (AI) is often discussed in broad terms, with promises of transformation at scale. In practice, however, its value is realized in far more specific ways – when it...
Learn how a next-generation approach to field training – powered by GenAI and avatar simulations and enabled through Intelligent Commercialization™ – can transform field force effectiveness by delivering scalable, always-on,...
Inizio, a leading global commercialization partner for pharma and life sciences, today unveiled Inizio Ignite, a reimagined, fully integrated advisory partner designed to accelerate pharma performance through strategy, insights, and innovation.
