Your cart is empty.
By Michael Fejtl
Cell-based and in vitro assays are cornerstones of successful drug discovery and development, informing critical decision points at every stage of the process, from target identification through to pre-clinical testing. Poor assay choices can lead to irrelevant, variable or misleading results that translate into delays and costly program failures further down the line. Here we look at some recent assay technology trends that promise to improve productivity and reduce attrition rates in drug discovery and development. With them come new or more intense challenges for successful assay development and implementation, but in the long run, their added information content may make them the more cost-efficient alternatives.
Discovery of new medicines depends on successful development of high-tech assays that address more complex biological questions.
More complex assays for more sophisticated questions
To keep drug development pipelines flowing, even as many of the big revenue-generating drugs come off-patent and the ‘low hanging fruits’ in drug discovery diminish, investigators are turning their attention to more complex diseases and asking increasingly sophisticated questions. The need to address these more difficult questions is driving demand for more informative and biologically relevant assays. This in turn is giving rise to novel tools and analytical approaches that are totally reshaping assay development.
With that in mind, here are seven emerging trends you probably can’t afford to ignore when considering new assays for drug discovery and development.
1. More relevant cell models. Regardless of the assay approach—target-based or phenotypic; biochemical, genetic or cell-based—results can only be as biologically meaningful as the model systems upon which they are based. For this reason, cell-based assays have gradually become a mainstay in drug discovery and development because they provide additional physiological context compared to more traditional in vitro biochemical methods.
More recently, breakthroughs in stem cell technology have enabled investigators to move away from immortalized and animal-derived cell models towards more relevant, differentiated, human cell types. In parallel, the emergence of efficient gene editing tools like CRISPR/Cas9, has enabled precision genetic engineering to generate much better disease models, along with reference ‘normal’ cells that share the same genetic background.
The ability to expand human embryonic stem cell (hESC) and induced-pluripotent stem cell (hiPSC) cultures before differentiation into the desired somatic cell type enables large batches of cells to be reproducibly generated in quantities sufficient for higher throughput applications. While stem cell-derived models and precision-engineered lines were once prohibitively costly and time-consuming to produce and validate, the barrier to entry has dropped dramatically, and there are now a good many human stem cell-derived models commercially available.
2. Phenotypic cell-based assays. With the advent of target-based approaches, phenotypic assays were largely phased out of drug discovery and development. However, over the past decade there has been a strong resurgence in their use, following retrospective studies (e.g. Swinney and Anthony¹) indicating that phenotypic investigations have been more successful than target-centric approaches in the discovery of ‘first-in-class’ small molecule drugs.
Better cell models, together with more robust and sensitive detection technologies, have facilitated more widespread implementation of phenotypic cell-based assays. Increasingly, these assays are being successfully miniaturized to 384-well formats. To characterize complex phenotypes, multiple end-points are often assessed within the same sample. In contrast to target-based approaches, phenotypic assays enable compounds to be studied and screened in a target-agnostic way, which can support discovery of novel therapeutic targets and identification of drugs with more optimal molecular mechanisms of action. Following identification of compounds that induce the desired phenotypic effect, a battery of ‘deconvolution’ assays is typically needed to elucidate underlying molecular mechanisms and potential drug targets.
Phenotypic Assay: “An assay where the measured signal corresponds to a complex response such as cell survival, proliferation, localization of a protein, nuclear translocation etc. The molecular target is not assumed.” — NCATS Assay Guidance Manual
3. Early phase predictive toxicity testing. The two main reasons for compound attrition in drug development are inefficacy and toxicity. Late-stage failures can cost drug developers millions of dollars, while undetected toxicities can lead to even more devastating financial liabilities, as well as presenting serious health risks to patients. Conventional toxicology testing is typically performed on animals at the pre-clinical candidate selection stage, after millions of dollars may already have been invested.
With the availability of better in vitro cell models and phenotypic analysis capabilities, predictive cell toxicity assays are being scaled up to enable screening of more compounds in the early phases of drug development, when the cost of failure is relatively low. Typically, cell-based in vitro toxicity testing monitors a number of different indicators of cell health and viability, along with specific toxicity markers, and may employ complex analytical methods to identify phenotypic signatures predictive of on- and off-target toxicities.
4. Live-cell interrogation. The term “cell-based assay” can refer to any assay based on the use of whole cells—whether the end-point is a lysate, a population of fixed (dead) cells, or a living cell culture. As probes and methodologies for analysis of living cells improve, live-cell assays are becoming more popular, since they preserve structure and physiological context, and enable measurement of probes and processes that are difficult to capture by destructive methods.
5. Kinetic data. Whether for in vitro characterization of enzyme inhibitors, capture of rapid calcium signaling, or the study of dynamic responses in living cells, kinetic measurements offer an additional informative assay dimension to elucidate the mode-of-action (MOA). Increasing attention on kinases as therapeutic targets has led to an improvement in assays that measure enzyme kinetics, along with the emergence of affordable kits such as those employing TR- FRET probes (such as HTRF®).
Automated cell-based assays are benefiting from the incorporation of temporal measurements. The ability to measure the kinetics of cell behaviors and specific markers has opened up new target classes, while temporal analysis of phenotypic drug responses enable investigators to distinguish between primary and secondary effects.
“Temporal analysis of cellular behaviors can lead to improved understanding of the phenomena underlying observations made in a phenotypic screen and allow for better differentiation between primary and secondary consequences of treatment.” —NCATS Assay Guidance Manual
6. Design of Experiments (DOE) methodology. As assays become more complex and multi-dimensional, assay development and optimization becomes exponentially more difficult, with a myriad of variables that can impact on assay performance. In recent years DOE methodology, a statistical experimental design approach originally developed for engineering applications, has become a valuable tool in assay optimization. DOE approaches are helping assay developers home in on the most influential parameters and optimal assay conditions with a minimal number of experiments. That being said, DOE methods often require a large number of experimental combinations to be run with precision and accuracy to get meaningful results. This is becoming easier and more commonplace with advances in automation.
7. Data integration and standardization. Assay development in the 21st century is more data-rich than ever before. The potential to correlate data from multiple sources, such as target-based screens, phenotypic assays and ‘omics’ analyses, is both an exciting opportunity and a daunting challenge. As we run more assays in parallel and in diverse models, there is an increasing need to integrate all this data in order to gain insights that will help us make better decisions during the drug development process. For this to be possible, more concerted effort will be needed to standardize methodologies and ensure data are comparable from experiment to experiment and lab to lab. And of course we will also need to make the most of informatics and AI to process all the data and find the answers more quickly.
While there can be a significant payoff from incorporating more complex and sophisticated assays into drug discovery, in the short term it can increase assay development and validation time if you’re not prepared. In the next article, we look at steps you can take to reduce assay development cycle times and ensure successful implementation.
“Assay development for drug discovery is a rapidly changing discipline. To keep up with these and other trends, the NCATS Assay Guidance Manual, produced by the National Center for Advancing Translational Sciences (NCATS) and Eli Lilly & Company, is an excellent resource that contains practical guidance contributed by leading experts around the world, and is updated quarterly.
Tecan’s Application Guide for Multimode Readers is another valuable source of information to help you optimize a wide range of assays commonly used in drug discovery and development. You can download a free copy here.