In the first article in this series, we looked at how HMGB1 has taken an increasingly important position as a key mediator in the immune response, playing a major role in many diseases, from cancer to coronavirus. There is now significant evidence that HMGB1 is essential for SARS-COV-2 replication, as well as potentially being a therapeutic target in severe cases of COVID-19.1 In this article we examine how we can effectively measure HMGB1 accurately in serum and other samples and begin the journey from research to clinic.
Low drug efficacy and safety concerns are the main reasons for late-stage withdrawal of drugs in clinical trials and account for 87% of all phase III submission failures.  Toxicity towards certain organs like the heart, liver or kidneys plays a central role in many of these unsuccessful trials. For the monitoring of tissue-specific toxicity, human induced pluripotent stem cells (hiPSC) are increasingly used as a powerful tool to develop cell models, since they are more relevant, scalable and reproducible model systems compared to traditional animal models and standard immortalized cell lines. Production, handling and differentiation of iPSC and other stem cell-derived models is very time-consuming and greatly benefits from automation. This article explores some of the factors to consider when automating for stem cell handling and differentiation.
How the human body deals with infection depends on an individual’s immune response. When looking at the body’s response to SARS-CoV-2, the state of the immune system has a crucial impact on the clinical outcome. For example, HMGB1 (High Mobility Group Box 1) protein is a key mediator of the immune system, and as such it has been shown to be critical in the replication of SARS-CoV-2. This article outlines the potential roles of HMGB1 in the race to find solutions to the coronavirus pandemic.
Research using stem cells and stem cell-derived models holds huge promise for drug discovery and therapeutic applications. However, creating, characterizing, maintaining and expanding stem cell-derived models and therapeutics can be a time-consuming and error-prone bottleneck. The emergence of genetically engineered induced pluripotent stem cells (iPSC) has opened the door to more relevant and reproducible human model systems and scale-up strategies, yet many challenges remain when it comes to the practical application of iPSC in the lab. In this article we take a look at the advantages that iPSC technology brings as well as some of the main challenges that must be addressed to increase research output and quality.
Live cell imaging is one of the most important techniques in the life sciences today. But behind every great imaging assay, pity the poor scientist grappling with the demands of biological variability and complex kinetic cell assays. Live cell experiments are often synonymous with unsociable working hours, tedious protocols and unrepeatable results. In this blog we explore what it takes to tame automated cell imaging assays and take back control of kinetic experiments to get reliable results more quickly, with fewer errors, and less aggravation.
What happens when lab automation projects are unsuccessful? One out-take is learning what creates a stronger process and methodology. That's exactly what we found at Tecan after working with several hundred customers on lab automation for multiple projects. This presentation reveals the top 5 pitfalls of custom automation based on real experience.
With “fake news” topping the headlines these days, we’re painfully aware that hearing just part of the whole story can lead to seriously wrong ideas that can have embarrassing or even disastrous consequences. The same is true when analyzing cell populations. Every individual cell has its own story to tell, so population averages and random samples are often misleading. When running assays on a cell imaging system, microplate reader or flow cytometer, can you be sure you are getting the whole truth? If not, it may be time to consider whole-well imaging.
Today’s hematology labs are faced with escalating demands to deliver robust and accurate blood test results quickly. At the heart of automated diagnostic systems for blood analysis are liquid handling pumps, which must deliver precise and accurate results every time. As well as being reliable, they must also be affordable and easy to maintain. Unfortunately, not all pumps deliver to these exacting standards. What are the most important factors for an engineer to consider when selecting a pump to meet the stringent performance required for a hematology automation system?
Automated lab analytics solutions are increasingly taking to the cloud to give labs real-time visibility of instrument and consumables usage. This is valuable information – for example to understand what throughput is available to scale up and complete programs in weeks and hours rather than months. But what about the worry of data security when implementing cloud-based software? Here are seven steps you can take to make sure your data stays safe in the cloud.
If you’ve decided you need to incorporate phenotypic screening into your discovery program and you know that one of the new generation of automation platforms is the way forward, what factors should influence your choice?
Automation, miniaturization, cell based assays and 3D cell culture
Improving lab procurement processes involves more than just putting e-procurement or lab management software in place. In most cases accessing, managing and analyzing the data that you use to support purchase decisions and feed into e-procurement tools is still a big challenge. In previous articles, we explored the value of automated collection of usage data from lab instruments and robotics. What capabilities and features should you look for when deciding which tools will best support your needs? Here are our top picks.
As labs face tighter profit margins and the need to minimize cost of goods, there is increasing pressure to implement more efficient and responsive mechanisms for procurement and inventory management. A large proportion of annual spend goes towards consumables like disposable pipette tips, microplates and kits. Procurement strategies based on lean and ‘just-in-time’ principles can improve cost-efficiency by reducing overhead and warehousing expenses. However, this often comes with a significant risk: without enough data about both availability of consumables and what you have in stock, you could run into costly unexpected out-of-stock scenarios. Here are three essential questions to ask when looking to reduce the risks of creating leaner, ‘just-in-time’ procurement processes.
As a procurement planner in the competitive life sciences sector, how do you ensure your organization adapts swiftly to the rapidly changing demands of customers and stakeholders? Whether supporting a CRO, pharmaceutical company, clinical lab, biotech business or academic department, procurement teams are under constant pressure to manage risk, reduce costs and keep their organizations profitable. Advancements in technology and business practices are widening the influence of procurement on business operations, requiring procurement teams to collaborate even more closely with other functions, including lab management. Here are three major trends that are transforming procurement management:
Congratulations. It took you quite some time and effort to convince your management or institution on the value of investing in automating your experimental or clinical workflow. The applications were submitted, the presentations were made and the wheeling and dealing to secure the budget resulted in you and your team landing the investment. You've arrived. Now all you have to do is choose the robot and get it up and running.
In the first article in this series, we looked at how HMGB1 has taken an increasingly important position as a key mediator in the immune response and as such plays a major role in a large number of diseases – from sepsis to cancer. As Professor Helena Erlandsson Harris, a pioneer in HMGB1 research, says, “I am convinced that the next step will be even better data to demonstrate the usefulness of HMGB1 as a prognostic/diagnostic biomarker. This has been hampered by the need to understand the isoforms that control different functions and also the methods for measuring HMGB1. It would be even better if HMGB1 detection were included in larger biomarker panels.” HMGB1 has indeed been included as a necessary biomarker in consensus guidelines for the detection of immunogenic cell death. The question is how to measure it. In this article, we will look at the development of increasingly sensitive, reliable and easy-to-use assays for clinical research and routine use and how this has been complicated by the need to resolve the isoforms, and also overcome interference caused by auto-antibodies and other proteins that naturally interact with HMGB1 to modulate its function.
The demand for advanced medical and diagnostic testing continues to accelerate. Laboratories, hospitals, and emerging consumer genomics companies are demanding quicker test sequences resulting in the design and development of new innovative and responsive test protocols. These new tests include the handling of a wide array of fluids. The measurement, monitoring, mixing, and controlling of solvents, salts, detergents, acids, bases, reagents, and additives is critical in all liquid handling lab environments.
As a nuclear protein present in most cell types, HMGB1 (high mobility group box 1) is a key mediator of the immune system in health and disease. Interest in HMGB1 has increased dramatically as the protein has been shown to be critical to the cell’s response to stress and plays a major role in many disease states, including infectious diseases, ischemia, immune disorders, neurodegenerative diseases, metabolic disorders, and, not least, cancer. Highly conserved in mammals, HMGB1 (also known as HMG-1 and amphoterin) is primarily located in the chromatin where it stabilizes chromosome structure and plays a key role in controlling gene expression.
When it’s time to move your biotechnology breakthrough towards commercialization, your specific application workflows may require a custom approach to lab automation. If your requirements are uncommon, there may be no off-the-shelf products available for you to compare and test. Even custom configuration of off-the-shelf components may not be suitable. What is the best approach to finding a custom solution that meets your unique needs?
The answer is to use a defined process that ensures each step is thoroughly explored and evaluated. Consider these four “I’s” of custom engineering: Investigate, Ideate, Invent and Integrate.
Are you guilty of making decisions without the data to back them up? In today’s busy labs, mission-critical decisions about laboratory equipment purchases, service contract renewals, consumables spending, and staffing are often made on the basis of incomplete information. Having a clear picture of instrument usage and burn rates of associated reagents and consumables can help you uncover new ways to cut costs and improve performance in the laboratory. In the previous article we highlighted how crucial it can be for labs to monitor instrument utilization data. Now let’s consider more specifically what you can learn from analyzing all this data.
You’ve done your testing on the benchtop and proven that your new biotechnology innovation works in your hands. Now comes the exciting part – turning your solution into a breakthrough product that is ready for broader use and commercial launch. To get there, you need to optimize your processes so that you can ensure they are robust, operate within defined tolerances, and facilitate scale-up. What’s the fastest and most efficient way to get this done so that you can focus on your next bioscience advancements?
As we move into the 2019 budget cycle with signs of a global economic slowdown on the horizon, laboratory administrators are no doubt feeling the heat. A combination of poor forecasting, inefficient use of resources, and a sudden economic downturn could create the perfect storm to capsize operations. Despite these high stakes, critical decisions about budget allocation, expensive equipment purchases, workflow optimization and cost-cutting strategies are often made based on incomplete information or even pure guesswork about laboratory asset utilization.
With biotechnology advancing at an astounding rate, last year’s innovations often become routine tools for today’s breakthroughs. For example, next generation sequencing (NGS) is now an integral step in CRISPR/Cas9 constructions. The interplay between hardware, software, and biotechnologies is continually in flux, as some developments see payoff more quickly than others, and emerging breakthroughs can suddenly change the game altogether. With such constant and unpredictable change, how can you ensure that your own innovations move smoothly from concept to solution as quickly as possible?
As we saw in the previous article in this series, detecting differences in your cell-based fluorescence experiments means you need high assay sensitivity and reproducibility that comes from high quality optics and intelligent measurement methods. All this can be achieved using Spark™ multimode microplate reader.
Ever wish you could turn your microplate reader into an imager, so you can see exactly what your cells are doing in the well? Conventional plate readers are a ‘black box’ for cell-based assays. Your plate goes into the box, numbers come out, but you can never be certain that the results reflect physiological reality.
If you thought automated cell imaging and confluence determinations were just for “high-content” microscopy, think again. “All-in-one” microplate readers are shifting into top gear with the addition of robust imaging capability.
Last night you were up until midnight tending to your live-cell experiment. This morning you woke up with great expectations, only to find that your cells are sick and the entire experiment must be repeated. Sound familiar? It happens all too often, and the consequences can be heartbreaking – deadlines missed, expensive reagents wasted, precious samples lost.
Cell-based assays are a core research tool, offering an informative and cost-effective counterpart to in vitro and animal tests. Where destructive methods involving cell lysis once predominated, live cell assays are now commonplace, with measurements collected in real time, either at a single time point (end-point assays) or repeatedly over the course of minutes, hours or even days (kinetic assays).
Cell-based assays are giving us deeper insight into cellular mechanisms in a true biological context, and fluorescence assays are playing a leading role. Applications range from cytotoxicity, proliferation, apoptosis and G-protein-coupled receptor (GPCR) signaling assays to high-throughput screening (HTS) drug discovery.
All researchers performing cellular assays – research or clinical - need a cell counting solution. Cell counters are used to count cells in a culture to determine density, concentration or viability. Having established the need to count cells, how then to understand the many cell counting technologies available? Manual or automatic? Non-imaging (electrical resistance, flow, spectrophotometry) or imaging?
Imagine life science research without cell-based assays. Or without cultured cells of all types to power those assays. Healthy, high-quality cells at the right point of confluence are vital for proliferation, kinetics, cytotoxicity, and gene expression studies particularly during long-term experiments. With so many different cell types, assay formats, and detection methods the variability inherent in cell-based assays can be enormous. There’s no room for inconsistency in cell counts and confluence assessments — it’s counterproductive and just wastes time. What’s the best way to improve counting accuracy in your cell-based assays?
Successful assay development is of utmost importance for cost-efficient drug discovery. In vitro and cell-based assays serve as a first step to evaluate the biological effects of chemical compounds by cellular, molecular or biochemical approaches. The derived assay readouts may be relevant to human health and disease and can identify potential therapeutic candidates in the drug development pipeline. Ensuring minimal cycle times for assay development is an essential step in making a drug discovery program more cost-efficient. In this article, we present the key challenges for reducing cycle time of assay development and what it takes to solve them.
In the pharmaceutical industry, stem cells play a growing role in all phases of drug discovery, from disease modeling and early target discovery to their use in developing innovative cell therapies. Increasingly, a major development impact factor for stem cells is their capacity to serve as a self-renewing, sustainable source of differentiated cell models to support predictive toxicity testing in early stage drug discovery.
Cell-based and in vitro assays are cornerstones of successful drug discovery and development, informing critical decision points at every stage of the process, from target identification through to pre-clinical testing. Poor assay choices can lead to irrelevant, variable or misleading results that translate into delays and costly program failures further down the line. Here we look at some recent assay technology trends that promise to improve productivity and reduce attrition rates in drug discovery and development. With them come new or more intense challenges for successful assay development and implementation, but in the long run, their added information content may make them the more cost-efficient alternatives.
The trend towards more automated workflows in research is helping to significantly improve data quality as well as laboratory productivity. But when it comes to choosing an automated system for liquid handling and dispensing, it can be difficult to decide between the large range of technologies and platforms currently available. Here are a few pointers to help you select the features that are most important for your lab.
Designing an effective biological screen is always a case of knowing when to quit versus when to keep going, so you don’t miss potentially important factors. When working with complex biological systems, rational screen design becomes even more of a challenge. A main presentation track at SLAS 2018 will focus on that question. Entitled "Assay Development and Screening", the track will include a number of relevant sessions, including "Screening to Optimize Chemical and Biological Space," chaired by Fred King, Ph.D., Genomics Institute of the Novartis Research Foundation. We spoke to Dr. King to learn more.
A main presentation track at SLAS2018 entitled "Cellular Technologies" will include the session "Development of Cellular Models for Phenotypic Screening," chaired by Kristen Brennand, Ph.D., New York Stem Cell Foundation-Robertson Investigator and Associate Professor, Departments of Genetics and Genomics, Neuroscience and Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY. We spoke to Dr. Brennand about the key topics, highlighted trends, and target audience for the talks and presenters he has prepared.
In the rapidly evolving, data-driven life sciences sector, it is increasingly common to see labs developing their own in-house solutions to enable scale-up of novel methods, and to bridge technology gaps not yet filled by automation providers. The track "Automation and High-Throughput Technologies" at SLAS 2018 includes the session "In-House Automation: Devices and Software Developed Internally," which will explore this growing trend. We interviewed the session chair, Louis Scampavia, Ph.D., of The Scripps Research Institute to learn more.
From phenotypic assays to 4D cell tracking, high-tech methods are of increasing importance for complex screens. This expanding area will be a main presentation track at SLAS 2018 entitled "Assay Development and Screening" and co-chaired by Dr. Ralph Garippa, Memorial Sloan-Kettering Cancer Center and Dr. Edward Ainscow, Carrick Therapeutics. Dr. Garippa provides more insight on this timely and broad-ranging track, which will highlight case histories in assay development, implementation for high throughput screening (HTS) campaigns, and triaging for hit confirmation.
High throughput screening methods for phenotypic drug discovery are in demand, as novel disease models arise and increase in complexity. A main presentation track at SLAS2018 entitled "Automation and High-throughput Technologies" will include the session "Automating Target-Based and Complex Phenotypic Drug Discovery," chaired by Shane Horman, Ph.D. of the Genomics Institute of the Novartis Research Foundation. We spoke with Dr. Horman to learn more about the key topics, highlighted trends, and target audience for the session.
Phenotypic screening is back, with exciting implications for the discovery of new and more effective drugs. The reason? Constantly improving cellular technologies and instrumentation, and drug discovery and development programs bringing us closer to truly realizing the potential of precision medicine.
Like gravity, some phenomena are so integral to our existence that we’re barely conscious of them. Maybe that’s why the research community was largely taken by surprise when it was announced that this year’s Nobel Prize in Physiology or Medicine was awarded to three American scientists for their seminal work on circadian clocks ¹. But consider the synergies with next gen sequencing (NGS) and gene editing technologies, and it becomes clear that the implications of their work are far-reaching.
The repeatability of biomedical research has become a major issue, and the ability to achieve reproducible research results can only be as good as the liquid handling performance. Automation has become a given step in the drive to generate reproducible data so how well can automated liquid handling perform in, for example, genomics applications?
Cognitive computing and artificial intelligence have the power to save us from drowning in the vast and growing sea of data needed for precision medicine, but what will it take to achieve a timely return on investment? Experts from multiple disciplines will gather to share their perspectives on this challenging problem at the upcoming Tecan Symposium in Salt Lake City on November 14th.
Data driven decision-making depends on generating reliable data in a timely fashion. But the reproducibility of biomedical research results, or rather lack of it, has become a big issue. A recent Nature survey¹ revealed a “reproducibility crisis” in the research community, with 70% of respondents having failed to reproduce the work of other researchers, and over half even failing to reproduce their own results.
You may be convinced that your academic research laboratory is humming along just fine and cannot benefit from, take the time to consider, and perhaps most of all, afford adding automation to your workflow.
Scinomix, Inc., founded in 2001, creates customized solutions for labeling tubes, vials and plates in many life science applications. We took the chance to ask Nigel Malterer (CEO) and Jonathan King (Automation Software Engineer) at Scinomix about how automated barcode labeling solutions are helping to improve productivity, reduce errors and costs, and increase control over lab workflows.
As we have seen in the previous posts in this series, implementing fluorescence detection will be a quick and effective route to improving the quality and sensitivity of your assays. Achieving optimal fluorescence assays requires an optics system with both sensitivity and flexibility.
Fluorescence detection can give you the ability to develop assays with extreme sensitivity, high robustness and a broad dynamic range. Success involves addressing several challenges, such as the careful choice of excitation (Ex) and emission (Em) wavelengths and the selection of flexible and sensitive optics, as we will see here.
Barcodes play a central role in minimizing the risk of error in lab automation by providing secure tracking of components throughout the workflow. Barcode-guided lab automation can be simple and cost-effective, with significant paybacks thanks to productivity increases.
As we have learned in previous posts in this series, only pipette tips marked ‘sterile’ are guaranteed with a sterility assurance level (SAL) of 10-6. Pipette tips labeled as ‘Pre-sterile’ do not give such sterility assurances.
Compared to many other detection technologies, fluorescence provides hard-to-beat performance and flexibility. Fluorescent labels are stable for months, deliver high sensitivity and the diversity in available dyes gives nearly unlimited possibilities in assay design. This and many other advantages make implementing fluorescence detection one of the easiest and safest ways for you to improve the quality and sensitivity of your assays.
The life science industry is constantly fighting to improve throughput and reduce costs through the ‘industrialization’ of research and development. You have to strike a balance between moving quickly (productivity) and ensuring that you are actually moving in the right direction (quality). Lab automation, including automated liquid handling, plays an essential role in ramping up productivity. Ensuring high quality liquid handling is therefore the key to securing the reliable data you need to meet your program goals.
How can we improve upon the completely artificial situation that we have today for screening drugs? We spoke to Dr. Christopher Millan, Co-Founder and CTO of the up-and-coming company, CellSpring. Based in Zürich, Christopher Millan with his business partner, CEO Kramer Schmidt, are both Americans. We also asked Chris how two Americans end up establishing a biotech start-up company in Switzerland.
With today's demands of throughput and flexibility, how can you perform screening better? We spoke to Dr. Bernhard Ellinger, Principal Scientist at the Fraunhofer Institute for Molecular Biology and Applied Ecology. Dr. Ellinger is one of the first testers of the Fluent®* 780 liquid handling automation platform from Tecan. He put the system through its paces in a diverse range of applications for over 3 years. Here’s what he learnt.
With multiple tests to perform on a tiny volume, samples are getting more precious. And as Next Generation Sequencing pushes the envelope on cost and throughput, scientists are looking for ways of reducing reagent volumes without compromising on quality. Tecan has a tip.
The industrialization of biology has become possible thanks to the automation of repetitive tasks such as liquid handling, providing several benefits. It allows customers to extend their window of operations, achieve greater assay consistency and refocus expertise away from repetitive processes. In addition, moving manual steps, such as pipetting into the control of robots also enables secure downsizing of formats, including sample and reagent volumes.
How do cancer cells die? Necrosis of a tumor, or unscheduled cell death, has been linked to tumors outgrowing their blood supply. But now it is believed that the release of HMGB1 promotes the survival of the remaining tumor cells.
Robotics and automation have become essential to the future plans of drug discovery and clinical diagnostic companies. Executives are looking to increase productivity and reduce costs, and automation fits the bill in every respect.
The hemocytometer has been around for 140 years. It’s an easy, reliable, and trusty tool for all kinds of cell counting applications. It’s beautiful and simple. But measuring the well-being of your cells one click at a time is slow and tedious, and can be near impossible for adherent cells. Shouldn’t you be doing something else with your time?
“When you can measure what you are speaking about, and express it in numbers, you know something about it.” Lord Kelvin knew that. To be confident in your results, to quickly move your studies forward, and to be the first to publish your conclusions, you need to know that your numbers are right. The proof you need lies in reproducibility, and reproducibility in any cell-based assay starts with accurate cell counts.
When it comes to drug development, the challenge is always to create as much in-vivo relevant data as possible. The more relevant in-vivo data you can gather, the lower the risk of the drug not passing a clinical trial.
What are the benefits of the new Spark® 20M when it comes to accelerating the drug discovery process? This presentation from SLAS2016 goes beyond discussing typical microplate readers and washers to covering processes for optimizing assay development.