ACHEMA Trend Report How Laboratory Automation Tackles the Big Big Data Challenge

Editor: Lea Meißner

Data makes the lab go round: Laboratory automation enables the handling and evaluation of thousands of samples. The result: data galore. The question is how to make good use of this information.

Related Companies

Big Data: Problem or solution for lab operators? Gathering more information does not always lead to a better understanding of the process.
Big Data: Problem or solution for lab operators? Gathering more information does not always lead to a better understanding of the process.
( © hainichfoto - Fotolia)

Time and cost pressure, a heterogeneous set of hardware, rapidly expanding data volumes and a diversity of formats are typical for the IT environment in automated lab workflows. Due to the highly complex, lab-specific tasks involved, a single uniform long-term standard for pharmaceutical, biotechnology or clinical diagnostics is unlikely for the foreseeable future. There is a need for scalable systems, integrative platforms and standardized interfaces.

Substance libraries made up of potential active ingredient molecules for drug development currently contain well over a hundred thousand active substances. Lab robots build these libraries based on a pre-defined set of synthesis rules. With the aid of lab automation systems, the lab teams use high-throughput screening to process in the region of 10,000 samples per day. Ultra high-throughput screening was developed at the beginning of the 1990’s by Evotec in collaboration with international pharmaceutical producers such as Novartis and Smith Kline Beecham.

Gallery

Using this technique, more than 100,000 samples can be processed on a daily basis in active ingredient research for the development of new pharmaceutical products. Each microtitration plate can contain up to 3,456 wells to facilitate efficient handling and archiving by automated systems.

For more exciting news and trend from this year's ACHEMA, please see our ACHEMA Special on PROCESS-Worldwide.com!

Lab information and management systems (LIMS) have been in use since the 1980's to integrate laboratory workflows into an IT landscape, but higher levels of modern automation greatly increase the complexity of the demands placed on lab IT. This is particularly the case when automated workflows extend beyond PC and microcontroller based control on a single piece of equipment. Attention is currently focused primarily on process and data management as well as inclusive lab management.

The Big Data Challenge

Data volumes in the world of medicine and research normally extend well into the terabyte and petabyte range. The term “big data” is used to describe the enormous amount of data which is generated during automated processing. Insiders expect data volumes worldwide to double every two years or so. The enormous increase in data resources for genome sequencing is only one example from the Life Sciences. The abundance of data also creates huge potentials for the development of new drugs and life-sustaining medical analysis.

Companies including Boehringer Ingelheim, CHDI, Evotec, Genentech, Med Immune/Astra Zeneca, Ono Pharmaceutical and UCB have forged research alliances to develop new pathways for treating Alzheimer’s, dia- betes and cancer as well as palliative medicine. Insilico Biotechnology operates one of the world’s leading system biology platforms which draws together proprietary data bases, cell models and computer-based analysis. The goals are validation of active ingredients and production of biochemicals and pharmaceuticals.

(ID:43249799)