CMMS - Center for Multiscale Modelling in Life Sciences

CMMS is the Frankfurt centre for multi-scale modelling, analysis and simulation of biological processes.

The long-term goal of CMMS is a comprehensive understanding of both simple molecular biological processes, such as the mode of action of an enzyme, as well as the complex behaviour of organisms.

Such an understanding is the basis for the adaptation of cell functions for biotechnological use as well as for the development of biomedical, pharmacological and agricultural applications. Advances in the development of high-resolution methods for the atomistic description of molecules, cells and cell systems using cryo-EM and light microscopy provide insights into molecular mechanisms and processes. By integrating this information into models and simulations, basic mechanisms and causalities are identified. This requires new technical, algorithmic and informatic solutions to overcome the scale constraint and the prediction of missing information in experimental data sets. 

The merging of theoretical competences and their interlinking with data from diverse experiments carried out independently on several scales is essential in order to develop new concepts for describing biological systems and deciphering the causes of diseases. 


Prof. Dr. Volker Lindenstruth
Tel.: +49 69 798 44101

Prof. Dr. Enrico Schleiff
Tel: +49 69 798 29287

Prof. Dr. Franziska Matthäus
Tel: +49 69 798 47509

Prof. Dr. Gerhard Hummer 
Tel: +49 69 6303 2500


CMMS is a Loewe Schwerpunkt of FIAS together with Panters from the Goethe University Frankfurt, Max Plank Institut for Brain Reaserch and Max Plank Institut for Biobhysics (both Frankfurt). In this function, it will bundle and decisively advance the various activities in the field of multi-scale modelling.



The research at CMMS is structured in three pillars, each of which is described in detail here.

Pillar 1 - Development of integrated theoretical and experimental approaches

A current challenge is the interlinking of theory and experiment to formulate common approaches for the description of biological processes. One difficulty so far has been to convince experimenters that the different levels of hypothesis acquisition are equivalent and complementary. Therefore, experiments are planned jointly by experimenters and theoreticians in CMMS, so that i) a sufficiently large number of different usable data flows into modelling and simulation, ii) model predictions are checked experimentally and iii) theoretical methods and experimental approaches are optimized. This circular approach leads to data that can be used in models, theoretical descriptions and algorithms, new predictions and new information as a basis for further optimization steps. The efficient use, further and new development of data analysis methods (e. g. in image processing, Omics analysis, dimension reduction, fitting, correlation analysis, statistics, machine learning) is necessarily carried out. Relevant biological and medical experiments are also used as models in the joint planning process.

Pillar 2 - Multi-scale Modeling and Analysis

In the multi-scale analysis, data from independently conducted experiments, which may possibly run with different contrasts and in several spatial-temporal orders of magnitude, are combined. While data analysis connects the individual levels, modeling and simulation must lead to hypotheses that can be carried out on the scale of a specific experiment. Examples are the use of atomic coordinates with a Coarse Grain method to simulate the dynamics of biomolecules in large complexes and membranes, or the inclusion of data from image analysis of high-resolution microscopic methods in mathematical models to describe the dynamics and structure of these systems. A fundamental problem is that one initially does not know what accuracy of the data is required. The development of multi-scale analysis, modelling and simulation methods is continuously being driven forward.

Multi-scale approaches are based on a combination of methods, e. g. the numerical solution of partial differential equations in complex areas or the coupling of agent-based models to describe the function, structure or dynamics of an object with networks or systems of differential equations to describe internal regulatory mechanisms. The models resulting from the coupling of different approaches are numerically demanding and often there are no investigations on the stability and accuracy of such coupled numerical methods and their implementation on high-performance computers. Progress in this area requires in-depth mathematical analysis of the models and procedures, as well as the development of efficient implementation strategies and the optimal choice of computer architecture. Furthermore, existing methods for multi-scale integration must be further developed and tested and improved through concrete application in Pillar 1 for modelling and simulation of specific systems.

Pillar 3 – High performance computing

A template for integrated data streams from the microscope to the computer (model) is being developed, in which high-resolution microscopes are networked with high-performance computers. The data and metadata of different light and electron microscopes are stored in a standardized format on the mass storage devices of the HPC, which enables efficient access. Neural networks are involved in the long term at every level.

Data security is guaranteed especially with regard to possible medical data. Databases are being developed for efficient and standardized access to this data. The first stage of the data path is the automatic pattern recognition and selection of irrelevant areas for physical data reduction. In further steps, such as segmentation, particle physics experience is incorporated.

Another aspect is the visualization of complex data. To this end, standardized tools are to be developed and integrated into an analysis platform that efficiently presents various formats. A generic simulation and analysis platform should integrate data import and export with appropriate conversion routines, including data analysis and modeling, statistical packages, visualization, and a standardized scripting language. Since multi-scale models require a lot of computing time, the efficiency of algorithms is of great importance. Different methods of computer science are used here, e. g. the optimization of data structures, vectorization, generation of high parallelism, and the use of GPGPUs. The different approaches are further developed and combined in libraries.

Participating Scientists

Volker Lindenstruth

Scientific Coordinator CMMS
Professor of High Performance Computing, GU Frankfurt
Senior Fellow, Frankfurt Institute for Advanced Studies

Enrico Schleiff

Enrico Schleiff

Head of the FIAS Board of Directors
Director of the Buchmann Institute for Molecular Life Sciences
Professor of Molecular Cell Biology, GU Frankfurt
Senior Fellow, Frankfurt Institute for Advanced Studies

Gerhard Hummer

Director, Max Planck Institut für Biophysik
Senior Fellow, Frankfurt Institute for Advanced Studies

Franziska Matthäus

Giersch endowed professor of bioinformatics, GU Frankfurt
Fellow, Frankfurt Institute for Advanced Studies

Nadine Flinner

Junior Group Leader, Frankfurt Institute for Advanced Studies

Achilleas Frangakis

Achilleas Frangakis

Professor of Biophysics, GU Frankfurt

Matthias Kaschube

Professor of Computational Neuroscience / Computational Vision, GU Frankfurt
Fellow, Frankfurt Institute for Advanced Studies

Gilles Laurent

Director, Max Plank Institut for Brain Research

Ulrich Meyer

Ulrich Meyer

Professor of Algorithm Engineering, GU Frankfurt

Gaby Schneider

Professor of Math, GU Frankfurt

Ernst Stelzer

Ernst Stelzer

Professor of Physical Biology, GU Frankfurt

Tatjana Tchumatchenko

Head of Research "Theory of Neuronal Dynamics", Max Planck Institute for Brain Research

Michael Wand

Michael Wand

Professor of Computer Science and Visual Computing, JGU Mainz