Microscopic simulations: forecasting the next two decades

Theoretical, methodological and algorithmic developments require long term research investments. It is well accepted that developing a simulation code needs from 10 to 20 years of effort. As in many other research fields (materials, fluid mechanics, climate modeling etc), this is also observed in the field of microscopic simulations (where systems like finite size nanodrops, polymers/bio-polymers or bulk/vapor interfaces, are simulated at the atomic or molecular scale). For instance, the molecular modeling code CHARMM [1] and the quantum chemistry package of programs GAUSSIAN [2] were first developed in the seventies and they are both still among the most popular codes used today. The popular codes NAMD [3], GROMACS [4] and LAMMPS [5] or the quantum molecular code CPMD [6] are also still developed after more than 20 years. Many of the development teams of all the latter codes comprise now at least 20 people (accounting for developers and contributors), but they were all initiated with very small teams (no more than 2 to 3 people usually). Regarding methods, we have obviously to mention here the never-ending race to refine force fields (the version 36 of the CHARMM force field has been proposed in July 2017 [7]) and the long efforts devoted to develop accurate coarse grained approaches (like the ongoing improvements regarding the MARTINI approach [8]). Lastly we should also mention the case of polarizable force fields that were proposed and discussed as soon as the seventies [9]. They have started to be intensively used only during the last decade.

A priori, developing a theoretical method and writing a simulation code may be considered as two distinct standalone activities. Nevertheless, codes can be developed to allow the use of new theoretical methods that still need further improvements. For instance, we may quote the long-standing challenge to perform reliable and efficient geometry optimization when using Quantum Monte Carlo (QMC) approaches [10] or the need to build specific barostats well suited to new multi-scale coarse grained approaches like that proposed in Ref. 11. Moreover we have also to consider the outstanding increase of the available computational resources. In about 15 years, the power of a typical computing system available in national centers has increased from dozens of Tflops to now hundreds of Pflops, with Exaflops systems on the horizon (see among others Ref. 12). While the computing system panorama was stable over the last decade, dominated by standard and almost monolithic INTEL CPU-based architectures, we are now facing an important evolution. First more than 56% of the computational power available in the available fastest supercomputing systems results from GPU units [13]. Moreover, new actors are emerging proposing new architectures based on ARM computational units (see the recent announcement from Fujistu and RIKEN [12]) and even computing systems specifically devoted to molecular dynamics (the ANTON machine, developed by the D.E. Shaw Research Laboratory [14]). This means that we have to be aware of the forthcoming new generation of computing systems to propose not only interesting but also efficient new theoretical methods and algorithms. This is already particularly challenging with the present massively parallel CPU architectures, from which it is far from being obvious to get the highest level of performance when using a simulation code, in particular from the microscopic field [15].

On the experimental side, the main features and capacities of experimental apparatus also evolve rapidly. Decoding a full human genome can be performed today at the week scale by a single team, while years of efforts were needed by a large international network of research centers only 20 years ago. This means that bio informatics developments today have to account for the “Niagara Falls” of data generated by these new sequencing tools, i.e. for the amazing amount of storage capacity needed to store these genetic data and for all the problems tied to analyzing these data [16], problems that were far from being critical 20 years ago. We may also quote new approaches that are emerging in biology, like the high-throughput screening of proteins by phage display and by droplet microfluidics that allow one to map sequences to specific protein properties (like binding affinity or catalytic activity) for libraries comprising from 10^5 to 10^6 different proteins. This means that we are able today to experimentally benchmark protein libraries comprising about the same number of proteins as that hosted in the Protein Data Bank [17]. This suggests thus that new computational techniques coupling high-throughput docking methods with efficient simulation approaches to further interpret the results of this kind of experimental methods could be of great interest.

The problem of analyzing data generated by new experimental techniques is also known in other research fields where microscopic simulations are routinely used. For instance new experimental techniques (like surface sensitive photoelectron spectroscopy used in conjunction with the liquid microjet technique [18] or the gaseous ion nanocalorimetry technique [19] for instance) allow one to investigate interface phenomena or to estimate thermodynamics quantities at bulk/vapor interface and in gas phase nanodrops, like the properties of the hydrated electron and of halide anions. The latter two kinds of charged species are pivotal to understand many processes specific to aerosols that are know to have a deep impact on atmospheric pollution and then on global climate, air quality and public health [20], [21], [22]. However the experimental data generated using these new techniques are still difficult to interpret leading to intense disagreements even for an important property like the water surface ability to accept protons [23].

For obvious safety reasons, we may also quote here the difficulty of experimentally investigating the behavior of chemical species pivotal in the nuclear energy field (like heavy ions and metals) in particular in the liquid phase on a large range of temperature and pressure conditions to study contamination in reactor plan primary circuits for instance. Typical experiments to investigate the latter processes have to be performed at least at the year scale [24]. This is an exhaustive data set to be acquired regarding the ion thermochemistry that can be used by macroscopic chemistry codes like the program PhreeqC [25]. Similarly, we may also quote the example of working fluids for refrigeration. To replace fluids with high global warming potentials, new families of refrigerants [26] are proposed; however, experimental measurements of the thermodynamic and transport properties of new fluid candidates is expensive and time consuming, especially in the case of fluid mixtures with several possible compositions.

Microscopic simulations are considered as a promising alternative route not only to interpret experiments but also to complement them (for instance to “feed” data banks about ion thermodynamics properties in liquid phase). As discussed above the development of simulation codes and theoretical methods can not be considered as standalone activities. They have to be driven to complement experiments, to match potential new needs and to be well suited to be used on the forthcoming high-performance computing systems in order to reach the highest level of efficiency when performing simulations. It is thus pivotal to anticipate as soon as possible (and as far in the future as possible) what will be new potential needs and the computational state of the art in the forthcoming decades, in order to initiate the development of new generation of codes and of theoretical methods that will be used by large communities, from basic research to industry.



1 Brooks, B.R., Bruccoleri, R.E., Olafson, B.D., States, D.J., Swaminathan, S., and Karplus, M. Charmm – a Program for Macromolecular Energy, Minimization, and Dynamics Calculations. J Comput Chem 4 (1983) 187-217 ; Brooks, B.R., Brooks, C.L., Mackerell, A.D., Nilsson, L., Petrella, R.J., Roux, B., Won, Y., Archontis, G., Bartels, C., Boresch, S., et al. CHARMM: The biomolecular simulation program. J Comput Chem 30 (2009) 1545-1614.
2 Gaussian 16, Revision B.01, Frisch, M. J.; Trucks, G. W.; Schlegel, H. B.; Scuseria, G. E.; Robb, M. A.; Cheeseman, J. R.; Scalmani, G.; Barone, V.; Petersson, G. A.; Nakatsuji, H.; Li, X.; Caricato, M.; Marenich, A. V.; Bloino, J.; Janesko, B. G.; Gomperts, R.; Mennucci, B.; Hratchian, H. P.; Ortiz, J. V.; Izmaylov, A. F.; Sonnenberg, J. L.; Williams-Young, D.; Ding, F.; Lipparini, F.; Egidi, F.; Goings, J.; Peng, B.; Petrone, A.; Henderson, T.; Ranasinghe, D.; Zakrzewski, V. G.; Gao, J.; Rega, N.; Zheng, G.; Liang, W.; Hada, M.; Ehara, M.; Toyota, K.; Fukuda, R.; Hasegawa, J.; Ishida, M.; Nakajima, T.; Honda, Y.; Kitao, O.; Nakai, H.; Vreven, T.; Throssell, K.; Montgomery, J. A., Jr.; Peralta, J. E.; Ogliaro, F.; Bearpark, M. J.; Heyd, J. J.; Brothers, E. N.; Kudin, K. N.; Staroverov, V. N.; Keith, T. A.; Kobayashi, R.; Normand, J.; Raghavachari, K.; Rendell, A. P.; Burant, J. C.; Iyengar, S. S.; Tomasi, J.; Cossi, M.; Millam, J. M.; Klene, M.; Adamo, C.; Cammi, R.; Ochterski, J. W.; Martin, R. L.; Morokuma, K.; Farkas, O.; Foresman, J. B.; Fox, D. J. Gaussian, Inc., Wallingford CT, 2016.
3 James C. Phillips, Rosemary Braun, Wei Wang, James Gumbart, Emad Tajkhorshid, Elizabeth Villa, Christophe Chipot, Robert D. Skeel, Laxmikant Kale, and Klaus Schulten. Scalable molecular dynamics with NAMD. J Comput Chem, 26 (2005) 1781-1802
4 H.J.C. Berendsen, D. van der Spoel, R. van Drunen, GROMACS: A message-passing parallel molecular dynamics implementation, Comp Phys Comm Phys, 91 (1995) 43-56 
5 S. Plimpton, Fast Parallel Algorithms for Short-Range Molecular Dynamics, J Comp Phys, 117 (1995) 1-19
6 http://www.cpmd.org/
7 http://mackerell.umaryland.edu/charmm_ff.shtml
8 http://cgmartini.nl/index.php/about
9 Vesely F.J., J. Comput. Phys., 24 (1977) 361
10 Motta M and Shiwei Zhang, Calculation of interatomic forces and optimization of molecular geometry with auxiliary-field quantum Monte Carlo, J Chem Phys 148 (2018) 181101
11 Masella M, Borgis D. and Cuniasse P., J Comput Chem, 34 (2013) 1112-1124
12 https://www.top500.org/news/new-gpu-accelerated-supercomputers-change-the-balance-of-power-on-the-top500/
13 https://spectrum.ieee.org/tech-talk/computing/hardware/japan-tests-silicon-for-exascale-computing-in-2021
14 Shaw D.E., Deneroff M.M., Dror R.O., Kuskin J.S., Larson R.H., Salmon J.K., Young C., Batson B., Bowers K.J., Chao J.C., Eastwood M.P., Gagliardo J., Grossman J. P., Ho C. R. ,. Ierardi D.J., Ist Anton, A Special-Purpose Machine For Molecular Dynamics Simulation, Communications of the ACM,. 51 (2008) 91-97
15 Jalby W., Kuck D., Malony A.D., Masella M., Mazouz A., and Popov M., The Long and Winding Road Towards Efficient High-Performance Computing, Proceeding of the IEEE, in press
16 Kahn S D, On the Future of Genomic Data, Science, 331 (2011) 728-729 
17 https://www.rcsb.org/
18 Winter B and Faubel M, Photoemisson from liquid aqueous solution, Chem Rev, 106 (2006) 1176 ; Winter B, Liquid microjet for photelectron spectroscopy, Nucl Instrum Methods Phys Res Sect A, 601 (2009) 139
19 Leib R D, Donald W A, Bush M F , O’Brien J T , Williams E R, Internal energy deposition in electron capture dissociation measured using hydrated divalent metal ions as nanocalorimeters.J Am Chem Soc 129 (2007) 4894–4895
20 Petersen P B & Saykally R J, On the nature of ions at the liquid water surface. Annu Rev Phys Chem 57, (2006) 333–364
21 Carpenter L J et al. Atmospheric iodine levels influenced by sea surface emissions of inorganic iodine. Nat Geosci, 6 (2013) 108–111
22 Kanakidou M, Seinfeld J H, Pandis S N, , Barnes I, Dentener F J, Facchini M C, , Van Dingenen R, Ervens B, Nenes A, Nielsen C J, Swietlicki E, Putaud J P, Balkanski Y, Fuzzi S, Horth J, Moortgat G K, Winterhalter R, Myhre C E L, Tsigaridis K, Vignati E, Stephanou E G, and Wilson J, Organic aerosol and global climate modelling: a review, Atmos. Chem. Phys, , 1053–1123, 2005 ; Pöschl U, Atmospheric aerosols: composition, transformation, climate and health effects, Angew Chem Int Ed Engl. 46 (2005) 7520-7540 
23 Saykally R J, Nature Chemistry, 5 (2013) 82-84
24 You D, Pancque G and Lovera P, New Data for Thermodynamic and Kinetic Behaviour of Nickel Phases in PWR Physicochemical Conditions, Proceedings of nuclear plant chemistry conference 2014 Sapporo (NPC 2014), 1 (2014) 351-362
25 https://wwwbrr.cr.usgs.gov/projects/GWC_coupled/phreeqc/
26 McLinden, Mark O. et al. “Limited Options for Low-Global-Warming-Potential Refrigerants.” Nature Communications 8 (2017) 14476


Jessica Andreani (CEA Saclay, Gif-sur-Yvette)
Marc Barrachin (IRSN)
Grambow Bernd (Nantes University)
Leonforte Fabien (L”Oréal Research)
Stolz Gabriel (Ecoles des Ponts)
Francesca Ingrosso (Laboratoire de Physique et Chimie Théoriques UMR 7019 Université de Lorraine-CNRS. Institut Jean Barriol)
Chloé Quignot (CEA)
André Severo Pereira Gomes (CNRS)
Luiz Angelo Steffenel (Université de Reims Champagne-Ardenne)
Céline TOUBIN (PHLAM Laboratory)
Seydou Traoré (Neoxia)
Jalby William (University of Versailles-St Quentin)


Thomas Fox (Boehringer Ingelheim Pharma GmbH & Co KG)
Buckau Gunnar (EC)
Pieter in ‘t Veld (BASF SE)
Coles Jonathan (Technical University of Munich)


Radovan Bast (UiT The Arctic University of Norway)


Enrique Sanchez Marcos (University of Sevilla)


Markus Ammann (Paul Scherrer Institut)


Soderholm Lynne (Argonne National Laboratory)

Program is available here

Wednesday April 24th 2019 – Day 1
The point of view of experimentalists

10:00 to 11:00 – Welcome
11:00 to 11:30 – Presentation
11:30 to 12:15 – Presentation
12:15 to 12:30 – Discussion
12:30 to 14:00 – Lunch
14:00 to 14:45 – Presentation
14:45 to 15:30 – Presentation
15:30 to 15:45 – Coffee Break
16:00 to 16:45 – Presentation
16:45 to 17:30 – Presentation
17:30 to 18:15 – Presentation
18:15 to 18:45 – Discussion
19:00 to 21:00 – Dinner
Thursday April 25th 2019 – Day 2
The point of view of modeling experts

10:15 to 11:00 – Presentation
11:00 to 11:30 – Coffee Break
11:30 to 12:15 – Presentation
12:15 to 13:00 – Presentation
13:00 to 14:00 – Lunch
14:00 to 14:30 – Presentation
14:30 to 15:00 – Presentation
15:00 to 15:30 – Presentation
15:30 to 16:00 – Coffee Break
16:00 to 16:30 – Presentation
16:30 to 17:00 – Presentation
17:00 to 17:30 – Presentation
17:30 to 19:00 – Poster Session
19:30 to 22:00 – Dinner
Friday April 26th 2019 – Day 3
The point of view of mathematicians and computer scientists

09:30 to 10:15 – Presentation
10:15 to 11:00 – Presentation
11:00 to 11:30 – Coffee Break
11:30 to 12:15 – Presentation
12:15 to 13:00 – Presentation
13:00 to 14:00 – Lunch
14:00 to 15:00 – Closing Word