Literature DB >> 29532014

The Matter Simulation (R)evolution.

Alán Aspuru-Guzik1,2, Roland Lindh3,3, Markus Reiher4.   

Abstract

To date, the program for the development of methods and models for atomistic and continuum simulation directed toward chemicals and materials has reached an incredible degree of sophistication and maturity. Currently, one can witness an increasingly rapid emergence of advances in computing, artificial intelligence, and robotics. This drives us to consider the future of computer simulation of matter from the molecular to the human length and time scales in a radical way that deliberately dares to go beyond the foreseeable next steps in any given discipline. This perspective article presents a view on this future development that we believe is likely to become a reality during our lifetime.

Entities:  

Year:  2018        PMID: 29532014      PMCID: PMC5832995          DOI: 10.1021/acscentsci.7b00550

Source DB:  PubMed          Journal:  ACS Cent Sci        ISSN: 2374-7943            Impact factor:   14.553


Matter Simulation Methods

You probably are reading this article in your mobile phone.[1] The materials that compose the electronic circuits that made it possible were related to the outcome of digital computer simulations carried out by several thousands of scientists of the span of decades. Peter Galison in Image and Logic: A material culture of microphysics (1997)[2] discusses the inseparability of the physical reality around us and the virtual matter that is simulated on modern computers: Without the computer-based simulation, the material culture of late-twentieth-century microphysics is not merely inconvenienced – It does not exist. [...] Machines [...] are inseparable from their virtual counterparts–all are bound to simulations. Several subfields of digital simulation have contributed to the design of our physical reality. These include quantum physics, quantum chemistry, condensed matter physics, computational statistical mechanics, computational materials science, continuum modeling, circuit layout, etc. In this paper, we reflect on the status of these fields, which we call as a collective matter simulation methods, since the early days of computing to the current era. Our goal is that of answering the question, What are the emerging challenges and opportunities for these fields in the twenty-first century? By answering this question, we arrive to the conclusion that redefining the scope of their main mission may be necessary due to the rapid advances in the drivers of our society. Writing about all these fields would go beyond our domain of expertise and would rely on too many examples. Therefore, for the remainder of the paper, we will use the field of quantum chemistry as an example, but the reader can have in mind that very similar arguments could be made for any of the other theoretical/computational fields mentioned above, as they are all connected to the developments in fundamental, physical theory in the first half of the twentieth century and the dramatic hardware and algorithm developments in its second half.

Quantum Chemistry in the 20th Century

In 1928, Paul Dirac proposed his fundamental covariant equation of motion that governs the relativistic dynamics of a single electron in a classical electromagnetic field. This equation became the basis not only of more fundamental theories such as quantum electrodynamics but also of endeavors to solve many-electron problems in chemistry, molecular physics, and materials science[3]—even if they originally set out from Schrödinger’s (nonrelativistic) formulation of quantum mechanics. In a famous quote from his 1929 account on many-electron systems,[4] Dirac emphasized the importance of his discovery and then continued to state that it “becomes desirable that approximate practical methods of applying quantum mechanics should be developed, which can lead to an explanation of the main features of complex atomic systems without too much computation.” This desire has been the motivation for generations of computational chemists and physicists to devise algorithms that allow us to solve the differential equations governing the dynamics of many-electron systems. The aspiration to quantitatively assess molecular properties and to qualitatively understand their implications has been a driving force of computer simulations of molecular matter since that time. In quantum chemistry, the mission in the previous century was to calculate an energy and the molecular properties of a given, isolated molecular structure. As Per-Olov Löwdin has put it in a visionary perspective on the field,[5] There seems to be a rather long way to go before we reach the mathematical goal of quantum chemistry, which is to be able to predict accurately the properties of a hypothetic polyatomic molecule.... Löwdin states in his “1967-Program”[6] written for the newly created International Journal of Quantum Chemistry that quantum chemistry “uses physical and chemical experience, deep going mathematical analysis and high speed electronic computers to achieve its results”. Gavroglu and Simões[7] identify these as the four pillars on which the field has rested since its conception (see Figure ).
Figure 1

Quantum chemistry[8] is an interdisciplinary field that lies at the intersection of chemistry, physics, applied math, and computer science. It borrows from several other subfields, some of which are mentioned at the borders of the diagram.

Quantum chemistry[8] is an interdisciplinary field that lies at the intersection of chemistry, physics, applied math, and computer science. It borrows from several other subfields, some of which are mentioned at the borders of the diagram. In the 20th century, the interdisciplinary field of quantum chemistry was therefore drawn from thematic aspects of the different founding disciples. It has taken from physics the laws of quantum mechanics and light-matter interaction, from chemistry the conceptual laws of molecular structure and inspiration for problems and applications, from applied math mostly computational linear algebra, and finally, from computer science a focus on high-performance computing, both parallel and using accelerators such as general-purpose graphical processing units. Sixty years later, this enormous effort has been made and the mission has basically been accomplished. As a collective, theoretical chemists and physicists have developed methods that simulate the electronic structure of molecules up to hundreds of atoms, both in vacuum and in solvents, and that can calculate practically all observables of interest to experimentalists. Furthermore, although the so-called “chemical accuracy” cannot be obtained for all the calculations, quantum chemistry calculations have become a useful everyday tool in the arsenal of chemists. In 1957, Löwdin[5] presented an overview of various methods used in molecular and solid-state theory for the solution of the Schrödinger equation in his account. Amazingly, most of the principles and hierarchy set out in that overview prevailed until the present time. Propelled by developments in other fields (most notably the design and construction of efficient and affordable computer hardware), a huge part of Dirac’s and Löwdin’s programs set for future generations has become a reality. In fact, we do have now computational tools at our disposal that allow us to do such calculations routinely and with relatively well-known accuracy.[7] All remaining open issues of this original mission are well identified and understood. In other words, we are now able to solve the complex high-dimensional partial differential equations that govern quantum many-electron systems for arbitrary nuclear frameworks (even with rigorous error assessment[9−14]). Clearly, the dimension that scales with the number of elementary particles in the molecular system cannot be arbitrarily large for feasibility reasons, but the field has made the remarkable achievement of providing routine solution methods for partial differential equations whose dimension is set on input of a computation by providing the number of electrons of a reasonably sized molecular structure or unit cell. It therefore appears evident that what remains from the original program is rather straightforward to accomplish in light of the past achievements. And, in fact, it is most likely to take only a fraction of the effort invested so far to bring the original mission to an end for practical purposes. According to his former students, Nicholas Handy, John Pople, and Isaiah Shavitt, during the 1959 Conference on Molecular Quantum Mechanics held in Boulder Colorado,[2] Samuel F. Boys [...] produced a paper tape of his whole computer program and unrolled it along the length of the chemical lecture bench. There, in one roll, was something, of which one could ask a chemical question at one end and it would produce an answer at the other! .... [M]ost of the audience probably thought the demonstration bizarre. But it was prescient. In the narrow sense of the quote above, the field of quantum chemistry has practically solved the mission of the twentieth century quantum chemistry that we may summarize as follows:Having considered the great accomplishments of the field to date, we are ready to present our vision of what is possible next due to the rapid developments in the fields of computer science and robotics. Depending on one’s own progressive mind set, this vision may be considered either linear and therefore an Evolution of the current state of the art, or rather radical and hence a potential Revolution of the field. Given a molecular geometry, obtain its energy and/or other molecular properties in the gas phase or a solvent model efficiently on a modern digital computer.

Matter Simulation in the 21st Century

Societal Drivers

The connection between science and the current drivers for society is deep and cannot be ignored. This century poses several severe challenges that range from the rapid rise of income inequality and the apparent cracks of the neoliberal structure to the stresses on the environment due to industrialization. The work of simulation scientists therefore is linked directly or indirectly to this societal context. In particular, the solutions to many of the challenges related to this century, ranging from the discovery of novel materials for renewable energy to that of environmentally friendly pesticides or next-generation antibiotics, require tools to be developed by our field. A characteristic of these challenges is that their solution is time-critical, which turns them into societal threats. For instance, if the search for new antibiotics cannot keep pace with the development and spread of bacterial resistance or if the development of sustainable energy cycles cannot outpace global warming, severe consequences for our society will be unavoidable. Whereas we have seen in the past decades a remarkably accurate prediction of our computational capabilities by Moore’s law, which was an original prediction of their exponential growth with time, nothing similar has been found so far for scientific discovery. However, the time-critical threats to society in the 21st century require us to find viable solutions for pressing global problems at an increasingly faster pace than what was sufficient in the past. In a sense, as suggested by Benji Maruyama[15] a “Moore’s law for scientific discovery is required to increase the success for systematic as well as serendipitous discovery”. However, tied to the current exponential growth of technology, such an exponential pace of scientific discovery could be achieved! An exponential increase in scientific throughput and lowering in cost via automation has been achieved in the field of gene sequencing. The National Institutes of Health also uses a comparison with exponential decrease in cost when comparing the cost of sequencing a human genome.[16] We hope that, in certain areas of chemical discovery, an increase in throughput of calculation, synthesis, and characterization will result in an exponential increase in the rate of discovery.

Science and Technology Drivers

In this century, we are witnessing the introduction of novel technologies at a pace never seen before. Technoeconomically, the twenty-first century is deeply linked to what Klaus Schwab from the World Economic Forum (WEF) has coined as “The Fourth Industrial Revolution”.[17] The WEF identified six technology drivers[18] that will significantly impact society. To focus on the accelerated discovery of matter,[19] we modify and expand this list to nine science and technology drivers that will deeply transform the speed and way of discovery of new chemicals and matter.

Driver 1. Human–Machine Interaction and the Internet

Technology will continue to enable the connection between people, enhancing their digital presence by enabling them to interact with objects and one another in new ways. Emerging technologies that enhance the human–machine interaction will allow for an unprecedented immersion into the virtual atomistic world that is otherwise inaccessible to the human senses. Already today, we see haptic and tracking devices, virtual realities, and caves adding a new level of intuition to the virtual experience of the molecular world that goes far beyond its archaic and fractured perception through computer mouse and keyboard.[20−41] A perfect and seamless immersion of the scientist into the mesoscopic environment of her/his object of study is key to an accelerated understanding and manipulation of (molecular) matter in a virtual laboratory. For instance, immersion is about literally feeling the softness of a functional group in a molecule and about experiencing how it feels to push a hydrogen atom into a metal surface. By enhanced immersion and real-time data flow, one can cope with the immense data provided by computations finishing in real-time (see Figure )
Figure 2

Examples of already existing tools that enhance the immersion of professional and amateur scientists into a molecular world. (A) A cave for data exploration (Electron density of a molecular data set image provided by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago and Argonne National Laboratory. Photo: L. Long, EVL). (B) An operator’s pair of hands manipulate a peptide during a molecular dynamics simulation (taken from ref (27); Creative Common License). (C) A simple haptic force-feedback device by which the tactile human sense can be addressed (Reproduced with permission from ref (42). Copyright 2011 Wiley-VCH Verlag GmbH & Co. KgaA). (D) Interactive atmospheric molecular dynamics simulation in an immersive projection dome (taken from ref (27); Creative Common License).

Examples of already existing tools that enhance the immersion of professional and amateur scientists into a molecular world. (A) A cave for data exploration (Electron density of a molecular data set image provided by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago and Argonne National Laboratory. Photo: L. Long, EVL). (B) An operator’s pair of hands manipulate a peptide during a molecular dynamics simulation (taken from ref (27); Creative Common License). (C) A simple haptic force-feedback device by which the tactile human sense can be addressed (Reproduced with permission from ref (42). Copyright 2011 Wiley-VCH Verlag GmbH & Co. KgaA). (D) Interactive atmospheric molecular dynamics simulation in an immersive projection dome (taken from ref (27); Creative Common License).

Driver 2. Computing, Communication, and Storage Everywhere

Everybody will have access to a supercomputer in their pocket (currently already a reality as a link of the already powerful existing smartphones to cloud services), with nearly unlimited storage capacity. This driver will grant the research community access to cheap computational power and ability to store and share chemical information. It will also allow amateur scientists to access important research problems much more easily, which will strengthen the general acceptance of the field in society and increase the pace of discovery (cf., recent web-based online platforms, serious gaming, and gamification are existing examples).[43,44] Considering that the chemical space is unfathomably large and much information for a given problem may not be available, such restricted, unavailable, or nonexistent information will be computed on the spot as needed.

Driver 3. The Internet of Things

The introduction of sensors and processors that are connected to the Internet for most of the objects around us will offer society and the science community a trivial route to build, for example, networks of off-the-shelf sensors and processors to monitor and control artificial intelligence (AI) environments and robotics systems. For example, it is not too hard to imagine a 3D printer that prints parts of a synthesis robot that itself may then be put online to receive control commands from a virtual reality devoted to chemical or materials synthesis to produce a specific chemical or material. This is closely connected to the next driver.

Driver 4. Artificial Intelligence, Big Data, and Robotics

The exponential creation of more data from the sensors and processors around us requires its organization and processing using artificial intelligence. Already now, we witness the rapid rise of machine learning tools for such purposes (see refs (45−57) for some examples). Robotics, empowered by such tools, is already making an impact in the automation of jobs, decision making, and research. It is only a matter of time before synthesis robots, which are already employed in many chemistry laboratories, will be generalized (see the work by Burke on his synthesis machine[58−60]) and coupled to adequate software. Much work has been devoted to devising and implementing algorithms for the automated exploration of chemical reaction networks (see, e.g., refs (61−67)). In the future, the evolution of these achievements may team up with the latest expert systems for the planning of chemical synthesis (see driver 6) to generate a reliable platform that can map out complex chemical reaction networks (based on the big data provided by reaction libraries and vast quantum chemical explorations) under predefined conditions that are then eventually realized by a relatively general synthesis robot (see Figure ).
Figure 3

Left: A MakerBot 3D printer (picture from https://en.wikipedia.org/wiki/3D_printing). Middle: HERMAN the High-throughput Experimentation Robot for the Multiplexed Automation of Nanochemistry (taken from https://www.youtube.com/watch?v=J0VlCItpI5s). Right: Martin Burke’s synthesis machine (taken from ref (68); permission to print this picture granted by L. Brian Stauffer, University of Illinois at Urbana–Champaign).

Left: A MakerBot 3D printer (picture from https://en.wikipedia.org/wiki/3D_printing). Middle: HERMAN the High-throughput Experimentation Robot for the Multiplexed Automation of Nanochemistry (taken from https://www.youtube.com/watch?v=J0VlCItpI5s). Right: Martin Burke’s synthesis machine (taken from ref (68); permission to print this picture granted by L. Brian Stauffer, University of Illinois at Urbana–Champaign).

Driver 5. The Sharing Economy and Distributed Trust

This driver enables new social and business models. Tools like the blockchain promise to change the way we think about money and transactions in the real world. This will have an impact on how data is acquired or generated, stored, and managed in a complete reproducible and controlled way: smart evidence. It will ensure that data is immutable and protected; the user will no longer have to, for example, trust service providers as Google, Dropbox, etc. to not manipulate the data. Blockchaining will also affect how research results are published. For instance, controversial data or results can be published anonymously while the integrity of the presented data acquisition is completely verifiable. Blockchaining will create new ways of funding: smart contracts that will potentially disrupt the way science is funded.

Driver 6. The Digitization of Matter from the Macroscopic to the Atomic Scale

The continued development of printers of physical objects due to 3D printing and additive manufacturing technologies allows for new creative opportunities at the scale of meters to micrometers. The increased control over the synthesis and characterization of precise nanoscale assemblies of matter such as nanoparticles, chemical and atomic layer deposition, and lithography provides a bottom-up approach to create matter that can further be probed efficiently. Purely synthetic approaches have reached a level of sophistication in chemistry such that, for any molecule that is calculated to be stable and viable, a synthetic path can be designed. Expert systems for such purposes have a long history[69−72] and experienced a fresh boost by modern technology.[73−77]

Driver 7. Precise Control of Molecular Biology and Nanochemistry

Our increasing ability to manipulate and control the molecular biology of cells is an example of successful molecular engineering exploiting an existing molecular machinery to produce new molecules, often with predefined function. The recently developed CRISPR/Cas gene editing[78−80] technique is an easy to use approach with incredibly far reaching consequences. It is not only an amazing example for our understanding and capabilities to modify molecular processes in functional and living matter, it is also a remarkable advance in bioengineering technology that allows us to modify the genome of an organism at single-nucleotide precision. In this respect it straightforwardly extends the more traditional biochemical toolbox bioengineering that can be used to produce specific molecules. Combined with experimental automation and control through a virtual reality setup, this is a way to actually produce molecular structures designed in silico. Naturally, such a level of sophistication in understanding and manipulating the biochemical machinery of cells is an example for what would like to achieve in artificial soft and hard nanochemistry settings. And surely, matter simulation will be a decisive tool to point the way.

Driver 8. Disruptive Computing Technologies

Quantum computers employ the rules of quantum physics such as entanglement and superposition to carry out computations. The earliest suggested application for them is the simulation of quantum matter. Quantum computers can provide an exponential advantage over classical computers for the simulation of chemistry.[81−83] This has been shown theoretically and experimentally realized in several occasions in demonstration experiments.[84−86] Although no quantum computer with error-corrected qubits has been built so far, the severe efforts in major companies, start-ups, and academia make the emergence of moderately sized quantum computers for actual applications in chemistry in the near future rather likely.

Driver 9. Matter Imaging Technology at the Atomistic and Larger Length Scales

Coupled to driver 6, the emergence of advanced spectroscopy techniques for the study of matter has led to an increase in the available experimental data in terms of resolution, area studied, and variety of signals that are recorded. This makes characterization a prime source of data that can directly interact with the output of simulation. In this way, it can help to constrain models “on the fly” and refine digital chemical hypotheses. The nine science and technology drivers mentioned above could help us generate a contemporary close reading of the quote of the students of Samuel Boys. He mentions a computer program, where “one could ask a chemical question at one end and it would produce an answer at the other!”[2] In the context of the new frontiers of human–computer interaction and artificial intelligence, the quote of Boys can be expanded into the revised mission for the field of computer simulation: These questions could imply complex tasks and a dialogue with a computer program (see also ref (87)) such as the following one: Jane, the chemist: Dear Organa, good morning. Could you please suggest to me an organic molecule with an estimated synthetic cost of less than a hundred dollars per gram and three synthetic steps from available synthons that has an emissive color of 450 nm and stability against oxidation? I am thinking of organic light emitting diode emitters. Organa: Jane, I will get back to you in 2 h. Organa: Hello Jane, I am back! I have a set of 50 potential candidates based on your constraints and inspired by the recent literature on the subject. You can purchase 20 of them and use the matter computer to synthesize the other 30 that you have with the synthon library available in the matter computer. Jane: Dear Organa, can you display them for me in AR (augmented reality) ? Organa: My pleasure, Jane. Jane: (As she waves her hand and flips through compounds and their associated properties shown to her in augmented reality) Could you synthesize all the 30 compounds and test them for emission properties? Can you order the other 20? When the compounds arrive, I will install cartridges for the 20 compounds so you can use them for automated characterization. Thank you, Organa! To enable a computer program and associated “matter computer”, or synthesis and characterization machine, the simulation community should overcome six grand challenges that we identify in the following.

Grand Challenges for the Simulation of Matter in the 21st Century

Contemplating the nine drivers, we may condense the goals of future research efforts in the field of matter simulation to six grand challenges.

Challenge 1. The Designer Challenge

While the mission of the 20th century was related to providing answers to questions pertaining to properties of specific chemical structures, the questions of the 21st century revolve around the inverse design problem:[88−94] finding the best chemical structures that are associated with desired and requested properties. A potential solution for this challenge is the use of invertible models from machine learning such as generative models (GANs, autoencoders, ...)[48,89] or inverting molecules from families of Hamiltonians.[90−93]

Challenge 2. The Chemical Turing Test

The classical Turing test[95] is a gedanken experiment designed by Alan Turing to answer the question, “What is intelligence, and is it exclusive to humans?” In the test a human communicates with another human or a computer and is ultimately asked to tell the identity of the subject of the communication. The new goal of theoretical chemistry should be that of providing access to a chemical “oracle”: an AI environment which can help humans solve problems, associated with the fundamental chemical questions of the fourth industrial revolution (clean energy, efficient drugs, smart materials, green chemistry, etc.), in a way such that the human cannot distinguish between this and communicating with a human expert.

Challenge 3. The Feynman Test

In 1982 R. Feynman[96] stated, “I want to talk about the possibility that there is to be an exact simulation, that the computer will do exactly the same as nature”, in his visionary article “Simulating Physics with Computers”. Computer simulations, to the best of our knowledge, applying the known rules of physics to computer-modeled particles, is an exact one-to-one mapping to reality, such that experimental and virtual data, for all practical purposes, are indistinguishable. The remaining discrepancy, between experiment and computer simulations, is an ongoing battle the computational chemists are convincingly winning inch by inch. In the not too distant future, computer simulations will be a fully adequate alternative to experiments, at which point questions like cost efficiency, environmental considerations, or other aspects will be the grounds for the choice between theory and experiment. Quantum computers[97] may offer the way forward as they are known to simulate matter exactly if a suitable input state is provided.[95] Enormous progress in the field has led to current experiments involving several qubits and simulating molecules as large as BeH2.[81−86]

Challenge 4. The Matter Computer

Let us now provide the complete 1957 quote of Per-Olov Löwdin[5] mentioned in section : There seems to be a rather long way to go before we reach the mathematical goal of quantum chemistry, which is to be able to predict accurately the properties of a hypothetic polyatomic molecule before it has been synthesized in the laboratories. The aim is also to obtain such knowledge of the electronic structure of matter that one can construct new substances having properties of particular value to mankind. To learn to think in terms of electrons and their quantum mechanical behaviors is probably of greater technical importance than we can now anticipate. We can now reinterpret this quote in the context of the development of an integrated molecular discovery platform that we name a “matter computer”. These computers process chemicals instead of information. Their “registers” consist of actual chemicals, solvents, nanoparticles, etc. Their information processing subroutines are actual synthesis and characterization tools. The output of the computer is matter that can be readily analyzed and acted upon. They are the analogues of 3D printers but with molecular building blocks. All of these components should be connected to a central control system and database that can make artificial-intelligence-driven decisions about what to synthesize next. Characterization tools also should be integrated in the platform. To understand the chemical space to be explored, these matter computers employ large computer-generated compound screening libraries or generative models based on their data. This loop requires the integration of several technologies that are currently emerging in an integrated platform.[15] Physical constraints are likely to make these matter computers explore a relatively small fraction of chemical space such as the number of building blocks, solvents and catalysts, capacities of the synthetic hardware, and current knowledge or prediction of reaction mechanisms. Computers themselves can help design the right robotic synthesis platforms for targeting a region of chemical space.

Challenge 5. The Immersive Chemistry Challenge

A full immersion into the virtual world of a molecular system with seamless integration of fellow researchers will boost research and education. This seamless integration is also a necessary condition to promote instantaneous computing, i.e., the fact that the computing time for some problem has been constantly shrinking over the years up to the point where starting a calculation and receiving its results can no longer be separated on the time scale of human perception, which is 60 ms for vision and 1 ms for our sense of touch. It will be necessary to accomplish an ultimate integrated hardware and software implementation of the perfect immersion of a human researcher into the molecular world. Coming back to Dirac’s quote from 1929, which ends “... without too much computation”, we need to rewrite this into “... by immediate and interactive computation” to meet the demands of the time to come.

Challenge 6. The Machine and Human Molecular Representation Learning Challenge

The natural representation of molecules from the point of view of quantum information theory is the content of a full quantum tomography[98] experiment for their wave function or a quantum process tomography to explore their dynamical processes.[99] The wave function contains too much information to be processed by a human brain. As we advance in our program of machine learning and automation, the question of representation learning will arise. In the field of machine learning, this pertains to finding the representation of the molecular data that the machine can learn from. From the other side of the spectrum, a human grasps to concepts that help her or him make rational decisions. We hope that, in the future, humans and machines meet in the middle. Emerging conceptual breakthroughs in chemistry may arise from humans training their computer helpers, and the computer helpers, in return, providing the raw source of inspiration to elaborate new concepts.

Toward a Unified Computation Science of Matter

Science as we know it today owes its classifications to how scientists viewed and understood science in the 19th century and before. As scientific understanding has progressed, the fundamental laws of chemistry and physics, as well as other areas of science, have been unified. In computational science this is more evident than anywhere else: chemistry and physics simulations use the very same fundamental numerical methods and strategies. This calls for reflection and a possible reclassification of the fields of chemistry and physics. At first sight this might seem destructive: why destroy the infrastructure that has served mankind so well in the past? Well, stone and bricks were the preferential building materials in New York after the Great Fire in 1835. However, as building space was exhausted, more living and office space could not be added by simply adding a new floor to an existing house. The stone and brick technology had come to a dead end. Buildings had to be torn down to make space for buildings using a completely different paradigm: the skyscrapers based on the use of steel frames. We suggest that the matter simulation (r)evolution will call for a healthy reclassification of the scientific fields away from classification based on structure toward functionalities. The opposite side of the spectrum of chemical understanding lies on the scribblings by organic chemists in blackboards, also known as arrow-pushing diagrams. These conceptual tools have led to several advances in the field of organic chemistry. At the interface of quantum theory with conceptual learning, ideas such as the Woodward-Hoffman rules[100] or frontier orbital theory have proven themselves invaluable tools for our understanding of chemical processes. It goes without saying that the developments in science will have paramount implications for the educational system, too. Can we today be so complacent with respect to the expected changes in science that we continue educating students in skills which we expect to be automated in the not so distant future? Of course not! There will have to be a parallel development in the educational system. Not only will the educational system change the curriculum to adapt to the new scientific reality. At the same time the educational system will have to adapt to the new information technology (IT) reality: all information is available at your screen by the touch of a screen, a voice command, or even a thought. Pedagogic adaptation will be natural, active rather than passive learning will be an instrumental part of a progressive educational system, and the flipped classroom will be the norm rather than the exception. As a final remark, we emphasize that we clearly see the massive effort required in future research and education that it will take to push the existing theories and technologies further to accomplish the grand challenges ahead of us. Undoubtedly, a collaborative international, interdisciplinary effort in research and education is necessary. This cannot be done alone by a few research groups. This is a call to arms for the computational scientists to continue innovating using all the new tools available to us on a daily basis.
  58 in total

1.  Ab initio interactive molecular dynamics on graphical processing units (GPUs).

Authors:  Nathan Luehr; Alex G B Jin; Todd J Martínez
Journal:  J Chem Theory Comput       Date:  2015-09-15       Impact factor: 6.006

2.  Real-time feedback from iterative electronic structure calculations.

Authors:  Alain C Vaucher; Moritz P Haag; Markus Reiher
Journal:  J Comput Chem       Date:  2015-12-17       Impact factor: 3.376

3.  Studying chemical reactivity in a virtual environment.

Authors:  Moritz P Haag; Markus Reiher
Journal:  Faraday Discuss       Date:  2014-05-27       Impact factor: 4.008

4.  A real-time proximity querying algorithm for haptic-based molecular docking.

Authors:  Georgios Iakovou; Steven Hayward; Stephen Laycock
Journal:  Faraday Discuss       Date:  2014-05-30       Impact factor: 4.008

5.  Error Assessment of Computational Models in Chemistry.

Authors:  Gregor N Simm; Jonny Proppe; Markus Reiher
Journal:  Chimia (Aarau)       Date:  2017-04-26       Impact factor: 1.509

6.  Reliable Estimation of Prediction Uncertainty for Physicochemical Property Models.

Authors:  Jonny Proppe; Markus Reiher
Journal:  J Chem Theory Comput       Date:  2017-06-22       Impact factor: 6.006

7.  Parallel optimization of synthetic pathways within the network of organic chemistry.

Authors:  Mikołaj Kowalik; Chris M Gothard; Aaron M Drews; Nosheen A Gothard; Alex Weckiewicz; Patrick E Fuller; Bartosz A Grzybowski; Kyle J M Bishop
Journal:  Angew Chem Int Ed Engl       Date:  2012-07-13       Impact factor: 15.336

8.  Systematic Error Estimation for Chemical Reaction Energies.

Authors:  Gregor N Simm; Markus Reiher
Journal:  J Chem Theory Comput       Date:  2016-05-19       Impact factor: 6.006

Review 9.  CRISPR/Cas9 in Genome Editing and Beyond.

Authors:  Haifeng Wang; Marie La Russa; Lei S Qi
Journal:  Annu Rev Biochem       Date:  2016-04-25       Impact factor: 23.643

10.  Use machine learning to find energy materials.

Authors:  Phil De Luna; Jennifer Wei; Yoshua Bengio; Alán Aspuru-Guzik; Edward Sargent
Journal:  Nature       Date:  2017-12-07       Impact factor: 49.962

View more
  13 in total

Review 1.  Big-Data Science in Porous Materials: Materials Genomics and Machine Learning.

Authors:  Kevin Maik Jablonka; Daniele Ongari; Seyed Mohamad Moosavi; Berend Smit
Journal:  Chem Rev       Date:  2020-06-10       Impact factor: 60.622

2.  Machine Learning for Electronically Excited States of Molecules.

Authors:  Julia Westermayr; Philipp Marquetand
Journal:  Chem Rev       Date:  2020-11-19       Impact factor: 60.622

Review 3.  Ab Initio Machine Learning in Chemical Compound Space.

Authors:  Bing Huang; O Anatole von Lilienfeld
Journal:  Chem Rev       Date:  2021-08-13       Impact factor: 60.622

4.  ACS Central Science Virtual Issue on Machine Learning.

Authors:  Andrew L Ferguson
Journal:  ACS Cent Sci       Date:  2018-08-08       Impact factor: 14.553

5.  Harnessing liquid-in-liquid printing and micropatterned substrates to fabricate 3-dimensional all-liquid fluidic devices.

Authors:  Wenqian Feng; Yu Chai; Joe Forth; Paul D Ashby; Thomas P Russell; Brett A Helms
Journal:  Nat Commun       Date:  2019-03-06       Impact factor: 14.919

6.  Transferable Machine-Learning Model of the Electron Density.

Authors:  Andrea Grisafi; Alberto Fabrizio; Benjamin Meyer; David M Wilkins; Clemence Corminboeuf; Michele Ceriotti
Journal:  ACS Cent Sci       Date:  2018-12-26       Impact factor: 14.553

7.  Data-Driven Strategies for Accelerated Materials Design.

Authors:  Robert Pollice; Gabriel Dos Passos Gomes; Matteo Aldeghi; Riley J Hickman; Mario Krenn; Cyrille Lavigne; Michael Lindner-D'Addario; AkshatKumar Nigam; Cher Tian Ser; Zhenpeng Yao; Alán Aspuru-Guzik
Journal:  Acc Chem Res       Date:  2021-02-02       Impact factor: 22.384

8.  Geometric landscapes for material discovery within energy-structure-function maps.

Authors:  Seyed Mohamad Moosavi; Henglu Xu; Linjiang Chen; Andrew I Cooper; Berend Smit
Journal:  Chem Sci       Date:  2020-04-29       Impact factor: 9.825

9.  Sampling molecular conformations and dynamics in a multiuser virtual reality framework.

Authors:  Michael O'Connor; Helen M Deeks; Edward Dawn; Oussama Metatla; Anne Roudaut; Matthew Sutton; Lisa May Thomas; Becca Rose Glowacki; Rebecca Sage; Philip Tew; Mark Wonnacott; Phil Bates; Adrian J Mulholland; David R Glowacki
Journal:  Sci Adv       Date:  2018-06-29       Impact factor: 14.136

10.  ChemOS: An orchestration software to democratize autonomous discovery.

Authors:  Loïc M Roch; Florian Häse; Christoph Kreisbeck; Teresa Tamayo-Mendoza; Lars P E Yunker; Jason E Hein; Alán Aspuru-Guzik
Journal:  PLoS One       Date:  2020-04-16       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.