Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Nano Lett ; 17(7): 4453-4460, 2017 07 12.
Artigo em Inglês | MEDLINE | ID: mdl-28640634

RESUMO

Structure determination and prediction pose a major challenge to computational material science, demanding efficient global structure search techniques tailored to identify promising and relevant candidates. A major bottleneck is the fact that due to the many combinatorial possibilities, there are too many possible geometries to be sampled exhaustively. Here, an innovative computational approach to overcome this problem is presented that explores the potential energy landscape of commensurate organic/inorganic interfaces where the orientation and conformation of the molecules in the tightly packed layer is close to a favorable geometry adopted by isolated molecules on the surface. It is specifically designed to sample the energetically lowest lying structures, including the thermodynamic minimum, in order to survey the particularly rich and intricate polymorphism in such systems. The approach combines a systematic discretization of the configuration space, which leads to a huge reduction of the combinatorial possibilities with an efficient exploration of the potential energy surface inspired by the Basin-Hopping method. Interfacing the algorithm with first-principles calculations, the power and efficiency of this approach is demonstrated for the example of the organic molecule TCNE (tetracyanoethylene) on Au(111). For the pristine metal surface, the global minimum structure is found to be at variance with the geometry found by scanning tunneling microscopy. Rather, our results suggest the presence of surface adatoms or vacancies that are not imaged in the experiment.

2.
Nat Commun ; 15(1): 120, 2024 Jan 02.
Artigo em Inglês | MEDLINE | ID: mdl-38168035

RESUMO

Deep neural networks have become a highly accurate and powerful wavefunction ansatz in combination with variational Monte Carlo methods for solving the electronic Schrödinger equation. However, despite their success and favorable scaling, these methods are still computationally too costly for wide adoption. A significant obstacle is the requirement to optimize the wavefunction from scratch for each new system, thus requiring long optimization. In this work, we propose a neural network ansatz, which effectively maps uncorrelated, computationally cheap Hartree-Fock orbitals, to correlated, high-accuracy neural network orbitals. This ansatz is inherently capable of learning a single wavefunction across multiple compounds and geometries, as we demonstrate by successfully transferring a wavefunction model pre-trained on smaller fragments to larger compounds. Furthermore, we provide ample experimental evidence to support the idea that extensive pre-training of such a generalized wavefunction model across different compounds and geometries could lead to a foundation wavefunction model. Such a model could yield high-accuracy ab-initio energies using only minimal computational effort for fine-tuning and evaluation of observables.

3.
Nat Comput Sci ; 2(5): 331-341, 2022 May.
Artigo em Inglês | MEDLINE | ID: mdl-38177815

RESUMO

The Schrödinger equation describes the quantum-mechanical behaviour of particles, making it the most fundamental equation in chemistry. A solution for a given molecule allows computation of any of its properties. Finding accurate solutions for many different molecules and geometries is thus crucial to the discovery of new materials such as drugs or catalysts. Despite its importance, the Schrödinger equation is notoriously difficult to solve even for single molecules, as established methods scale exponentially with the number of particles. Combining Monte Carlo techniques with unsupervised optimization of neural networks was recently discovered as a promising approach to overcome this curse of dimensionality, but the corresponding methods do not exploit synergies that arise when considering multiple geometries. Here we show that sharing the vast majority of weights across neural network models for different geometries substantially accelerates optimization. Furthermore, weight-sharing yields pretrained models that require only a small number of additional optimization steps to obtain high-accuracy solutions for new geometries.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA