Crystal Composition Transformer: Self-Learning Neural Language Model for Generative and Tinkering Design of Materials.
Adv Sci (Weinh)
; 11(36): e2304305, 2024 Sep.
Article
em En
| MEDLINE
| ID: mdl-39101275
ABSTRACT
Self-supervised neural language models have recently achieved unprecedented success from natural language processing to learning the languages of biological sequences and organic molecules. These models have demonstrated superior performance in the generation, structure classification, and functional predictions for proteins and molecules with learned representations. However, most of the masking-based pre-trained language models are not designed for generative design, and their black-box nature makes it difficult to interpret their design logic. Here a Blank-filling Language Model for Materials (BLMM) Crystal Transformer is proposed, a neural network-based probabilistic generative model for generative and tinkering design of inorganic materials. The model is built on the blank-filling language model for text generation and has demonstrated unique advantages in learning the "materials grammars" together with high-quality generation, interpretability, and data efficiency. It can generate chemically valid materials compositions with as high as 89.7% charge neutrality and 84.8% balanced electronegativity, which are more than four and eight times higher compared to a pseudo-random sampling baseline. The probabilistic generation process of BLMM allows it to recommend materials tinkering operations based on learned materials chemistry, which makes it useful for materials doping. The model is applied to discover a set of new materials as validated using the Density Functional Theory (DFT) calculations. This work thus brings the unsupervised transformer language models based generative artificial intelligence to inorganic materials. A user-friendly web app for tinkering materials design has been developed and can be accessed freely at www.materialsatlas.org/blmtinker.
Texto completo:
1
Base de dados:
MEDLINE
Idioma:
En
Ano de publicação:
2024
Tipo de documento:
Article