Your browser doesn't support javascript.
loading
nach0: multimodal natural and chemical languages foundation model.
Livne, Micha; Miftahutdinov, Zulfat; Tutubalina, Elena; Kuznetsov, Maksim; Polykovskiy, Daniil; Brundyn, Annika; Jhunjhunwala, Aastha; Costa, Anthony; Aliper, Alex; Aspuru-Guzik, Alán; Zhavoronkov, Alex.
Afiliación
  • Livne M; NVIDIA 2788 San Tomas Expressway Santa Clara 95051 CA USA.
  • Miftahutdinov Z; Insilico Medicine Canada Inc. 3710-1250 René-Lévesque West Montreal Quebec Canada.
  • Tutubalina E; Insilico Medicine Hong Kong Ltd. Unit 310, 3/F, Building 8W, Phase 2, Hong Kong Science Park, Pak Shek Kok New Territories Hong Kong alex@insilicomedicine.com.
  • Kuznetsov M; Insilico Medicine Canada Inc. 3710-1250 René-Lévesque West Montreal Quebec Canada.
  • Polykovskiy D; Insilico Medicine Canada Inc. 3710-1250 René-Lévesque West Montreal Quebec Canada.
  • Brundyn A; NVIDIA 2788 San Tomas Expressway Santa Clara 95051 CA USA.
  • Jhunjhunwala A; NVIDIA 2788 San Tomas Expressway Santa Clara 95051 CA USA.
  • Costa A; NVIDIA 2788 San Tomas Expressway Santa Clara 95051 CA USA.
  • Aliper A; Insilico Medicine AI Ltd. Level 6, Unit 08, Block A, IRENA HQ Building, Masdar City Abu Dhabi United Arab Emirates.
  • Aspuru-Guzik A; University of Toronto Lash Miller Building 80 St. George Street Toronto Ontario Canada alan@aspuru.com.
  • Zhavoronkov A; Insilico Medicine Hong Kong Ltd. Unit 310, 3/F, Building 8W, Phase 2, Hong Kong Science Park, Pak Shek Kok New Territories Hong Kong alex@insilicomedicine.com.
Chem Sci ; 15(22): 8380-8389, 2024 Jun 05.
Article en En | MEDLINE | ID: mdl-38846388
ABSTRACT
Large Language Models (LLMs) have substantially driven scientific progress in various domains, and many papers have demonstrated their ability to tackle complex problems with creative solutions. Our paper introduces a new foundation model, nach0, capable of solving various chemical and biological tasks biomedical question answering, named entity recognition, molecular generation, molecular synthesis, attributes prediction, and others. nach0 is a multi-domain and multi-task encoder-decoder LLM pre-trained on unlabeled text from scientific literature, patents, and molecule strings to incorporate a range of chemical and linguistic knowledge. We employed instruction tuning, where specific task-related instructions are utilized to fine-tune nach0 for the final set of tasks. To train nach0 effectively, we leverage the NeMo framework, enabling efficient parallel optimization of both base and large model versions. Extensive experiments demonstrate that our model outperforms state-of-the-art baselines on single-domain and cross-domain tasks. Furthermore, it can generate high-quality outputs in molecular and textual formats, showcasing its effectiveness in multi-domain setups.

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Chem Sci Año: 2024 Tipo del documento: Article Pais de publicación: Reino Unido

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Chem Sci Año: 2024 Tipo del documento: Article Pais de publicación: Reino Unido