Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
JAMIA Open ; 5(4): ooac087, 2022 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-36380848

RESUMO

Objective: Healthcare data such as clinical notes are primarily recorded in an unstructured manner. If adequately translated into structured data, they can be utilized for health economics and set the groundwork for better individualized patient care. To structure clinical notes, deep-learning methods, particularly transformer-based models like Bidirectional Encoder Representations from Transformers (BERT), have recently received much attention. Currently, biomedical applications are primarily focused on the English language. While general-purpose German-language models such as GermanBERT and GottBERT have been published, adaptations for biomedical data are unavailable. This study evaluated the suitability of existing and novel transformer-based models for the German biomedical and clinical domain. Materials and Methods: We used 8 transformer-based models and pre-trained 3 new models on a newly generated biomedical corpus, and systematically compared them with each other. We annotated a new dataset of clinical notes and used it with 4 other corpora (BRONCO150, CLEF eHealth 2019 Task 1, GGPONC, and JSynCC) to perform named entity recognition (NER) and document classification tasks. Results: General-purpose language models can be used effectively for biomedical and clinical natural language processing (NLP) tasks, still, our newly trained BioGottBERT model outperformed GottBERT on both clinical NER tasks. However, training new biomedical models from scratch proved ineffective. Discussion: The domain-adaptation strategy's potential is currently limited due to a lack of pre-training data. Since general-purpose language models are only marginally inferior to domain-specific models, both options are suitable for developing German-language biomedical applications. Conclusion: General-purpose language models perform remarkably well on biomedical and clinical NLP tasks. If larger corpora become available in the future, domain-adapting these models may improve performances.

2.
J Biomed Semantics ; 13(1): 26, 2022 10 27.
Artigo em Inglês | MEDLINE | ID: mdl-36303237

RESUMO

BACKGROUND: Intense research has been done in the area of biomedical natural language processing. Since the breakthrough of transfer learning-based methods, BERT models are used in a variety of biomedical and clinical applications. For the available data sets, these models show excellent results - partly exceeding the inter-annotator agreements. However, biomedical named entity recognition applied on COVID-19 preprints shows a performance drop compared to the results on test data. The question arises how well trained models are able to predict on completely new data, i.e. to generalize. RESULTS: Based on the example of disease named entity recognition, we investigate the robustness of different machine learning-based methods - thereof transfer learning - and show that current state-of-the-art methods work well for a given training and the corresponding test set but experience a significant lack of generalization when applying to new data. CONCLUSIONS: We argue that there is a need for larger annotated data sets for training and testing. Therefore, we foresee the curation of further data sets and, moreover, the investigation of continual learning processes for machine learning-based models.


Assuntos
COVID-19 , Mineração de Dados , Humanos , Mineração de Dados/métodos , Processamento de Linguagem Natural , Aprendizado de Máquina
3.
Chembiochem ; 18(22): 2222-2225, 2017 11 16.
Artigo em Inglês | MEDLINE | ID: mdl-28898524

RESUMO

Squalene-hopene cyclases (SHCs) catalyze the polycyclization of squalene into a mixture of hopene and hopanol. Recently, amino-acid residues lining the catalytic cavity of the SHC from Alicyclobacillus acidocaldarius were replaced by small and large hydrophobic amino acids. The alteration of leucine 607 to phenylalanine resulted in increased enzymatic activity towards the formation of an intermolecular farnesyl-farnesyl ether product from farnesol. Furthermore, the addition of small-chain alcohols acting as nucleophiles led to the formation of non-natural ether-linked terpenoids and, thus, to significant alteration of the product pattern relative to that obtained with the wild type. It is proposed that the mutation of leucine at position 607 may facilitate premature quenching of the intermediate by small alcohol nucleophiles. This mutagenesis-based study opens the field for further intermolecular bond-forming reactions and the generation of non-natural products.


Assuntos
Álcoois/metabolismo , Transferases Intramoleculares/metabolismo , Terpenos/metabolismo , Álcoois/química , Alicyclobacillus/enzimologia , Variação Genética/genética , Transferases Intramoleculares/genética , Estrutura Molecular , Mutagênese Sítio-Dirigida , Terpenos/química
4.
Chem Commun (Camb) ; 48(42): 5115-7, 2012 May 25.
Artigo em Inglês | MEDLINE | ID: mdl-22513828

RESUMO

CYP153A from Marinobacter aquaeolei has been identified as a fatty acid ω-hydroxylase with a broad substrate range. Two hotspots predicted to influence substrate specificity and selectivity were exchanged. Mutant G307A is 2- to 20-fold more active towards fatty acids than the wild-type. Residue L354 is determinant for the enzyme ω-regioselectivity.


Assuntos
Proteínas de Bactérias/metabolismo , Ácidos Graxos/biossíntese , Oxigenases de Função Mista/metabolismo , Proteínas de Bactérias/genética , Ácidos Graxos/química , Hidroxilação , Marinobacter/enzimologia , Oxigenases de Função Mista/genética , Mutação , Estereoisomerismo , Especificidade por Substrato
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...