Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros

Banco de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Nat Chem Biol ; 16(4): 387-390, 2020 04.
Artículo en Inglés | MEDLINE | ID: mdl-31873222

RESUMEN

Here, we report a rapid CRISPR-Cas9-mediated gene knock-in strategy that uses Cas9 ribonucleoprotein and 5'-modified double-stranded DNA donors with 50-base-pair homology arms and achieved unprecedented 65/40% knock-in rates for 0.7/2.5 kilobase inserts, respectively, in human embryonic kidney 293T cells. The identified 5'-end modification led to up to a fivefold increase in gene knock-in rates at various genomic loci in human cancer and stem cells.


Asunto(s)
Técnicas de Sustitución del Gen/métodos , Región de Flanqueo 5'/genética , Sistemas CRISPR-Cas/genética , Repeticiones Palindrómicas Cortas Agrupadas y Regularmente Espaciadas/genética , ADN/genética , Genoma/genética , Células HEK293 , Humanos , ARN Guía de Kinetoplastida/genética , Homología de Secuencia de Ácido Nucleico
2.
Nat Chem Biol ; 16(4): 479, 2020 04.
Artículo en Inglés | MEDLINE | ID: mdl-31942048

RESUMEN

An amendment to this paper has been published and can be accessed via a link at the top of the paper.

3.
Protein Eng Des Sel ; 362023 Jan 21.
Artículo en Inglés | MEDLINE | ID: mdl-37883472

RESUMEN

Self-supervised pretraining on protein sequences has led to state-of-the art performance on protein function and fitness prediction. However, sequence-only methods ignore the rich information contained in experimental and predicted protein structures. Meanwhile, inverse folding methods reconstruct a protein's amino-acid sequence given its structure, but do not take advantage of sequences that do not have known structures. In this study, we train a masked inverse folding protein masked language model parameterized as a structured graph neural network. During pretraining, this model learns to reconstruct corrupted sequences conditioned on the backbone structure. We then show that using the outputs from a pretrained sequence-only protein masked language model as input to the inverse folding model further improves pretraining perplexity. We evaluate both of these models on downstream protein engineering tasks and analyze the effect of using information from experimental or predicted structures on performance.


Asunto(s)
Ingeniería de Proteínas , Pliegue de Proteína , Secuencia de Aminoácidos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA