Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Assunto da revista
Intervalo de ano de publicação
1.
Arch Microbiol ; 206(6): 249, 2024 May 07.
Artigo em Inglês | MEDLINE | ID: mdl-38713385

RESUMO

Escherichia coli (E. coli) can induce severe clinical bovine mastitis, which is to blame for large losses experienced by dairy farms. Macrophage polarization into various states is in response to pathogen infections. Lycopene, a naturally occurring hydrocarbon carotenoid, relieved inflammation by controlling M1/M2 status of macrophages. Thus, we wanted to explore the effect of lycopene on polarization states of macrophages in E. coli-induced mastitis. Macrophages were cultivated with lycopene for 24, before E. coli inoculation for 6 h. Lycopene (0.5 µmol/L) significantly enhanced cell viabilities and significantly reduced lactic dehydrogenase (LDH) levels in macrophages, whereas 2 and 3 µmol/L lycopene significantly enhanced LDH activities. Lycopene treatment significantly reduced the increase in LDH release, iNOS, CD86, TNF-α, IL-1ß and phosphatase and tensin homolog (PTEN) expressions in E. coli group. 0.5 µmol/L lycopene significantly increased E. coli-induced downregulation of CD206, arginase I (ARG1), indoleamine 2,3-dioxygenase (IDO), chitinase 3-like 3 (YM1), PI3K, AKT, p-AKT, mammalian target of rapamycin (mTOR), p-mTOR, jumonji domain-containing protein-3 (JMJD3) and interferon regulatory factor 4 (IRF4) levels. Moreover, Ginkgolic acid C17:1 (a specific PTEN inhibitor), 740YPDGFR (a specific PI3K activator), SC79 (a specific AKT activator) or CHPG sodium salt (a specific NF-κB activator) significantly decreased CD206, AGR1, IDO and YM1 expressions in lycopene and E. coli-treated macrophages. Therefore, lycopene increased M2 macrophages via inhibiting NOTCH1-PI3K-mTOR-NF-κB-JMJD3-IRF4 pathway in response to E. coli infection in macrophages. These results contribute to revealing the pathogenesis of E. coli-caused bovine mastitis, providing the new angle of the prevention and management of mastitis.


Assuntos
Infecções por Escherichia coli , Escherichia coli , Licopeno , Macrófagos , Transdução de Sinais , Animais , Bovinos , Feminino , Camundongos , Linhagem Celular , Infecções por Escherichia coli/microbiologia , Infecções por Escherichia coli/imunologia , Fatores Reguladores de Interferon/metabolismo , Fatores Reguladores de Interferon/genética , Licopeno/farmacologia , Macrófagos/efeitos dos fármacos , Macrófagos/microbiologia , Macrófagos/imunologia , Macrófagos/metabolismo , Mastite Bovina/microbiologia , NF-kappa B/metabolismo , Fosfatidilinositol 3-Quinases/metabolismo , Fosfatidilinositol 3-Quinases/genética , Receptor Notch1/metabolismo , Receptor Notch1/genética , Transdução de Sinais/efeitos dos fármacos , Serina-Treonina Quinases TOR/metabolismo
2.
IEEE Trans Pattern Anal Mach Intell ; 45(7): 8176-8192, 2023 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-37018677

RESUMO

Attention-based neural networks, such as Transformers, have become ubiquitous in numerous applications, including computer vision, natural language processing, and time-series analysis. In all kinds of attention networks, the attention maps are crucial as they encode semantic dependencies between input tokens. However, most existing attention networks perform modeling or reasoning based on representations, wherein the attention maps of different layers are learned separately without explicit interactions. In this paper, we propose a novel and generic evolving attention mechanism, which directly models the evolution of inter-token relationships through a chain of residual convolutional modules. The major motivations are twofold. On the one hand, the attention maps in different layers share transferable knowledge, thus adding a residual connection can facilitate the information flow of inter-token relationships across layers. On the other hand, there is naturally an evolutionary trend among attention maps at different abstraction levels, so it is beneficial to exploit a dedicated convolution-based module to capture this process. Equipped with the proposed mechanism, the convolution-enhanced evolving attention networks achieve superior performance in various applications, including time-series representation, natural language understanding, machine translation, and image classification. Especially on time-series representation tasks, Evolving Attention-enhanced Dilated Convolutional (EA-DC-) Transformer outperforms state-of-the-art models significantly, achieving an average of 17% improvement compared to the best SOTA. To the best of our knowledge, this is the first work that explicitly models the layer-wise evolution of attention maps. Our implementation is available at https://github.com/pkuyym/EvolvingAttention.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA