Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
JMIR Form Res ; 6(12): e23422, 2022 Dec 19.
Artículo en Inglés | MEDLINE | ID: mdl-36534457

RESUMEN

BACKGROUND: Real-time air pollution monitoring is a valuable tool for public health and environmental surveillance. In recent years, there has been a dramatic increase in air pollution forecasting and monitoring research using artificial neural networks. Most prior work relied on modeling pollutant concentrations collected from ground-based monitors and meteorological data for long-term forecasting of outdoor ozone (O3), oxides of nitrogen, and fine particulate matter (PM2.5). Given that traditional, highly sophisticated air quality monitors are expensive and not universally available, these models cannot adequately serve those not living near pollutant monitoring sites. Furthermore, because prior models were built based on physical measurement data collected from sensors, they may not be suitable for predicting the public health effects of pollution exposure. OBJECTIVE: This study aimed to develop and validate models to nowcast the observed pollution levels using web search data, which are publicly available in near real time from major search engines. METHODS: We developed novel machine learning-based models using both traditional supervised classification methods and state-of-the-art deep learning methods to detect elevated air pollution levels at the US city level by using generally available meteorological data and aggregate web-based search volume data derived from Google Trends. We validated the performance of these methods by predicting 3 critical air pollutants (O3, nitrogen dioxide, and PM2.5) across 10 major US metropolitan statistical areas in 2017 and 2018. We also explore different variations of the long short-term memory model and propose a novel search term dictionary learner-long short-term memory model to learn sequential patterns across multiple search terms for prediction. RESULTS: The top-performing model was a deep neural sequence model long short-term memory, using meteorological and web search data, and reached an accuracy of 0.82 (F1-score 0.51) for O3, 0.74 (F1-score 0.41) for nitrogen dioxide, and 0.85 (F1-score 0.27) for PM2.5, when used for detecting elevated pollution levels. Compared with using only meteorological data, the proposed method achieved superior accuracy by incorporating web search data. CONCLUSIONS: The results show that incorporating web search data with meteorological data improves the nowcasting performance for all 3 pollutants and suggest promising novel applications for tracking global physical phenomena using web search data.

2.
NPJ Precis Oncol ; 2: 24, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-30417117

RESUMEN

Oligodendrogliomas are diffusely infiltrative gliomas defined by IDH-mutation and co-deletion of 1p/19q. They have highly variable clinical courses, with survivals ranging from 6 months to over 20 years, but little is known regarding the pathways involved with their progression or optimal markers for stratifying risk. We utilized machine-learning approaches with genomic data from The Cancer Genome Atlas to objectively identify molecular factors associated with clinical outcomes of oligodendroglioma and extended these findings to study signaling pathways implicated in oncogenesis and clinical endpoints associated with glioma progression. Our multi-faceted computational approach uncovered key genetic alterations associated with disease progression and shorter survival in oligodendroglioma and specifically identified Notch pathway inactivation and PI3K pathway activation as the most strongly associated with MRI and pathology findings of advanced disease and poor clinical outcome. Our findings that Notch pathway inactivation and PI3K pathway activation are associated with advanced disease and survival risk will pave the way for clinically relevant markers of disease progression and therapeutic targets to improve clinical outcomes. Furthermore, our approach demonstrates the strength of machine learning and computational methods for identifying genetic events critical to disease progression in the era of big data and precision medicine.

3.
Proc Natl Acad Sci U S A ; 115(13): E2970-E2979, 2018 03 27.
Artículo en Inglés | MEDLINE | ID: mdl-29531073

RESUMEN

Cancer histology reflects underlying molecular processes and disease progression and contains rich phenotypic information that is predictive of patient outcomes. In this study, we show a computational approach for learning patient outcomes from digital pathology images using deep learning to combine the power of adaptive machine learning algorithms with traditional survival models. We illustrate how these survival convolutional neural networks (SCNNs) can integrate information from both histology images and genomic biomarkers into a single unified framework to predict time-to-event outcomes and show prediction accuracy that surpasses the current clinical paradigm for predicting the overall survival of patients diagnosed with glioma. We use statistical sampling techniques to address challenges in learning survival from histology images, including tumor heterogeneity and the need for large training cohorts. We also provide insights into the prediction mechanisms of SCNNs, using heat map visualization to show that SCNNs recognize important structures, like microvascular proliferation, that are related to prognosis and that are used by pathologists in grading. These results highlight the emerging role of deep learning in precision medicine and suggest an expanding utility for computational analysis of histology in the future practice of pathology.


Asunto(s)
Neoplasias Encefálicas/genética , Neoplasias Encefálicas/patología , Genómica/métodos , Glioma/genética , Glioma/patología , Técnicas Histológicas/métodos , Redes Neurales de la Computación , Algoritmos , Neoplasias Encefálicas/terapia , Glioma/terapia , Humanos , Procesamiento de Imagen Asistido por Computador , Medicina de Precisión , Pronóstico
4.
Sci Rep ; 7(1): 11707, 2017 09 15.
Artículo en Inglés | MEDLINE | ID: mdl-28916782

RESUMEN

Translating the vast data generated by genomic platforms into accurate predictions of clinical outcomes is a fundamental challenge in genomic medicine. Many prediction methods face limitations in learning from the high-dimensional profiles generated by these platforms, and rely on experts to hand-select a small number of features for training prediction models. In this paper, we demonstrate how deep learning and Bayesian optimization methods that have been remarkably successful in general high-dimensional prediction tasks can be adapted to the problem of predicting cancer outcomes. We perform an extensive comparison of Bayesian optimized deep survival models and other state of the art machine learning methods for survival analysis, and describe a framework for interpreting deep survival models using a risk backpropagation technique. Finally, we illustrate that deep survival models can successfully transfer information across diseases to improve prognostic accuracy. We provide an open-source software implementation of this framework called SurvivalNet that enables automatic training, evaluation and interpretation of deep survival models.


Asunto(s)
Aprendizaje Profundo , Genómica/métodos , Pronóstico , Programas Informáticos , Sobrevida , Teorema de Bayes , Conjuntos de Datos como Asunto , Humanos , Neoplasias/genética , Neoplasias/mortalidad , Redes Neurales de la Computación , Resultado del Tratamiento
5.
IEEE Trans Image Process ; 24(12): 5074-85, 2015 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-26259084

RESUMEN

Feature coding has received great attention in recent years as a building block of many image processing algorithms. In particular, the importance of the locality assumption in coding approaches has been studied in many previous works. We review this assumption and claim that using the similarity of data points to a more global set of anchor points does not necessarily weaken the coding method, as long as the underlying structure of the anchor points is considered. We propose to capture the underlying structure by assuming a random walker over the anchor points. We also show that our method is a fast approximation to the diffusion map kernel. Experiments on various data sets show that with a knowledge of the underlying structure of anchor points, different state-of-the-art coding algorithms may boost their performance in different learning tasks by utilizing the proposed method.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...