Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 16 de 16
Filtrar
1.
Environ Int ; 184: 108473, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38340404

RESUMEN

Uncertainty in ammonia (NH3) emissions causes the inaccuracy of fine particulate matter simulations, which is associated with human health. To address this uncertainty, in this work, we employ the iterative finite difference mass balance (iFDMB) technique to revise NH3 emissions over East Asia using the Cross-track Infrared Sounder (CRIS) satellite for July, August, and September 2019. Compared to the emissions, the revised NH3 emissions show an increase in China, particularly in the North China Plain (NCP) region, corresponding to agricultural land use in July, August, and September and a decrease in South Korea in September. The enhancement in NH3 emissions resulted in a remarkable increase in concentrations of NH3 by 5 ppb. in July and September, there is an increase in ammonium (NH4+) and nitrate (NO3-) concentrations by 5 µg/m3, particularly in the NCP region, while in August, both NH4+ and NO3- concentrations exhibit a decrease. For sulfate (SO42-), in August and September, the concentrations decreased over most regions of China and Taiwan, as a result of the production of ammonium sulfate; increased concentrations of SO42-, however, were simulated over South Korea, Japan, and the southern region of Chengdu, caused by higher relative humidity (RH). In contrast, during the month of July, our simulations showed an increase in SO42- concentrations over most regions of China. To gain a more comprehensive understanding, we defined a sulfur conversion ratio ( [Formula: see text] ), which explains how changes in sulfur in the gas phase affect changes in sulfate concentrations. A subsequent sensitivity analysis performed in this study indicated the same relationship between changes in ammonia and its effect on inorganic fine particulate matter (PM2.5). This study highlights the challenge of controlling and managing inorganic PM2.5 and indicates that reducing the emissions of air pollutants do not necessarily lead to a reduction in their concentrations.


Asunto(s)
Contaminantes Atmosféricos , Amoníaco , Humanos , Amoníaco/análisis , Material Particulado/análisis , Contaminantes Atmosféricos/análisis , Asia Oriental , China , Sulfatos/análisis , Azufre , Monitoreo del Ambiente/métodos
2.
Environ Pollut ; 334: 122223, 2023 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-37481031

RESUMEN

Ozone concentrations in Houston, Texas, are among the highest in the United States, posing significant risks to human health. This study aimed to evaluate the impact of various emissions sources and meteorological factors on ozone formation in Houston from 2017 to 2021 using a comprehensive PMF-SHAP approach. First, we distinguished the unique sources of VOCs in each area and identified differences in the local chemistry that affect ozone production. At the urban station, the primary sources were n_decane, biogenic/industrial/fuel evaporation, oil and gas flaring/production, industrial emissions/evaporation, and ethylene/propylene/aromatics. At the industrial site, the main sources were industrial emissions/evaporation, fuel evaporation, vehicle-related sources, oil and gas flaring/production, biogenic, aromatic, and ethylene and propylene. And then, we performed SHAP analysis to determine the importance and impact of each emissions factor and meteorological variables. Shortwave radiation (SHAP values are ∼5.74 and ∼6.3 for Milby Park and Lynchburg, respectively) and humidity (∼4.87 and ∼4.71, respectively) were the most important variables for both sites. For the urban station, the most important emissions sources were n_decane (∼2.96), industrial emissions/evaporation (∼1.89), and ethylene/propylene/aromatics (∼1.57), while for the industrial site, they were oil and gas flaring/production (∼1.38), ethylene/propylene (∼1.26), and industrial emissions/evaporation (∼0.95). NOx had a negative impact on ozone production at the urban station due to the NOx-rich chemical regime, whereas NOx had positive impacts at the industrial site. The study's findings suggest that the PMF-SHAP approach is efficient, inexpensive, and can be applied to other similar applications to identify factors contributing to ozone-exceedance events. The study's results can be used to develop more effective air quality management strategies for Houston and other cities with high levels of ozone.


Asunto(s)
Contaminantes Atmosféricos , Ozono , Compuestos Orgánicos Volátiles , Humanos , Ozono/análisis , Contaminantes Atmosféricos/análisis , Texas , Meteorología , Etilenos/análisis , Aprendizaje Automático , Monitoreo del Ambiente/métodos , Compuestos Orgánicos Volátiles/análisis , China , Emisiones de Vehículos/análisis
3.
Sci Total Environ ; 891: 164694, 2023 Sep 15.
Artículo en Inglés | MEDLINE | ID: mdl-37290661

RESUMEN

Since the outbreak of the COVID-19 pandemic, many previous studies using computational fluid dynamics (CFD) have focused on the dynamics of air masses, which are believed to be the carriers of respiratory diseases, in enclosed indoor environments. Although outdoor air may seem to provide smaller exposure risks, it may not necessarily offer adequate ventilation that varies with different micro-climate settings. To comprehensively assess the fluid dynamics in outdoor environments and the efficiency of outdoor ventilation, we simulated the outdoor transmission of a sneeze plume in "hot spots" or areas in which the air is not quickly ventilated. We began by simulating the airflow over buildings at the University of Houston using an OpenFOAM computational fluid dynamics solver that utilized the 2019 seasonal atmospheric velocity profile from an on-site station. Next, we calculated the length of time an existing fluid is replaced by new fresh air in the domain by defining a new variable and selecting the hot spots. Finally, we conducted a large-eddy simulation of a sneeze in outdoor conditions and then simulated a sneeze plume and particles in a hot spot. The results show that fresh incoming air takes as long as 1000 s to ventilate the hot spot area in some specific regions on campus. We also found that even the slightest upward wind causes a sneeze plume to dissipate almost instantaneously at lower elevations. However, downward wind provides a stable condition for the plume, and forward wind can carry a plume even beyond six feet, the recommended social distance for preventing infection. Additionally, the simulation of sneeze droplets shows that the majority of the particles adhered to the ground or body immediately, and airborne particles can be transported more than six feet, even in a minimal amount of ambient air.


Asunto(s)
Contaminación del Aire Interior , COVID-19 , Humanos , Contaminación del Aire Interior/análisis , Pandemias , COVID-19/epidemiología , Simulación por Computador , Viento
4.
Environ Pollut ; 306: 119419, 2022 Aug 01.
Artículo en Inglés | MEDLINE | ID: mdl-35526647

RESUMEN

Vegetation plays an important role as both a sink of air pollutants via dry deposition and a source of biogenic VOC (BVOC) emissions which often provide the precursors of air pollutants. To identify the vegetation-driven offset between the deposition and formation of air pollutants, this study examines the responses of ozone and PM2.5 concentrations to changes in the leaf area index (LAI) over East Asia and its neighboring seas, using up-to-date satellite-derived LAI and green vegetation fraction (GVF) products. Two LAI scenarios that examine (1) table-prescribed LAI and GVF from 1992 to 1993 AVHRR and 2001 MODIS products and (2) reprocessed 2019 MODIS LAI and 2019 VIIRS GVF products were used in WRF-CMAQ modeling to simulate ozone and PM2.5 concentrations for June 2019. The use of up-to-date LAI and GVF products resulted in monthly mean LAI differences ranging from -56.20% to 96.81% over the study domain. The increase in LAI resulted in the differences in hourly mean ozone and PM2.5 concentrations over inland areas ranging from 0.27 ppbV to -7.17 ppbV and 0.89 µg/m3 to -2.65 µg/m3, and the differences of those over the adjacent sea surface ranging from 0.69 ppbV to -2.86 ppbV and 3.41 µg/m3 to -7.47 µg/m3. The decreases in inland ozone and PM2.5 concentrations were mainly the results of dry deposition accelerated by increases in LAI, which outweighed the ozone and PM2.5 formations via BVOC-driven chemistry. Some inland regions showed further decreases in PM2.5 concentrations due to reduced reactions of PM2.5 precursors with hydroxyl radicals depleted by BVOCs. The reductions in sea surface ozone and PM2.5 concentrations were accompanied by the reductions in those in upwind inland regions, which led to less ozone and PM2.5 inflows. The results suggest the importance of the selective use of vegetation parameters for air quality modeling.


Asunto(s)
Contaminantes Atmosféricos , Contaminación del Aire , Ozono , Contaminantes Atmosféricos/análisis , Contaminación del Aire/análisis , Monitoreo del Ambiente/métodos , Ozono/análisis , Material Particulado/análisis , Hojas de la Planta/química
5.
Methods Mol Biol ; 2432: 167-185, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35505215

RESUMEN

High-throughput assays have been developed to measure DNA methylation, among which bisulfite-based sequencing (BS-seq) and microarray technologies are the most popular for genome-wide profiling. A major goal in DNA methylation analysis is the detection of differentially methylated genomic regions under two different conditions. To accomplish this, many state-of-the-art methods have been proposed in the past few years; only a handful of these methods are capable of analyzing both types of data (BS-seq and microarray), though. On the other hand, covariates, such as sex and age, are known to be potentially influential on DNA methylation; and thus, it would be important to adjust for their effects on differential methylation analysis. In this chapter, we describe a Bayesian curve credible bands approach and the accompanying software, BCurve, for detecting differentially methylated regions for data generated from either microarray or BS-Seq. The unified theme underlying the analysis of these two different types of data is the model that accounts for correlation between DNA methylation in nearby sites, covariates, and between-sample variability. The BCurve R software package also provides tools for simulating both microarray and BS-seq data, which can be useful for facilitating comparisons of methods given the known "gold standard" in the simulated data. We provide detailed description of the main functions in BCurve and demonstrate the utility of the package for analyzing data from both platforms using simulated data from the functions provided in the package. Analyses of two real datasets, one from BS-seq and one from microarray, are also furnished to further illustrate the capability of BCurve.


Asunto(s)
Metilación de ADN , Programas Informáticos , Teorema de Bayes , Genómica , Análisis de Secuencia de ADN/métodos
6.
Atmos Res ; 270: 1-14, 2022 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-35370333

RESUMEN

To investigate changes in the ozone (O3) chemical production regime over the contiguous United States (CONUS) with accurate knowledge of concentrations of its precursors, we applied an inverse modeling technique with Ozone Monitoring Instrument (OMI) tropospheric nitrogen dioxide (NO2) and total formaldehyde (HCHO) retrieval products in the summers of 2011, 2014, and 2017, years in which United States National Emission Inventory were based. The inclusion of dynamic chemical lateral boundary conditions and lightning-induced nitric oxide emissions significantly account for the contribution of background sources in the free troposphere. Satellite-constrained nitrogen oxide (NOx) and non-methane volatile organic compounds (NMVOCs) emissions mitigate the discrepancy between satellite and modeled columns: the inversion suggested 2.33-2.84 (1.07-1.34) times higher NOx over the CONUS (over urban regions) and 0.28-0.81 times fewer NMVOCs emissions over the southeastern United States. The model-derived HCHO/NO2 column ratio shows gradual spatial changes in the O3 production regime near urban cores relative to previously defined threshold values representing NOx and VOC sensitive conditions. We also found apparent shifts from the NOx-saturated regime to the transition regime (or the transition regime to the NOx-limited regime) over the major cities in the western United States. In contrast, rural areas, especially in the east-southeastern United States, exhibit a decreased HCHO/NO2 column ratio by -1.30 ± 1.71 with a reduction in HCHO column primarily driven by meteorology, becoming sensitive to VOC emissions. Results show that incorporating satellite observations into numerical modeling could help policymakers implement appropriate emission control policies for O3 pollution.

7.
Brief Bioinform ; 20(4): 1205-1214, 2019 07 19.
Artículo en Inglés | MEDLINE | ID: mdl-29091999

RESUMEN

How chromosomes fold and how distal genomic elements interact with one another at a genomic scale have been actively pursued in the past decade following the seminal work describing the Chromosome Conformation Capture (3C) assay. Essentially, 3C-based technologies produce two-dimensional (2D) contact maps that capture interactions between genomic fragments. Accordingly, a plethora of analytical methods have been proposed to take a 2D contact map as input to recapitulate the underlying whole genome three-dimensional (3D) structure of the chromatin. However, their performance in terms of several factors, including data resolution and ability to handle contact map features, have not been sufficiently evaluated. This task is taken up in this article, in which we consider several recent and/or well-regarded methods, both optimization-based and model-based, for their aptness of producing 3D structures using contact maps generated based on a population of cells. These methods are evaluated and compared using both simulated and real data. Several criteria have been used. For simulated data sets, the focus is on accurate recapitulation of the entire structure given the existence of the gold standard. For real data sets, comparison with distances measured by Florescence in situ Hybridization and consistency with several genomic features of known biological functions are examined.


Asunto(s)
Cromatina/química , Cromatina/genética , Animales , Cromatina/ultraestructura , Cromosomas Humanos/química , Cromosomas Humanos/genética , Cromosomas Humanos/ultraestructura , Biología Computacional/métodos , Simulación por Computador , Bases de Datos Genéticas , Genoma Humano , Humanos , Imagenología Tridimensional/métodos , Hibridación Fluorescente in Situ , Ratones , Modelos Genéticos , Conformación Molecular
8.
Biometrics ; 73(1): 52-62, 2017 03.
Artículo en Inglés | MEDLINE | ID: mdl-27214023

RESUMEN

A gene may be controlled by distal enhancers and repressors, not merely by regulatory elements in its promoter. Spatial organization of chromosomes is the mechanism that brings genes and their distal regulatory elements into close proximity. Recent molecular techniques, coupled with Next Generation Sequencing (NGS) technology, enable genome-wide detection of physical contacts between distant genomic loci. In particular, Hi-C is an NGS-aided assay for the study of genome-wide spatial interactions. The availability of such data makes it possible to reconstruct the underlying three-dimensional (3D) spatial chromatin structure. In this article, we present the Poisson Random effect Architecture Model (PRAM) for such an inference. The main feature of PRAM that separates it from previous methods is that it addresses the issue of over-dispersion and takes correlations among contact counts into consideration, thereby achieving greater consistency with observed data. PRAM was applied to Hi-C data to illustrate its performance and to compare the predicted distances with those measured by a Fluorescence In Situ Hybridization (FISH) validation experiment. Further, PRAM was compared to other methods in the literature based on both real and simulated data.


Asunto(s)
Cromatina/química , Modelos Biológicos , Modelos Estadísticos , Análisis Espacial , Regulación de la Expresión Génica , Hibridación Fluorescente in Situ , Distribución de Poisson
9.
J Interv Cardiol ; 29(2): 216-24, 2016 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-26927366

RESUMEN

OBJECTIVES: To compare outcomes and rates of optimal stent placement between optical coherence tomography (OCT) and intravascular ultrasound (IVUS) guided percutaneous coronary intervention (PCI). BACKGROUND: Unlike IVUS-guided PCI, rates of clinical outcomes and optimal stent placement have not been well characterized for OCT-guided PCI. METHODS: The study enrolled 290 patients who underwent implantation of a second generation drug eluting stent under OCT (122 patients) or IVUS (168 patients) guidance. The two groups were compared after adjusting for baseline differences using 1:1 propensity score matching (PSM) (114 patients in each group). Optimal stent placement was defined as achieving an adequate lumen (optimal minimum stent area [MSA > 4.85 mm(2) for OCT, >5 mm(2) for IVUS] or a final MSA ≥ 90% of the distal reference lumen area, without edge dissection, incomplete stent apposition, or tissue prolapse), or otherwise performing additional interventions to address suboptimal post-stenting OCT or IVUS findings. The primary endpoint was one-year cumulative incidence of major adverse cardiac events (MACE; cardiac death, myocardial infarction and target lesion revascularization). Definite or probable stent thrombosis (ST) rates were evaluated. RESULTS: In adjusted comparisons between OCT and IVUS groups, there was no significant difference in rates of MACE (3.5% vs. 3.5%, P = 1.000) and ST (0% vs. 0.9%, P = 1.000) at 1 year, optimal stent placement (89.5% vs. 92.1%, P = 0.492), and further intervention (7.9% vs.13.2%, P = 0.234), despite OCT significantly more frequently detecting tissue prolapse (97.4% vs. 47.4%, P < 0.001), and numerically more edge dissection (10.5% vs. 4.4%, P = 0.078) or incomplete stent apposition (48.2% vs. 36.8%, P = 0.082). CONCLUSIONS: OCT guidance showed comparable results to IVUS in mid-term clinical outcomes, suggesting that OCT can be an alternative tool for stent placement optimization.


Asunto(s)
Enfermedad de la Arteria Coronaria/cirugía , Stents Liberadores de Fármacos/efectos adversos , Intervención Coronaria Percutánea/métodos , Tomografía de Coherencia Óptica/métodos , Ultrasonografía Intervencional/métodos , Anciano , Angiografía Coronaria/métodos , Femenino , Humanos , Incidencia , Masculino , Persona de Mediana Edad , Intervención Coronaria Percutánea/efectos adversos , Puntaje de Propensión , Estudios Retrospectivos , Resultado del Tratamiento , Ultrasonografía Intervencional/efectos adversos
10.
Clin Cardiol ; 39(5): 276-84, 2016 May.
Artículo en Inglés | MEDLINE | ID: mdl-27028303

RESUMEN

BACKGROUND: Despite improved long-term safety of biodegradable polymer (BP) drug-eluting stents (DES) compared to first-generation durable polymer (DP) DES, data on the safety and efficacy of BP-DES compared with second-generation (2G) DP-DES in patients with acute myocardial infarction (AMI) are limited. HYPOTHESIS: To evaluate the safety and efficacy of BP-DES compared with 2G-DP-DES in the higher stent thrombosis (ST) risk setting of AMI. METHODS: A total of 3359 AMI patients who received either BP-DES (n = 261) or 2G-DP-DES (n = 3098) were included from the Korea Acute Myocardial Infarction Registry (KAMIR). Differences in baseline clinical and angiographic characteristics were adjusted using a 1:5 propensity score matching analysis (n = 261 for BP-DES and n = 1305 for 2G-DP-DES). The primary outcome was the incidence of major adverse cardiac events (MACE) including all-cause death, recurrent myocardial infarction (re-MI), and target vessel revascularization (TVR). The rate of definite or probable ST was also investigated. RESULTS: In adjusted analysis, there was no significant difference between the 2 groups in baseline clinical and angiographic characteristics; 2-year MACE (10.7% and 9.9% in the BP-DES group and 2G-DP-DES group, respectively, P = 0.679); ST incidence (0.8% vs 0.9%, respectively, P = 1.0), and rates of all-cause death, re-MI, and TVR. By multivariate analysis, old age, diabetes mellitus, renal dysfunction, and left ventricular dysfunction were the independent predictors of MACE after BP-DES or 2G-DP-DES implantation. CONCLUSIONS: BP-DES and 2G-DP-DES appear to have comparable 2-year safety and efficacy for the treatment of AMI. However, longer-term follow-up is needed.


Asunto(s)
Implantes Absorbibles , Stents Liberadores de Fármacos , Infarto del Miocardio/terapia , Intervención Coronaria Percutánea/instrumentación , Polímeros , Anciano , Distribución de Chi-Cuadrado , Angiografía Coronaria , Trombosis Coronaria/etiología , Femenino , Humanos , Estimación de Kaplan-Meier , Modelos Logísticos , Masculino , Persona de Mediana Edad , Análisis Multivariante , Infarto del Miocardio/diagnóstico por imagen , Infarto del Miocardio/mortalidad , Intervención Coronaria Percutánea/efectos adversos , Intervención Coronaria Percutánea/mortalidad , Puntaje de Propensión , Modelos de Riesgos Proporcionales , Estudios Prospectivos , Diseño de Prótesis , Recurrencia , Sistema de Registros , República de Corea , Factores de Riesgo , Factores de Tiempo , Resultado del Tratamiento
11.
BMC Bioinformatics ; 17: 70, 2016 Feb 06.
Artículo en Inglés | MEDLINE | ID: mdl-26852142

RESUMEN

BACKGROUND: Assays that are capable of detecting genome-wide chromatin interactions have produced massive amount of data and led to great understanding of the chromosomal three-dimensional (3D) structure. As technology becomes more sophisticated, higher-and-higher resolution data are being produced, going from the initial 1 Megabases (Mb) resolution to the current 10 Kilobases (Kb) or even 1 Kb resolution. The availability of genome-wide interaction data necessitates development of analytical methods to recover the underlying 3D spatial chromatin structure, but challenges abound. Most of the methods were proposed for analyzing data at low resolution (1 Mb). Their behaviors are thus unknown for higher resolution data. For such data, one of the key features is the high proportion of "0" contact counts among all available data, in other words, the excess of zeros. RESULTS: To address the issue of excess of zeros, in this paper, we propose a truncated Random effect EXpression (tREX) method that can handle data at various resolutions. We then assess the performance of tREX and a number of leading existing methods for recovering the underlying chromatin 3D structure. This was accomplished by creating in-silico data to mimic multiple levels of resolution and submit the methods to a "stress test". Finally, we applied tREX and the comparison methods to a Hi-C dataset for which FISH measurements are available to evaluate estimation accuracy. CONCLUSION: The proposed tREX method achieves consistently good performance in all 30 simulated settings considered. It is not only robust to resolution level and underlying parameters, but also insensitive to model misspecification. This conclusion is based on observations made in terms of 3D structure estimation accuracy and preservation of topologically associated domains. Application of the methods to the human lymphoblastoid cell line data on chromosomes 14 and 22 further substantiates the superior performance of tREX: the constructed 3D structure from tREX is consistent with the FISH measurements, and the corresponding distances predicted by tREX have higher correlation with the FISH measurements than any of the comparison methods. SOFTWARE: An open-source R-package is available at http://www.stat.osu.edu/~statgen/Software/tRex.


Asunto(s)
Cromatina/química , Cromosomas Humanos/química , Simulación por Computador , Linfocitos/química , Modelos Teóricos , Programas Informáticos , Células Cultivadas , Genoma Humano , Humanos , Hibridación Fluorescente in Situ
12.
Comput Struct Biotechnol J ; 13: 366-9, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-26106460

RESUMEN

BOG (Bacterium and virus analysis of Orthologous Groups) is a package for identifying groups of differentially regulated genes in the light of gene functions for various virus and bacteria genomes. It is designed to identify Clusters of Orthologous Groups (COGs) that are enriched among genes that have gone through significant changes under different conditions. This would contribute to the detection of pathogens, an important scientific research area of relevance in uncovering bioterrorism, among others. Particular statistical analyses include hypergeometric, Mann-Whitney rank sum, and gene set enrichment. Results from the analyses are organized and presented in tabular and graphical forms for ease of understanding and dissemination of results. BOG is implemented as an R-package, which is available from CRAN or can be downloaded from http://www.stat.osu.edu/~statgen/SOFTWARE/BOG/.

13.
IEEE Trans Image Process ; 24(3): 1101-14, 2015 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-25532185

RESUMEN

Being able to predict the degree of visual discomfort that is felt when viewing stereoscopic 3D (S3D) images is an important goal toward ameliorating causative factors, such as excessive horizontal disparity, misalignments or mismatches between the left and right views of stereo pairs, or conflicts between different depth cues. Ideally, such a model should account for such factors as capture and viewing geometries, the distribution of disparities, and the responses of visual neurons. When viewing modern 3D displays, visual discomfort is caused primarily by changes in binocular vergence while accommodation in held fixed at the viewing distance to a flat 3D screen. This results in unnatural mismatches between ocular fixations and ocular focus that does not occur in normal direct 3D viewing. This accommodation vergence conflict can cause adverse effects, such as headaches, fatigue, eye strain, and reduced visual ability. Binocular vision is ultimately realized by means of neural mechanisms that subserve the sensorimotor control of eye movements. Realizing that the neuronal responses are directly implicated in both the control and experience of 3D perception, we have developed a model-based neuronal and statistical framework called the 3D visual discomfort predictor (3D-VDP)that automatically predicts the level of visual discomfort that is experienced when viewing S3D images. 3D-VDP extracts two types of features: 1) coarse features derived from the statistics of binocular disparities and 2) fine features derived by estimating the neural activity associated with the processing of horizontal disparities. In particular, we deploy a model of horizontal disparity processing in the extrastriate middle temporal region of occipital lobe. We compare the performance of 3D-VDP with other recent discomfort prediction algorithms with respect to correlation against recorded subjective visual discomfort scores,and show that 3D-VDP is statistically superior to the other methods.


Asunto(s)
Imagenología Tridimensional/efectos adversos , Modelos Neurológicos , Estimulación Luminosa/efectos adversos , Trastornos de la Visión/fisiopatología , Percepción Visual/fisiología , Adulto , Encéfalo/fisiología , Humanos , Adulto Joven
14.
IEEE Trans Image Process ; 23(12): 5428-39, 2014 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-25350928

RESUMEN

The tremendous explosion of image-, video-, and audio-enabled mobile devices, such as tablets and smart-phones in recent years, has led to an associated dramatic increase in the volume of captured and distributed multimedia content. In particular, the number of digital photographs being captured annually is approaching 100 billion in just the U.S. These pictures are increasingly being acquired by inexperienced, casual users under highly diverse conditions leading to a plethora of distortions, including blur induced by camera shake. In order to be able to automatically detect, correct, or cull images impaired by shake-induced blur, it is necessary to develop distortion models specific to and suitable for assessing the sharpness of camera-shaken images. Toward this goal, we have developed a no-reference framework for automatically predicting the perceptual quality of camera-shaken images based on their spectral statistics. Two kinds of features are defined that capture blur induced by camera shake. One is a directional feature, which measures the variation of the image spectrum across orientations. The second feature captures the shape, area, and orientation of the spectral contours of camera shaken images. We demonstrate the performance of an algorithm derived from these features on new and existing databases of images distorted by camera shake.

15.
Bioinformatics ; 30(24): 3567-74, 2014 Dec 15.
Artículo en Inglés | MEDLINE | ID: mdl-25178460

RESUMEN

MOTIVATION: DNA methylation is an epigenetic change occurring in genomic CpG sequences that contribute to the regulation of gene transcription both in normal and malignant cells. Next-generation sequencing has been used to characterize DNA methylation status at the genome scale, but suffers from high sequencing cost in the case of whole-genome bisulfite sequencing, or from reduced resolution (inability to precisely define which of the CpGs are methylated) with capture-based techniques. RESULTS: Here we present a computational method that computes nucleotide-resolution methylation values from capture-based data by incorporating fragment length profiles into a model of methylation analysis. We demonstrate that it compares favorably with nucleotide-resolution bisulfite sequencing and has better predictive power with respect to a reference than window-based methods, often used for enrichment data. The described method was used to produce the methylation data used in tandem with gene expression to produce a novel and clinically significant gene signature in acute myeloid leukemia. In addition, we introduce a complementary statistical method that uses this nucleotide-resolution methylation data for detection of differentially methylated features.


Asunto(s)
Metilación de ADN , Secuenciación de Nucleótidos de Alto Rendimiento/métodos , Análisis de Secuencia de ADN/métodos , Algoritmos , Islas de CpG , Genómica/métodos , Humanos , Leucemia Mieloide Aguda/genética , Nucleótidos/metabolismo , Sulfitos
16.
IEEE Trans Image Process ; 22(2): 610-20, 2013 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-23008260

RESUMEN

It is generally recognized that severe video distortions that are transient in space and/or time have a large effect on overall perceived video quality. In order to understand this phenomena, we study the distribution of spatio-temporally local quality scores obtained from several video quality assessment (VQA) algorithms on videos suffering from compression and lossy transmission over communication channels. We propose a content adaptive spatial and temporal pooling strategy based on the observed distribution. Our method adaptively emphasizes "worst" scores along both the spatial and temporal dimensions of a video sequence and also considers the perceptual effect of large-area cohesive motion flow such as egomotion. We demonstrate the efficacy of the method by testing it using three different VQA algorithms on the LIVE Video Quality database and the EPFL-PoliMI video quality database.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...