Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 35
Filter
1.
Genes Genomics ; 2024 May 11.
Article in English | MEDLINE | ID: mdl-38733519

ABSTRACT

BACKGROUND: Exposure to particulate matter (PM) and house dust mite (HDM) can change the expression patterns of inflammation-, oxidative stress-, and cell death-related genes. We investigated the changes in gene expression patterns owing to PM exposure. OBJECTIVE: This study examined the changes in gene expression patterns following PM exposure. METHODS: We searched for differentially expressed genes (DEGs) following PM exposure using five cell line-based RNA-seq or microarray datasets and six human-derived datasets. The enrichment terms of the DEGs were assessed. RESULTS: DEG analysis yielded two gene sets. Thus, enrichment analysis was performed for each gene set, and the enrichment terms related to respiratory diseases were presented. The intersection of six human-derived datasets and two gene sets was obtained, and the expression patterns following PM exposure were observed. CONCLUSIONS: Two gene sets were obtained for cells treated with PM and their expression patterns were presented following verification in human-derived cells. Our findings suggest that exposure to PM2.5 and HDM may reveal changes in genes that are associated with diseases, such as allergies, highlighting the importance of mitigating PM2.5 and HDM exposure for disease prevention.

2.
Discov Oncol ; 15(1): 64, 2024 Mar 06.
Article in English | MEDLINE | ID: mdl-38443516

ABSTRACT

BACKGROUND: Our study has aimed to assess the effects of consolidative high-dose radiotherapy on clinical outcomes in patients with localized metastatic non-small cell lung cancer (NSCLC) who showed favorable tumor response after systemic treatment. METHODS: We retrospectively reviewed the medical records of 83 patients with localized metastatic NSCLC, who received systemic therapy followed by consolidative local radiotherapy at the Korea University Guro Hospital between March 2017 and June 2022. In the current study, we defined localized metastatic disease as the presence of one to three metastatic sites at the time of diagnosis. And patients who showed favorable tumor response after systemic treatment, including oligo-progressive disease at the thoracic site which was amenable to curative high-dose local radiotherapy, were included. The planned total dose and fraction size mainly depended on the location of lesions. RESULTS: The median follow-up time after consolidative radiotherapy was 16 months (range: 5-52 months). The overall 2-year progression-free survival rates were 81.4%. Of 83 patients, only four (4.3%), treated with intensity-modulated radiation therapy, showed an in-field local recurrence. Interestingly, only one patient experienced a local failure among the 20 patients who showed an oligo-progressive disease at the thoracic site on the tumor response evaluation after systemic treatment. Regarding treatment-related pulmonary toxicity, three patients with grade-3 and one patient with grade-4 radiation pneumonitis were presented. CONCLUSIONS: If the disease is sufficiently controlled and localized by systemic therapy, local consolidative radiotherapy is thought to improves local control rates with acceptable treatment-related toxicities in patients with localized metastatic NSCLC, especially those with oligo-progressive disease.

4.
J Thorac Imaging ; 39(2): 79-85, 2024 Mar 01.
Article in English | MEDLINE | ID: mdl-37889567

ABSTRACT

PURPOSE: This study aimed to determine the association between functional impairment in small airways and symptoms of dyspnea in patients with Long-coronavirus disease (COVID), using imaging and computational modeling analysis. PATIENTS AND METHODS: Thirty-four patients with Long-COVID underwent thoracic computed tomography and hyperpolarized Xenon-129 magnetic resonance imaging (HP Xe MRI) scans. Twenty-two answered dyspnea-12 questionnaires. We used a computed tomography-based full-scale airway network (FAN) flow model to simulate pulmonary ventilation. The ventilation distribution projected on a coronal plane and the percentage lobar ventilation modeled in the FAN model were compared with the HP Xe MRI data. To assess the ventilation heterogeneity in small airways, we calculated the fractal dimensions of the impaired ventilation regions in the HP Xe MRI and FAN models. RESULTS: The ventilation distribution projected on a coronal plane showed an excellent resemblance between HP Xe MRI scans and FAN models (structure similarity index: 0.87 ± 0.04). In both the image and the model, the existence of large clustered ventilation defects was not identifiable regardless of dyspnea severity. The percentage lobar ventilation of the HP Xe MRI and FAN model showed a strong correlation (ρ = 0.63, P < 0.001). The difference in the fractal dimension of impaired ventilation zones between the low and high dyspnea-12 score groups was significant (HP Xe MRI: 1.97 [1.89 to 2.04] and 2.08 [2.06 to 2.14], P = 0.005; FAN: 2.60 [2.59 to 2.64] and 2.64 [2.63 to 2.65], P = 0.056). CONCLUSIONS: This study has identified a potential association of small airway functional impairment with breathlessness in Long-COVID, using fractal analysis of HP Xe MRI scans and FAN models.


Subject(s)
Post-Acute COVID-19 Syndrome , Xenon Isotopes , Humans , Lung/diagnostic imaging , Lung/pathology , Respiration , Magnetic Resonance Imaging/methods , Dyspnea/diagnostic imaging
5.
Nutrition ; 119: 112304, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38154397

ABSTRACT

OBJECTIVE: Optical spectroscopy-measured skin carotenoid status (SCS) has been validated for estimating fruit and vegetable (F&V) intake; however, there is limited research addressing SCS kinetics in whole-diet interventions. The aim of this controlled feeding trial was to explore SCS's response to carotenoid intake changes via whole-diet intervention, evaluating its biomarker potential. METHODS: Eighty participants ages 20 to 49 y, without underlying diseases, were randomly allocated to the high-carotenoid group (HG; n = 40) or control group (CG; n = 40). The HG consumed a high-carotenoid diet (21 mg total carotenoids/2000 kcal), whereas the CG consumed a control diet (13.6 mg total carotenoids/2000 kcal) for 6 wk. Subsequently, skin and blood carotenoid concentrations were tracked without intervention for 4 wk. SCS was measured weekly via resonance Raman spectroscopy, and serum carotenoid concentrations were analyzed biweekly using high-performance liquid chromatography. Baseline carotenoid and F&V intakes were assessed via a 3-d diet record. The kinetics of SCS and serum carotenoid concentrations were analyzed using a weighted generalized estimating equation. Pearson's correlation analyses were used to examine baseline correlations between SCS and dietary carotenoid and F&V intakes, as well as serum carotenoid concentrations. RESULTS: During the intervention, the HG showed a faster and greater SCS increase than the CG (difference in slope per week = 8.87 AU, Pinteraction <0.001). Baseline SCS had positive correlations with total carotenoid intake (r = 0.45), total F&V intake (r = 0.49), and total serum carotenoid concentration (r = 0.79; P < 0.001 for all). CONCLUSION: These results suggest that SCS is a valid biomarker for monitoring changes in carotenoid intake through whole diet, which supports using SCS for assessing carotenoid-rich F&V intake.


Subject(s)
Fruit , Vegetables , Humans , Biomarkers , Carotenoids/analysis , Diet/methods , Eating , Fruit/chemistry , Skin/chemistry , Vegetables/chemistry , Young Adult , Adult , Middle Aged
6.
BMJ Open ; 13(12): e074381, 2023 12 13.
Article in English | MEDLINE | ID: mdl-38097233

ABSTRACT

OBJECTIVES: The COVID-19 pandemic resulted in suboptimal care for ischaemic stroke. Patients with diabetes mellitus (DM), a high-risk group for stroke, had compromised routine care during the pandemic, which increases the chance of stroke. We examined influence of the COVID-19 pandemic on the management of ischaemic stroke in patients with DM in South Korea. DESIGN: Retrospective, nationwide, population-based cohort study. SETTING: Data from the National Emergency Department Information System. PARTICIPANTS: We analysed 11 734 patients diagnosed with acute ischaemic stroke who underwent intravenous thrombolysis or endovascular thrombectomy between 2019 (the reference year) and 2020 (the pandemic year). Among them, 1014 subjects with DM were analysed separately. OUTCOME MEASURES: The frequency of emergency department (ED) visits, time from symptom onset to ED, from ED visit to admission and in-hospital mortality were compared between two periods in the overall population and in patients with DM. RESULTS: During the pandemic, the incidence of ischaemic stroke requiring urgent procedures increased by 7.57% in total and by 9.03% in patients with DM. Time delay from symptom onset to ED (reference vs pandemic, total: 1.50 vs 1.55 hours; p<0.01) and from ED visit to admission (total: 3.88 vs 3.92 hours; p=0.02) occurred during the pandemic in the overall population, but not significantly in patients with DM specifically. Older patients with DM showed higher chances of intensive care unit (ICU) admission during the pandemic: 53.5% vs 62.8% in age 70-79, 60.5% vs 71.9% in age 80-89 and 20.0% vs 70.8% in age ≥90 years (all p=0.01). There was no significant difference in in-hospital mortality between two periods (total: 8.2% vs 8.4%, p=0.65; DM: 8.1% vs 6.7%, p=0.25). CONCLUSIONS: During the COVID-19 pandemic, the incidence of ischaemic stroke requiring urgent procedures increased, and older patients with DM showed a higher ICU admission rate. However, the pandemic was not associated with an increased in-hospital stroke mortality.


Subject(s)
Brain Ischemia , COVID-19 , Diabetes Mellitus , Ischemic Stroke , Stroke , Humans , Aged , Aged, 80 and over , COVID-19/epidemiology , Retrospective Studies , Brain Ischemia/epidemiology , Brain Ischemia/therapy , Pandemics , Cohort Studies , Stroke/epidemiology , Stroke/therapy , Ischemic Stroke/epidemiology , Ischemic Stroke/therapy , Emergency Service, Hospital , Diabetes Mellitus/epidemiology
7.
BMC Cancer ; 23(1): 992, 2023 Oct 17.
Article in English | MEDLINE | ID: mdl-37848850

ABSTRACT

BACKGROUND: We aim to identify the multifaceted risk factors that can affect the development of severe radiation pneumonitis (RP) in patients with non-small cell lung cancer (NSCLC) treated with curative high-dose radiotherapy with or without concurrent chemotherapy. METHODS: We retrospectively reviewed the medical records of 175 patients with stage-I-III NSCLC treated with curative thoracic X-ray radiotherapy at the Korea University Guro Hospital between June 2019 and June 2022. Treatment-related complications were evaluated using the Common Terminology Criteria for Adverse Events (version 4.03). RESULTS: The median follow-up duration was 15 months (range: 3-47 months). Idiopathic pulmonary fibrosis (IPF) as an underlying lung disease (P < 0.001) and clinical stage, regarded as the concurrent use of chemotherapy (P = 0.009), were associated with a high rate of severe RP. In multivariate analyses adjusting confounding variables, the presence of IPF as an underlying disease was significantly associated with severe RP (odds ratio [95% confidence interval] = 48.4 [9.09-347]; P < 0.001). In a subgroup analysis of stage-I-II NSCLC, the incidence of severe RP in the control, chronic obstructive pulmonary disease (COPD), and IPF groups was 3.2%, 4.3%, and 42.9%, respectively (P < 0.001). The incidence of severe RP was 15.2%, 10.7%, and 75.0% in the control, COPD, and IPF groups, respectively (P < 0.001) in the stage-III NSCLC group. CONCLUSIONS: This study revealed that IPF as an underlying lung disease and the concurrent use of chemotherapy are associated with a high rate of severe RP. In contrast, COPD did not increase the risk of pulmonary toxicity after receiving curative high-dose radiotherapy.


Subject(s)
Carcinoma, Non-Small-Cell Lung , Idiopathic Pulmonary Fibrosis , Lung Diseases , Lung Neoplasms , Pulmonary Disease, Chronic Obstructive , Radiation Pneumonitis , Humans , Carcinoma, Non-Small-Cell Lung/complications , Carcinoma, Non-Small-Cell Lung/radiotherapy , Carcinoma, Non-Small-Cell Lung/drug therapy , Lung Neoplasms/complications , Lung Neoplasms/radiotherapy , Lung Neoplasms/drug therapy , Radiation Pneumonitis/epidemiology , Radiation Pneumonitis/etiology , Retrospective Studies , Risk Factors , Pulmonary Disease, Chronic Obstructive/complications
8.
J Stroke Cerebrovasc Dis ; 32(11): 107348, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37783139

ABSTRACT

BACKGROUND: Air pollutant concentrations in South Korea vary greatly by region and time. To assess temporal and spatial associations of stroke subtypes with long-term air pollution effects on stroke mortality, we studied ischemic stroke (IS), intracerebral hemorrhage (ICH), and subarachnoid hemorrhage (SAH). METHODS: This was an observational study conducted in South Korea from 2001-2018. Concentrations of carbon monoxide (CO), nitrogen dioxide (NO2), sulfur dioxide (SO2), and particulate matter ≤10 µm in diameter (PM10) were determined from 332 stations. Average air pollutant concentrations in each district were determined by distance-weighted linear interpolation. The nationwide stroke mortality rates in 249 districts were obtained from the Korean Statistical Information Service. Time intervals were divided into three consecutive 6-year periods: 2001-2006, 2007-2012, and 2013-2018. RESULTS: The concentrations of air pollutants gradually decreased from 2001-2018, along with decreases in IS and ICH mortality rates. However, mortality rates associated with SAH remained constant. From 2001-2006, NO2 (adjusted odds ratio [aOR]:1.13, 95% confidence interval: 1.08-1.19), SO2 (aOR: 1.10, 1.07-1.13), and PM10 (aOR: 1.12, 1.06-1.18) concentrations were associated with IS mortality, and SO2 (aOR: 1.07, 1.02-1.13) and PM10 (aOR:1.11,1.06-1.22) concentrations were associated with SAH-associated mortality. Air pollution was no longer associated with stroke mortality from 2007 onward, as the air pollution concentration continued to decline. Throughout the entire 18-year period, ICH-associated mortality was not associated with air pollution. CONCLUSIONS: Considering temporal and spatial trends, high concentrations of air pollutants were most likely to be associated with IS mortality. Our results strengthen the existing evidence of the deleterious effects of air pollution on IS mortality.


Subject(s)
Air Pollutants , Air Pollution , Stroke , Humans , Nitrogen Dioxide/adverse effects , Air Pollution/adverse effects , Air Pollutants/adverse effects , Republic of Korea/epidemiology , Stroke/diagnosis
9.
Sensors (Basel) ; 23(17)2023 Sep 04.
Article in English | MEDLINE | ID: mdl-37688110

ABSTRACT

Resonance Raman spectroscopy (RRS) has been used as a reference method for measuring skin carotenoid levels (SCL), which indicate vegetable and fruit intake. However, RRS is not an easy-to-use method in SCL measurement due to its complicated implementation. In this study, a commercial spectrophotometer based on reflection spectroscopy (RS), which is relatively simple and inexpensive, was evaluated to confirm usability compared with RRS in measuring SCL. To investigate the agreement between RS and RRS, eighty participants were randomly assigned to a high-carotenoid diet group (21 mg/day of total carotenoids) or a control-carotenoid diet group (14 mg/day of total carotenoids) during a 6-week whole-diet intervention period and a 4-week tracking period. Strong correlations between the RS and RRS methods were observed at baseline (r = 0.944) and the entire period (r = 0.930). The rate of SCL increase was similar during the diet intervention; however, the initiation of the SCL decrease in RS was slower than in RRS during the tracking period. To confirm the agreement of RS and RRS from various perspectives, new visualization tools and indices were additionally applied and confirmed the similar response patterns of the two methods. The results indicate that the proposed RS method could be an alternative to RRS in SCL measurements.


Subject(s)
Skin , Spectrum Analysis, Raman , Humans , Carotenoids , Cognition , Vegetables
10.
Nutrients ; 15(9)2023 Apr 30.
Article in English | MEDLINE | ID: mdl-37432294

ABSTRACT

Despite consistent evidence that greater consumption of fruits and vegetables (FV) is associated with significant reductions in chronic disease morbidity and mortality, the majority of adults in the United States consume less than the amounts recommended by public health agencies. As such, there is a critical need to design and implement effective programs and policies to facilitate increases in FV consumption for the prevention of these diseases. To accomplish this, an accurate, inexpensive, and convenient method for estimating the dietary FV intake is required. A promising method for quantifying the FV intake via proxy that has gained interest in recent years is the measurement of skin carotenoid levels via spectroscopy-based devices. However, there exist certain dietary and non-dietary factors that may affect the skin carotenoid levels independently of the dietary intake of carotenoids. In order to validate the ability of this method to accurately estimate the FV intake among diverse demographics, these factors must be identified and taken into consideration. Therefore, this narrative review seeks to summarize the available research on factors that may affect the skin carotenoid levels, determine current gaps in knowledge, and provide guidance for future research efforts seeking to validate spectroscopy-measured skin carotenoid levels as a means of accurately estimating the FV intake among various populations.


Subject(s)
Carotenoids , Skin , Adult , Humans , Eating , Fruit , Knowledge , Vegetables
11.
Br J Nutr ; : 1-9, 2023 May 15.
Article in English | MEDLINE | ID: mdl-37184085

ABSTRACT

Blood carotenoid concentration measurement is considered the gold standard for fruit and vegetable (F&V) intake estimation; however, this method is invasive and expensive. Recently, skin carotenoid status (SCS) measured by optical sensors has been evaluated as a promising parameter for F&V intake estimation. In this cross-sectional study, we aimed to validate the utility of resonance Raman spectroscopy (RRS)-assessed SCS as a biomarker of F&V intake in Korean adults. We used data from 108 participants aged 20-69 years who completed SCS measurements, blood collection and 3-d dietary recordings. Serum carotenoid concentrations were quantified using HPLC, and dietary carotenoid and F&V intakes were estimated via 3-d dietary records using a carotenoid database for common Korean foods. The correlations of the SCS with serum carotenoid concentrations, dietary carotenoid intake and F&V intake were examined to assess SCS validity. SCS was positively correlated with total serum carotenoid concentration (r = 0·52, 95 % CI = 0·36, 0·64, P < 0·001), serum ß-carotene concentration (r = 0·60, 95 % CI = 0·47, 0·71, P < 0·001), total carotenoid intake (r = 0·20, 95 % CI = 0·01, 0·37, P = 0·04), ß-carotene intake (r = 0·30, 95 % CI = 0·11, 0·46, P = 0·002) and F&V intake (r = 0·40, 95 % CI = 0·23, 0·55, P < 0·001). These results suggest that SCS can be a valid biomarker of F&V intake in Korean adults.

12.
Front Public Health ; 11: 1151506, 2023.
Article in English | MEDLINE | ID: mdl-37181708

ABSTRACT

Background: Although acute myocardial infarction (AMI) requires timely intervention, limited nationwide data is available regarding the association between disruption of emergency services and outcomes of patients with AMI during the coronavirus disease 2019 (COVID-19) pandemic. Moreover, whether diabetes mellitus (DM) adversely affects disease severity in these patients has not yet been investigated. Methods: This nationwide population-based study analyzed 45,648 patients with AMI, using data from the national registry of emergency departments (ED) in Korea. Frequency of ED visits and disease severity were compared between the COVID-19 outbreak period (year 2020) and the control period (the previous year 2019). Results: The number of ED visits by patients with AMI decreased during the first, second, and third waves of the outbreak period compared to the corresponding time period in the control period (all p-values < 0.05). A longer duration from symptom onset to ED visit (p = 0.001) and ED stay (p = 0.001) and higher rates of resuscitation, ventilation care, and extracorporeal membrane oxygen insertion were observed during the outbreak period than during the control period (all p-values < 0.05). These findings were exacerbated in patients with comorbid DM; Compared to patients without DM, patients with DM demonstrated delayed ED visits, longer ED stays, more intensive care unit admissions (p < 0.001), longer hospitalizations (p < 0.001), and higher rates of resuscitation, intubation, and hemodialysis (all p-values < 0.05) during the outbreak period. While in-hospital mortality was similar in AMI patients with and without comorbid DM during the two periods (4.3 vs. 4.4%; p = 0.671), patients with DM who had other comorbidities such as chronic kidney disease or heart failure or were aged ≥ 80 years had higher in-hospital mortality compared with those without any of the comorbidities (3.1 vs. 6.0%; p < 0.001). Conclusion: During the pandemic, the number of patients with AMI presenting to the ED decreased compared with that of the previous year, while the disease severity increased, particularly in patients with comorbid DM.


Subject(s)
COVID-19 , Diabetes Mellitus , Emergency Medical Services , Myocardial Infarction , Humans , COVID-19/epidemiology , COVID-19/therapy , Pandemics , Retrospective Studies , Myocardial Infarction/epidemiology , Myocardial Infarction/therapy , Diabetes Mellitus/epidemiology
13.
Sci Rep ; 13(1): 3867, 2023 03 08.
Article in English | MEDLINE | ID: mdl-36890192

ABSTRACT

Central line-related bloodstream infection (CRBSI) is a common complication during hospital admissions; however, there is insufficient data regarding CRBSI in the emergency department. Therefore, we evaluated the incidence and clinical impact of CRBSI using a single-center retrospective study to analyze medical data of 2189 adult patients (median age: 65 years, 58.8% males) who underwent central line insertion in ED from 2013 to 2015. CRBSI was defined if the same pathogens were identified at peripheral and catheter tips or the differential time to positivity was > 2 h. CRBSI-related in-hospital mortality and risk factors were evaluated. CRBSI occurred in 80 patients (3.7%), of which 51 survived and 29 died; those with CRBSI had higher incidence of subclavian vein insertion and retry rates. Staphylococcus epidermidis was the most common pathogen, followed by Staphylococcus aureus, Enterococcus faecium, and Escherichia coli. Using multivariate analysis, we found that CRBSI development was an independent risk factor for in-hospital mortality (adjusted odds ratio: 1.93, 95% confidence intervals: 1.19-3.14, p < 0.01). Our findings suggest that CRBSI after central line insertion in the emergency department is common and associated with poor outcomes. Infection prevention and management measures to reduce CRBSI incidence are essential to improve clinical outcomes.


Subject(s)
Bacteremia , Catheter-Related Infections , Catheterization, Central Venous , Sepsis , Male , Adult , Humans , Aged , Female , Retrospective Studies , Incidence , Catheterization, Central Venous/adverse effects , Emergency Service, Hospital , Sepsis/complications , Catheter-Related Infections/epidemiology , Catheter-Related Infections/etiology , Bacteremia/epidemiology , Bacteremia/etiology
14.
Food Sci Anim Resour ; 43(2): 319-330, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36909850

ABSTRACT

Some preservatives are naturally contained in raw food materials, while in some cases may have been introduced in food by careless handling or fermentation. However, it is difficult to distinguish between intentionally added preservatives and the preservatives naturally produced in food. The objective of this study was to evaluate the minimum inhibitory concentration (MIC) of propionic acid, sorbic acid, and benzoic acid for inhibiting food spoilage microorganisms in animal products, which can be useful in determining if the preservatives are natural or not. The broth microdilution method was used to determine the MIC of preservatives for 57 microorganisms. Five bacteria that were the most sensitive to propionic acid, benzoic acid, and sorbic acid were inoculated in unprocessed and processed animal products. A hundred microliters of the preservatives were then spiked in samples. After storage, the cells were counted to determine the MIC of the preservatives. The MIC of the preservatives in animal products ranged from 100 to 1,500 ppm for propionic acid, from 100 to >1,500 ppm for benzoic acid, and from 100 to >1,200 ppm for sorbic acid. Thus, if the concentrations of preservatives are below the MIC, the preservatives may not be added intentionally. Therefore, the MIC result will be useful in determining if preservatives are added intentionally in food.

15.
Foods ; 12(4)2023 Feb 13.
Article in English | MEDLINE | ID: mdl-36832871

ABSTRACT

This study estimated the risk of hepatitis A virus (HAV) foodborne illness outbreaks through the consumption of fermented clams in South Korea. HAV prevalence in fermented clams was obtained from the Ministry of Food and Drug Safety Report, 2019. Fermented clam samples (2 g) were inoculated with HAV and stored at -20-25 °C. Based on the HAV titer (determined using plaque assay) in fermented clams according to storage, the Baranyi predictive models provided by Combase were applied to describe the kinetic behavior of HAV in fermented clams. The initial estimated HAV contamination level was -3.7 Log PFU/g. The developed predictive models revealed that, when the temperature increased, the number of HAV plaques decreased. The Beta-Poisson model was chosen for determining the dose-response of HAV, and the simulation revealed that there was a 6.56 × 10-11/person/day chance of contracting HAV foodborne illness by eating fermented clams. However, when only regular consumers of fermented clams were assumed as the population, the probability of HAV foodborne illness increased to 8.11 × 10-8/person/day. These results suggest that, while there is a low likelihood of HAV foodborne illness from consuming fermented clams across the country, regular consumers should be aware of the possibility of foodborne illness.

16.
Radiology ; 307(2): e221488, 2023 04.
Article in English | MEDLINE | ID: mdl-36786699

ABSTRACT

Background Low-dose chest CT screening is recommended for smokers with the potential for lung function abnormality, but its role in predicting lung function remains unclear. Purpose To develop a deep learning algorithm to predict pulmonary function with low-dose CT images in participants using health screening services. Materials and Methods In this retrospective study, participants underwent health screening with same-day low-dose CT and pulmonary function testing with spirometry at a university affiliated tertiary referral general hospital between January 2015 and December 2018. The data set was split into a development set (model training, validation, and internal test sets) and temporally independent test set according to first visit year. A convolutional neural network was trained to predict the forced expiratory volume in the first second of expiration (FEV1) and forced vital capacity (FVC) from low-dose CT. The mean absolute error and concordance correlation coefficient (CCC) were used to evaluate agreement between spirometry as the reference standard and deep-learning prediction as the index test. FVC and FEV1 percent predicted (hereafter, FVC% and FEV1%) values less than 80% and percent of FVC exhaled in first second (hereafter, FEV1/FVC) less than 70% were used to classify participants at high risk. Results A total of 16 148 participants were included (mean age, 55 years ± 10 [SD]; 10 981 men) and divided into a development set (n = 13 428) and temporally independent test set (n = 2720). In the temporally independent test set, the mean absolute error and CCC were 0.22 L and 0.94, respectively, for FVC and 0.22 L and 0.91 for FEV1. For the prediction of the respiratory high-risk group, FVC%, FEV1%, and FEV1/FVC had respective accuracies of 89.6% (2436 of 2720 participants; 95% CI: 88.4, 90.7), 85.9% (2337 of 2720 participants; 95% CI: 84.6, 87.2), and 90.2% (2453 of 2720 participants; 95% CI: 89.1, 91.3) in the same testing data set. The sensitivities were 61.6% (242 of 393 participants; 95% CI: 59.7, 63.4), 46.9% (226 of 482 participants; 95% CI: 45.0, 48.8), and 36.1% (91 of 252 participants; 95% CI: 34.3, 37.9), respectively. Conclusion A deep learning model applied to volumetric chest CT predicted pulmonary function with relatively good performance. © RSNA, 2023 Supplemental material is available for this article.


Subject(s)
Deep Learning , Male , Humans , Middle Aged , Retrospective Studies , Lung/diagnostic imaging , Vital Capacity , Forced Expiratory Volume , Spirometry/methods , Tomography, X-Ray Computed
18.
Sci Rep ; 12(1): 17307, 2022 10 15.
Article in English | MEDLINE | ID: mdl-36243746

ABSTRACT

Realistic image synthesis based on deep learning is an invaluable technique for developing high-performance computer aided diagnosis systems while protecting patient privacy. However, training a generative adversarial network (GAN) for image synthesis remains challenging because of the large amounts of data required for training various kinds of image features. This study aims to synthesize retinal images indistinguishable from real images and evaluate the efficacy of the synthesized images having a specific disease for augmenting class imbalanced datasets. The synthesized images were validated via image Turing tests, qualitative analysis by retinal specialists, and quantitative analyses on amounts and signal-to-noise ratios of vessels. The efficacy of synthesized images was verified by deep learning-based classification performance. Turing test shows that accuracy, sensitivity, and specificity of 54.0 ± 12.3%, 71.1 ± 18.8%, and 36.9 ± 25.5%, respectively. Here, sensitivity represents correctness to find real images among real datasets. Vessel amounts and average SNR comparisons show 0.43% and 1.5% difference between real and synthesized images. The classification performance after augmenting synthesized images outperforms every ratio of imbalanced real datasets. Our study shows the realistic retina images were successfully generated with insignificant differences between the real and synthesized images and shows great potential for practical applications.


Subject(s)
Image Processing, Computer-Assisted , Retina , Humans , Image Processing, Computer-Assisted/methods , Retina/diagnostic imaging , Signal-To-Noise Ratio
19.
Sci Rep ; 12(1): 16288, 2022 09 29.
Article in English | MEDLINE | ID: mdl-36175527

ABSTRACT

Birthweight is a strong determinant of a neonate's health. The SARS-CoV-2 pandemic's impact on birthweight has not been investigated in-depth, with inconsistent conclusions from initial studies. To assess changes in preterm birth and inappropriate birthweight between the SARS-CoV-2 pandemic and pre-pandemic periods. A nationwide birth micro-data consisted with exhaustive census of all births in 2011-2020 in South Korea was accessed to examine whether the mean birthweight and rates of under/overweight births changed significantly during the SARS-CoV-2 pandemic year (2020) compared to those of the pre-pandemic period (2011-2019). A total of 3,736,447 singleton births were analyzed. Preterm birth was defined as < 37 weeks of gestation. Low birthweight (LBW) and macrosomia were defined as birthweights < 2.5 kg and ≥ 4.0 kg, respectively. Small for gestational age (SGA) and large for gestational age (LGA) were defined as birthweights below the 10th and above 90th percentiles for sex and gestational age, respectively. Inappropriate birthweight was defined as one or more LBW, macrosomia, SGA, or LGA. Generalized linear models predicted birth outcomes and were adjusted for parental age and education level, marital status, parity, gestational age, and months from January 2011. There were 3,481,423 and 255,024 singleton births during the pre-pandemic and pandemic periods, respectively. Multivariable generalized linear models estimated negative associations between the pandemic and preterm birth (odds ratio [OR], 0.968; 95% confidence interval [CI] 0.958-0.978), LBW (OR: 0.967, 95% CI 0.956-0.979), macrosomia (OR: 0.899, 95% CI 0.886-0.912), SGA (OR: 0.974, 95% CI 0.964-0.983), LGA (OR: 0.952, 95% CI 0.945-0.959), and inappropriate birthweight (OR: 0.958, 95% CI 0.952-0.963), indicating a decline during the pandemic compared to pre-pandemic period. An 8.98 g decrease in birthweight (95% CI 7.98-9.99) was estimated during the pandemic. This is the largest and comprehensive nationwide study to date on the impact of the SARS-CoV-2 pandemic on preterm birth and inappropriate birthweight. Birth during the pandemic was associated with lower odds of being preterm, underweight, and overweight. Further studies are required to understand the dynamics underlying this phenomenon.


Subject(s)
COVID-19 , Premature Birth , Birth Weight , COVID-19/epidemiology , Female , Fetal Macrosomia/epidemiology , Humans , Infant, Newborn , Overweight , Pandemics , Pregnancy , Premature Birth/epidemiology , Republic of Korea/epidemiology , SARS-CoV-2 , Weight Gain
20.
Digit Health ; 8: 20552076221120319, 2022.
Article in English | MEDLINE | ID: mdl-36003315

ABSTRACT

Objective: Given the rapid growth of the wearable healthcare device market, we examined the associations among health-related and technology-related characteristics of using wearable healthcare devices and demonstrated how the associations differ between the US and Korean users. Methods: Online self-administered surveys were conducted with 4098 participants (3035 in the US and 1063 in Korea) who were recruited through two online survey service providers based on quota sampling. The primary outcome was the use of wearable healthcare devices. Seven health-related, two technology-related, and five socio-demographic factors were included as explanatory variables. Binary logistic regression analyses and a Chow test were conducted. Results: The health-related characteristics that were significantly associated with using wearable healthcare devices included disease-related worries (ß = 0.11**), health information seeking (ß = 0.26***), physical activity (ß = 0.62***), and health-related expenditures ($50-$199, ß = 0.38***; $200 or more, ß = 0.56***). Hedonic (ß = 0.33***), social (ß = 0.31***), and cognitive innovativeness (ß = 0.14*) also exhibited positive relationships. Younger, higher earner, and individuals with a child were more likely to use wearable healthcare devices. However, for Korean users, several associations disappeared including health information seeking, hedonic and social innovativeness, age, and household income. Conclusions: Key drivers of using wearable healthcare devices include greater concern about a specific illness, active engagement in health-promoting behaviors, and hedonic and social motivation to adopt new technologies. However, more country-specific considerations are needed in future studies to identify the main benefits for target markets.

SELECTION OF CITATIONS
SEARCH DETAIL
...