Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 194
Filter
Add more filters

Publication year range
1.
Br J Nutr ; 131(3): 429-437, 2024 02 14.
Article in English | MEDLINE | ID: mdl-37694674

ABSTRACT

Although the cardiovascular benefits of an increased urinary potassium excretion have been suggested, little is known about the potential cardiac association of urinary potassium excretion in patients with chronic kidney disease. In addition, whether the cardiac association of urinary potassium excretion was mediated by serum potassium levels has not been studied yet. We reviewed the data of 1633 patients from a large-scale multicentre prospective Korean study (2011-2016). Spot urinary potassium to creatinine ratio was used as a surrogate for urinary potassium excretion. Cardiac injury was defined as a high-sensitivity troponin T ≥ 14 ng/l. OR and 95 % (CI for cardiac injury were calculated using logistic regression analyses. Of 1633 patients, the mean spot urinary potassium to creatinine ratio was 49·5 (sd 22·6) mmol/g Cr and the overall prevalence of cardiac injury was 33·9 %. Although serum potassium levels were not associated with cardiac injury, per 10 mmol/g Cr increase in the spot urinary potassium to creatinine ratio was associated with decreased odds of cardiac injury: OR 0·917 (95 % CI 0·841, 0·998), P = 0·047) in multivariate logistic regression analysis. In mediation analysis, approximately 6·4 % of the relationship between spot urinary potassium to creatinine ratio and cardiac injury was mediated by serum potassium levels, which was not statistically significant (P = 0·368). Higher urinary potassium excretion was associated with lower odds of cardiac injury, which was not mediated by serum potassium levels.


Subject(s)
Potassium , Renal Insufficiency, Chronic , Humans , Cohort Studies , Potassium/urine , Creatinine/urine , Prospective Studies , Renal Insufficiency, Chronic/complications , Republic of Korea/epidemiology
2.
BMC Nephrol ; 25(1): 172, 2024 May 20.
Article in English | MEDLINE | ID: mdl-38769500

ABSTRACT

BACKGROUND: Diabetic kidney disease (DKD) stands as the predominant cause of chronic kidney disease and end-stage kidney disease. Its diverse range of manifestations complicates the treatment approach for patients. Although kidney biopsy is considered the gold standard for diagnosis, it lacks precision in predicting the progression of kidney dysfunction. Herein, we addressed whether the presence of glomerular crescents is linked to the outcomes in patients with biopsy-confirmed type 2 DKD. METHODS: We performed a retrospective evaluation, involving 327 patients diagnosed with biopsy-confirmed DKD in the context of type 2 diabetes, excluding cases with other glomerular diseases, from nine tertiary hospitals. Hazard ratios (HRs) were calculated using a Cox regression model to assess the risk of kidney disease progression, defined as either ≥ 50% decrease in estimated glomerular filtration rates or the development of end-stage kidney disease, based on the presence of glomerular crescents. RESULTS: Out of the 327 patients selected, ten patients had glomerular crescents observed in their biopsied tissues. Over the follow-up period (median of 19 months, with a maximum of 18 years), the crescent group exhibited a higher risk of kidney disease progression than the no crescent group, with an adjusted HR of 2.82 (1.32-6.06) (P = 0.008). The presence of heavy proteinuria was associated with an increased risk of developing glomerular crescents. CONCLUSION: The presence of glomerular crescents is indeed linked to the progression of type 2 DKD. Therefore, it is important to determine whether there is an additional immune-mediated glomerulonephritis requiring immunomodulation, and it may be prudent to monitor the histology and repeat a biopsy.


Subject(s)
Diabetes Mellitus, Type 2 , Diabetic Nephropathies , Disease Progression , Kidney Glomerulus , Humans , Diabetic Nephropathies/pathology , Retrospective Studies , Male , Female , Middle Aged , Diabetes Mellitus, Type 2/complications , Kidney Glomerulus/pathology , Aged , Glomerular Filtration Rate , Cohort Studies , Biopsy , Kidney Failure, Chronic , Risk Factors
3.
Biosci Biotechnol Biochem ; 87(7): 696-706, 2023 Jun 23.
Article in English | MEDLINE | ID: mdl-37024271

ABSTRACT

Obesity is caused by the accumulation of excess lipids due to an energy imbalance. Differentiation of pre-adipocytes induces abnormal lipid accumulation, and reactive oxygen species (ROS) generated in this process promote the differentiation of pre-adipocytes through mitogen-activated protein kinase (MAPK) signaling. Peroxiredoxin (Prx) is a potent antioxidant enzyme, and peroxiredoxin 5 (Prx5), which is mainly expressed in cytosol and mitochondria, inhibits adipogenesis by regulating ROS levels. Based on previous findings, the present study was performed to investigate whether cytosolic Prx5 (CytPrx5) or mitochondrial Prx5 (MtPrx5) has a greater effect on the inhibition of adipogenesis. In this study, MtPrx5 decreased insulin-mediated ROS levels to reduce adipogenic gene expression and lipid accumulation more effectively than CytPrx5. In addition, we found that p38 MAPK mainly participates in adipogenesis. Furthermore, we verified that MtPrx5 overexpression suppressed the phosphorylation of p38 during adipogenesis. Thus, we suggest that MtPrx5 inhibits insulin-induced adipogenesis more effectively than CytPrx5.


Subject(s)
Adipogenesis , Insulin , p38 Mitogen-Activated Protein Kinases , Animals , Mice , 3T3-L1 Cells , Cell Differentiation , Insulin/metabolism , Lipids/pharmacology , Mitochondria/metabolism , Peroxiredoxins/genetics , Peroxiredoxins/metabolism , Peroxiredoxins/pharmacology , Phosphorylation , Reactive Oxygen Species/metabolism , p38 Mitogen-Activated Protein Kinases/metabolism
4.
J Korean Med Sci ; 38(13): e96, 2023 Apr 03.
Article in English | MEDLINE | ID: mdl-37012684

ABSTRACT

In mid-2022, as the wave of pediatric coronavirus disease 2019 (COVID-19) cases escalated in South Korea, a public-private partnership was made to establish a Pediatric COVID-19 Module Clinic (PMC). We describe the utilization of the first prototype children's modular clinic in Korea University Anam Hospital functioning as the COVID-19 PMC. Between August 1 and September 30, 2022, a total of 766 children visited COVID-19 PMC. Daily number of patient visits to the COVID-19 PMC ranged between 10 and 47 in August; and less than 13 patients per day in September 2022. Not only the model provided timely care for the COVID-19 pediatric patients, but it also enabled safe and efficacious care for the non-COVID-19 patients in the main hospital building while minimizing exposure risk to severe acute respiratory syndrome coronavirus 2 transmission. Current description highlights the importance of spatial measures for mitigating in-hospital transmission of COVID-19, in specifically on pediatric care.


Subject(s)
COVID-19 , Child , Humans , Pandemics , SARS-CoV-2 , Ambulatory Care Facilities , Hospitals
5.
Korean J Chem Eng ; : 1-8, 2023 May 17.
Article in English | MEDLINE | ID: mdl-37363782

ABSTRACT

Municipal solid waste (MSW) management is an essential municipal service. Proper waste treatment is an important part of the waste management. Thermocatalytic waste upcycling has recently gained great interest and attention as a method to extract value from waste, which potentially substitutes traditional waste treatment methods. This study aims at demonstrating the potential for thermocatalytic waste upcycling using spent disposable wipes as an MSW surrogate. Two different Ni/Al2O3 catalysts were prepared, treated under two different atmospheres (N2 and CO2). The catalyst treated in N2 (Ni/Al2O3-N2) exhibited a higher surface metallic Ni site than the catalyst treated in CO2 (Ni/Al2O3-CO2). The use of the Ni/Al2O3-N2 increased the yield of gas pyrolysate and decreased the yield of byproduct (e.g., wax), compared with no catalyst and the Ni/Al2O3-CO2. In particular, the Ni/Al2O3-N2 catalyst affected the generation of gaseous hydrogen (H2) by increasing the H2 yield by up to 102% in comparison with the other thermocatalytic systems. The highest H2 yield obtained with the Ni/Al2O3-N2 was attributed to the most surface metallic Ni sites. However, the Ni/Al2O3-N2 catalyst led to char having a lower higher heating value than the other catalysts due to its lowest carbon content. The results indicated that the reduction treatment environment for Ni/Al2O3 catalyst influences thermocatalytic conversion product yields of spent disposable wipes, including enhanced H2 production. Electronic Supplementary Material: Supplementary material is available in the online version of this article at 10.1007/s11814-023-1461-8.

6.
J Appl Clin Med Phys ; 23(8): e13699, 2022 Aug.
Article in English | MEDLINE | ID: mdl-35856943

ABSTRACT

PURPOSE: Well-designed routine multileaf collimator (MLC) quality assurance (QA) is important to assure external-beam radiation treatment delivery accuracy. This study evaluates the clinical necessity of a comprehensive weekly (C-Weekly) MLC QA program compared to the American Association of Physics in Medicinerecommended weekly picket fence test (PF-Weekly), based on our seven-year experience with weekly MLC QA. METHODS: The C-Weekly MLC QA program used in this study includes 5 tests to analyze: (1) absolute MLC leaf position; (2) interdigitation MLC leaf position; (3) picket fence MLC leaf positions at static gantry angle; (4) minimum leaf-gap setting; and (5) volumetric-modulated arc therapy delivery. A total of 20,226 QA images from 16,855 tests (3,371 tests × 5) for 11 linacs at 5 photon clinical sites from May 2014 to June 2021 were analyzed. Failure mode and effects analysis was performed with 5 failure modes related to the 5 tests. For each failure mode, a risk probability number (RPN) was calculated for a C-Weekly and a PF-Weekly MLC QA program. The probability of occurrence was evaluated from statistical analyses of the C-Weekly MLC QA. RESULTS: The total number of failures for these 16,855 tests was 143 (0.9%): 39 (27.3%) for absolute MLC leaf position, 13 (9.1%) for interdigitation position, 9 (6.3%) for static gantry picket fence, 2 (1.4%) for minimum leaf-gap setting, and 80 (55.9%) for VMAT delivery. RPN scores for PF-Weekly MLC QA ranged from 60 to 192 and from 48 to 96 for C-Weekly MLC QA. CONCLUSION: RPNs for the 5 failure modes of MLC QA tests were quantitatively determined and analyzed. A comprehensive weekly MLC QA is imperative to lower the RPNs of the 5 failure modes to the desired level (<125); those from the PF-Weekly MLC QA program were found to be higher (>125). This supports the clinical necessity for comprehensive weekly MLC QA.


Subject(s)
Particle Accelerators , Radiotherapy, Intensity-Modulated , Electrical Equipment and Supplies , Humans , Radiotherapy, Intensity-Modulated/methods
7.
Environ Geochem Health ; 44(11): 3953-3965, 2022 Nov.
Article in English | MEDLINE | ID: mdl-34766236

ABSTRACT

Endosulfan was widely used as an insecticide in the agricultural sector before its environmental persistence was fully understood. Although its fate and transport in the environment have been studied, the effects of historic endosulfan residues in soil and its bioaccumulation in crops are not well understood. This knowledge gap was addressed by investigating the dissipation and bioaccumulation of endosulfan in ginseng as a perennial crop in fresh and aged endosulfan-contaminated fields. In addition, the effect of granular biochar (GBC) treatment on the bioaccumulation factor (BAF) of endosulfan residue in ginseng was assessed. The 50% dissipation time (DT50) of the total endosulfan was over 770 days in both the fresh and aged soils under mulching conditions. This was at least twofold greater than the reported (6- > 200 days) in arable soil. Among the endosulfan congeners, the main contributor to the soil residue was endosulfan sulfate, as observed from 150 days after treatment. The BAF for the 2-year-old ginseng was similar in the fresh (1.682-2.055) and aged (1.372-2.570) soils, whereas the BAF for the 3-year-old ginseng in the aged soil (1.087-1.137) was lower than that in the fresh soil (1.771-2.387). The treatment with 0.3 wt% GBC extended the DT50 of endosulfan in soil; however, this could successfully suppress endosulfan uptake, and reduced the BAFs by 66.5-67.7% in the freshly contaminated soil and 32.3-41.4% in the aged soil. Thus, this adsorbent treatment could be an effective, financially viable, and sustainable option to protect human health by reducing plant uptake of endosulfan from contaminated soils.


Subject(s)
Insecticides , Panax , Soil Pollutants , Humans , Child, Preschool , Endosulfan , Insecticides/analysis , Farms , Soil Pollutants/analysis , Soil/chemistry , Crops, Agricultural
8.
Crit Care ; 25(1): 29, 2021 01 18.
Article in English | MEDLINE | ID: mdl-33461588

ABSTRACT

BACKGROUND: A prediction model of mortality for patients with acute poisoning has to consider both poisoning-related characteristics and patients' physiological conditions; moreover, it must be applicable to patients of all ages. This study aimed to develop a scoring system for predicting in-hospital mortality of patients with acute poisoning at the emergency department (ED). METHODS: This was a retrospective analysis of the Injury Surveillance Cohort generated by the Korea Center for Disease Control and Prevention (KCDC) during 2011-2018. We developed the new-Poisoning Mortality Scoring system (new-PMS) to generate a prediction model using the derivation group (2011-2017 KCDC cohort). Points were computed for categories of each variable. The sum of these points was the new-PMS. The validation group (2018 KCDC cohort) was subjected to external temporal validation. The performance of new-PMS in predicting mortality was evaluated using area under the receiver operating characteristic curve (AUROC) for both the groups. RESULTS: Of 57,326 poisoning cases, 42,568 were selected. Of these, 34,352 (80.7%) and 8216 (19.3%) were enrolled in the derivation and validation groups, respectively. The new-PMS was the sum of the points for each category of 10 predictors. The possible range of the new-PMS was 0-137 points. Hosmer-Lemeshow goodness-of-fit test showed adequate calibration for the new-PMS with p values of 0.093 and 0.768 in the derivation and validation groups, respectively. AUROCs of the new-PMS were 0.941 (95% CI 0.934-0.949, p < 0.001) and 0.946 (95% CI 0.929-0.964, p < 0.001) in the derivation and validation groups, respectively. The sensitivity, specificity, and accuracy of the new-PMS (cutoff value: 49 points) were 86.4%, 87.2%, and 87.2% and 85.9%, 89.5%, and 89.4% in the derivation and validation groups, respectively. CONCLUSIONS: We developed a new-PMS system based on demographic, poisoning-related variables, and vital signs observed among patients at the ED. The new-PMS showed good performance for predicting in-hospital mortality in both the derivation and validation groups. The probability of death increased according to the increase in the new-PMS. The new-PMS accurately predicted the probability of death for patients with acute poisoning. This could contribute to clinical decision making for patients with acute poisoning at the ED.


Subject(s)
Mortality/trends , Poisoning/mortality , APACHE , Adult , Aged , Aged, 80 and over , Area Under Curve , Emergency Service, Hospital/organization & administration , Emergency Service, Hospital/trends , Female , Humans , Logistic Models , Male , Middle Aged , Population Surveillance/methods , ROC Curve , Republic of Korea , Research Design/standards , Retrospective Studies
9.
Kidney Blood Press Res ; 46(4): 460-468, 2021.
Article in English | MEDLINE | ID: mdl-34091449

ABSTRACT

INTRODUCTION: The renal hazard of polypharmacy has never been evaluated in predialysis chronic kidney disease (CKD) patients. OBJECTIVE: We aimed to analyze the renal hazard of polypharmacy in predialysis CKD patients with stage 1-5. METHOD: The data of 2,238 patients from a large-scale multicenter prospective Korean study (2011-2016), excluding 325 patients with various missing data, were reviewed. Polypharmacy was defined as taking 6 or more medications at the time of enrollment; renal events were defined as a ≥50% decrease in kidney function from baseline values, doubling of the serum creatinine levels, or initiation of renal replacement treatment. Hazard ratio (HR) and 95% confidence interval (CI) were calculated using Cox proportional-hazard regression analysis. RESULTS: Of the 1,913 patients, the mean estimated glomerular filtration rate was 53.6 mL/min/1.73 m2. The mean medication count was 4.1, and the prevalence of polypharmacy was 27.1%. During the average period of 3.6 years, 520 patients developed renal events (27.2%). Although increased medication counts were associated with increased renal hazard with HR (95% CI) of 1.056 (1.007-1.107, p = 0.025), even after adjusting for various confounders, adding comorbidity score and kidney function nullified the statistical significance. In mediation analysis, 55.6% (p = 0.016) of renal hazard in increased medication counts was mediated by the kidney function, and there was no direct effect of medication counts on renal event development. In subgroup analysis, the renal hazard of the medication counts was evident only in stage 1-3 of CKD patients (p for interaction = 0.014). CONCLUSIONS: We cannot identify the direct renal hazard of multiple medications, and most of the potential renal hazard was derived from intimate relationship with disease burden and kidney function.


Subject(s)
Kidney/drug effects , Polypharmacy , Renal Insufficiency, Chronic/drug therapy , Adult , Aged , Disease Progression , Female , Glomerular Filtration Rate , Humans , Kidney/physiopathology , Male , Middle Aged , Proportional Hazards Models , Prospective Studies , Renal Insufficiency, Chronic/epidemiology , Renal Insufficiency, Chronic/physiopathology , Republic of Korea/epidemiology
10.
BMC Pulm Med ; 21(1): 110, 2021 Apr 01.
Article in English | MEDLINE | ID: mdl-33794844

ABSTRACT

BACKGROUND: Hyperglycemic conditions are associated with respiratory dysfunction. Although several studies have reported that insulin resistance (IR) is related to decreased lung function, the association between IR and change in lung function has been rarely studied. This study aimed to investigate the potential association of IR on annual change in lung function using a community-based prospective cohort in Korea. METHODS: We selected 4827 Korean participants whose serial lung functions were assessed over 4 years using 1:3 propensity score matching. Exposure was baseline IR estimated with homeostatic model assessment (HOMA-IR), and outcomes were annual changes in lung function determined by calculating the regression coefficient using least-square linear regression analysis. RESULTS: In the multivariate linear regression, per one unit increased log transformed HOMA-IR was associated with decline in FEV1%-predicted (ß: - 0.23, 95% CI: - 0.36 to - 0.11) and FVC %-predicted (ß: - 0.20, 95% CI: - 0.33 to - 0.08), respectively. In the generalized additive model plot, HOMA-IR showed a negative linear association with annual changes in FEV1%-predicted and FVC %-predicted. The suggested threshold of HOMA-IR for decline in lung function was 1.0 unit for annual change in FEV1%-predicted and 2.2 unit for annual change in FVC %-predicted. Age showed statistically significant effect modification on the relationship between HOMA-IR and annual change in FEV1%-predicted. Increased HOMA-IR was associated with the decreased annual change in FEV1%-predicted, particularly in older people. CONCLUSIONS: In South Korea, increased HOMA-IR was associated with decline in lung function. Since IR was related to decline in FEV1%-predicted, particularly in older people, tailored approaches are needed in these populations. The potential pulmonary hazard of IR needs to be confirmed in future studies.


Subject(s)
Forced Expiratory Volume/physiology , Insulin Resistance/physiology , Adult , Aged , Diabetes Mellitus , Female , Humans , Hyperglycemia , Linear Models , Male , Middle Aged , Multivariate Analysis , Prospective Studies , Republic of Korea , Respiratory Function Tests/methods , Risk Factors
11.
J Korean Med Sci ; 36(25): e173, 2021 Jun 28.
Article in English | MEDLINE | ID: mdl-34184437

ABSTRACT

BACKGROUND: Survival and post-cardiac arrest care vary considerably by hospital, region, and country. In the current study, we aimed to analyze mortality in patients who underwent cardiac arrest by hospital level, and to reveal differences in patient characteristics and hospital factors, including post-cardiac arrest care, hospital costs, and adherence to changes in resuscitation guidelines. METHODS: We enrolled adult patients (≥ 20 years) who suffered non-traumatic cardiac arrest from 2006 to 2015. Patient demographics, insurance type, admission route, comorbidities, treatments, and hospital costs were extracted from the National Health Insurance Service database. We categorized patients into tertiary hospital, general hospital, and hospital groups according to the level of the hospital where they were treated. We analyzed the patients' characteristics, hospital factors, and mortalities among the three groups. We also analyzed post-cardiac arrest care before and after the 2010 guideline changes. The primary end-point was 30 days and 1 year mortality rates. RESULTS: The tertiary hospital, general hospital, and hospital groups represented 32.6%, 49.6%, and 17.8% of 337,042 patients, respectively. The tertiary and general hospital groups were younger, had a lower proportion of medical aid coverage, and fewer comorbidities, compared to the hospital group. Post-cardiac arrest care, such as percutaneous coronary intervention, targeted temperature management, and extracorporeal membrane oxygenation, were provided more frequently in the tertiary and general hospital groups. After adjusting for age, sex, insurance type, urbanization level, admission route, comorbidities, defibrillation, resuscitation medications, angiography, and guideline changes, the tertiary and general hospital groups showed lower 1-year mortality (tertiary hospital vs. general hospital vs. hospital, adjusted odds ratios, 0.538 vs. 0.604 vs. 1; P < 0.001). After 2010 guideline changes, a marked decline in atropine use and an increase in post-cardiac arrest care were observed in the tertiary and general hospital groups. CONCLUSION: The tertiary and general hospital groups showed lower 30 days and 1 year mortality rates than the hospital group, after adjusting for patient characteristics and hospital factors. Higher-level hospitals provided more post-cardiac arrest care, which led to high hospital costs, and showed good adherence to the guideline change after 2010.


Subject(s)
Heart Arrest/mortality , Adult , Aged , Cardiopulmonary Resuscitation/mortality , Extracorporeal Membrane Oxygenation , Female , Hospital Costs , Hospital Mortality , Hospitals , Humans , Hypothermia, Induced , Korea , Male , Middle Aged , Percutaneous Coronary Intervention
12.
J Korean Med Sci ; 36(25): e172, 2021 Jun 28.
Article in English | MEDLINE | ID: mdl-34184436

ABSTRACT

BACKGROUND: Inter-hospital transfer (IHT) for emergency department (ED) admission is a burden to high-level EDs. This study aimed to evaluate the prevalence and ED utilization patterns of patients who underwent single and double IHTs at high-level EDs in South Korea. METHODS: This nationwide cross-sectional study analyzed data from the National Emergency Department Information System for the period of 2016-2018. All the patients who underwent IHT at Level I and II emergency centers during this time period were included. The patients were categorized into the single-transfer and double-transfer groups. The clinical characteristics and ED utilization patterns were compared between the two groups. RESULTS: We found that 2.1% of the patients in the ED (n = 265,046) underwent IHTs; 18.1% of the pediatric patients (n = 3,556), and 24.2% of the adult patients (n = 59,498) underwent double transfers. Both pediatric (median, 141.0 vs. 208.0 minutes, P < 0.001) and adult (median, 189.0 vs. 308.0 minutes, P < 0.001) patients in the double-transfer group had longer duration of stay in the EDs. Patient's request was the reason for transfer in 41.9% of all IHTs (111,076 of 265,046). Unavailability of medical resources was the reason for transfer in 30.0% of the double transfers (18,920 of 64,054). CONCLUSION: The incidence of double-transfer of patients is increasing. The main reasons for double transfers were patient's request and unavailability of medical resources at the first-transfer hospitals. Emergency physicians and policymakers should focus on lowering the number of preventable double transfers.


Subject(s)
Emergency Service, Hospital/statistics & numerical data , Length of Stay/statistics & numerical data , Patient Transfer/statistics & numerical data , Adolescent , Adult , Aged , Aged, 80 and over , Child , Child, Preschool , Cross-Sectional Studies , Delivery of Health Care , Emergency Service, Hospital/organization & administration , Humans , Infant , Middle Aged , Patient Transfer/organization & administration , Prevalence , Prospective Studies , Republic of Korea , Young Adult
13.
J Appl Clin Med Phys ; 22(3): 234-245, 2021 Mar.
Article in English | MEDLINE | ID: mdl-33705604

ABSTRACT

PURPOSE: The recently published AAPM TG-275 and the public review version of TG-315 list new recommendations for comprehensive and minimum physics initial chart checks, respectively. This article addresses the potential development and benefit of initial chart check automation when these recommendations are implemented for clinical photon/electron EBRT. METHODS: Eight board-certified physicists with 2-20 years of clinical experience performed initial chart checks using checklists from TG-275 and TG-315. Manual check times were estimated for three types of plans (IMRT/VMAT, 3D, and 2D) and for prostate, whole pelvis, lung, breast, head and neck, and brain cancers. An expert development team of three physicists re-evaluated the automation feasibility of TG-275 checklist based on their experience of developing and implementing the in-house and the commercial automation tools in our institution. Three levels of initial chart check automation were simulated: (1) Auto_UMMS_tool (which consists of in-house program and commercially available software); (2) Auto_TG275 (with full and partial automation as indicated in TG-275); and (3) Auto_UMMS_exp (with full and partial automation as determined by our experts' re-evaluation). RESULTS: With no automation of initial chart checks, the ranges of manual check times were 29-56 min (full TG-315 list) and 102-163 min (full TG-275 list), which varied significantly with physicists but varied little at different tumor sites. The 69 of 71 checks which were considered as "not fully automated" in TG-275 were re-evaluated with more automation feasibility. Compared to no automation, the higher levels of automation yielded a great reduction in both manual check times (by 44%-98%) and potentially residual detectable errors (by 15-85%). CONCLUSION: The initial chart check automation greatly improves the practicality and efficiency of implementing the new TG recommendations. Revisiting the TG reports with new technology/practice updates may help develop and utilize more automation clinically.


Subject(s)
Electrons , Radiotherapy Planning, Computer-Assisted , Automation , Humans , Male , Photons , Quality Assurance, Health Care
14.
BMC Emerg Med ; 21(1): 71, 2021 06 16.
Article in English | MEDLINE | ID: mdl-34134648

ABSTRACT

BACKGROUND: In-hospital mortality and short-term mortality are indicators that are commonly used to evaluate the outcome of emergency department (ED) treatment. Although several scoring systems and machine learning-based approaches have been suggested to grade the severity of the condition of ED patients, methods for comparing severity-adjusted mortality in general ED patients between different systems have yet to be developed. The aim of the present study was to develop a scoring system to predict mortality in ED patients using data collected at the initial evaluation and to validate the usefulness of the scoring system for comparing severity-adjusted mortality between institutions with different severity distributions. METHODS: The study was based on the registry of the National Emergency Department Information System, which is maintained by the National Emergency Medical Center of the Republic of Korea. Data from 2016 were used to construct the prediction model, and data from 2017 were used for validation. Logistic regression was used to build the mortality prediction model. Receiver operating characteristic curves were used to evaluate the performance of the prediction model. We calculated the standardized W statistic and its 95% confidence intervals using the newly developed mortality prediction model. RESULTS: The area under the receiver operating characteristic curve of the developed scoring system for the prediction of mortality was 0.883 (95% confidence interval [CI]: 0.882-0.884). The Ws score calculated from the 2016 dataset was 0.000 (95% CI: - 0.021 - 0.021). The Ws score calculated from the 2017 dataset was 0.049 (95% CI: 0.030-0.069). CONCLUSIONS: The scoring system developed in the present study utilizing the parameters gathered in initial ED evaluations has acceptable performance for the prediction of in-hospital mortality. Standardized W statistics based on this scoring system can be used to compare the performance of an ED with the reference data or with the performance of other institutions.


Subject(s)
Emergency Service, Hospital , Hospital Mortality , Humans , Logistic Models , ROC Curve , Republic of Korea
15.
Nephrol Dial Transplant ; 35(1): 147-154, 2020 01 01.
Article in English | MEDLINE | ID: mdl-30053139

ABSTRACT

BACKGROUND: Few studies have examined the association between hepcidin, iron indices and bone mineral metabolism in non-dialysis chronic kidney disease (CKD) patients. METHODS: We reviewed the data of 2238 patients from a large-scale multicenter prospective Korean study (2011-16) and excluded 214 patients with missing data on markers and related medications of iron and bone mineral metabolism, hemoglobin, blood pressure and causes of CKD. Multivariate linear regression analysis was used to identify the association between iron and bone mineral metabolism. RESULTS: The proportion of CKD Stages 1-5 were 16.2, 18.7, 37.1, 21.6 and 6.4%, respectively. Per each 10% increase in transferrin saturation (TSAT), there was a 0.013 mmol/L decrease in phosphorus [95% confidence interval (CI) -0.021 to -0.004; P = 0.003] and a 0.022 nmol/L increase in logarithmic 25-hydroxyvitamin D (Ln-25OHD) levels (95% CI 0.005-0.040; P = 0.019). A 1 pmol/L increase in Ln-ferritin was associated with a 0.080 ng/L decrease in Ln-intact parathyroid hormone (Ln-iPTH; 95% CI -0.122 to -0.039; P < 0.001). Meanwhile, beta (95% CI) per 1 unit increase in phosphorus, Ln-25OHD and Ln-iPTH for the square root of the serum hepcidin were 0.594 (0.257-0.932; P = 0.001), -0.270 (-0.431 to -0.108; P = 0.001) and 0.115 (0.004-0.226; P = 0.042), respectively. In subgroup analysis, the relationship between phosphorus, 25OHD and hepcidin was strongest in the positive-inflammation group. CONCLUSIONS: Markers of bone mineral metabolism and iron status, including hepcidin, were closely correlated to each other. Potential mechanisms of the relationship warrant further studies.


Subject(s)
Anemia/diagnosis , Biomarkers/blood , Bone Diseases, Metabolic/diagnosis , Hepcidins/blood , Inflammation/diagnosis , Iron/blood , Renal Insufficiency, Chronic/complications , Anemia/blood , Anemia/etiology , Bone Diseases, Metabolic/blood , Bone Diseases, Metabolic/etiology , Female , Ferritins/blood , Hemoglobins/analysis , Humans , Inflammation/blood , Inflammation/etiology , Male , Middle Aged , Minerals/analysis , Prospective Studies
16.
Kidney Blood Press Res ; 45(3): 442-454, 2020.
Article in English | MEDLINE | ID: mdl-32369813

ABSTRACT

INTRODUCTION: Thyroid function is evaluated by thyroid stimulating hormone (TSH) and free thyroxine (fT4). Although many studies have indicated an intimate relationship between thyroid hormones and kidney functions, reports about the simultaneous evaluation of TSH and fT4 are rare. OBJECTIVE: We aimed to analyze the association between TSH and kidney function, with emphasis on a potential nonlinear relationship, and identify an independent relationship between fT4 and kidney function. METHODS: We reviewed the data of 7,061 subjects in the Korea National Health and Nutrition Examination Surveys who were randomly subsampled for thyroid function evaluation between 2013 and 2015. A total of 5,578 subjects were included in the final analysis, after excluding people <18 years old, and those with a short fasting time, abnormal fT4 levels, and thyroid disease or related medications. Creatinine-based estimated glomerular filtration rate (eGFR) was used to define kidney function. RESULTS: A 1 mmol/L increase of logarithmic TSH was associated with decreased eGFR (ß: -1.8; 95% CI -2.3 to -1.2; p < 0.001), according to multivariate linear regression analysis. On the multivariate generalized additive model plot, TSH demonstrated an L-shaped relationship with eGFR, showing a steeper slope for 0-4 mIU/L of TSH. A 1 µg/dL increase of fT4 was also associated with decreased eGFR (ß: -7.0; 95% CI -0.94 to -4.7; p < 0.001) on the multivariate linear regression analysis; this association was reversed after adjusting for age. On the mediation analysis, the indirect effect via age and direct effect per 1 µg/dL increase of fT4 on eGFR was 9.9 (8.1 to 11.7, p < 0.001) and -7.1 (-9.3 to -4.8, p < 0.001), respectively. CONCLUSIONS: Increased TSH was associated with decreased eGFR, particularly in the reference range. The direct effect of increased fT4 was decreased eGFR, which may be affected indirectly by age.


Subject(s)
Kidney Function Tests/standards , Thyroid Function Tests/standards , Adult , Female , History, 21st Century , Humans , Male , Nutrition Surveys , Reference Values , Republic of Korea
17.
J Korean Med Sci ; 35(1): e2, 2020 Jan 06.
Article in English | MEDLINE | ID: mdl-31898431

ABSTRACT

BACKGROUND: Few studies have examined the relationship between cardiac function and geometry and serum hepcidin levels in patients with chronic kidney disease (CKD). We aimed to identify the relationship between cardiac function and geometry and serum hepcidin levels. METHODS: We reviewed data of 1,897 patients in a large-scale multicenter prospective Korean study. Logistic regression analysis was used to identify the relationship between cardiac function and geometry and serum hepcidin levels. RESULTS: The mean relative wall thickness (RWT) and left ventricular mass index (LVMI) were 0.38 and 42.0 g/m2.7, respectively. The mean ejection fraction (EF) and early diastolic mitral inflow to annulus velocity ratio (E/e') were 64.1% and 9.9, respectively. Although EF and E/e' were not associated with high serum hepcidin, RWT and LVMI were significantly associated with high serum hepcidin levels in univariate logistic regression analysis. In multivariate logistic regression analysis after adjusting for variables related to anemia, bone mineral metabolism, comorbidities, and inflammation, however, only each 0.1-unit increase in RWT was associated with increased odds of high serum hepcidin (odds ratio, 1.989; 95% confidence interval, 1.358-2.916; P < 0.001). In the subgroup analysis, the independent relationship between RWT and high serum hepcidin level was valid only in women and patients with low transferrin saturation (TSAT). CONCLUSION: Although the relationship was not cause-and-effect, increased RWT was independently associated with high serum hepcidin, particularly in women and patients with low TSAT. The relationship between cardiac geometry and serum hepcidin in CKD patients needs to be confirmed in future studies.


Subject(s)
Hepcidins/blood , Renal Insufficiency, Chronic/diagnosis , Ventricular Function, Left/physiology , Adult , Aged , Anemia/complications , Echocardiography , Female , Heart Ventricles/anatomy & histology , Heart Ventricles/physiopathology , Humans , Logistic Models , Male , Middle Aged , Odds Ratio , Prospective Studies , Renal Insufficiency, Chronic/pathology , Risk Factors , Sex Factors , Stroke Volume , Transferrin/analysis
18.
Perfusion ; 35(1): 39-47, 2020 01.
Article in English | MEDLINE | ID: mdl-31146644

ABSTRACT

BACKGROUND: The objectives of this study were to 1) identify the risk factors for predicting re-arrest and 2) determine whether extracorporeal cardiopulmonary resuscitation results in better outcomes than conventional cardiopulmonary resuscitation for managing re-arrest in out-of-hospital cardiac arrest patients. METHODS: This retrospective analysis was based on a prospective cohort. We included adult patients with non-traumatic out-of-hospital cardiac arrest who achieved a survival event. The primary measurement was re-arrest, defined as recurrent cardiac arrest within 24 hours after survival event. Multiple logistic regression analysis was used to predict re-arrest. Subgroup analysis was performed to evaluate the effect of extracorporeal cardiopulmonary resuscitation on the survival to discharge in out-of-hospital cardiac arrest patients who experienced re-arrest. RESULTS: Of 534 patients suitable for inclusion, 203 (38.0%) were enrolled in the re-arrest group. Old age, prolonged advanced cardiac life support duration and the presence of hypotension at 0 hours after survival event were independent variables predicting re-arrest. In the re-arrest group, the extracorporeal cardiopulmonary resuscitation group (n = 25) showed better outcomes than the conventional cardiopulmonary resuscitation group. However, multiple logistic regression for predicting survival to discharge revealed that extracorporeal cardiopulmonary resuscitation was not an independent factor. Multiple logistic regression revealed that a hypotensive state at re-arrest was an independent risk factor for survival. CONCLUSION: Alternative methods that reduce the advanced cardiac life support duration should be considered to prevent re-arrest and attain good outcomes in out-of-hospital cardiac arrest patients. Extracorporeal cardiopulmonary resuscitation for re-arrest tended to show a good outcome compared to conventional cardiopulmonary resuscitation for re-arrest. Avoiding or immediately correcting hypotension may prevent re-arrest and improve the outcome of re-arrested patients.


Subject(s)
Cardiopulmonary Resuscitation , Extracorporeal Circulation , Out-of-Hospital Cardiac Arrest/therapy , Adolescent , Adult , Aged , Cardiopulmonary Resuscitation/adverse effects , Cardiopulmonary Resuscitation/mortality , Extracorporeal Circulation/adverse effects , Extracorporeal Circulation/mortality , Female , Hospital Mortality , Humans , Male , Middle Aged , Out-of-Hospital Cardiac Arrest/diagnosis , Out-of-Hospital Cardiac Arrest/mortality , Out-of-Hospital Cardiac Arrest/physiopathology , Recovery of Function , Recurrence , Registries , Retreatment , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , Young Adult
19.
Am Heart J ; 213: 73-80, 2019 07.
Article in English | MEDLINE | ID: mdl-31129440

ABSTRACT

BACKGROUND: False positive rate (FPR) of the current basic life support (BLS) termination of resuscitation (TOR) rule in out-of-hospital cardiac arrest (OHCA) patients (not witnessed; no return of spontaneous circulation prior to transport; and no shocks were delivered) has been ethically challenging. We validated the current BLS TOR rule with using nationwide Korean Cardiac Arrest Research Consortium (KoCARC) registry and identified the factors for modifying the rules. METHODS: This prospective, multicenter, registry-based study was performed using the nontraumatic OHCA registry data between October 2015 and June 2017. Independent factors associated with poor neurologic outcome were identified to propose new KoCARC TOR rules by using multivariable analysis. The diagnostic performances of the TOR rules were calculated respectively. RESULTS: Among 4,360 OHCA patients, 2,801 (64.2%) satisfied all 3 criteria of the BLS TOR rule. The FPR and positive predictive value of the BLS TOR rule were 5.9% and 99.3%. Asystole as initial rhythm and age > 60 years were found as new factors for modifying the TOR rule. New KoCARC TOR rules, combination of asystole and age > 60 years with current TOR rule, showed lower FPR (0.3%-2.1%) and higher positive predictive value (99.7%-99.9%) for predicting poor neurologic outcome at discharge. CONCLUSIONS: In this recent nationwide cohort, the current BLS TOR rule showed high FPR (5.9%) for predicting poor neurologic outcome. We anticipate that our new KoCARC TOR rules, application of 2 new factors (asystole as initial rhythm and age > 60 years) with BLS TOR rule, could reduce unwarranted death.


Subject(s)
Cardiopulmonary Resuscitation , Out-of-Hospital Cardiac Arrest/therapy , Registries , Resuscitation Orders , Withholding Treatment , Age Factors , Aged , Blood Circulation , Cardiopulmonary Resuscitation/statistics & numerical data , Electric Countershock , False Positive Reactions , Female , Heart Arrest , Humans , Male , Medical Futility , Middle Aged , Multivariate Analysis , Out-of-Hospital Cardiac Arrest/etiology , Predictive Value of Tests , Republic of Korea , Treatment Outcome
20.
Kidney Blood Press Res ; 44(5): 1089-1100, 2019.
Article in English | MEDLINE | ID: mdl-31505490

ABSTRACT

BACKGROUND: Urine osmolality indicates the ability of the kidney to concentrate the urine and reflects the antidiuretic action of vasopressin. However, results about the association between urine osmolality and adverse renal outcomes in chronic kidney disease (CKD) are conflicting. We investigated the association between urine osmolality and adverse renal outcomes in a nationwide prospective CKD cohort. METHODS: A total of 1,999 CKD patients were categorized into 3 groups according to their urine osmolality tertiles. Primary outcome was a composite of 50% decline in the estimated glomerular filtration rate (eGFR), initiation of dialysis, or kidney transplantation. RESULTS: During a mean follow-up of 35.2 ± 19.0 months, primary outcome occurred in 432 (21.6%) patients; 240 (36.4%), 162 (24.3%), and 30 (4.5%) in the lowest, middle, and highest tertiles, respectively. Low urine osmolality was independently associated with a greater risk of CKD progression (hazard ratio [HR], 1.71; 95% confidence interval [CI], 1.12-2.59). This association was particularly evident in patients with CKD stages 3-4 (per 10 mosm/kg decrease; HR, 1.02; 95% CI, 1.00-1.03). Adding urine osmolality to a base model with conventional factors significantly increased the ability to predict CKD progression (C-statistics, 0.86; integrated discrimination improvement [IDI], 0.021; both p < 0.001). However, adding both urine osmolality and eGFR did not further improve the predictive ability compared with the addition of eGFR only (C-statistics, p = 0.29; IDI, p = 0.09). CONCLUSIONS: Low urine osmolality was an independent risk factor for adverse renal outcomes in CKD patients, but its predictive ability did not surpass eGFR. Thus, kidney function should be considered while interpreting the clinical significance of urine osmolality.


Subject(s)
Glomerular Filtration Rate/physiology , Osmolar Concentration , Renal Insufficiency, Chronic/urine , Female , Humans , Male , Middle Aged , Prospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL