RESUMEN
Introduction: Statin treatment can reduce the risk of cardiovascular disease (CVD). Paradoxically, previous studies have shown that the use of statin is associated with the progression coronary artery calcification (CAC), a well-known predictor of CVD, in individuals with preserved renal function or in patients on dialysis. However, little is known about the association in patients with predialysis chronic kidney disease (CKD). The aim of this study was to characterize the relationship between statin use and progression of CAC in a CKD cohort of Korean adults. Methods: We analyzed 1177 participants registered in the Korean Cohort Study for Outcome in Patients with Chronic Kidney Disease (KNOW-CKD) cohort. Coronary artery calcium score (CACS) was assessed using cardiac computed tomography at baseline and 4 years after enrollment. CAC progression was defined using the Sevrukov method. Statin users were defined as those who used statins for 50% or more of the follow-up period. Results: The median (interquartile range) of CACS was 0 (0-30.33), and 318 (44.2%) participants had CACS above 0 at baseline. There were 447 (38.0%) statin users and 730 (62.0%) statin nonusers. After 4 years, 374 patients (52.0%) demonstrated CAC progression, which was significantly more frequent in statin users than in statin nonusers (218 [58.3%] vs. 156 [41.7%], P < 0.001). The multivariate-adjusted odds ratio for CAC progression in statin users compared to statin nonusers was 1.78 (1.26-2.50). Conclusion: Statin use, significantly and independently, is associated with CAC progression in Korean patients with predialysis CKD. Further research is warranted to verify the prognosis of statin-related CAC progression.
RESUMEN
BACKGROUND: To clarify if blood proteins can predict disease progression among individuals at clinical high-risk of severe mental illness (CHR-SMI), we developed a statistical model incorporating clinical and blood protein markers to distinguish the transition group (who developed severe mental illness) (CHR-SMI-T) and from non-transition group (CHR-SMI-NT) at baseline. METHODS: Ninety individuals (74 at CHR-SMI: 16 patients) were monitored for ≤4 years and were the focus of predictive models. Three predictive models (1 [100 clinical variables], 2 [158 peptides], and 3 [100 clinical variables +158 peptides]) were evaluated using area under the receiver operating characteristic (AUROC) values. Clinical and protein feature patterns were evaluated by linear mixed-effect analysis within the model at 12 and 24 months among patients who did (CHR-SMI-T) and did not transition (CHR-SMI-NT) and the entire group. RESULT: Eighteen CHR-SMI individuals with major psychiatric disorders (first episode psychosis: 2; bipolar II disorder: 13; major depressive disorder; 3) developed disorders over an average of 17.7 months. The combined model showed the highest discriminatory performance (AUROC = 0.73). Cytosolic malate dehydrogenase and transgelin-2 levels were lower in the CHR-SMI-T than the CHR-SMI-NT group. Complement component C9, inter-alpha-trypsin inhibitor heavy chain H4, von Willebrand factor, and C-reactive protein were lower in the patient than the CHR-SMI-NT group. These differences were non-significant after FDR adjustment. LIMITATIONS: Small sample, no control for medication use. CONCLUSION: This exploratory study identified clinical and proteomic markers that might predict severe mental illness early onset, which could aid in early detection and intervention. Future studies with larger samples and controlled variables are needed to validate these findings.
RESUMEN
BACKGROUND: Decreased lean body mass or muscle mass is associated with decreased bone mineral density in individuals with preserved renal function. However, the association between muscle mass and bone mineral density in chronic kidney disease (CKD) patients is not well known. The aim of this study was to assess the relationship between muscle mass estimated from urine creatinine (UCr) and bone mineral density in Korean CKD patients. METHODS: This cross-sectional study analyzed 1872 participants from the Korean Cohort Study for Outcome in Patients With Chronic Kidney Disease (KNOW-CKD) cohort. Participants underwent UCr (g/day) and bone mineral density measurements, which were measured at the lumbar spine, total hip, and femoral neck by dual-energy X-ray absorptiometry. Patients were divided into three groups according to the tertiles of 24 h UCr (T1-T3). RESULTS: The mean values for 24 h urine creatinine of T1, T2, and T3 were 0.83 ± 0.23 g, 1.18 ± 0.24 g, and 1.55 ± 0.38 g, respectively. A total of 172 patients were diagnosed with osteoporosis. The number of patients in each group was 92 (14.4%) in T1, 45 (7.3%) in T2, and 35 (5.7%) in T3. The odds ratio (95% confidence interval) for osteoporosis was 0.37 (0.20-0.69) for 1 g/day increase of UCr. Compared with T1, the odds ratios (95% confidence interval) for osteoporosis were 0.58 (0.39-0.87) for T2 and 0.51 (0.32-0.80) for T3. CONCLUSION: Low 24-h UCr was associated with low bone mineral density. Low 24 h UCr was significantly and independently associated with osteoporosis in Korean pre-dialysis CKD patients. Further research is warranted to verify the influence of muscle mass on bone health in CKD.
RESUMEN
BACKGROUND: The ultimate goal of successful schizophrenia treatment is not just to alleviate psychotic symptoms, but also to reduce distress and achieve subjective well-being (SWB). We aimed to identify the determinants of SWB and their interrelationships in schizophrenia. METHODS: Data were obtained from 637 patients with schizophrenia enrolled in multicenter, open-label, non-comparative clinical trials. The SWB under the Neuroleptic Treatment Scale (SWN) was utilized; a cut-off score of 80 indicated a high level of SWB at baseline and 6 months. Various machine learning (ML) algorithms were employed to identify the determinants of SWB. Furthermore, network analysis and structural equation modeling (SEM) were conducted to explore detailed relationship patterns. RESULTS: The random forest (RF) model had the highest area under the curve (AUC) of 0.794 at baseline. Obsessive-compulsive symptoms (OCS) had the most significant impact on high levels of SWB, followed by somatization, cognitive deficits, and depression. The network analysis demonstrated robust connections among the SWB, OCS, and somatization. SEM analysis revealed that OCS exerted the strongest direct effect on SWB, and also an indirect effect via the mediation of depression. Furthermore, the contribution of OCS at baseline to SWB was maintained 6 months later. CONCLUSIONS: OCS, somatization, cognition, and depression, rather than psychotic symptoms, exerted significant impacts on SWB in schizophrenia. Notably, OCS exhibited the most significant contribution not only to the current state of well-being but also to follow-up SWB, implying that OCS was predictive of SWB. The findings demonstrated that OCS management is critical for the treatment of schizophrenia.
RESUMEN
Handgrip strength (HGS) is suggested as an indirect assessment of nutritional status in chronic kidney disease (CKD) patients, but evidence is limited for non-dialysis-dependent CKD (NDD-CKD) patients. This cross-sectional study included 404 patients from the Phase II KoreaN Cohort Study for Outcome in Patients With CKD. HGS, measured twice in each hand, was the exposure, and malnutrition status was defined by a malnutrition-inflammation score (MIS) of 6 or higher. A logistic regression analysis adjusted for age, sex, diabetes mellitus (DM), hypertension, CKD stages, smoking, overhydration, education, and income status was used to assess malnutrition risk. The predictability of HGS for malnutrition was evaluated using the area under the curve (AUC). Patients with lower HGS were older, had a higher prevalence of DM, and lower estimated glomerular filtration rate. Higher HGS was significantly associated with lower malnutrition risk after adjustment (per 1 standard deviation increase, adjusted odds ratio, 0.47 [0.30-0.75]). Subgroup analyses showed no significant interaction between HGS and malnutrition risk across age, sex, DM, and CKD stage. HGS showed fair predictability for malnutrition in men (AUC 0.64 [0.46-0.83]) and women (AUC 0.71 [0.55-0.86]). In conclusion, HGS is a useful diagnostic indicator of malnutrition in NDD-CKD patients.
Asunto(s)
Fuerza de la Mano , Desnutrición , Estado Nutricional , Insuficiencia Renal Crónica , Humanos , Masculino , Femenino , Insuficiencia Renal Crónica/fisiopatología , Insuficiencia Renal Crónica/epidemiología , Persona de Mediana Edad , Estudios Transversales , Anciano , Desnutrición/epidemiología , Desnutrición/diagnóstico , Evaluación Nutricional , Factores de Riesgo , República de Corea/epidemiología , Tasa de Filtración GlomerularRESUMEN
Background: Transferrin saturation (TSAT) has been used as an indicator of iron deficiency. However, there is no consensus regarding its optimal range for patient with chronic kidney disease (CKD). We aimed to analyze the effect of TSAT on the prognosis of patients with non-dialysis CKD (NDCKD). Methods: From 2011 to 2016, 2157 NDCKD patients with baseline TSAT measurements were followed for 10 years. Patients were divided into three groups based on baseline TSAT values: <25%, ≥25% and <45%, and ≥45%. All-cause mortality and 4-point major adverse cardiovascular events (MACE) were analyzed using multivariable Cox regression analysis. Other iron biomarkers and mortality were also analyzed. Results: During a mean follow-up of 7.1 ± 2.9 years, 182 of a total of 2,157 patients (8.4%) died. Compared with the TSAT ≥25% and <45% group, the TSAT <25% group showed significantly increased all-cause mortality (hazard ratio [HR], 1.44; 95% confidence interval (CI), 1.02-2.03; p = 0.04). The occurrence of 4-point MACE was significantly increased in univariable analysis in the TSAT <25% group (HR, 1.48; 95% CI, 1.02-2.15; p = 0.04), but it was not significant in the multivariable analysis (HR, 1.38; 95% CI, 0.89-2.15; p = 0.15). Tertile comparisons of the iron-to-log-ferritin ratio showed increased mortality in the first tertile group. Conclusion: TSAT <25% is an independent risk factor for all-cause mortality in patients with NDCKD and care should be taken to prevent TSAT values of <25%. Other indicators, such as serum iron and iron-to-log-ferritin ratio, may also be used to assess iron deficiency.
RESUMEN
BACKGROUND: Late hospital arrival keeps patients with stroke from receiving recanalization therapy and is associated with poor outcomes. This study used a nationwide acute stroke registry to investigate the trends and regional disparities in prehospital delay and analyze the significant factors associated with late arrivals. METHODS: Patients with acute ischemic stroke or transient ischemic attack between January 2012 and December 2021 were included. The prehospital delay was identified, and its regional disparity was evaluated using the Gini coefficient for nine administrative regions. Multivariate models were used to identify factors significantly associated with prehospital delays of >4.5 h. RESULTS: A total of 144,014 patients from 61 hospitals were included. The median prehospital delay was 460 min (interquartile range, 116-1912), and only 36.8% of patients arrived at hospitals within 4.5 h. Long prehospital delays and high regional inequality (Gini coefficient > 0.3) persisted throughout the observation period. After adjusting for confounders, age > 65 years old (adjusted odds ratio [aOR] = 1.23; 95% confidence interval [CI], 1.19-1.27), female sex (aOR = 1.09; 95% CI, 1.05-1.13), hypertension (aOR = 1.12; 95% CI, 1.08-1.16), diabetes mellitus (aOR = 1.38; 95% CI, 1.33-1.43), smoking (aOR = 1.15, 95% CI, 1.11-1.20), premorbid disability (aOR = 1.44; 95% CI, 1.37-1.52), and mild stroke severity (aOR = 1.55; 95% CI, 1.50-1.61) were found to independently predict prehospital delays of >4.5 h. CONCLUSION: Prehospital delays were lengthy and had not improved in Korea, and there was a high regional disparity. To overcome these inequalities, a deeper understanding of regional characteristics and further research is warranted to address the vulnerabilities identified.
RESUMEN
BACKGROUND: Several genetic studies have been undertaken to elucidate the intricate interplay between genetics and drug responses in bipolar disorder (BD). However, there has been notably limited research on biomarkers specifically linked to valproate, with only a few studies investigating integrated proteomic and genomic factors in response to valproate treatment. Therefore, this study aimed to identify biological markers for the therapeutic response to valproate treatment in BD. Patients with BD in remission were assessed only at baseline, whereas those experiencing acute mood episodes were evaluated at three points (baseline, 8 ± 2 weeks, and 6 ± 1 months). The response to valproate treatment was measured using the Alda scale, with individuals scoring an Alda A score ≥ 5 categorized into the acute-valproate responder (acute-VPAR) group. We analyzed 158 peptides (92 proteins) from peripheral blood samples using multiple reaction monitoring mass spectrometry, and proteomic result-guided candidate gene association analyses, with 1,627 single nucleotide variants (SNVs), were performed using the Korean chip. RESULTS: The markers of 37 peptides (27 protein) showed temporal upregulation, indicating possible association with response to valproate treatment. A total of 58 SNVs in 22 genes and 37 SNVs in 16 genes showed nominally significant associations with the Alda A continuous score and the acute-VPAR group, respectively. No SNVs reached the genome-wide significance threshold; however, three SNVs (rs115788299, rs11563197, and rs117669164) in the secreted phosphoprotein 2 gene reached a gene-based false discovery rate-corrected significance threshold with response to valproate treatment. Significant markers were associated with the pathophysiological processes of bipolar disorders, including the immune response, acute phase reaction, and coagulation cascade. These results suggest that valproate effectively suppresses mechanisms associated with disease progression. CONCLUSIONS: The markers identified in this study could be valuable indicators of the underlying mechanisms associated with response to valproate treatment.
RESUMEN
OBJECTIVE: The aim of this study was to identify high- and low-risk subgroups of patients with lymph node (LN) metastasis in presumed early-stage endometrioid endometrial cancer (EC) patients. METHODS: Clinicopathologic data of presumed early-stage endometrioid EC patients (n=361) treated with lymphadenectomy between March 2000 and July 2022 were analyzed. None of the patient had definite evidence of LN metastasis in a preoperative magnetic resonance imaging (MRI). A received operating characteristic curve analysis was conducted to define the sensitivity and specificity for the combined preoperative risk factors for LN metastasis, which was determined by multivariate analysis. RESULTS: Nineteen patients (5.3%) had LN metastasis. Multivariate analysis identified cervical stromal invasion on MRI (odds ratio [OR]=4.386; 95% confidence interval [CI]=1.020-18.852; p=0.047), cornual location of tumor on MRI (OR=36.208; 95% CI=7.902-165.913; p<0.001), and lower uterine segment/isthmic location of tumor on MRI (OR=8.454; 95% CI=1.567-45.610; p=0.013) as independent prognostic factors associated with LN metastasis. Patients were categorized into low- and high-risk groups according to risk criteria. Significant differences in the rates of LN metastasis were observed between the two groups (0.4% vs. 22.2%, p<0.001). CONCLUSION: Approximately 95% of presumed early-stage endometrioid EC patients did not have LN metastasis. A model using tumor location was significantly correlated with the risk of LN metastasis. Even in presumed early-stage endometrioid EC patients, therefore, tumor location should be investigated to determine whether to perform LN assessment.
Asunto(s)
Carcinoma Endometrioide , Neoplasias Endometriales , Escisión del Ganglio Linfático , Ganglios Linfáticos , Metástasis Linfática , Imagen por Resonancia Magnética , Estadificación de Neoplasias , Humanos , Femenino , Neoplasias Endometriales/patología , Metástasis Linfática/patología , Persona de Mediana Edad , Carcinoma Endometrioide/patología , Anciano , Ganglios Linfáticos/patología , Ganglios Linfáticos/diagnóstico por imagen , Adulto , Estudios Retrospectivos , Factores de Riesgo , Invasividad Neoplásica , Curva ROCRESUMEN
Background: The natural course of chronic kidney disease (CKD) progression in children varies according to their underlying conditions. This study aims to identify different patterns of subsequent decline in kidney function and investigate factors associated with different patterns of estimated glomerular filtration rate (eGFR) trajectories. Methods: We analyzed data from the KNOW-Ped CKD (KoreaN cohort study for Outcomes in patients With Pediatric Chronic Kidney Disease), which is a longitudinal, prospective cohort study. A latent class linear mixed model was applied to identify the trajectory groups. Results: In a total of 287 patients, the median baseline eGFR (mL/min/1.73 m2) was 63.3, and the median age was 11.5 years. The eGFR decline rate was -1.54 during a 6.0-year follow-up. The eGFR trajectory over time was classified into four groups. Classes 1 (n = 103) and 2 (n = 11) had a slightly reduced eGFR at enrollment with a stable trend (ΔeGFR, 0.2/year) and a rapid decline eGFR over time (ΔeGFR, -10.5/year), respectively. Class 3 had a normal eGFR (n = 16), and class 4 had a moderately reduced eGFR (n = 157); both these chasses showed a linear decline in eGFR over time (ΔeGFR, -4.1 and -2.4/year). In comparison with classes 1 and 2, after adjusting for age, causes of primary renal disease, and baseline eGFR, nephrotic-range proteinuria was associated with a rapid decline in eGFR (odds ratio, 8.13). Conclusion: We identified four clinically relevant subgroups of kidney function trajectories in children with CKD. Most children showed a linear decline in eGFR; however, there are different patterns of eGFR trajectories.
RESUMEN
OBJECTIVES: : This study aimed to evaluate the effect of time-updated ambulatory blood pressure on chronic kidney disease (CKD) progression in patients with hypertension. METHODS: : Among patients with hypertension and CKD stages 3 and 4, enrolled in a clinical trial in which hypertension was treated based on office or ambulatory blood pressure (BP), participants assigned to the ambulatory BP were included in this study. Ambulatory BP was measured at the start of the study and 3, 6, and 18âmonths. Renal events were defined as a decrease in the estimated glomerular filtration rate (eGFR) by at least 30%, dialysis, or transplantation. RESULTS: : A total of 21 cases of renal events were observed. For baseline BP, a multivariate Cox model revealed that neither office SBP nor any component of ambulatory SBP, including mean, day-time, night-time BPs was associated with the risk of renal events. For time-updated BP, a marginal structural model revealed that the office SBP was not associated with renal events [hazard ratio 1.03, 95% confidence interval (CI) 0.99-1.07, P â=â0.117], but higher ambulatory SBPs, including day-time (hazard ratio 1.05, 95% CI 1.01-1.10, P â=â0.014), night-time (hazard ratio 1.05, 95% CI 1.02-1.08, P â=â0.001), and mean (hazard ratio 1.06, 95% CI 1.02-1.10, P â=â0.002) ambulatory SBPs, were significantly associated with an increased risk of renal events. CONCLUSION: : A higher time-updated ambulatory BP was associated with an increased risk of renal events in patients with hypertension and CKD, whereas baseline office and ambulatory BP, and time-updated office BP were not.
Asunto(s)
Hipertensión , Insuficiencia Renal Crónica , Humanos , Presión Sanguínea , Monitoreo Ambulatorio de la Presión Arterial , Insuficiencia Renal Crónica/complicaciones , Diálisis RenalRESUMEN
BACKGROUND: Although albuminuria is the gold standard for defining chronic kidney disease (CKD), total proteinuria has also been widely used in real-world clinical practice. Moreover, the superiority of the prognostic performance of albuminuria over proteinuria in patients with CKD remains inconclusive. Therefore, we aimed to compare the predictive performances of albuminuria and proteinuria in these patients. METHODS: From the Korean Cohort Study for Outcome in Patients with CKD we included 2099 patients diagnosed with CKD grades 1-5 who did not require kidney replacement therapy. We measured the spot urine albumin:creatinine ratio (mACR) and protein:creatinine ratio (PCR) and estimated the ACR (eACR) using the PCR. Kidney failure risk equation (KFRE) scores were calculated using the mACR, PCR and eACR. The primary outcome was the 5-year risk of kidney failure with replacement therapy (KFRT). RESULTS: The eACR significantly underestimated mACR in patients with low albuminuria levels. The time-dependent area under the receiver operating characteristics curve showed excellent predictive performance for all KFRE scores from the mACR, PCR and eACR. However, eACR was inferior to mACR based on the continuous net reclassification index (cNRI) and integrated discrimination improvement index (IDI) in all CKD cause groups, except for the group with an unclassified aetiology. Moreover, the cNRI and IDI statistics indicated that both eACR and PCR were inferior to mACR in patients with low albuminuria (<30 mg/g). Conversely, the predictive performance of PCR was superior in severe albuminuria and nephrotic-range proteinuria, in which the IDI and cNRI of the PCR were greater than those of the mACR. CONCLUSIONS: The mACR, eACR and PCR showed excellent performance in predicting KFRT in patients with CKD. However, eACR was inferior to mACR in patients with low albuminuria, indicating that measuring rather than estimating albuminuria is preferred for these patients.
Asunto(s)
Albuminuria , Insuficiencia Renal Crónica , Humanos , Albuminuria/diagnóstico , Albuminuria/etiología , Albuminuria/orina , Estudios de Cohortes , Creatinina/orina , Proteinuria/diagnóstico , Proteinuria/etiología , Proteinuria/orina , Insuficiencia Renal Crónica/complicaciones , Insuficiencia Renal Crónica/diagnóstico , Insuficiencia Renal Crónica/orina , Tasa de Filtración GlomerularRESUMEN
In a semi-competing risks model in which a terminal event censors a non-terminal event but not vice versa, the conventional method can predict clinical outcomes by maximizing likelihood estimation. However, this method can produce unreliable or biased estimators when the number of events in the datasets is small. Specifically, parameter estimates may converge to infinity, or their standard errors can be very large. Moreover, terminal and non-terminal event times may be correlated, which can account for the frailty term. Here, we adapt the penalized likelihood with Firth's correction method for gamma frailty models with semi-competing risks data to reduce the bias caused by rare events. The proposed method is evaluated in terms of relative bias, mean squared error, standard error, and standard deviation compared to the conventional methods through simulation studies. The results of the proposed method are stable and robust even when data contain only a few events with the misspecification of the baseline hazard function. We also illustrate a real example with a multi-centre, patient-based cohort study to identify risk factors for chronic kidney disease progression or adverse clinical outcomes. This study will provide a better understanding of semi-competing risk data in which the number of specific diseases or events of interest is rare.
Asunto(s)
Fragilidad , Humanos , Estudios de Cohortes , Factores de Riesgo , Simulación por Computador , República de Corea/epidemiología , Funciones de VerosimilitudRESUMEN
Pneumonia is a significant adverse drug reaction (ADR) associated with clozapine, characterized by high mortality and potential linkage with other inflammatory responses. Despite the critical nature, research regarding the development of pneumonia during initial clozapine titration remains limited. This retrospective study included 1408 Korean inpatients with schizophrenia spectrum disorders. Data were collected from January 2000 to January 2023. Pneumonia developed in 3.5 % of patients within 8 weeks of clozapine initiation. Patients who developed pneumonia were taking a greater number and higher dose of antipsychotics at baseline (2.14 vs. 1.58, p < 0.001; 25.64 vs. 19.34, p = 0.012). The average onset occurred 17.24 days after initiation, on an average dose of 151.28 mg/day. Titration was either paused or slowed in most of these patients, with no reported fatalities. The types of pneumonia included aspiration pneumonia, mycoplasma pneumonia, bronchopneumonia, and COVID-19 pneumonia. Myocarditis, drug reaction with eosinophilia and systemic symptoms (DRESS) syndrome, and urinary tract infections were also identified. Logistic regression analysis revealed that a greater number of concomitant antipsychotics (odds ratio [OR] = 1.59, p = 0.027) and concomitant benzodiazepine use (OR = 2.33, p = 0.005) at baseline were associated with an increased risk of pneumonia. Overall, pneumonia development during clozapine titration is linked with other inflammatory ADRs, suggesting a shared immunological mechanism. Close monitoring is recommended, especially for patients taking multiple antipsychotics and benzodiazepines. Further studies involving repeated measures of clozapine concentrations at trough and steady state, along with a more detailed description of pneumonia types, are warranted.
RESUMEN
Background: There are insufficient studies on the effect of dietary salt intake on cardiovascular (CV) outcomes in chronic kidney disease (CKD) patients, and there is no consensus on the sodium (Na) intake level that increases the risk of CV disease in CKD patients. Therefore, we investigated the association between dietary salt intake and CV outcomes in CKD patients. Methods: In the Korean cohort study for Outcome in patients with CKD (KNOW-CKD), 1,937 patients were eligible for the study, and their dietary Na intake was estimated using measured 24h urinary Na excretion. The primary outcome was a composite of CV events and/or all-cause death. The secondary outcome was a major adverse cardiac event (MACE). Results: Among 1,937 subjects, there were 205 (10.5%) events for the composite outcome and 110 (5.6%) events for MACE. Compared to the reference group (urinary Na excretion< 2.0g/day), the group with the highest measured 24h urinary Na excretion (urinary Na excretion ≥ 8.0g/day) was associated with increased risk of both the composite outcome (hazard ratio 3.29 [95% confidence interval 1.00-10.81]; P = 0.049) and MACE (hazard ratio 6.28 [95% confidence interval 1.45-27.20]; P = 0.013) in a cause-specific hazard model. Subgroup analysis also showed a pronounced association between dietary salt intake and the composite outcome in subgroups of patients with abdominal obesity, female, lower estimated glomerular filtration rate (< 60 ml/min per 1.73m2), no overt proteinuria, or a lower urinary potassium-to-creatinine ratio (< 46 mmol/g). Conclusion: A high-salt diet is associated with CV outcomes in non-dialysis CKD patients.
RESUMEN
A prognostic model to determine an association between survival outcomes and clinical risk factors, such as the Cox model, has been developed over the past decades in the medical field. Although the data size containing subjects' information gradually increases, the number of events is often relatively low as medical technology develops. Accordingly, poor discrimination and low predicted ability may occur between low- and high-risk groups. The main goal of this study was to evaluate the predicted probabilities with three existing competing risks models in variation with censoring rates. Three methods were illustrated and compared in a longitudinal study of a nationwide prospective cohort of patients with chronic kidney disease in Korea. The prediction accuracy and discrimination ability of the three methods were compared in terms of the Concordance index (C-index), Integrated Brier Score (IBS), and Calibration slope. In addition, we find that these methods have different performances when the effects are linear or nonlinear under various censoring rates.
Asunto(s)
Insuficiencia Renal Crónica , Humanos , Estudios Longitudinales , Estudios Prospectivos , Calibración , Insuficiencia Renal Crónica/diagnóstico , Factores de RiesgoRESUMEN
Safe and effective administration of clozapine requires careful monitoring for inflammatory reactions during the initial titration. The concentration-to-dose (C/D) ratio must be taken into account, which may vary among ethnicities. In this retrospective study, 1408 Korean schizophrenia inpatients were examined for during the first 8 weeks of clozapine titration. The average doses of clozapine administered during weeks 1, 2, 4, and 8 were 77.37, 137.73, 193.20, and 212.83 mg/day, with significantly lower doses for females than males. The average C/D ratio was significantly higher in females (1.75 ± 1.04 and 1.11 ± 0.67 ng/mL per mg/day). Patients with higher C/D ratios were more likely to experience fever and were prescribed lower doses of clozapine starting from week 4. In total, 22.1 % of patients developed a fever at an average of 15.74 days after initiating clozapine. Patients who developed a fever were younger, used more antipsychotics at baseline, had a higher C/D ratio, and had a higher incidence of an elevated C-reactive protein level. A higher C/D ratio, use of a greater number of antipsychotics at baseline, and concomitant olanzapine use were risk factors for the development of inflammatory reactions. The incidence of pneumonia, agranulocytosis, and myocarditis within 8 weeks were 3.7 %, 0.3 %, and 0.1 %. In summary, the target dose of clozapine titration is lower for Korean schizophrenia patients, with a higher C/D ratio and more frequent fever compared to Western patients; however, myocarditis occurs rarely. Our findings may contribute to the titration methods for clozapine for the East Asian population.
RESUMEN
BACKGROUND: Despite efforts to treat critically ill patients who require continuous renal replacement therapy (CRRT) due to acute kidney injury (AKI), their mortality risk remains high. This condition may be attributable to complications of CRRT, such as arrhythmias. Here, we addressed the occurrence of ventricular tachycardia (VT) during CRRT and its relationship with patient outcomes. METHODS: This study retrospectively enrolled 2,397 patients who started CRRT due to AKI from 2010 to 2020 at Seoul National University Hospital in Korea. The occurrence of VT was evaluated from the initiation of CRRT until weaning from CRRT. The odds ratios (ORs) of mortality outcomes were measured using logistic regression models after adjustment for multiple variables. RESULTS: VT occurred in 150 patients (6.3%) after starting CRRT. Among them, 95 cases were defined as sustained VT (i.e., lasting ≥30 seconds), and the other 55 cases were defined as non-sustained VT (i.e., lasting <30 seconds). The occurrence of sustained VT was associated with a higher mortality rate than a nonoccurrence (OR, 2.04 and 95% confidence interval [CI], 1.23-3.39 for the 30- day mortality; OR, 4.06 and 95% CI, 2.04-8.08 for the 90-day mortality). The mortality risk did not differ between patients with non-sustained VT and nonoccurrence. A history of myocardial infarction, vasopressor use, and certain trends of blood laboratory findings (such as acidosis and hyperkalemia) were associated with the subsequent risk of sustained VT. CONCLUSION: Sustained VT occurrence after starting CRRT is associated with increased patient mortality. The monitoring of electrolytes and acid-base status during CRRT is essential because of its relationship with the risk of VT.
RESUMEN
BACKGROUND: The new Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equations without a race coefficient have gained recognition across the United States. We aimed to test whether these new equations performed well in Korean patients with chronic kidney disease (CKD). METHODS: This study included 2,149 patients with CKD G1-G5 without kidney replacement therapy from the Korean Cohort Study for Outcome in Patients with CKD (KNOW-CKD). The estimated glomerular filtration rate (eGFR) was calculated using the new CKD-EPI equations with serum creatinine and cystatin C. The primary outcome was 5-year risk of kidney failure with replacement therapy (KFRT). RESULTS: When we adopted the new creatinine equation [eGFRcr (NEW)], 81 patients (23.1%) with CKD G3a based on the current creatinine equation (eGFRcr) were reclassified as CKD G2. Accordingly, the number of patients with eGFR of <60 mL/min/1.73 m2 decreased from 1,393 (64.8%) to 1,312 (61.1%). The time-dependent area under the receiver operating characteristic curve for 5-year KFRT risk was comparable between the eGFRcr (NEW) (0.941; 95% confidence interval [CI], 0.922-0.960) and eGFRcr (0.941; 95% CI, 0.922-0.961). The eGFRcr (NEW) showed slightly better discrimination and reclassification than the eGFRcr. However, the new creatinine and cystatin C equation [eGFRcr-cys (NEW)] performed similarly to the current creatinine and cystatin C equation. Furthermore, eGFRcr-cys (NEW) did not show better performance for KFRT risk than eGFRcr (NEW). CONCLUSION: Both the current and the new CKD-EPI equations showed excellent predictive performance for 5-year KFRT risk in Korean patients with CKD. These new equations need to be further tested for other clinical outcomes in Koreans.