ABSTRACT
Despite advances in the understanding of the pathophysiology of cytomegalovirus (CMV) infection, it remains as one of the most common infectious complications after allogeneic hematopoietic stem cell transplantation (allo-HSCT). The aim of this study was to determine the genotype of cytokines and chemokines in donor and recipient and their association with CMV reactivation. Eighty-five patients receiving an allo-HSCT from an HLA-identical sibling donor were included in the study. Fifty genes were selected for their potential role in the pathogenesis of CMV infection. CMV DNAemia was evaluated until day 180 after allo-HSCT. CMV reactivation was observed in 51/85 (60%) patients. Of the 213 genetic variants selected, 11 polymorphisms in 7 different genes (CXCL12, IL12A, KIR3DL1, TGFB2, TNF, IL1RN, and CD48) were associated with development or protection from CMV reactivation. A predictive model using five of such polymorphisms (CXCL12 rs2839695, IL12A rs7615589, KIR3DL1 rs4554639, TGFB2 rs5781034 for the recipient and CD48 rs2295615 for the donor) together with the development of acute GVHD grade III/IV improved risk stratification of CMV reactivation. In conclusion, the data presented suggest that the screening of five polymorphisms in recipient and donor pre-transplantation could help to predict the individual risk of CMV infection development after HLA-identical allo-HSCT.
Subject(s)
Cytomegalovirus Infections , Hematopoietic Stem Cell Transplantation , Cytomegalovirus/genetics , Cytomegalovirus Infections/etiology , Cytomegalovirus Infections/genetics , Hematopoietic Stem Cell Transplantation/adverse effects , Humans , Immunogenetics , Retrospective Studies , Transplantation, Homologous/adverse effectsABSTRACT
OBJECTIVES: To assess the safety of enteral nutrition (EN) in children on extracorporeal membrane oxygenation (ECMO). To describe nutritional status and the characteristics of the nutritional support in this population. METHODS: A retrospective single-center analysis (2006-2016) including children <18âyears on ECMO. Demographic data, nutritional status, characteristics of nutritional support, and development of gastrointestinal (GI) complications were recorded. RESULTS: One hundred children, with a median age of 9.7âmonths (interquartile range [IQR] 3.9-63.1) were enrolled. Undernutrition was prevalent among children on ECMO (33.3%) mainly in patients <2âyears (P = 0.042). Most patients (64%) received EN at some point during ECMO therapy. EN was administered in the first 48âhours after ECMO initiation (48HEN) to 60.3% of the children.Mortality rate in the Pediatric Intensive Care Unit was lower in patients who received EN as the initial artificial nutrition support (ANS) (37.7 vs 51%, Pâ=â0.005) and in children on 48HEN (34% vs 50%, Pâ=â0.04). In the logistic regression analysis, duration of ECMO support and low cardiac output indication were the only factors associated with mortality.Although most patients on ECMO (45%) developed digestive complications, they were mostly mild, being constipation the most prevalent. In the logistic regression analysis, EN was not associated with an increase in GI complications (Pâ=â0.09). Only three patients developed intestinal ischemia (one without EN and two on EN). CONCLUSIONS: Undernutrition is prevalent among children on ECMO, mainly in infants <2âyears. EN is not associated with severe gastrointestinal complications or higher mortality in these children.
Subject(s)
Extracorporeal Membrane Oxygenation , Gastrointestinal Diseases , Child , Enteral Nutrition/adverse effects , Extracorporeal Membrane Oxygenation/adverse effects , Gastrointestinal Diseases/etiology , Humans , Infant , Nutritional Status , Retrospective Studies , Treatment OutcomeABSTRACT
Lung resection surgery (LRS) causes an intense local and systemic inflammatory response. There is a relationship between inflammation and postoperative complications (POCs). Also, it has been proposed that the inflammation and complications related with the surgery may promote the recurrence of cancer and therefore deterioration of survival. We investigated the association between inflammatory biomarkers, severity of POCs and long-term outcome in patients who were discharged after LRS. This is a prospective substudy of a randomized control trial. We established three groups based in the presence of POCs evaluated by Clavien-Dindo (C-D) classification: Patients with no postoperative complications (No-POCs group) (C-D = 0), patients who developed light POCs (L-POCs group) (C-D = I-II), and major POCs (M-POCs group) (C-D = III, IV, or V). Kaplan-Meier curves and Cox regression model were created to compare survival and oncologic recurrence in those groups. Patients who developed POCs (light or major) had an increase in some inflammatory biomarkers (TNF-α, IL-6, IL-7, IL-8) compared with No-POCs group. This pro-inflammatory status plays a fundamental role in the appearance of POCs and therefore in a shorter life expectancy. Individuals in the M-POCs group had a higher risk of death (HR = 3.59, 95% CI 1.69 to 7.63) compared to individuals in the No-POCs group (p = 0.001). Patients of L-POCs group showed better survival than M-POCs group (HR = 2.16, 95% CI 1.00 to 4.65, p = 0.049). Besides, M-POCs patients had higher risk of recurrence in the first 2 years, when compared with L-POCs (p = 0,008) or with No-POCs (p = 0.002). In patients who are discharged after undergoing oncologic LRS, there is an association between POCs occurrence and long term outcome. Oncologist should pay special attention in patients who develop POCs after LRS.
Subject(s)
Lung , Postoperative Complications , Humans , Prospective Studies , Retrospective StudiesABSTRACT
BACKGROUND: Most of the circulating Vitamin D (VitD) is transported bound to vitamin D-binding protein (DBP), and several DBP single nucleotide polymorphisms (SNPs) have been related to circulating VitD concentration and disease. In this study, we evaluated the association among DBP SNPs and AIDS progression in antiretroviral treatment (ART)-naïve-HIV-infected patients. METHODS: We performed a retrospective study in 667 patients who were classified according to their pattern of AIDS progression (183 long-term non-progressors (LTNPs), 334 moderate progressors (MPs), and 150 rapid progressors (RPs)) and 113 healthy blood donors (HIV, HCV, and HBV negative subjects). We genotyped seven DBP SNPs (rs16846876, rs12512631, rs2070741, rs2282679, rs7041, rs1155563, rs2298849) using Agena Bioscience's MassARRAY platform. The genetic association was evaluated by Generalized Linear Models adjusted by age at the moment of HIV diagnosis, gender, risk group, and VDR rs2228570 SNP. Multiple testing correction was performed by the false discovery rate (Benjamini and Hochberg procedure; q-value). RESULTS: All SNPs were in HWE (p > 0.05) and had similar genotypic frequencies for DBP SNPs in healthy-controls and HIV-infected patients. In unadjusted GLMs, we only found significant association with AIDS progression in rs16846876 and rs12512631 SNPs. In adjusted GLMs, DBP rs16846876 SNP showed significant association under the recessive inheritance model [LTNPs vs. RPs (adjusted odds ratio (aOR) = 3.53; q-value = 0.044) and LTNPs vs. MPs (aOR = 3.28; q-value = 0.030)] and codominant [LTNPs vs. RPs (aOR = 4.92; q-value = 0.030) and LTNPs vs. MPs (aOR = 3.15; q-value = 0.030)]. Also, we found DBP rs12512631 SNP showed significant association in the inheritance model dominant [LTNPs vs. RPs (aOR = 0.49; q-value = 0.031) and LTNPs vs. MPs (aOR = 0.6; q-value = 0.047)], additive [LTNPs vs. RPs (aOR = 0.61; q-value = 0.031)], overdominant [LTNPs vs. MPs (aOR = 0.55; q-value = 0.032)], and codominant [LTNPs vs. RPs (aOR = 0.52; q-value = 0.036) and LTNPs vs. MPs (aOR = 0.55; q-value = 0.032)]. Additionally, we found a significant association between DBP haplotypes (composed by rs16846876 and rs12512631) and AIDS progression (LTNPs vs RPs): DBP haplotype AC (aOR = 0.63; q-value = 0.028) and the DBP haplotype TT (aOR = 1.64; q-value = 0.028). CONCLUSIONS: DBP rs16846876 and rs12512631 SNPs are related to the patterns of clinical AIDS progression (LTNP, MP, and RP) in ART-naïve HIV-infected patients. Our findings provide new knowledge about AIDS progression that may be relevant to understanding the pathogenesis of HIV infection.
Subject(s)
Acquired Immunodeficiency Syndrome/genetics , Anti-Retroviral Agents/therapeutic use , DNA-Binding Proteins/genetics , Disease Progression , HIV/physiology , Polymorphism, Single Nucleotide , Transcription Factors/genetics , Acquired Immunodeficiency Syndrome/virology , Adult , Cohort Studies , DNA-Binding Proteins/metabolism , Female , Humans , Male , Middle Aged , Retrospective Studies , Spain , Transcription Factors/metabolismABSTRACT
Early detection of patients with a high risk of postoperative pulmonary complications (PPCs) could improve postoperative strategies. We investigated the role of monitoring systemic and lung inflammatory biomarkers during surgery and the early postoperative period to detect patients at high risk of PPCs after lung resection surgery (LRS). This is a substudy of a randomized control trial on the inflammatory effects of anaesthetic drugs during LRS. We classified patients into two groups, depending on whether or not they developed PPCs. We constructed three multivariate logistic regression models to analyse the power of the biomarkers to predict PPCs. Model 1 only included the usual clinical variables; Model 2 included lung and systemic inflammatory biomarkers; and Model 3 combined Models 1 and 2. Comparisons between mathematical models were based on the area under the receiver operating characteristic curve (AUROC) and tests of integrated discrimination improvement (IDI). Statistical significance was set at p < 0.05. PPCs were detected in 37 (21.3%) patients during admission. The AUROC for Models 1, 2, and 3 was 0.79 (95% CI 0.71-0.87), 0.80 (95% CI 0.72-0.88), and 0.93 (95% CI 0.88-0.97), respectively. Comparison of the AUROC between Models 1 and 2 did not reveal statistically significant values (p = 0.79). However, Model 3 was superior to Model 1 (p < 0.001). Model 3 had had an IDI of 0.29 (p < 0.001) and a net reclassification index of 0.28 (p = 0.007). A mathematical model combining inflammation biomarkers with clinical variables predicts PPCs after LRS better than a model that includes only clinical data. Clinical registration number Clinical Trial Registration NCT02168751; EudraCT 2011-002294-29.
Subject(s)
Lung/surgery , Postoperative Complications/diagnosis , Aged , Anesthesia/methods , Area Under Curve , Biomarkers/metabolism , Bronchoalveolar Lavage Fluid , Cytokines/metabolism , Female , Forced Expiratory Volume , Hemodynamics , Humans , Inflammation , Lung/metabolism , Male , Middle Aged , Models, Theoretical , Multivariate Analysis , Probability , Prospective Studies , ROC Curve , Risk Factors , Thoracic SurgeryABSTRACT
BACKGROUND: Little is known about the utility of transient elastography (TE) for assessing the prognosis of patients with decompensated cirrhosis (DC). METHODS: We analyzed HIV/HCV-coinfected patients with DC who underwent TE as part of their routine follow-up between 2006 and 2015. We also calculated the liver stiffness spleen diameter-to-platelet score (LSPS), FIB-4 index, albumin, MELD score, and Child-Pugh score. The primary outcome was death. RESULTS: The study population comprised 65 patients. After a median follow-up of 32 months after the first TE, 17 patients had received anti-HCV therapy and 31 patients had died. The highest area under the receiver operating characteristic curve (AUROC) value for prediction of death was observed with albumin (0.695), followed by Child-Pugh score (0.648), both with P values < .05. Lower AUROC values were observed with MELD score (0.633), TE (0.618), LSPS score (0.595), and FIB-4 (0.569), all with P values > .05. In the univariate Cox regression analysis, albumin, FIB-4, Child-Pugh score, and MELD score, but not TE, were associated with death. In the multivariate analysis, albumin and Child-Pugh score were the only baseline variables associated with death. CONCLUSIONS: Our results suggest that TE is not useful for assessing the prognosis of HIV-infected patients with decompensated HCV-related cirrhosis. Albumin concentration and Child-Pugh scores were the most consistent predictors of death in this population group.
Subject(s)
AIDS-Related Opportunistic Infections/diagnostic imaging , Elasticity Imaging Techniques/methods , HIV Infections/diagnostic imaging , Hepatitis C/diagnostic imaging , Liver/diagnostic imaging , AIDS-Related Opportunistic Infections/complications , AIDS-Related Opportunistic Infections/mortality , Adult , Area Under Curve , Female , HIV Infections/complications , HIV Infections/mortality , Hepatitis C/complications , Hepatitis C/mortality , Humans , Liver/pathology , Liver/virology , Liver Cirrhosis/diagnostic imaging , Liver Cirrhosis/virology , Male , Middle Aged , Prognosis , Proportional Hazards Models , ROC CurveABSTRACT
BACKGROUND: Our aim was to determine whether α-chain of the IL-7 receptor (IL7RA) polymorphisms (rs10491434, rs6897932 and rs987106) are associated with the clinical pattern of AIDS progression in ART-naïve HIV-infected patients. MATERIALS AND METHODS: We carried out a cross-sectional study in 673 HIV-infected patients who were classified into three groups according to the clinical pattern of AIDS progression (188 long-term nonprogressors (LTNPs), 334 moderate progressors (MPs) and 151 rapid progressors (RPs)). Additionally, 134 healthy blood donors participated as a Control-group. We selected three IL7RA polymorphisms located at three regulatory regions [rs6897932 (exon 6), rs987106 (intronic region) and rs10491434 (3'UTR)]. DNA genotyping was performed using Sequenom's MassARRAY platform. RESULTS: The Control-group and all HIV-infected patients had similar age and percentage of males. LTNP-group was older at HIV diagnosis and at the inclusion in the study and had higher percentage of intravenous drug users (IDU) (P < 0·001). Besides, LTNP-group had lower proportion of male patients and homosexual HIV transmission than MP and RP groups (P < 0·001). Moreover, similar values of allelic, genotypic and haplotype frequencies for IL7RA polymorphisms were found between healthy controls and HIV-infected patients (P > 0·05), and among different subgroups of HIV patients according to AIDS progression (LTNPs, MPs and RPs) (P > 0·05). The adjusted logistic regression did not show any significant association between IL7RA polymorphisms and AIDS progression. CONCLUSIONS: IL7RA polymorphisms (rs6897932, rs987106 and rs10491434) were not associated with AIDS progression in Spanish population. Therefore, IL7RA polymorphisms do not seem to help us to understand HIV pathogenesis in untreated HIV-infected patients with different clinical evolution.
Subject(s)
Gene Expression Regulation , HIV Infections/genetics , HIV Infections/mortality , Interleukin-7 Receptor alpha Subunit/genetics , Polymorphism, Single Nucleotide , Acquired Immunodeficiency Syndrome/diagnosis , Acquired Immunodeficiency Syndrome/genetics , Acquired Immunodeficiency Syndrome/mortality , Adult , Age Factors , Cross-Sectional Studies , Disease Progression , Female , HIV Infections/diagnosis , Humans , Logistic Models , Male , Middle Aged , Prognosis , Reference Values , Risk Assessment , Sensitivity and Specificity , Sex Factors , Spain , Survival AnalysisABSTRACT
BACKGROUND: The IL7RA polymorphisms have recently been associated with CD4+ T-cell decline in untreated HIV-infected subjects and CD4+ T-cell recovery in patients on combination antiretroviral therapy (cART). The aim of this study was to evaluate whether IL7RA polymorphisms are associated with CD4+ T-cell recovery in HIV-infected patients on long-term cART. STUDY DESIGN: We performed a retrospective study in 151 naïve cART patients with severe immunodeficiency (CD4+ counts ≤200 cells/mm(3) ). IL7RA polymorphisms' genotyping was performed using Sequenom's MassARRAY platform. The outcome variable was the time to achieve the first value of CD4+ count ≥500 cells/mm(3) during the follow-up. RESULTS: Two different trends of CD4+ T-cell recovery were found in Kaplan-Meier analysis. During the first 48 months, 60 of 151 (39·7%) of the patients reached CD4+ T-cell values ≥500 cells/mm(3) , and no differences were observed between IL7RA genotypes. After the first 48 months of follow-up, 27 of 151 (17·8%) of the patients reached CD4+ T-cell values ≥500 cells/mm(3) , with a different pattern of CD4+ recovery depending on IL7RA genotype. Patients with rs10491434 TT genotype and rs6897932 TT genotype were more likely of achieving CD4+ value ≥500 cells/mm(3) than patients with rs10491434 CT/CC genotype (adjusted hazard ratio (aHR) = 3·59; P = 0·005) and patients with rs6897932 CC/CT genotype (aHR = 11·7; P < 0·001). CONCLUSIONS: The IL7RA polymorphisms seem to be associated with CD4+ T-cell recovery in HIV-infected patients who started cART with severe immunodeficiency, in the second phase of CD4+ T-cell recovery after long-term cART.
Subject(s)
HIV Infections/drug therapy , HIV Protease Inhibitors/therapeutic use , Receptors, Interleukin-7/genetics , Reverse Transcriptase Inhibitors/therapeutic use , Adult , Antiretroviral Therapy, Highly Active , CD4 Lymphocyte Count , CD4-Positive T-Lymphocytes/immunology , Female , Genotype , HIV Infections/genetics , HIV Infections/immunology , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Polymorphism, Genetic , Polymorphism, Single Nucleotide , Proportional Hazards Models , Retrospective Studies , Spain , Treatment Outcome , Viral LoadABSTRACT
Previous studies have shown the reproducibility of the 2008 World Health Organization (WHO) classification in myelodysplastic syndromes (MDS), especially when multilineage dysplasia or excess of blasts are present. However, there are few data regarding the reproducibility of MDS with unilineage dysplasia. The revised International Prognostic Scoring System R-IPSS described two new morphological categories, distinguishing bone marrow (BM) blast cell count between 0-2 % and >2- < 5 %. This distinction is critical for establishing prognosis, but the reproducibility of this threshold is still not demonstrated. The objectives of our study were to explore the reliability of the 2008 WHO classification, regarding unilineage vs. multilineage dysplasia, by reviewing 110 cases previously diagnosed with MDS, and to study whether the threshold of ≤2 % BM blasts is reproducible among different observers. We used the same methodology as in our previous paper [Font et al. (2013) Ann Hematol 92:19-24], by encouraging investigators to include patients with <5 % BM blasts. Samples were collected from 11 hospitals and were evaluated by 11 morphologists. Each observer evaluated 20 samples, and each sample was analyzed independently by two morphologists. Discordance was observed in 36/108 suitable cases (33 %, kappa test 0.503). Diagnosis of MDS with unilineage dysplasia (refractory cytopenia with unilineage dysplasia (RCUD), refractory anemia with ring sideroblasts (RARS) or unclassifiable MDS) was assessed in 33 patients, by either of the two observers. We combined this series with the cases with RCUD or RARS included in our 2013 paper, thus obtaining 50 cases with unilineage dysplasia by at least one of the observers. The whole series showed very low agreement regarding RCUD (5/23, 21 %) and RARS (5/28, 18 %). Regarding BM blast count, the threshold of ≤2 % was not reproducible (discordance rate 32/108 cases, kappa test 0.277). Our study shows that among MDS WHO 2008 categories, interobserver discordance seems to be high in cases with unilineage dysplasia. We also illustrate that the threshold of ≤2 % BM blasts as settled by the R-IPSS may be not easy to reproduce by morphologists in real practice.
Subject(s)
Blast Crisis/pathology , Bone Marrow/pathology , Myelodysplastic Syndromes/diagnosis , Myelodysplastic Syndromes/pathology , Cell Count/statistics & numerical data , Cell Lineage , Cytodiagnosis/statistics & numerical data , Female , Humans , Male , Observer Variation , Prognosis , Reproducibility of ResultsABSTRACT
BACKGROUND: Hepatic venous pressure gradient (HVPG) is the best indicator of prognosis in patients with compensated cirrhosis. We compared HVPG and transient elastography (TE) for the prediction of liver-related events (LREs) in patients with hepatitis C virus (HCV)-related cirrhosis with or without human immunodeficiency virus (HIV) coinfection. METHODS: This was a retrospective review of all consecutive patients with compensated HCV-related cirrhosis who were assessed simultaneously using TE and HVPG between January 2005 and December 2011. We used receiver operating characteristic (ROC) curves to determine the ability of TE and HVPG to predict the first LRE (liver decompensation or hepatocellular carcinoma). RESULTS: The study included 60 patients, 36 of whom were coinfected with HIV. After a median follow-up of 42 months, 6 patients died, 8 experienced liver decompensations, and 7 were diagnosed with hepatocellular carcinoma. The area under the ROC curve (AUROC) of TE and HVPG for prediction of LREs in all patients was 0.85 (95% confidence interval [CI], .73-.97) and 0.76 (95% CI, .63-.89) (P = .13); for HIV-infected patients, the AUROC was 0.85 (95% CI, .67-1.00) and 0.81 (95% CI, .64-.97) (P = .57); and for non-HIV-infected patients, the AUROC was 0.88 (95% CI, .75-1.00) and 0.77 (95% CI, .57-.97) (P = .19). Based on the AUROC values, 2 TE cutoff points were chosen to predict the absence (<25 kPa) or presence (≥40 kPa) of LREs, thus enabling correct classification of 82% of patients. CONCLUSIONS: Our data suggest that TE is at least as valid as HVPG for predicting LREs in patients with compensated HCV-related cirrhosis with or without concomitant HIV coinfection.
Subject(s)
Carcinoma, Hepatocellular/diagnosis , Elasticity Imaging Techniques/methods , Hepatitis C, Chronic/complications , Liver Cirrhosis/complications , Liver Failure/diagnosis , Liver Neoplasms/diagnosis , Portal Pressure , Adult , Carcinoma, Hepatocellular/pathology , Coinfection/complications , Coinfection/pathology , Female , HIV Infections/complications , Hepatitis C, Chronic/pathology , Humans , Liver Cirrhosis/pathology , Liver Failure/pathology , Liver Neoplasms/pathology , Male , Middle Aged , Prognosis , ROC Curve , Retrospective StudiesABSTRACT
BACKGROUND: To study hormonal changes associated with severe hyperglycemia in critically ill children and the relationship with prognosis and length of stay in intensive care. METHODS: Observational study in twenty-nine critically ill children with severe hyperglycemia defined as 2 blood glucose measurements greater than 180 mg/dL. Severity of illness was assessed using pediatric index of mortality (PIM2), pediatric risk of mortality (PRISM) score, and pediatric logistic organ dysfunction (PELOD) scales. Blood glucose, glycosuria, insulin, C-peptide, cortisol, corticotropin, insulinlike growth factor-1, growth hormone, thyrotropin, thyroxine, and treatment with insulin were recorded. ß-cell function and insulin sensitivity and resistance were determined on the basis of the homeostatic model assessment (HOMA), using blood glucose and C-peptide levels. RESULTS: The initial blood glucose level was 249 mg/dL and fell gradually to 125 mg/dL at 72 hours. Initial ß-cell function (49.2%) and insulin sensitivity (13.2%) were low. At the time of diagnosis of hyperglycemia, 50% of the patients presented insulin resistance and ß-cell dysfunction, 46% presented isolated insulin resistance, and 4% isolated ß-cell dysfunction. ß-cell function improved rapidly but insulin resistance persisted. Initial glycemia did not correlate with any other factor, and there was no relationship between glycemia and mortality. Patients who died had higher cortisol and growth hormone levels at diagnosis. Length of stay was correlated by univariate analysis, but not by multivariate analysis, with C-peptide and glycemic control at 24 hours, insulin resistance, and severity of illness scores. CONCLUSIONS: Critically ill children with severe hyperglycemia initially present decreased ß-cell function and insulin sensitivity. Nonsurvivors had higher cortisol and growth hormone levels and developed hyperglycemia later than survivors.
ABSTRACT
BACKGROUND: MELD3.0 has been proposed to stratify patients on the liver transplant waiting list (WL) to reduce the historical disadvantage of women in accessing liver transplant. Our aim was to validate MELD3.0 in 2 unique populations. METHODS: This study is a 2-center retrospective cohort study from Toronto, Canada, and Valencia, Spain, of all adults added to the liver transplant WL between 2015 and 2019. Listing indications whose short-term survival outcome is not adequately captured by the MELD score were excluded. All patients analyzed had a minimum follow-up of 3 months after inclusion in the WL. RESULTS: Six hundred nineteen patients were included; 61% were male, with a mean age of 56 years. Mean MELD at inclusion was 18.00 ± 6.88, Model for End-Stage Liver Disease Sodium (MELDNa) 19.78 ± 7.00, and MELD3.0 20.25 ± 7.22. AUC to predict 90-day mortality on the WL was 0.879 (95% CI: 0.820, 0.939) for MELD, 0.921 (95% CI: 0.876, 0.967) for MELDNa, and 0.930 (95% CI: 0.888, 0.973) for MELD3.0. MELDNa and MELD3.0 were better predictors than MELD (p = 0.055 and p = 0.024, respectively), but MELD3.0 was not statistically superior to MELDNa (p = 0.144). The same was true when stratified by sex, although the difference between MELD3.0 and MELD was only significant for women (p = 0.032), while no statistical significance was found in either sex when compared with MELDNa. In women, AUC was 0.835 (95% CI: 0.744, 0.926) for MELD, 0.873 (95% CI: 0.785, 0.961) for MELDNa, and 0.886 (95% CI: 0.803, 0.970) for MELD3.0; differences for the comparison between AUC in women versus men for all 3 scores were nonsignificant. Compared to MELD, MELD3.0 was able to reclassify 146 patients (24%), the majority of whom belonged to the MELD 10-19 interval. Compared to MELDNa, it reclassified 68 patients (11%), most of them in the MELDNa 20-29 category. CONCLUSIONS: MELD3.0 has been validated in centers with significant heterogeneity and offers the highest mortality prediction for women on the WL without disadvantaging men. However, in these cohorts, it was not superior to MELDNa.
Subject(s)
End Stage Liver Disease , Liver Transplantation , Severity of Illness Index , Waiting Lists , Humans , Female , Male , Middle Aged , Retrospective Studies , Liver Transplantation/statistics & numerical data , Waiting Lists/mortality , End Stage Liver Disease/mortality , End Stage Liver Disease/surgery , Spain , Aged , Adult , Sex FactorsABSTRACT
BACKGROUND: There is substantial interindividual variability in the rate and extent of CD4+ T cell recovery after starting combination antiretroviral therapy (cART). The aim of our study was to determine whether mitochondrial DNA (mtDNA) haplogroups are associated with recovery of CD4+ in HIV-infected patients on cART. METHODS: We carried out a retrospective study on 275 cART-naive patients with CD4+ counts <350 cells/mm(3), who were followed-up during at least 24 months after initiating cART. mtDNA genotyping was performed by Sequenom's MassARRAY platform. RESULTS: Patients within cluster JT and haplogroup J had a lower chance of achieving a CD4+ count ≥500 cells/mm(3) than patients within cluster HV and haplogroup H [hazard ratio (HR) = 0.68 (P = 0.058) and HR = 0.48 (P = 0.010), respectively]. The time of follow-up during which the CD4+ count was ≥500 cells/mm(3) was longer in haplogroups HV and H than in haplogroups JT and J [20 months versus 6.2 months (P = 0.029) and 20 months versus 0 months (P = 0.024), respectively]. Additionally, haplogroups HV and H had greater chances of achieving a CD4+ count ≥500 cells/mm(3) during at least 12, 36, 48 and 60 months post-cART initiation compared with patients within haplogroups JT and J. Patients within haplogroup T only had a lesser chance of achieving a CD4+ count ≥500 cells/mm(3) during at least 48 months and 60 months post-cART initiation. CONCLUSION: European mitochondrial haplogroups might influence CD4+ recovery in HIV-infected patients following initiation with cART. Haplogroups J and T appear to be associated with a worse profile of CD4+ recovery, whereas haplogroup H was associated with a better CD4+ reconstitution.
Subject(s)
Anti-Retroviral Agents/therapeutic use , Antiretroviral Therapy, Highly Active , CD4-Positive T-Lymphocytes/immunology , DNA, Mitochondrial/genetics , HIV Infections/drug therapy , HIV Infections/immunology , Adult , CD4 Lymphocyte Count , Female , Follow-Up Studies , Haplotypes , Humans , Male , Middle Aged , Retrospective StudiesABSTRACT
Infections remain a common complication in patients with multiple myeloma (MM) and are associated with morbidity and mortality. A risk score to predict the probability of early severe infection could help to identify the patients that would benefit from preventive measures. We undertook a post hoc analysis of infections in four clinical trials from the Spanish Myeloma Group, involving a total of 1347 patients (847 transplant candidates). Regarding the GEM2010 > 65 trial, antibiotic prophylaxis was mandatory, so we excluded it from the final analysis. The incidence of severe infection episodes within the first 6 months was 13.8%, and majority of the patients experiencing the first episode before 4 months (11.1%). 1.2% of patients died because of infections within the first 6 months (1% before 4 months). Variables associated with increased risk of severe infection in the first 4 months included serum albumin ≤30 g/L, ECOG > 1, male sex, and non-IgA type MM. A simple risk score with these variables facilitated the identification of three risk groups with different probabilities of severe infection within the first 4 months: low-risk (score 0-2) 8.2%; intermediate-risk (score 3) 19.2%; and high-risk (score 4) 28.3%. Patients with intermediate/high risk could be candidates for prophylactic antibiotic therapies.
Subject(s)
Multiple Myeloma , Antibiotic Prophylaxis , Humans , Male , Multiple Myeloma/complications , Multiple Myeloma/diagnosis , Multiple Myeloma/therapyABSTRACT
BACKGROUND: Children living with HIV are reaching adulthood and transitioning to adult clinics. This study aimed to describe clinical and immunovirological status after transition in patients with perinatal HIV. METHODS: Patients participating in the Spanish multicenter pediatric HIV cohort (CoRISpe) transferred to adult care (FARO cohort) from 1997 to 2016 were included. Clinical and immunovirological data were collected from 12 years old to the last follow-up moment after transition (up to December 2017). We used mixed-effect models to analyze changes in CD4 counts or viral suppression and multivariate analysis for risk factors for virological failure (VF) and immune status after transition. Transition years were classified into 5-year periods. RESULTS: Three hundred thirty-two youths were included. The median age at transition was 18 years (interquartile range: 16.3-18.9) and 58.1% women. The median follow-up time after transition was 6.6 years (interquartile range: 4.6-9.8), and 11 patients (3.3%) died. The immunovirological status at transition improved over the last periods. Globally, VF decreased from 27.7% at transition to 14.4% at 3 years post-transition (P < 0.001), but no changes were observed in the last 2 transition periods. There were no significant differences in CD4 over the transition period. Risk factors for VF after transition were female sex, being born abroad and VF at transition, and for lower CD4 after transition were Romani heritage, younger age at transition, lower CD4 nadir, and CD4 at transition. CONCLUSIONS: After transition, virological suppression improved in the early transition periods, and immunological status remained stable. Nevertheless, some patients had higher risk of worse outcomes. Identifying these patients may aid during transition.
Subject(s)
HIV Infections/immunology , HIV Infections/transmission , HIV Infections/virology , Adolescent , Adult , Anti-HIV Agents/therapeutic use , CD4 Lymphocyte Count , Cohort Studies , Female , HIV Infections/mortality , Humans , Male , Pregnancy , Risk Factors , Spain , Viral Load , Young AdultABSTRACT
UNLABELLED: Human immunodeficiency virus (HIV) infection modifies the natural history of chronic hepatitis C, thus promoting more rapid progression to cirrhosis and end-stage liver disease. The objective of our study was to determine whether hepatitis C virus (HCV) clearance is associated with improved clinical outcomes in patients positive for HIV and HCV. It was an ambispective cohort study carried out in 11 HIV units in Spain and involved 711 consecutive patients positive for HIV/HCV who started interferon plus ribavirin therapy between 2000 and 2005. We measured sustained virologic response (SVR), i.e., undetectable HCV RNA at 24 weeks after the end of treatment, and clinical outcomes, defined as death (liver-related or non-liver-related), liver decompensation, hepatocellular carcinoma, and liver transplantation. Of 711 patients who were positive for HIV/HCV, 31% had SVR. During a mean follow-up of 20.8 months (interquartile range: 12.2-38.7), the incidence rates per 100 person-years of overall mortality, liver-related mortality, and liver decompensation were 0.46, 0.23, and 0.23 among patients with SVR and 3.12, 1.65, and 4.33 among those without SVR (P = 0.003, 0.028, and <0.001 by the log-rank test), respectively. Cox regression analysis adjusted for fibrosis, HCV genotype, HCV RNA viral load, Centers for Disease Control and Prevention clinical category, and nadir CD4+ cell count showed that the adjusted hazard ratio of liver-related events was 8.92 (95% confidence interval, 1.20; 66.11, P = 0.032) for nonresponders in comparison with responders and 4.96 (95% confidence interval, 2.27; 10.85, P < 0.001) for patients with fibrosis grade of F3-F4 versus those with F0-F2.Because this was not a prospective study, selection and survival biases may influence estimates of effect. CONCLUSION: Our results suggest that the achievement of an SVR after interferon-ribavirin therapy in patients coinfected with HIV/HCV reduces liver-related complications and mortality.
Subject(s)
Antiviral Agents/therapeutic use , HIV Infections/drug therapy , Hepatitis C, Chronic/drug therapy , Interferons/therapeutic use , Ribavirin/therapeutic use , Adult , Cohort Studies , Drug Therapy, Combination , Female , HIV Infections/complications , HIV Infections/mortality , Hepatitis C, Chronic/complications , Hepatitis C, Chronic/mortality , Humans , Male , Treatment OutcomeABSTRACT
A prospective observational study was performed to analyze the clinical course of critically ill children who require continuous renal replacement therapy (CRRT). Variables associated with prolonged CRRT were analyzed. Of the 174 children treated with CRRT, 32 (18.3%) required CRRT for >14 days and 20 (11.5%) for >21 days. Prolonged CRRT was more common in patients with heart disease and those requiring mechanical ventilation, hemodiafiltration, and higher doses of heparin. The same factors were found when patients with CRRT for >14 days and 21 days were studied. Overall mortality rate was 35.6%; it was slightly higher in patients on prolonged CRRT (43.7% with CRRT > 14 days and 45% with CRRT >21 days), though the differences were not statistically significant. We conclude that there were no differences in the pre-CRRT clinical characteristics, severity of illness, and renal function in critically ill children requiring prolonged CRRT. Prolonged CRRT was more frequently required by patients with heart disease and those on mechanical ventilation. Patients with prolonged CRRT required more frequent hemodiafiltration and higher doses of heparin. Mortality was slightly higher in children with longer CRRT, though this difference did not reach statistical significance.
Subject(s)
Acute Kidney Injury/therapy , Hemofiltration , Renal Replacement Therapy , Acute Kidney Injury/complications , Acute Kidney Injury/mortality , Analysis of Variance , Anticoagulants/adverse effects , Anticoagulants/therapeutic use , Child , Child, Preschool , Critical Illness , Female , Heart Diseases/complications , Heparin/adverse effects , Heparin/therapeutic use , Humans , Infant , Kidney Function Tests , Male , Prognosis , Proportional Hazards Models , Prospective Studies , Regression Analysis , Renal Replacement Therapy/mortality , Respiration, Artificial , Treatment OutcomeABSTRACT
INTRODUCTION: Pharmacologic studies have shown a relationship between plasma antiretroviral levels and toxicity/viral activity. Nevertheless, pharmacokinetic and pharmacodynamic data are inconsistent and limited in HIV-infected children. An analysis was performed of plasma antiretroviral concentrations in clinical practice and their influence on therapy efficacy in HIV-infected children. METHODS: Observational, prospective, multicenter study, including HIV-infected children followed up at 5 reference hospitals between March 2006 and June 2008. Pre-dose plasma antiretroviral levels were determined and the relationships with various clinical and analytical variables were investigated. RESULTS: A total of 129 patients were included, and 41.3% had antiretroviral plasma levels outside the established range. No differences were found between sexes. Children younger than 1 year had a higher rate of suboptimal levels and higher viral load than the remaining children. CONCLUSION: Antiretroviral plasma concentrations are more frequently suboptimal in children younger than 1 year. This finding is related with greater viral failure and implies a considerable challenge in this population, which requires very long-term treatment.
Subject(s)
Anti-Retroviral Agents/blood , HIV Infections/blood , Age Factors , Anti-Retroviral Agents/therapeutic use , Child , Female , HIV Infections/drug therapy , Humans , Male , Prospective Studies , Sex FactorsABSTRACT
Acute kidney injury is a frequent complication in patients requiring extracorporeal membrane oxygenation. A single-center retrospective analysis from a prospective observational database assessing the incidence of acute kidney injury in children undergoing extracorporeal membrane oxygenation, the use of continuous renal replacement therapy and its association with outcomes was performed. One hundred children were studied. Creatinine was normal in 33.3% of children at the beginning of extracorporeal membrane oxygenation, between 1.5 and 2 times its baseline levels in 18.4% of children (stage I acute kidney injury), between 2 and 3 times baseline levels (stage II) in 20.7%, and over 3 times baseline levels or requiring continuous renal replacement therapy (stage III) in 27.6% of the patients. Eighteen patients were on continuous renal replacement therapy before the beginning of extracorporeal membrane oxygenation, 81 required continuous renal replacement therapy during extracorporeal membrane oxygenation, and 38 after weaning from extracorporeal membrane oxygenation, but none of them did at discharge from the pediatric intensive care unit. Fifty-one children survived to pediatric intensive care unit discharge. Mortality was lower in children with normal kidney function or with stage I acute kidney injury at the beginning of extracorporeal membrane oxygenation than in those with stage II or III acute kidney injury (33.3% vs 58.3%, p = 0.021). Mortality in children requiring continuous renal replacement therapy during extracorporeal membrane oxygenation was 54.3% and 21.1% in the rest of patients (p < 0.01). We conclude that kidney function is significantly impaired in a high percentage of children undergoing extracorporeal membrane oxygenation and many of them are treated with continuous renal replacement therapy. Patients treated with continuous renal replacement therapy have a higher mortality than those with normal kidney function or stage I acute kidney injury at the beginning of extracorporeal membrane oxygenation. Most patients surviving to pediatric intensive care unit discharge recover normal renal function after weaning from extracorporeal membrane oxygenation.
Subject(s)
Acute Kidney Injury , Extracorporeal Membrane Oxygenation , Renal Replacement Therapy , Acute Kidney Injury/diagnosis , Acute Kidney Injury/epidemiology , Acute Kidney Injury/therapy , Child, Preschool , Extracorporeal Membrane Oxygenation/methods , Extracorporeal Membrane Oxygenation/statistics & numerical data , Female , Heart Failure/therapy , Humans , Incidence , Infant , Intensive Care Units, Pediatric/statistics & numerical data , Male , Organ Dysfunction Scores , Renal Replacement Therapy/methods , Renal Replacement Therapy/statistics & numerical data , Respiratory Insufficiency/therapy , Retrospective Studies , Spain/epidemiologyABSTRACT
BACKGROUND: A recently developed global indicator of oxidative stress (OXY-SCORE), by combining individual plasma biomarkers of oxidative damage and antioxidant capacity, has been validated in several pathologies, but not in left ventricular hypertrophy (LVH). The aim of this study was to design and calculate a plasma oxidative stress global index for patients with LVH. METHODS: A total of 70 consecutive adult patients were recruited in our institution and assigned to one of the two study groups (control group/LVH group) by an echocardiography study. We evaluated plasmatic biomarkers of oxidative damage (malondialdehyde and thiolated proteins) and antioxidant defense (total thiols, reduced glutathione, total antioxidant capacity, catalase, and superoxide dismutase activities) by spectrophotometry/fluorimetry in order to calculate a plasma oxidative stress global index (OXY-SCORE) in relation to LVH. RESULTS: The OXY-SCORE exhibited a highly significant difference between the groups (p < 0.001). The area under the receiver operating characteristic curve was 0.74 (95% confidence interval (CI), 0.62-0.85; p < 0.001). At a cut-off value of -1, the 68.6% sensitivity and 68.6% specificity values suggest that OXY-SCORE could be used to screen for LVH. A multivariable logistic regression model showed a positive association (p = 0.001) between OXY-SCORE and LVH [odds ratio = 0.55 (95% CI, 0.39-0.79)], independent of gender, age, smoking, glucose, systolic and diastolic arterial pressure, dyslipidemia, estimated glomerular filtration rate, body mass index, and valvular/coronary disease. CONCLUSION: OXY-SCORE could help in the diagnosis of LVH and could be used to monitor treatment response.