RESUMEN
It is crucial to understand factors associated with COVID-19 booster uptake in the U.S. given the updated COVID-19 vaccine recommendations. Using data from a national prospective cohort (N=4,616) between September 2021-October 2022, we examined socioeconomic, demographic, and behavioral factors of initial booster uptake among participants fully-vaccinated with the primary COVID-19 vaccines series. Cox proportional hazards models were used to estimate the associations of each factor with time to initial booster uptake. Most participants (86.5%) reported receiving their initial booster. After adjusting for age, race/ethnicity, education, region, and employment, participants with greater risk for severe COVID-19 had similar booster uptake compared with those with lower risk (aHR: 1.04; 95% CI: 0.95, 1.14). Participants with greater barriers to healthcare (aHR: 0.89; 95% CI: 0.84, 0.96), food insecurity (aHR: 0.82; 95% CI: 0.75, 0.89), and housing instability (aHR: 0.81; 95% CI: 0.73, 0.90) were less likely to report receiving initial booster compared with those without those barriers. Factors motivating the decision to vaccinate changed from safety-related concerns for the primary series to perceived need for the booster. It is key to address economic and health access barriers to achieve equitable COVID-19 vaccine uptake and continued protection against COVID-19.
RESUMEN
OBJECTIVES: The COVID-19 pandemic threatened standard hospital operations. We sought to understand how this stress was perceived and manifested within individual hospitals and in relation to local viral activity. DESIGN: Prospective weekly hospital stress survey, November 2020-June 2022. SETTING: Society of Critical Care Medicine's Discovery Severe Acute Respiratory Infection-Preparedness multicenter cohort study. SUBJECTS: Thirteen hospitals across seven U.S. health systems. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: We analyzed 839 hospital-weeks of data over 85 pandemic weeks and five viral surges. Perceived overall hospital, ICU, and emergency department (ED) stress due to severe acute respiratory infection patients during the pandemic were reported by a mean of 43% ( sd , 36%), 32% (30%), and 14% (22%) of hospitals per week, respectively, and perceived care deviations in a mean of 36% (33%). Overall hospital stress was highly correlated with ICU stress (ρ = 0.82; p < 0.0001) but only moderately correlated with ED stress (ρ = 0.52; p < 0.0001). A county increase in 10 severe acute respiratory syndrome coronavirus 2 cases per 100,000 residents was associated with an increase in the odds of overall hospital, ICU, and ED stress by 9% (95% CI, 5-12%), 7% (3-10%), and 4% (2-6%), respectively. During the Delta variant surge, overall hospital stress persisted for a median of 11.5 weeks (interquartile range, 9-14 wk) after local case peak. ICU stress had a similar pattern of resolution (median 11 wk [6-14 wk] after local case peak; p = 0.59) while the resolution of ED stress (median 6 wk [5-6 wk] after local case peak; p = 0.003) was earlier. There was a similar but attenuated pattern during the Omicron BA.1 subvariant surge. CONCLUSIONS: During the COVID-19 pandemic, perceived care deviations were common and potentially avoidable patient harm was rare. Perceived hospital stress persisted for weeks after surges peaked.
Asunto(s)
COVID-19 , Humanos , COVID-19/epidemiología , SARS-CoV-2 , Pandemias , Estudios de Cohortes , Estudios Prospectivos , HospitalesRESUMEN
OBJECTIVES: Assess clinical outcomes following PICU Liberation ABCDEF Bundle utilization. DESIGN: Prospective, multicenter, cohort study. SETTING: Eight academic PICUs. PATIENTS: Children greater than 2 months with expected PICU stay greater than 2 days and need for mechanical ventilation (MV). INTERVENTIONS: ABCDEF Bundle implementation. MEASUREMENT AND MAIN RESULTS: Over an 11-month period (3-mo baseline, 8-mo implementation), Bundle utilization was measured for 622 patients totaling 5,017 PICU days. Risk of mortality was quantified for 532 patients (4,275 PICU days) for correlation between Bundle utilization and MV duration, PICU length of stay (LOS), delirium incidence, and mortality. Utilization was analyzed as subject-specific (entire PICU stay) and day-specific (single PICU day). Median overall subject-specific utilization increased from 50% during the 3-month baseline to 63.9% during the last four implementation months ( p < 0.001). Subject-specific utilization for elements A and C did not change; utilization improved for B (0-12.5%; p = 0.007), D (22.2-61.1%; p < 0.001), E (17.7-50%; p = 0.003), and F (50-79.2%; p = 0.001). We observed no association between Bundle utilization and MV duration, PICU LOS, or delirium incidence. In contrast, on adjusted analysis, every 10% increase in subject-specific utilization correlated with mortality odds ratio (OR) reduction of 34%, p < 0.001; every 10% increase in day-specific utilization correlated with a mortality OR reduction of 1.4% ( p = 0.006). CONCLUSIONS: ABCDEF Bundle is applicable to children. Although enhanced Bundle utilization correlated with decreased mortality, increased utilization did not correlate with duration of MV, PICU LOS, or delirium incidence. Additional research in the domains of comparative effectiveness, implementation science, and human factors engineering is required to understand this clinical inconsistency and optimize PICU Liberation concept integration into clinical practice.
Asunto(s)
Enfermedad Crítica , Delirio , Humanos , Niño , Estudios de Cohortes , Estudios Prospectivos , Enfermedad Crítica/terapia , Enfermedad Crítica/epidemiología , Unidades de Cuidados Intensivos , Delirio/epidemiología , Unidades de Cuidado Intensivo PediátricoRESUMEN
PURPOSE: To examine National Cancer Database (NCDB) data to comparatively evaluate overall survival (OS) between patients undergoing transarterial radioembolization (TARE) and those undergoing systemic therapy for hepatocellular carcinoma with major vascular invasion (HCC-MVI). METHODS: One thousand five hundred fourteen patients with HCC-MVI undergoing first-line TARE or systemic therapy were identified from the NCDB. OS was compared using propensity score-matched Cox regression and landmark analysis. Efficacy was also compared within a target trial framework. RESULTS: TARE usage doubled between 2010 and 2015. Intervals before treatment were longer for TARE than for systemic therapy (mean [median], 66.5 [60] days vs 46.8 (35) days, respectively, P < .0001). In propensity-score-matched and landmark-time-adjusted analyses, TARE was found to be associated with a hazard ratio of 0.74 (95 % CI, 0.60-0.91; P = .005) and median OS of 7.1 months (95 % CI, 5.0-10.5) versus 4.9 months (95 % CI, 3.9-6.5) for systemically treated patients. In an emulated target trial involving 236 patients with unilobular HCC-MVI, a low number of comorbidities, creatinine levels <2.0 mg/dL, bilirubin levels <2.0 mg/dL, and international normalized ratio <1.7, TARE was found to be associated with a hazard ratio of 0.57 (95 % CI, 0.39-0.83; P = .004) and a median OS of 12.9 months (95 % CI, 7.6-19.2) versus 6.5 months (95 % CI, 3.6-11.1) for the systemic therapy arm. CONCLUSIONS: In propensity-score-matched analyses involving pragmatic and target trial HCC-MVI cohorts, TARE was found to be associated with significant survival benefits compared with systemic therapy. Although not a substitute for prospective trials, these findings suggest that the increasing use of TARE for HCC-MVI is accompanied by improved OS. Further trials of TARE in patients with HCC-MVI are needed, especially to compare with newer systemic therapies.
Asunto(s)
Carcinoma Hepatocelular , Neoplasias Hepáticas , Carcinoma Hepatocelular/radioterapia , Humanos , Neoplasias Hepáticas/terapia , Puntaje de Propensión , Estudios Prospectivos , Radioisótopos de ItrioRESUMEN
BACKGROUND: The Mayo Clinic imaging classification of autosomal dominant polycystic kidney disease (ADPKD) uses height-adjusted total kidney volume (htTKV) and age to identify patients at highest risk for disease progression. However, this classification applies only to patients with typical diffuse cystic disease (class 1). Because htTKV poorly predicts eGFR decline for the 5%-10% of patients with atypical morphology (class 2), imaging-based risk modeling remains unresolved. METHODS: Of 558 adults with ADPKD in the HALT-A study, we identified 25 patients of class 2A with prominent exophytic cysts (class 2Ae) and 43 patients of class 1 with prominent exophytic cysts; we recalculated their htTKVs to exclude exophytic cysts. Using original and recalculated htTKVs in association with imaging classification in logistic and mixed linear models, we compared predictions for developing CKD stage 3 and for eGFR trajectory. RESULTS: Using recalculated htTKVs increased specificity for developing CKD stage 3 in all participants from 82.6% to 84.2% after adjustment for baseline age, eGFR, BMI, sex, and race. The predicted proportion of class 2Ae patients developing CKD stage 3 using a cutoff of 0.5 for predicting case status was better calibrated to the observed value of 13.0% with recalculated htTKVs (45.5%) versus original htTKVs (63.6%). Using recalculated htTKVs reduced the mean paired difference between predicted and observed eGFR from 17.6 (using original htTKVs) to 4.0 ml/min per 1.73 m2 for class 2Ae, and from -1.7 (using original htTKVs) to 0.1 ml/min per 1.73 m2 for class 1. CONCLUSIONS: Use of a recalculated htTKV measure that excludes prominent exophytic cysts facilitates inclusion of class 2 patients and reclassification of class 1 patients in the Mayo classification model.
Asunto(s)
Riñón/patología , Riñón Poliquístico Autosómico Dominante/clasificación , Riñón Poliquístico Autosómico Dominante/diagnóstico por imagen , Insuficiencia Renal Crónica/etiología , Adulto , Estatura , Progresión de la Enfermedad , Femenino , Tasa de Filtración Glomerular , Humanos , Imagen por Resonancia Magnética , Masculino , Persona de Mediana Edad , Tamaño de los Órganos , Riñón Poliquístico Autosómico Dominante/complicaciones , Riñón Poliquístico Autosómico Dominante/patología , Valor Predictivo de las Pruebas , Curva ROC , Medición de Riesgo/métodos , Adulto JovenRESUMEN
INTRODUCTION: Oral antiviral medications are important tools for preventing severe COVID-19 outcomes. However, their uptake remains low for reasons that are not entirely understood. Our study aimed to assess the association between perceived risk for severe COVID-19 outcomes and oral antiviral use among those who were eligible for treatment based on Centers for Disease Control and Prevention (CDC) guidelines. METHODS: We surveyed 4034 non-institutionalized US adults in April 2023, and report findings from 934 antiviral-eligible participants with at least one confirmed SARS-CoV-2 infection since December 1, 2021 and no current long COVID symptoms. Survey weights were used to yield nationally representative estimates. The primary exposure of interest was whether participants perceived themselves to be "at high risk for severe COVID-19." The primary outcome was use of a COVID-19 oral antiviral within 5 days of suspected SARS-CoV-2 infection. RESULTS: Only 18.5% of antiviral-eligible adults considered themselves to be at high risk for severe COVID-19 and 16.8% and 15.9% took oral antivirals at any time or within 5 days of SARS-CoV-2 infection, respectively. In contrast, 79.8% were aware of antiviral treatments for COVID-19. Perceived high-risk status was associated with being more likely to be aware (adjusted prevalence ratio [aPR]: 1.11 [95% confidence interval (CI) 1.03-1.20]), to be prescribed (aPR 1.47 [95% CI 1.08-2.01]), and to take oral antivirals at any time (aPR 1.61 [95% CI 1.16-2.24]) or within 5 days of infection (aPR 1.72 [95% CI 1.23-2.40]). CONCLUSIONS: Despite widespread awareness of the availability of COVID-19 oral antivirals, more than 80% of eligible US adults did not receive them. Our findings suggest that differences between perceived and actual risk for severe COVID-19 (based on current CDC guidelines) may partially explain this low uptake.
RESUMEN
For people with HIV (PWH) who have psychological comorbidities, effective management of mental health issues is crucial to achieving and maintaining viral suppression. Care coordination programs (CCPs) have been shown to improve outcomes across the HIV care continuum, but little research has focused on the role of care coordination in supporting the mental health of PWH. This study reports qualitative findings from the Program Refinements to Optimize Model Impact and Scalability based on Evidence (PROMISE) study, which evaluated a revised version of an HIV CCP for Ryan White Part A clients in New York City. Semistructured interviews were conducted with 30 providers and 27 clients from 6 CCP-implementing agencies to elucidate barriers and facilitators of program engagement. Transcripts were analyzed for key themes related to clients' mental health needs and providers' successes and challenges in meeting these needs. Providers and clients agreed that insufficiently managed mental health issues are a common barrier to achieving and maintaining viral suppression. Although the CCP model calls for providers to address clients' unmet mental health needs primarily through screening and referrals to psychiatric and/or psychological care, both clients and providers reported that the routine provision of emotional support is a major part of providers' role that is highly valued by clients. Some concerns raised by providers included insufficient training to address clients' mental health needs and an inability to document the provision of emotional support as a delivered service. These findings suggest the potential value of formally integrating mental health services into HIV care coordination provision. ClinicalTrials.gov protocol number: NCT03628287.
Asunto(s)
Infecciones por VIH , Servicios de Salud Mental , Humanos , Continuidad de la Atención al Paciente , Consejo , Infecciones por VIH/psicología , Salud MentalRESUMEN
Objective: To estimate risk of being unvaccinated against COVID-19 by experience of intimate partner violence (IPV). Methods: Among 3,343 partnered individuals in a community-based U.S. cohort, we quantified emotional and physical IPV experienced between March and December 2020 and estimated risk of being unvaccinated against COVID-19 through June 2021 by experience of IPV. Experience of recent IPV was defined as endorsement of more frequent or severe IPV since the start of the pandemic or report of any past-month IPV in at least one of four follow-up surveys conducted by the end of December 2020. We created a three-level composite variable - no experience of IPV, experience of emotional but not physical IPV, and experience of physical IPV. Results: Cisgender women, non-binary, or transgender individuals who reported experiencing emotional, but not physical, IPV and those who reported experiencing physical IPV were both at significantly higher risk of being unvaccinated for COVID-19 compared to those who reported experiencing no IPV (ARRemotional violence: 1.28 [95 % CI: 1.09 - 1.51]; ARRphysical violence: 1.70 [95 % CI: 1.41 - 2.05]). Cisgender men who reported experiencing physical IPV were also at significantly higher risk of being unvaccinated for COVID-19 (ARRphysical violence: 1.52 [95 % CI: 1.15 - 2.02]). Conclusions: IPV may increase the risk of low vaccine uptake. Results highlight the need to incorporate IPV prevention and support into public health responses, with targeted resources and consideration for reducing barriers to public health interventions among those impacted.
RESUMEN
This study used repeat serologic testing to estimate infection rates and risk factors in two overlapping cohorts of SARS-CoV-2 N protein seronegative U.S. adults. One mostly unvaccinated sub-cohort was tracked from April 2020 to March 2021 (pre-vaccine/wild-type era, n = 3421), and the other, mostly vaccinated cohort, from March 2021 to June 2022 (vaccine/variant era, n = 2735). Vaccine uptake was 0.53% and 91.3% in the pre-vaccine and vaccine/variant cohorts, respectively. Corresponding seroconversion rates were 9.6 and 25.7 per 100 person-years. In both cohorts, sociodemographic and epidemiologic risk factors for infection were similar, though new risk factors emerged in the vaccine/variant era, such as having a child in the household. Despite higher incidence rates in the vaccine/variant cohort, vaccine boosters, masking, and social distancing were associated with substantially reduced infection risk, even through major variant surges.
Asunto(s)
COVID-19 , Vacunas , Adulto , Niño , Humanos , COVID-19/epidemiología , COVID-19/prevención & control , Estudios Prospectivos , SARS-CoV-2 , Inmunización SecundariaRESUMEN
Background: We described the oral nirmatrelvir/ritonavir (NMV/r) and molnupiravir (MOV) uptake among a subgroup of highly vaccinated adults in a US national prospective cohort who were infected with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) between 12/2021 and 10/2022. Methods: We estimate antiviral uptake within 5 days of SARS-CoV-2 infection, as well as age- and gender-adjusted antiviral uptake prevalence ratios by antiviral eligibility (based on age and comorbidities), sociodemographic characteristics, and clinical characteristics including vaccination status and history of long coronavirus disease 2019 (COVID). Results: NMV/r uptake was 13.6% (95% CI, 11.9%-15.2%) among 1594 participants, and MOV uptake was 1.4% (95% CI, 0.8%-2.1%) among 1398 participants. NMV/r uptake increased over time (1.9%; 95% CI, 1.0%-2.9%; between 12/2021 and 3/2022; 16.5%; 95% CI, 13.0%-20.0%; between 4/2022 and 7/2022; and 25.3%; 95% CI, 21.6%-29.0%; between 8/2022 and 10/2022). Participants age ≥65 and those who had comorbidities for severe COVID-19 had higher NMV/r uptake. There was lower NMV/r uptake among non-Hispanic Black participants (7.2%; 95% CI, 2.4%-12.0%; relative to other racial/ethnic groups) and among individuals in the lowest income groups (10.6%; 95% CI, 7.3%-13.8%; relative to higher income groups). Among a subset of 278 participants with SARS-CoV-2 infection after 12/2021 who also had a history of prior SARS-CoV-2 infection, those with (vs without) a history of long COVID reported greater NMV/r uptake (22.0% vs 7.9%; P = .001). Among those prescribed NMV/r (n = 216), 137 (63%; 95% CI, 57%-70%) reported that NMV/r was helpful for reducing COVID-19 symptoms. Conclusions: Despite proven effectiveness against severe outcomes, COVID-19 antiviral uptake remains low among those with SARS-CoV-2 infection in the United States. Further outreach to providers and patients to improve awareness of COVID-19 oral antivirals and indications is needed.
RESUMEN
IMPORTANCE: The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic has evolved through multiple phases in the United States, with significant differences in patient centered outcomes with improvements in hospital strain, medical countermeasures, and overall understanding of the disease. We describe how patient characteristics changed and care progressed over the various pandemic phases; we also emphasize the need for an ongoing clinical network to improve the understanding of known and novel respiratory viral diseases. OBJECTIVES: To describe how patient characteristics and care evolved across the various COVID-19 pandemic periods in those hospitalized with viral severe acute respiratory infection (SARI). DESIGN: Severe Acute Respiratory Infection-Preparedness (SARI-PREP) is a Centers for Disease Control and Prevention Foundation-funded, Society of Critical Care Medicine Discovery-housed, longitudinal multicenter cohort study of viral pneumonia. We defined SARI patients as those hospitalized with laboratory-confirmed respiratory viral infection and an acute syndrome of fever, cough, and radiographic infiltrates or hypoxemia. We collected patient-level data including demographic characteristics, comorbidities, acute physiologic measures, serum and respiratory specimens, therapeutics, and outcomes. Outcomes were described across four pandemic variant periods based on a SARS-CoV-2 sequenced subsample: pre-Delta, Delta, Omicron BA.1, and Omicron post-BA.1. SETTING: Multicenter cohort of adult patients admitted to an acute care ward or ICU from seven hospitals representing diverse geographic regions across the United States. PARTICIPANTS: Patients with SARI caused by infection with respiratory viruses. MAIN OUTCOMES AND RESULTS: Eight hundred seventy-four adult patients with SARI were enrolled at seven study hospitals between March 2020 and April 2023. Most patients (780, 89%) had SARS-CoV-2 infection. Across the COVID-19 cohort, median age was 60 years (interquartile range, 48.0-71.0 yr) and 66% were male. Almost half (430, 49%) of the study population belonged to underserved communities. Most patients (76.5%) were admitted to the ICU, 52.5% received mechanical ventilation, and observed hospital mortality was 25.5%. As the pandemic progressed, we observed decreases in ICU utilization (94% to 58%), hospital length of stay (median, 26.0 to 8.5 d), and hospital mortality (32% to 12%), while the number of comorbid conditions increased. CONCLUSIONS AND RELEVANCE: We describe increasing comorbidities but improved outcomes across pandemic variant periods, in the setting of multiple factors, including evolving care delivery, countermeasures, and viral variants. An understanding of patient-level factors may inform treatment options for subsequent variants and future novel pathogens.
Asunto(s)
COVID-19 , SARS-CoV-2 , Humanos , COVID-19/epidemiología , Masculino , Femenino , Persona de Mediana Edad , Estados Unidos/epidemiología , Estudios Longitudinales , Anciano , Pandemias , Adulto , Hospitalización/estadística & datos numéricos , Unidades de Cuidados Intensivos , Estudios de CohortesRESUMEN
Background: Infectious disease surveillance systems, which largely rely on diagnosed cases, underestimate the true incidence of SARS-CoV-2 infection, due to under-ascertainment and underreporting. We used repeat serologic testing to measure N-protein seroconversion in a well-characterized cohort of U.S. adults with no serologic evidence of SARS-CoV-2 infection to estimate the incidence of SARS-CoV-2 infection and characterize risk factors, with comparisons before and after the start of the SARS-CoV-2 vaccine and variant eras. Methods: We assessed the incidence rate of infection and risk factors in two sub-groups (cohorts) that were SARS-CoV-2 N-protein seronegative at the start of each follow-up period: 1) the pre-vaccine/wild-type era cohort (n=3,421), followed from April to November 2020; and 2) the vaccine/variant era cohort (n=2,735), followed from November 2020 to June 2022. Both cohorts underwent repeat serologic testing with an assay for antibodies to the SARS-CoV-2 N protein (Bio-Rad Platelia SARS-CoV-2 total Ab). We estimated crude incidence and sociodemographic/epidemiologic risk factors in both cohorts. We used multivariate Poisson models to compare the risk of SARS-CoV-2 infection in the pre-vaccine/wild-type era cohort (referent group) to that in the vaccine/variant era cohort, within strata of vaccination status and epidemiologic risk factors (essential worker status, child in the household, case in the household, social distancing). Findings: In the pre-vaccine/wild-type era cohort, only 18 of the 3,421 participants (0.53%) had ≥1 vaccine dose by the end of follow-up, compared with 2,497/2,735 (91.3%) in the vaccine/variant era cohort. We observed 323 and 815 seroconversions in the pre-vaccine/wild-type era and the vaccine/variant era and cohorts, respectively, with corresponding incidence rates of 9.6 (95% CI: 8.3-11.5) and 25.7 (95% CI: 24.2-27.3) per 100 person-years. Associations of sociodemographic and epidemiologic risk factors with SARS-CoV-2 incidence were largely similar in the pre-vaccine/wild-type and vaccine/variant era cohorts. However, some new epidemiologic risk factors emerged in the vaccine/variant era cohort, including having a child in the household, and never wearing a mask while using public transit. Adjusted incidence rate ratios (aIRR), with the entire pre-vaccine/wild-type era cohort as the referent group, showed markedly higher incidence in the vaccine/variant era cohort, but with more vaccine doses associated with lower incidence: aIRRun/undervaccinated=5.3 (95% CI: 4.2-6.7); aIRRprimary series only=5.1 (95% CI: 4.2-7.3); aIRRboosted once=2.5 (95% CI: 2.1-3.0), and aIRRboosted twice=1.65 (95% CI: 1.3-2.1). These associations were essentially unchanged in risk factor-stratified models. Interpretation: In SARS-CoV-2 N protein seronegative individuals, large increases in incidence and newly emerging epidemiologic risk factors in the vaccine/variant era likely resulted from multiple co-occurring factors, including policy changes, behavior changes, surges in transmission, and changes in SARS-CoV-2 variant properties. While SARS-CoV-2 incidence increased markedly in most groups in the vaccine/variant era, being up to date on vaccines and the use of non-pharmaceutical interventions (NPIs), such as masking and social distancing, remained reliable strategies to mitigate the risk of SARS-CoV-2 infection, even through major surges due to immune evasive variants. Repeat serologic testing in cohort studies is a useful and complementary strategy to characterize SARS-CoV-2 incidence and risk factors.
RESUMEN
Immune cell-based therapies are promising strategies to facilitate immunosuppression withdrawal after organ transplantation. Regulatory dendritic cells (DCreg) are innate immune cells that down-regulate alloimmune responses in preclinical models. Here, we performed clinical monitoring and comprehensive assessment of peripheral and allograft tissue immune cell populations in DCreg-infused live-donor liver transplant (LDLT) recipients up to 12 months (M) after transplant. Thirteen patients were given a single infusion of donor-derived DCreg 1 week before transplant (STUDY) and were compared with 40 propensity-matched standard-of-care (SOC) patients. Donor-derived DCreg infusion was well tolerated in all STUDY patients. There were no differences in postoperative complications or biopsy-confirmed acute rejection compared with SOC patients up to 12M. DCreg administration was associated with lower frequencies of effector T-bet+Eomes+CD8+ T cells and CD16bright natural killer (NK) cells and an increase in putative tolerogenic CD141+CD163+ DCs compared with SOC at 12M. Antidonor proliferative capacity of interferon-γ+ (IFN-γ+) CD4+ and CD8+ T cells was lower compared with antithird party responses in STUDY participants, but not in SOC patients, at 12M. In addition, lower circulating concentrations of interleukin-12p40 (IL-12p40), IFN-γ, and CXCL10 were detected in STUDY participants compared with SOC patients at 12M. Analysis of 12M allograft biopsies revealed lower frequencies of graft-infiltrating CD8+ T cells, as well as attenuation of cytolytic TH1 effector genes and pathways among intragraft CD8+ T cells and NK cells, in DCreg-infused patients. These reductions may be conducive to reduced dependence on immunosuppressive drug therapy or immunosuppression withdrawal.
Asunto(s)
Linfocitos T CD8-positivos , Trasplante de Hígado , Humanos , Células Dendríticas/metabolismo , Donadores Vivos , Células Asesinas Naturales , Interferón gamma/metabolismo , Rechazo de InjertoRESUMEN
Vascular dysfunction and capillary leak are common in critically ill COVID-19 patients, but identification of endothelial pathways involved in COVID-19 pathogenesis has been limited. Angiopoietin-like 4 (ANGPTL4) is a protein secreted in response to hypoxic and nutrient-poor conditions that has a variety of biological effects including vascular injury and capillary leak. OBJECTIVES: To assess the role of ANGPTL4 in COVID-19-related outcomes. DESIGN SETTING AND PARTICIPANTS: Two hundred twenty-five COVID-19 ICU patients were enrolled from April 2020 to May 2021 in a prospective, multicenter cohort study from three different medical centers, University of Washington, University of Southern California and New York University. MAIN OUTCOMES AND MEASURES: Plasma ANGPTL4 was measured on days 1, 7, and 14 after ICU admission. We used previously published tissue proteomic data and lung single nucleus RNA (snRNA) sequencing data from specimens collected from COVID-19 patients to determine the tissues and cells that produce ANGPTL4. RESULTS: Higher plasma ANGPTL4 concentrations were significantly associated with worse hospital mortality (adjusted odds ratio per log2 increase, 1.53; 95% CI, 1.17-2.00; p = 0.002). Higher ANGPTL4 concentrations were also associated with higher proportions of venous thromboembolism and acute respiratory distress syndrome. Longitudinal ANGPTL4 concentrations were significantly different during the first 2 weeks of hospitalization in patients who subsequently died compared with survivors (p for interaction = 8.1 × 10-5). Proteomics analysis demonstrated abundance of ANGPTL4 in lung tissue compared with other organs in COVID-19. ANGPTL4 single-nuclear RNA gene expression was significantly increased in pulmonary alveolar type 2 epithelial cells and fibroblasts in COVID-19 lung tissue compared with controls. CONCLUSIONS AND RELEVANCE: ANGPTL4 is expressed in pulmonary epithelial cells and fibroblasts and is associated with clinical prognosis in critically ill COVID-19 patients.
RESUMEN
BACKGROUND AND OBJECTIVES: The progression of polycystic liver disease is not well understood. The purpose of the study is to evaluate the associations of polycystic liver progression with other disease progression variables and classify liver progression on the basis of patient's age, height-adjusted liver cystic volume, and height-adjusted liver volume. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Prospective longitudinal magnetic resonance images from 670 patients with early autosomal dominant polycystic kidney disease for up to 14 years of follow-up were evaluated to measure height-adjusted liver cystic volume and height-adjusted liver volume. Among them, 245 patients with liver cyst volume >50 ml at baseline were included in the longitudinal analysis. Linear mixed models on log-transformed height-adjusted liver cystic volume and height-adjusted liver volume were fitted to approximate mean annual rate of change for each outcome. The association of sex, body mass index, genotype, baseline height-adjusted total kidney volume, and Mayo imaging class was assessed. We calculated height-adjusted liver cystic volume ranges for each specific age and divided them into five classes on the basis of annual percentage increase in height-adjusted liver cystic volume. RESULTS: The mean annual growth rate of height-adjusted liver cystic volume was 12% (95% confidence interval, 11.1% to 13.1%; P<0.001), whereas that for height-adjusted liver volume was 2% (95% confidence interval, 1.9% to 2.6%; P<0.001). Women had higher baseline height-adjusted liver cystic volume than men, but men had higher height-adjusted liver cystic volume growth rate than women by 2% (95% confidence interval, 0.4% to 4.5%; P=0.02). Whereas the height-adjusted liver cystic volume growth rate decreased in women after menopause, no decrease was observed in men at any age. Body mass index, genotype, and baseline height-adjusted total kidney volume were not associated with the growth rate of height-adjusted liver cystic volume or height-adjusted liver volume. According to the height-adjusted liver cystic volume growth rate, patients were classified into five classes (number of women, men in each class): A (24, six); B (44, 13); C (43, 48); D (28, 17); and E (13, nine). CONCLUSIONS: Compared with height-adjusted liver volume, the use of height-adjusted liver cystic volume showed greater separations in volumetric progression of polycystic liver disease. Similar to the Mayo imaging classification for the kidney, the progression of polycystic liver disease may be categorized on the basis of patient's age and height-adjusted liver cystic volume.
Asunto(s)
Riñón Poliquístico Autosómico Dominante , Quistes , Progresión de la Enfermedad , Femenino , Tasa de Filtración Glomerular , Humanos , Riñón/diagnóstico por imagen , Riñón/patología , Hígado/diagnóstico por imagen , Hígado/patología , Hepatopatías , Imagen por Resonancia Magnética , Masculino , Riñón Poliquístico Autosómico Dominante/complicaciones , Riñón Poliquístico Autosómico Dominante/diagnóstico por imagen , Riñón Poliquístico Autosómico Dominante/genética , Estudios ProspectivosRESUMEN
Respiratory virus infections cause significant morbidity and mortality ranging from mild uncomplicated acute respiratory illness to severe complications, such as acute respiratory distress syndrome, multiple organ failure, and death during epidemics and pandemics. We present a protocol to systematically study patients with severe acute respiratory infection (SARI), including severe acute respiratory syndrome coronavirus 2, due to respiratory viral pathogens to evaluate the natural history, prognostic biomarkers, and characteristics, including hospital stress, associated with clinical outcomes and severity. DESIGN: Prospective cohort study. SETTING: Multicenter cohort of patients admitted to an acute care ward or ICU from at least 15 hospitals representing diverse geographic regions across the United States. PATIENTS: Patients with SARI caused by infection with respiratory viruses that can cause outbreaks, epidemics, and pandemics. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Measurements include patient demographics, signs, symptoms, and medications; microbiology, imaging, and associated tests; mechanical ventilation, hospital procedures, and other interventions; and clinical outcomes and hospital stress, with specimens collected on days 0, 3, and 7-14 after enrollment and at discharge. The primary outcome measure is the number of consecutive days alive and free of mechanical ventilation (VFD) in the first 30 days after hospital admission. Important secondary outcomes include organ failure-free days before acute kidney injury, shock, hepatic failure, disseminated intravascular coagulation, 28-day mortality, adaptive immunity, as well as immunologic and microbiologic outcomes. CONCLUSIONS: SARI-Preparedness is a multicenter study under the collaboration of the Society of Critical Care Medicine Discovery, Resilience Intelligence Network, and National Emerging Special Pathogen Training and Education Center, which seeks to improve understanding of prognostic factors associated with worse outcomes and increased resource utilization. This can lead to interventions to mitigate the clinical impact of respiratory virus infections associated with SARI.
RESUMEN
BACKGROUND AND OBJECTIVES: Root cause analysis involves evaluation of causal relationships between exposures (or interventions) and adverse outcomes, such as identification of direct (eg, medication orders missed) and root causes (eg, clinician's fatigue and workload) of adverse rare events. To assess causality requires either randomization or sophisticated methods applied to carefully designed observational studies. In most cases, randomized trials are not feasible in the context of root cause analysis. Using observational data for causal inference, however, presents many challenges in both the design and analysis stages. Methods for observational causal inference often fall outside the toolbox of even well-trained statisticians, thus necessitating workforce training. METHODS: This article synthesizes the key concepts and statistical perspectives for causal inference, and describes available educational resources, with a focus on observational clinical data. The target audience for this review is clinical researchers with training in fundamental statistics or epidemiology, and statisticians collaborating with those researchers. RESULTS: The available literature includes a number of textbooks and thousands of review articles. However, using this literature for independent study or clinical training programs is extremely challenging for numerous reasons. First, the published articles often assume an advanced technical background with different notations and terminology. Second, they may be written from any number of perspectives across statistics, epidemiology, computer science, or philosophy. Third, the methods are rapidly expanding and thus difficult to capture within traditional publications. Fourth, even the most fundamental aspects of causal inference (eg, framing the causal question as a target trial) often receive little or no coverage. This review presents an overview of (1) key concepts and frameworks for causal inference and (2) online documents that are publicly available for better assisting researchers to gain the necessary perspectives for functioning effectively within a multidisciplinary team. CONCLUSION: A familiarity with causal inference methods can help risk managers empirically verify, from observed events, the true causes of adverse sentinel events.
Asunto(s)
Causalidad , Medición de Riesgo/métodos , Humanos , Estudios Observacionales como Asunto , Puntaje de Propensión , PublicacionesRESUMEN
BACKGROUND: Left subclavian revascularization has become an integral part of thoracic endovascular aortic repair to extend the proximal landing zone. This is most commonly achieved via carotid-subclavian bypass; however, this can be achieved via vessel transposition. METHODS: All patients who had zone 2 thoracic endovascular aortic repairs without branched grafts from 2007 to 2018 were included in the study. The primary outcomes were adverse events, including operative mortality, paraplegia, left arm ischemia, and stroke. Multivariable regression analysis was performed for baseline characteristics associated with adverse events. RESULTS: A total of 58 patients underwent left subclavian artery transposition for zone 2 thoracic endovascular aortic repair coverage. Operative (30-day) mortality occurred in 3 patients (5.2%). The majority of patients were operated on under urgent (N = 25; 43.1%) or emergency (N = 12; 20.7%) status. Indications for thoracic endovascular aortic repair included aneurysmal disease (34.5%) and type B aortic dissection (chronic [13.8%]; acute [51.7%]). Major adverse events included paraplegia (N = 1; 1.7%), transient paraparesis (N = 3; 5.2%), and stroke (N = 2; 3.4%). Over a mean follow-up of 2.8 years, there were 5 deaths (8.6%). On multivariable analysis, prior stroke (odds ratio, 31.4; 1.95-506.72; P = .02) was an independent predictor of adverse events. CONCLUSIONS: Carotid-subclavian transposition offers patients a safe and effective method for left subclavian artery revascularization during thoracic endovascular aortic repair with zone 2 coverage with no increased operative risk and a low complication rate.