Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 14 de 14
1.
AIDS Patient Care STDS ; 38(3): 107-114, 2024 Mar.
Article En | MEDLINE | ID: mdl-38471091

For people with HIV (PWH) who have psychological comorbidities, effective management of mental health issues is crucial to achieving and maintaining viral suppression. Care coordination programs (CCPs) have been shown to improve outcomes across the HIV care continuum, but little research has focused on the role of care coordination in supporting the mental health of PWH. This study reports qualitative findings from the Program Refinements to Optimize Model Impact and Scalability based on Evidence (PROMISE) study, which evaluated a revised version of an HIV CCP for Ryan White Part A clients in New York City. Semistructured interviews were conducted with 30 providers and 27 clients from 6 CCP-implementing agencies to elucidate barriers and facilitators of program engagement. Transcripts were analyzed for key themes related to clients' mental health needs and providers' successes and challenges in meeting these needs. Providers and clients agreed that insufficiently managed mental health issues are a common barrier to achieving and maintaining viral suppression. Although the CCP model calls for providers to address clients' unmet mental health needs primarily through screening and referrals to psychiatric and/or psychological care, both clients and providers reported that the routine provision of emotional support is a major part of providers' role that is highly valued by clients. Some concerns raised by providers included insufficient training to address clients' mental health needs and an inability to document the provision of emotional support as a delivered service. These findings suggest the potential value of formally integrating mental health services into HIV care coordination provision. ClinicalTrials.gov protocol number: NCT03628287.


HIV Infections , Mental Health Services , Humans , Continuity of Patient Care , Counseling , HIV Infections/psychology , Mental Health
2.
Open Forum Infect Dis ; 11(2): ofad674, 2024 Feb.
Article En | MEDLINE | ID: mdl-38344131

Background: We described the oral nirmatrelvir/ritonavir (NMV/r) and molnupiravir (MOV) uptake among a subgroup of highly vaccinated adults in a US national prospective cohort who were infected with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) between 12/2021 and 10/2022. Methods: We estimate antiviral uptake within 5 days of SARS-CoV-2 infection, as well as age- and gender-adjusted antiviral uptake prevalence ratios by antiviral eligibility (based on age and comorbidities), sociodemographic characteristics, and clinical characteristics including vaccination status and history of long coronavirus disease 2019 (COVID). Results: NMV/r uptake was 13.6% (95% CI, 11.9%-15.2%) among 1594 participants, and MOV uptake was 1.4% (95% CI, 0.8%-2.1%) among 1398 participants. NMV/r uptake increased over time (1.9%; 95% CI, 1.0%-2.9%; between 12/2021 and 3/2022; 16.5%; 95% CI, 13.0%-20.0%; between 4/2022 and 7/2022; and 25.3%; 95% CI, 21.6%-29.0%; between 8/2022 and 10/2022). Participants age ≥65 and those who had comorbidities for severe COVID-19 had higher NMV/r uptake. There was lower NMV/r uptake among non-Hispanic Black participants (7.2%; 95% CI, 2.4%-12.0%; relative to other racial/ethnic groups) and among individuals in the lowest income groups (10.6%; 95% CI, 7.3%-13.8%; relative to higher income groups). Among a subset of 278 participants with SARS-CoV-2 infection after 12/2021 who also had a history of prior SARS-CoV-2 infection, those with (vs without) a history of long COVID reported greater NMV/r uptake (22.0% vs 7.9%; P = .001). Among those prescribed NMV/r (n = 216), 137 (63%; 95% CI, 57%-70%) reported that NMV/r was helpful for reducing COVID-19 symptoms. Conclusions: Despite proven effectiveness against severe outcomes, COVID-19 antiviral uptake remains low among those with SARS-CoV-2 infection in the United States. Further outreach to providers and patients to improve awareness of COVID-19 oral antivirals and indications is needed.

3.
Sci Rep ; 14(1): 644, 2024 01 05.
Article En | MEDLINE | ID: mdl-38182731

This study used repeat serologic testing to estimate infection rates and risk factors in two overlapping cohorts of SARS-CoV-2 N protein seronegative U.S. adults. One mostly unvaccinated sub-cohort was tracked from April 2020 to March 2021 (pre-vaccine/wild-type era, n = 3421), and the other, mostly vaccinated cohort, from March 2021 to June 2022 (vaccine/variant era, n = 2735). Vaccine uptake was 0.53% and 91.3% in the pre-vaccine and vaccine/variant cohorts, respectively. Corresponding seroconversion rates were 9.6 and 25.7 per 100 person-years. In both cohorts, sociodemographic and epidemiologic risk factors for infection were similar, though new risk factors emerged in the vaccine/variant era, such as having a child in the household. Despite higher incidence rates in the vaccine/variant cohort, vaccine boosters, masking, and social distancing were associated with substantially reduced infection risk, even through major variant surges.


COVID-19 , Vaccines , Adult , Child , Humans , COVID-19/epidemiology , COVID-19/prevention & control , Prospective Studies , SARS-CoV-2 , Immunization, Secondary
4.
medRxiv ; 2023 Oct 02.
Article En | MEDLINE | ID: mdl-37873066

Background: Infectious disease surveillance systems, which largely rely on diagnosed cases, underestimate the true incidence of SARS-CoV-2 infection, due to under-ascertainment and underreporting. We used repeat serologic testing to measure N-protein seroconversion in a well-characterized cohort of U.S. adults with no serologic evidence of SARS-CoV-2 infection to estimate the incidence of SARS-CoV-2 infection and characterize risk factors, with comparisons before and after the start of the SARS-CoV-2 vaccine and variant eras. Methods: We assessed the incidence rate of infection and risk factors in two sub-groups (cohorts) that were SARS-CoV-2 N-protein seronegative at the start of each follow-up period: 1) the pre-vaccine/wild-type era cohort (n=3,421), followed from April to November 2020; and 2) the vaccine/variant era cohort (n=2,735), followed from November 2020 to June 2022. Both cohorts underwent repeat serologic testing with an assay for antibodies to the SARS-CoV-2 N protein (Bio-Rad Platelia SARS-CoV-2 total Ab). We estimated crude incidence and sociodemographic/epidemiologic risk factors in both cohorts. We used multivariate Poisson models to compare the risk of SARS-CoV-2 infection in the pre-vaccine/wild-type era cohort (referent group) to that in the vaccine/variant era cohort, within strata of vaccination status and epidemiologic risk factors (essential worker status, child in the household, case in the household, social distancing). Findings: In the pre-vaccine/wild-type era cohort, only 18 of the 3,421 participants (0.53%) had ≥1 vaccine dose by the end of follow-up, compared with 2,497/2,735 (91.3%) in the vaccine/variant era cohort. We observed 323 and 815 seroconversions in the pre-vaccine/wild-type era and the vaccine/variant era and cohorts, respectively, with corresponding incidence rates of 9.6 (95% CI: 8.3-11.5) and 25.7 (95% CI: 24.2-27.3) per 100 person-years. Associations of sociodemographic and epidemiologic risk factors with SARS-CoV-2 incidence were largely similar in the pre-vaccine/wild-type and vaccine/variant era cohorts. However, some new epidemiologic risk factors emerged in the vaccine/variant era cohort, including having a child in the household, and never wearing a mask while using public transit. Adjusted incidence rate ratios (aIRR), with the entire pre-vaccine/wild-type era cohort as the referent group, showed markedly higher incidence in the vaccine/variant era cohort, but with more vaccine doses associated with lower incidence: aIRRun/undervaccinated=5.3 (95% CI: 4.2-6.7); aIRRprimary series only=5.1 (95% CI: 4.2-7.3); aIRRboosted once=2.5 (95% CI: 2.1-3.0), and aIRRboosted twice=1.65 (95% CI: 1.3-2.1). These associations were essentially unchanged in risk factor-stratified models. Interpretation: In SARS-CoV-2 N protein seronegative individuals, large increases in incidence and newly emerging epidemiologic risk factors in the vaccine/variant era likely resulted from multiple co-occurring factors, including policy changes, behavior changes, surges in transmission, and changes in SARS-CoV-2 variant properties. While SARS-CoV-2 incidence increased markedly in most groups in the vaccine/variant era, being up to date on vaccines and the use of non-pharmaceutical interventions (NPIs), such as masking and social distancing, remained reliable strategies to mitigate the risk of SARS-CoV-2 infection, even through major surges due to immune evasive variants. Repeat serologic testing in cohort studies is a useful and complementary strategy to characterize SARS-CoV-2 incidence and risk factors.

5.
Sci Transl Med ; 15(717): eadf4287, 2023 10 11.
Article En | MEDLINE | ID: mdl-37820009

Immune cell-based therapies are promising strategies to facilitate immunosuppression withdrawal after organ transplantation. Regulatory dendritic cells (DCreg) are innate immune cells that down-regulate alloimmune responses in preclinical models. Here, we performed clinical monitoring and comprehensive assessment of peripheral and allograft tissue immune cell populations in DCreg-infused live-donor liver transplant (LDLT) recipients up to 12 months (M) after transplant. Thirteen patients were given a single infusion of donor-derived DCreg 1 week before transplant (STUDY) and were compared with 40 propensity-matched standard-of-care (SOC) patients. Donor-derived DCreg infusion was well tolerated in all STUDY patients. There were no differences in postoperative complications or biopsy-confirmed acute rejection compared with SOC patients up to 12M. DCreg administration was associated with lower frequencies of effector T-bet+Eomes+CD8+ T cells and CD16bright natural killer (NK) cells and an increase in putative tolerogenic CD141+CD163+ DCs compared with SOC at 12M. Antidonor proliferative capacity of interferon-γ+ (IFN-γ+) CD4+ and CD8+ T cells was lower compared with antithird party responses in STUDY participants, but not in SOC patients, at 12M. In addition, lower circulating concentrations of interleukin-12p40 (IL-12p40), IFN-γ, and CXCL10 were detected in STUDY participants compared with SOC patients at 12M. Analysis of 12M allograft biopsies revealed lower frequencies of graft-infiltrating CD8+ T cells, as well as attenuation of cytolytic TH1 effector genes and pathways among intragraft CD8+ T cells and NK cells, in DCreg-infused patients. These reductions may be conducive to reduced dependence on immunosuppressive drug therapy or immunosuppression withdrawal.


CD8-Positive T-Lymphocytes , Liver Transplantation , Humans , Dendritic Cells/metabolism , Living Donors , Killer Cells, Natural , Interferon-gamma/metabolism , Graft Rejection
6.
Pediatr Crit Care Med ; 24(8): 636-651, 2023 08 01.
Article En | MEDLINE | ID: mdl-37125798

OBJECTIVES: Assess clinical outcomes following PICU Liberation ABCDEF Bundle utilization. DESIGN: Prospective, multicenter, cohort study. SETTING: Eight academic PICUs. PATIENTS: Children greater than 2 months with expected PICU stay greater than 2 days and need for mechanical ventilation (MV). INTERVENTIONS: ABCDEF Bundle implementation. MEASUREMENT AND MAIN RESULTS: Over an 11-month period (3-mo baseline, 8-mo implementation), Bundle utilization was measured for 622 patients totaling 5,017 PICU days. Risk of mortality was quantified for 532 patients (4,275 PICU days) for correlation between Bundle utilization and MV duration, PICU length of stay (LOS), delirium incidence, and mortality. Utilization was analyzed as subject-specific (entire PICU stay) and day-specific (single PICU day). Median overall subject-specific utilization increased from 50% during the 3-month baseline to 63.9% during the last four implementation months ( p < 0.001). Subject-specific utilization for elements A and C did not change; utilization improved for B (0-12.5%; p = 0.007), D (22.2-61.1%; p < 0.001), E (17.7-50%; p = 0.003), and F (50-79.2%; p = 0.001). We observed no association between Bundle utilization and MV duration, PICU LOS, or delirium incidence. In contrast, on adjusted analysis, every 10% increase in subject-specific utilization correlated with mortality odds ratio (OR) reduction of 34%, p < 0.001; every 10% increase in day-specific utilization correlated with a mortality OR reduction of 1.4% ( p = 0.006). CONCLUSIONS: ABCDEF Bundle is applicable to children. Although enhanced Bundle utilization correlated with decreased mortality, increased utilization did not correlate with duration of MV, PICU LOS, or delirium incidence. Additional research in the domains of comparative effectiveness, implementation science, and human factors engineering is required to understand this clinical inconsistency and optimize PICU Liberation concept integration into clinical practice.


Critical Illness , Delirium , Humans , Child , Cohort Studies , Prospective Studies , Critical Illness/therapy , Critical Illness/epidemiology , Intensive Care Units , Delirium/epidemiology , Intensive Care Units, Pediatric
7.
Crit Care Med ; 51(4): 445-459, 2023 04 01.
Article En | MEDLINE | ID: mdl-36790189

OBJECTIVES: The COVID-19 pandemic threatened standard hospital operations. We sought to understand how this stress was perceived and manifested within individual hospitals and in relation to local viral activity. DESIGN: Prospective weekly hospital stress survey, November 2020-June 2022. SETTING: Society of Critical Care Medicine's Discovery Severe Acute Respiratory Infection-Preparedness multicenter cohort study. SUBJECTS: Thirteen hospitals across seven U.S. health systems. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: We analyzed 839 hospital-weeks of data over 85 pandemic weeks and five viral surges. Perceived overall hospital, ICU, and emergency department (ED) stress due to severe acute respiratory infection patients during the pandemic were reported by a mean of 43% ( sd , 36%), 32% (30%), and 14% (22%) of hospitals per week, respectively, and perceived care deviations in a mean of 36% (33%). Overall hospital stress was highly correlated with ICU stress (ρ = 0.82; p < 0.0001) but only moderately correlated with ED stress (ρ = 0.52; p < 0.0001). A county increase in 10 severe acute respiratory syndrome coronavirus 2 cases per 100,000 residents was associated with an increase in the odds of overall hospital, ICU, and ED stress by 9% (95% CI, 5-12%), 7% (3-10%), and 4% (2-6%), respectively. During the Delta variant surge, overall hospital stress persisted for a median of 11.5 weeks (interquartile range, 9-14 wk) after local case peak. ICU stress had a similar pattern of resolution (median 11 wk [6-14 wk] after local case peak; p = 0.59) while the resolution of ED stress (median 6 wk [5-6 wk] after local case peak; p = 0.003) was earlier. There was a similar but attenuated pattern during the Omicron BA.1 subvariant surge. CONCLUSIONS: During the COVID-19 pandemic, perceived care deviations were common and potentially avoidable patient harm was rare. Perceived hospital stress persisted for weeks after surges peaked.


COVID-19 , Humans , COVID-19/epidemiology , SARS-CoV-2 , Pandemics , Cohort Studies , Prospective Studies , Hospitals
8.
Crit Care Explor ; 5(1): e0827, 2023 Jan.
Article En | MEDLINE | ID: mdl-36600780

Vascular dysfunction and capillary leak are common in critically ill COVID-19 patients, but identification of endothelial pathways involved in COVID-19 pathogenesis has been limited. Angiopoietin-like 4 (ANGPTL4) is a protein secreted in response to hypoxic and nutrient-poor conditions that has a variety of biological effects including vascular injury and capillary leak. OBJECTIVES: To assess the role of ANGPTL4 in COVID-19-related outcomes. DESIGN SETTING AND PARTICIPANTS: Two hundred twenty-five COVID-19 ICU patients were enrolled from April 2020 to May 2021 in a prospective, multicenter cohort study from three different medical centers, University of Washington, University of Southern California and New York University. MAIN OUTCOMES AND MEASURES: Plasma ANGPTL4 was measured on days 1, 7, and 14 after ICU admission. We used previously published tissue proteomic data and lung single nucleus RNA (snRNA) sequencing data from specimens collected from COVID-19 patients to determine the tissues and cells that produce ANGPTL4. RESULTS: Higher plasma ANGPTL4 concentrations were significantly associated with worse hospital mortality (adjusted odds ratio per log2 increase, 1.53; 95% CI, 1.17-2.00; p = 0.002). Higher ANGPTL4 concentrations were also associated with higher proportions of venous thromboembolism and acute respiratory distress syndrome. Longitudinal ANGPTL4 concentrations were significantly different during the first 2 weeks of hospitalization in patients who subsequently died compared with survivors (p for interaction = 8.1 × 10-5). Proteomics analysis demonstrated abundance of ANGPTL4 in lung tissue compared with other organs in COVID-19. ANGPTL4 single-nuclear RNA gene expression was significantly increased in pulmonary alveolar type 2 epithelial cells and fibroblasts in COVID-19 lung tissue compared with controls. CONCLUSIONS AND RELEVANCE: ANGPTL4 is expressed in pulmonary epithelial cells and fibroblasts and is associated with clinical prognosis in critically ill COVID-19 patients.

9.
Crit Care Explor ; 4(10): e0773, 2022 Oct.
Article En | MEDLINE | ID: mdl-36284548

Respiratory virus infections cause significant morbidity and mortality ranging from mild uncomplicated acute respiratory illness to severe complications, such as acute respiratory distress syndrome, multiple organ failure, and death during epidemics and pandemics. We present a protocol to systematically study patients with severe acute respiratory infection (SARI), including severe acute respiratory syndrome coronavirus 2, due to respiratory viral pathogens to evaluate the natural history, prognostic biomarkers, and characteristics, including hospital stress, associated with clinical outcomes and severity. DESIGN: Prospective cohort study. SETTING: Multicenter cohort of patients admitted to an acute care ward or ICU from at least 15 hospitals representing diverse geographic regions across the United States. PATIENTS: Patients with SARI caused by infection with respiratory viruses that can cause outbreaks, epidemics, and pandemics. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Measurements include patient demographics, signs, symptoms, and medications; microbiology, imaging, and associated tests; mechanical ventilation, hospital procedures, and other interventions; and clinical outcomes and hospital stress, with specimens collected on days 0, 3, and 7-14 after enrollment and at discharge. The primary outcome measure is the number of consecutive days alive and free of mechanical ventilation (VFD) in the first 30 days after hospital admission. Important secondary outcomes include organ failure-free days before acute kidney injury, shock, hepatic failure, disseminated intravascular coagulation, 28-day mortality, adaptive immunity, as well as immunologic and microbiologic outcomes. CONCLUSIONS: SARI-Preparedness is a multicenter study under the collaboration of the Society of Critical Care Medicine Discovery, Resilience Intelligence Network, and National Emerging Special Pathogen Training and Education Center, which seeks to improve understanding of prognostic factors associated with worse outcomes and increased resource utilization. This can lead to interventions to mitigate the clinical impact of respiratory virus infections associated with SARI.

10.
Clin J Am Soc Nephrol ; 17(3): 374-384, 2022 03.
Article En | MEDLINE | ID: mdl-35217526

BACKGROUND AND OBJECTIVES: The progression of polycystic liver disease is not well understood. The purpose of the study is to evaluate the associations of polycystic liver progression with other disease progression variables and classify liver progression on the basis of patient's age, height-adjusted liver cystic volume, and height-adjusted liver volume. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Prospective longitudinal magnetic resonance images from 670 patients with early autosomal dominant polycystic kidney disease for up to 14 years of follow-up were evaluated to measure height-adjusted liver cystic volume and height-adjusted liver volume. Among them, 245 patients with liver cyst volume >50 ml at baseline were included in the longitudinal analysis. Linear mixed models on log-transformed height-adjusted liver cystic volume and height-adjusted liver volume were fitted to approximate mean annual rate of change for each outcome. The association of sex, body mass index, genotype, baseline height-adjusted total kidney volume, and Mayo imaging class was assessed. We calculated height-adjusted liver cystic volume ranges for each specific age and divided them into five classes on the basis of annual percentage increase in height-adjusted liver cystic volume. RESULTS: The mean annual growth rate of height-adjusted liver cystic volume was 12% (95% confidence interval, 11.1% to 13.1%; P<0.001), whereas that for height-adjusted liver volume was 2% (95% confidence interval, 1.9% to 2.6%; P<0.001). Women had higher baseline height-adjusted liver cystic volume than men, but men had higher height-adjusted liver cystic volume growth rate than women by 2% (95% confidence interval, 0.4% to 4.5%; P=0.02). Whereas the height-adjusted liver cystic volume growth rate decreased in women after menopause, no decrease was observed in men at any age. Body mass index, genotype, and baseline height-adjusted total kidney volume were not associated with the growth rate of height-adjusted liver cystic volume or height-adjusted liver volume. According to the height-adjusted liver cystic volume growth rate, patients were classified into five classes (number of women, men in each class): A (24, six); B (44, 13); C (43, 48); D (28, 17); and E (13, nine). CONCLUSIONS: Compared with height-adjusted liver volume, the use of height-adjusted liver cystic volume showed greater separations in volumetric progression of polycystic liver disease. Similar to the Mayo imaging classification for the kidney, the progression of polycystic liver disease may be categorized on the basis of patient's age and height-adjusted liver cystic volume.


Polycystic Kidney, Autosomal Dominant , Cysts , Disease Progression , Female , Glomerular Filtration Rate , Humans , Kidney/diagnostic imaging , Kidney/pathology , Liver/diagnostic imaging , Liver/pathology , Liver Diseases , Magnetic Resonance Imaging , Male , Polycystic Kidney, Autosomal Dominant/complications , Polycystic Kidney, Autosomal Dominant/diagnostic imaging , Polycystic Kidney, Autosomal Dominant/genetics , Prospective Studies
11.
J Vasc Interv Radiol ; 32(9): 1258-1266.e6, 2021 09.
Article En | MEDLINE | ID: mdl-34242775

PURPOSE: To examine National Cancer Database (NCDB) data to comparatively evaluate overall survival (OS) between patients undergoing transarterial radioembolization (TARE) and those undergoing systemic therapy for hepatocellular carcinoma with major vascular invasion (HCC-MVI). METHODS: One thousand five hundred fourteen patients with HCC-MVI undergoing first-line TARE or systemic therapy were identified from the NCDB. OS was compared using propensity score-matched Cox regression and landmark analysis. Efficacy was also compared within a target trial framework. RESULTS: TARE usage doubled between 2010 and 2015. Intervals before treatment were longer for TARE than for systemic therapy (mean [median], 66.5 [60] days vs 46.8 (35) days, respectively, P < .0001). In propensity-score-matched and landmark-time-adjusted analyses, TARE was found to be associated with a hazard ratio of 0.74 (95 % CI, 0.60-0.91; P = .005) and median OS of 7.1 months (95 % CI, 5.0-10.5) versus 4.9 months (95 % CI, 3.9-6.5) for systemically treated patients. In an emulated target trial involving 236 patients with unilobular HCC-MVI, a low number of comorbidities, creatinine levels <2.0 mg/dL, bilirubin levels <2.0 mg/dL, and international normalized ratio <1.7, TARE was found to be associated with a hazard ratio of 0.57 (95 % CI, 0.39-0.83; P = .004) and a median OS of 12.9 months (95 % CI, 7.6-19.2) versus 6.5 months (95 % CI, 3.6-11.1) for the systemic therapy arm. CONCLUSIONS: In propensity-score-matched analyses involving pragmatic and target trial HCC-MVI cohorts, TARE was found to be associated with significant survival benefits compared with systemic therapy. Although not a substitute for prospective trials, these findings suggest that the increasing use of TARE for HCC-MVI is accompanied by improved OS. Further trials of TARE in patients with HCC-MVI are needed, especially to compare with newer systemic therapies.


Carcinoma, Hepatocellular , Liver Neoplasms , Carcinoma, Hepatocellular/radiotherapy , Humans , Liver Neoplasms/therapy , Propensity Score , Prospective Studies , Yttrium Radioisotopes
12.
Qual Manag Health Care ; 29(4): 260-269, 2020.
Article En | MEDLINE | ID: mdl-32991545

BACKGROUND AND OBJECTIVES: Root cause analysis involves evaluation of causal relationships between exposures (or interventions) and adverse outcomes, such as identification of direct (eg, medication orders missed) and root causes (eg, clinician's fatigue and workload) of adverse rare events. To assess causality requires either randomization or sophisticated methods applied to carefully designed observational studies. In most cases, randomized trials are not feasible in the context of root cause analysis. Using observational data for causal inference, however, presents many challenges in both the design and analysis stages. Methods for observational causal inference often fall outside the toolbox of even well-trained statisticians, thus necessitating workforce training. METHODS: This article synthesizes the key concepts and statistical perspectives for causal inference, and describes available educational resources, with a focus on observational clinical data. The target audience for this review is clinical researchers with training in fundamental statistics or epidemiology, and statisticians collaborating with those researchers. RESULTS: The available literature includes a number of textbooks and thousands of review articles. However, using this literature for independent study or clinical training programs is extremely challenging for numerous reasons. First, the published articles often assume an advanced technical background with different notations and terminology. Second, they may be written from any number of perspectives across statistics, epidemiology, computer science, or philosophy. Third, the methods are rapidly expanding and thus difficult to capture within traditional publications. Fourth, even the most fundamental aspects of causal inference (eg, framing the causal question as a target trial) often receive little or no coverage. This review presents an overview of (1) key concepts and frameworks for causal inference and (2) online documents that are publicly available for better assisting researchers to gain the necessary perspectives for functioning effectively within a multidisciplinary team. CONCLUSION: A familiarity with causal inference methods can help risk managers empirically verify, from observed events, the true causes of adverse sentinel events.


Causality , Risk Assessment/methods , Humans , Observational Studies as Topic , Propensity Score , Publications
13.
J Am Soc Nephrol ; 31(7): 1640-1651, 2020 07.
Article En | MEDLINE | ID: mdl-32487558

BACKGROUND: The Mayo Clinic imaging classification of autosomal dominant polycystic kidney disease (ADPKD) uses height-adjusted total kidney volume (htTKV) and age to identify patients at highest risk for disease progression. However, this classification applies only to patients with typical diffuse cystic disease (class 1). Because htTKV poorly predicts eGFR decline for the 5%-10% of patients with atypical morphology (class 2), imaging-based risk modeling remains unresolved. METHODS: Of 558 adults with ADPKD in the HALT-A study, we identified 25 patients of class 2A with prominent exophytic cysts (class 2Ae) and 43 patients of class 1 with prominent exophytic cysts; we recalculated their htTKVs to exclude exophytic cysts. Using original and recalculated htTKVs in association with imaging classification in logistic and mixed linear models, we compared predictions for developing CKD stage 3 and for eGFR trajectory. RESULTS: Using recalculated htTKVs increased specificity for developing CKD stage 3 in all participants from 82.6% to 84.2% after adjustment for baseline age, eGFR, BMI, sex, and race. The predicted proportion of class 2Ae patients developing CKD stage 3 using a cutoff of 0.5 for predicting case status was better calibrated to the observed value of 13.0% with recalculated htTKVs (45.5%) versus original htTKVs (63.6%). Using recalculated htTKVs reduced the mean paired difference between predicted and observed eGFR from 17.6 (using original htTKVs) to 4.0 ml/min per 1.73 m2 for class 2Ae, and from -1.7 (using original htTKVs) to 0.1 ml/min per 1.73 m2 for class 1. CONCLUSIONS: Use of a recalculated htTKV measure that excludes prominent exophytic cysts facilitates inclusion of class 2 patients and reclassification of class 1 patients in the Mayo classification model.


Kidney/pathology , Polycystic Kidney, Autosomal Dominant/classification , Polycystic Kidney, Autosomal Dominant/diagnostic imaging , Renal Insufficiency, Chronic/etiology , Adult , Body Height , Disease Progression , Female , Glomerular Filtration Rate , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Organ Size , Polycystic Kidney, Autosomal Dominant/complications , Polycystic Kidney, Autosomal Dominant/pathology , Predictive Value of Tests , ROC Curve , Risk Assessment/methods , Young Adult
14.
J Thorac Cardiovasc Surg ; 159(4): 1222-1227, 2020 04.
Article En | MEDLINE | ID: mdl-31030960

BACKGROUND: Left subclavian revascularization has become an integral part of thoracic endovascular aortic repair to extend the proximal landing zone. This is most commonly achieved via carotid-subclavian bypass; however, this can be achieved via vessel transposition. METHODS: All patients who had zone 2 thoracic endovascular aortic repairs without branched grafts from 2007 to 2018 were included in the study. The primary outcomes were adverse events, including operative mortality, paraplegia, left arm ischemia, and stroke. Multivariable regression analysis was performed for baseline characteristics associated with adverse events. RESULTS: A total of 58 patients underwent left subclavian artery transposition for zone 2 thoracic endovascular aortic repair coverage. Operative (30-day) mortality occurred in 3 patients (5.2%). The majority of patients were operated on under urgent (N = 25; 43.1%) or emergency (N = 12; 20.7%) status. Indications for thoracic endovascular aortic repair included aneurysmal disease (34.5%) and type B aortic dissection (chronic [13.8%]; acute [51.7%]). Major adverse events included paraplegia (N = 1; 1.7%), transient paraparesis (N = 3; 5.2%), and stroke (N = 2; 3.4%). Over a mean follow-up of 2.8 years, there were 5 deaths (8.6%). On multivariable analysis, prior stroke (odds ratio, 31.4; 1.95-506.72; P = .02) was an independent predictor of adverse events. CONCLUSIONS: Carotid-subclavian transposition offers patients a safe and effective method for left subclavian artery revascularization during thoracic endovascular aortic repair with zone 2 coverage with no increased operative risk and a low complication rate.


Aorta, Thoracic/surgery , Aortic Diseases/surgery , Endovascular Procedures , Subclavian Artery/surgery , Anastomosis, Surgical , Aortic Dissection/surgery , Aortic Aneurysm, Thoracic/surgery , Blood Vessel Prosthesis , Female , Humans , Male , Middle Aged , Postoperative Complications , Risk Factors
...