Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 149
Filtrar
1.
Artigo em Inglês | MEDLINE | ID: mdl-38649110

RESUMO

OBJECTIVE: Despite guideline recommendation, cardiac rehabilitation (CR) after cardiac surgery remains underused, and the extent of interhospital variability is not well understood. This study evaluated determinants of interhospital variability in CR use and outcomes. METHODS: This retrospective cohort study included 166,809 Medicare beneficiaries undergoing cardiac surgery who were discharged alive between July 1, 2016, and December 31, 2018. CR participation was identified in outpatient facility claims within a year of discharge. Hospital-level CR rates were tabulated, and multilevel models evaluated the extent to which patient, organizational, and regional factors accounted for interhospital variability. Adjusted 1-year mortality and readmission rates were also calculated for each hospital quartile of CR use. RESULTS: Overall, 90,171 (54.1%) participated in at least 1 CR session within a year of discharge. Interhospital CR rates ranged from 0.0% to 96.8%. Hospital factors that predicted CR use included nonteaching status and lower-hospital volume. Before adjustment for patient, organizational, and regional factors, 19.3% of interhospital variability was attributable to the admitting hospital. After accounting for covariates, 12.3% of variation was attributable to the admitting hospital. Patient (0.5%), structural (2.8%), and regional (3.7%) factors accounted for the remaining explained variation. Hospitals in the lowest quartile of CR use had greater adjusted 1-year mortality rates (Q1 = 6.7%, Q4 = 5.2%, P < .001) and readmission rates (Q1 = 37.6%, Q4 = 33.9%, P < .001). CONCLUSIONS: Identifying best practices among high CR use facilities and barriers to access in low CR use hospitals may reduce interhospital variability in CR use and advance national improvement efforts.

2.
Artigo em Inglês | MEDLINE | ID: mdl-38522574

RESUMO

BACKGROUND: Cardiac rehabilitation (CR) is a guideline-recommended risk-reduction program offered to cardiac surgical patients. Despite CR's association with better outcomes, attendance remains poor. The relationship between discharge location and CR use is poorly understood. METHODS: This study was a nationwide, retrospective cohort analysis of Medicare fee-for-service claims for beneficiaries undergoing coronary artery bypass grafting and/or surgical aortic valve repair between July 1, 2016, and December 31, 2018. The primary outcome was attendance of any CR session. Discharge location was categorized as home discharge or discharge to extended care facility (ECF) (including skilled nursing facility, inpatient rehabilitation, and long-term acute care). Multivariable logistic regression models evaluated the association between discharge location, CR attendance, and 1-year mortality. RESULTS: Of the 167,966 patients who met inclusion criteria, 34.1% discharged to an ECF. Overall CR usage rate was 53.9%. Unadjusted and adjusted CR use was lower among patients discharged ECFs versus those discharged home (42.1% vs 60.0%; adjusted odds ratio, 0.66; P < .001). Patients discharged to long-term acute care were less likely to use CR than those discharged to skilled nursing facility or inpatient rehabilitation (reference category: home; adjusted odds ratio for long-term acute care, 0.36, adjusted odds ratio for skilled nursing facility, 0.69, and adjusted odds ratio for inpatient rehabilitation, 0.71; P < .001). CR attendance was associated with a greater reduction in adjusted 1-year mortality in patients discharged to ECFs (9.7% reduction) versus those discharged home (4.3% reduction). CONCLUSIONS: In this national analysis of Medicare beneficiaries, discharge to ECF was associated with lower CR use, despite a greater association with improved 1-year mortality. Interventions aimed at increasing CR enrollment at ECFs may improve CR use and advance surgical quality.

3.
Transfusion ; 64(3): 457-465, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38314476

RESUMO

BACKGROUND: The Mirasol® Pathogen Reduction Technology System was developed to reduce transfusion-transmitted diseases in platelet (PLT) products. STUDY DESIGN AND METHODS: MiPLATE trial was a prospective, multicenter, controlled, randomized, non-inferiority (NI) study of the clinical effectiveness of conventional versus Mirasol-treated Apheresis PLTs in participants with hypoproliferative thrombocytopenia. The novel primary endpoint was days of ≥Grade 2 bleeding with an NI margin of 1.6. RESULTS: After 330 participants were randomized, a planned interim analysis of 297 participants (145 MIRASOL, 152 CONTROL) receiving ≥1 study transfusion found a 2.79-relative rate (RR) in the MIRASOL compared to the CONTROL in number of days with ≥Grade 2 bleeding (95% confidence interval [CI] 1.67-4.67). The proportion of subjects with ≥Grade 2 bleeding was 40.0% (n = 58) in MIRASOL and 30.3% (n = 46) in CONTROL (RR = 1.32, 95% CI 0.97-1.81, p = .08). Corrected count increments were lower (p < .01) and the number of PLT transfusion episodes per participant was higher (RR = 1.22, 95% CI 1.05-1.41) in MIRASOL. There was no difference in the days of PLT support (hazard ratio = 0.86, 95% CI 0.68-1.08) or total number of red blood cell transfusions (RR = 1.12, 95% CI 0.91-1.37) between MIRASOL versus CONTROL. Transfusion emergent adverse events were reported in 119 MIRASOL participants (84.4%) compared to 133 (82.6%) participants in CONTROL (p = NS). DISCUSSION: This study did not support that MIRASOL was non-inferior compared to conventional platelets using the novel endpoint number of days with ≥Grade 2 bleeding in MIRASOL when compared to CONTROL.


Assuntos
Remoção de Componentes Sanguíneos , Trombocitopenia , Humanos , Estudos Prospectivos , Plaquetas , Trombocitopenia/terapia , Trombocitopenia/etiologia , Hemorragia/terapia , Hemorragia/etiologia , Transfusão de Plaquetas/efeitos adversos , Resultado do Tratamento
4.
J Subst Use Addict Treat ; 157: 209271, 2024 02.
Artigo em Inglês | MEDLINE | ID: mdl-38135120

RESUMO

INTRODUCTION: Overdose deaths are increasing disproportionately for minoritized populations in the United States. Disparities in substance use disorder treatment access and use have been a key contributor to this phenomenon. However, little is known about the magnitude of these disparities and the role of social determinants of health (SDOH) and provider characteristics in driving them. Our study measures the association between race and ethnicity and visits with Medication for Opioid Use Disorder (MOUD) providers, MOUD treatment conditional on a provider visit, and opioid overdose following MOUD treatment in Medicare. We also evaluate the role of social determinants of health and provider characteristics in modifying disparities. METHODS: Using a population of 230,198 US Medicare fee-for-service beneficiaries diagnosed with opioid use disorder (OUD), we estimate logistic regression models to quantify the association between belonging to a racial or ethnic group and the probability of visiting a buprenorphine or naltrexone provider, receiving a prescription or medication administration during or after a visit, and experiencing an opioid overdose after treatment with MOUD. Data included Medicare claims data and the Agency for Health Research and Quality Social Determinants of Health Database files between 2013 and 2017. RESULTS: Compared to Non-Hispanic White Medicare beneficiaries, Asian/Pacific Islander, American Indian/Alaska Native, Black, Hispanic, and Other/Unknown Race beneficiaries were between 3.0 and 9.3 percentage points less likely to have a visit with a buprenorphine or naltrexone provider. Conditional on having a buprenorphine or naltrexone provider visit, Asian/Pacific Islander, American Indian/Alaska Native, Black, Hispanic, and Other/Unknown Race were between 2.6 and 8.1 percentage points less likely to receive buprenorphine or naltrexone than white beneficiaries. Controlling for provider characteristics and SDOH increased disparities in visits and MOUD treatment for all groups besides American Indians/Alaska Natives. Conditional on treatment, only Black Medicare beneficiaries were at greater associated risk of overdose than non-Hispanic white beneficiaries, although differences became statistically insignificant after controlling for SDOH and including provider fixed effects. CONCLUSION: Ongoing equity programming and measurement efforts by CMS should include explicit consideration for disparities in access and use of MOUD. This may help ensure greater MOUD utilization by minoritized Medicare beneficiaries and reduce rising disparities in overdose deaths.


Assuntos
Buprenorfina , Overdose de Opiáceos , Transtornos Relacionados ao Uso de Opioides , Idoso , Humanos , Estados Unidos/epidemiologia , Naltrexona/uso terapêutico , Medicare , Overdose de Opiáceos/tratamento farmacológico , Transtornos Relacionados ao Uso de Opioides/tratamento farmacológico , Buprenorfina/uso terapêutico , Resultado do Tratamento
5.
Transfusion ; 63(7): 1354-1365, 2023 07.
Artigo em Inglês | MEDLINE | ID: mdl-37255467

RESUMO

BACKGROUND: The true burden of COVID-19 in low- and middle-income countries remains poorly characterized, especially in Africa. Even prior to the availability of SARS-CoV-2 vaccines, countries in Africa had lower numbers of reported COVID-19 related hospitalizations and deaths than other regions globally. METHODS: Ugandan blood donors were evaluated between October 2019 and April 2022 for IgG antibodies to SARS-CoV-2 nucleocapsid (N), spike (S), and five variants of the S protein using multiplexed electrochemiluminescence immunoassays (MesoScale Diagnostics, Rockville, MD). Seropositivity for N and S was assigned using manufacturer-provided cutoffs and trends in seroprevalence were estimated by quarter. Statistically significant associations between N and S antibody seropositivity and donor characteristics in November-December 2021 were assessed by chi-square tests. RESULTS: A total of 5393 blood unit samples from donors were evaluated. N and S seropositivity increased throughout the pandemic to 82.6% in January-April 2022. Among seropositive individuals, N and S antibody levels increased ≥9-fold over the study period. In November-December 2021, seropositivity to N and S antibody was higher among repeat donors (61.3%) compared with new donors (55.1%; p = .043) and among donors from Kampala (capital city of Uganda) compared with rural regions (p = .007). Seropositivity to S antibody was significantly lower among HIV-seropositive individuals (58.8% vs. 84.9%; p = .009). CONCLUSIONS: Despite previously reported low numbers of COVID-19 cases and related deaths in Uganda, high SARS-CoV-2 seroprevalence and increasing antibody levels among blood donors indicated that the country experienced high levels of infection over the course of the pandemic.


Assuntos
Doadores de Sangue , COVID-19 , Humanos , Uganda/epidemiologia , SARS-CoV-2 , Vacinas contra COVID-19 , Estudos Soroepidemiológicos , COVID-19/epidemiologia , Anticorpos Antivirais
7.
JAMA Netw Open ; 5(11): e2240646, 2022 11 01.
Artigo em Inglês | MEDLINE | ID: mdl-36342716

RESUMO

Importance: In 2020, the Centers for Medicare & Medicaid Services revised its national coverage determination, removing the requirement to obtain review from a Medicare-approved heart transplant center to implant a durable left ventricular assist device (LVAD) for bridge-to-transplant (BTT) intent at an LVAD-only center. The association between center-level transplant availability and access to heart transplant, the gold-standard therapy for advanced heart failure (HF), is unknown. Objective: To investigate the association of center transplant availability with LVAD implant strategies and subsequent heart transplant following LVAD implant before the Centers for Medicare & Medicaid Services policy change. Design, Setting, and Participants: A retrospective cohort study of the Society of Thoracic Surgeons Intermacs multicenter US registry database was conducted from April 1, 2012, to June 30, 2020. The population included patients with HF receiving a primary durable LVAD. Exposures: LVAD center transplant availability (LVAD/transplant vs LVAD only). Main Outcomes and Measures: The primary outcomes were implant strategy as BTT and subsequent transplant by 2 years. Covariates that might affect listing strategy and outcomes were included (eg, patient demographic characteristics, comorbidities) in multivariable models. Parameters for BTT listing were estimated using logistic regression with center-level random effects and for receipt of a transplant using a Cox proportional hazards regression model with death as a competing event. Results: The sample included 22 221 LVAD recipients with a median age of 59.0 (IQR, 50.0-67.0) years, of whom 17 420 (78.4%) were male and 3156 (14.2%) received implants at LVAD-only centers. Receiving an LVAD at an LVAD/transplant center was associated with a 79% increased adjusted odds of BTT LVAD designation (odds ratio, 1.79; 95% CI, 1.35-2.38; P < .001). The 2-year transplant rate following LVAD implant was 25.6% at LVAD/transplant centers and 11.9% at LVAD-only centers. There was an associated 33% increased rate of transplant at LVAD/transplant centers compared with LVAD-only centers (adjusted hazard ratio, 1.33; 95% CI, 1.17-1.51) with a similar hazard for death at 2 years (adjusted hazard ratio, 0.99; 95% CI, 0.90-1.08). Conclusions and Relevance: Receiving an LVAD at an LVAD-transplant center was associated with increased odds of BTT intent at implant and subsequent transplant receipt for patients at 2 years. The findings of this study suggest that Centers for Medicare & Medicaid Services policy change may have the unintended consequence of further increasing inequities in access to transplant among patients at LVAD-only centers.


Assuntos
Insuficiência Cardíaca , Transplante de Coração , Coração Auxiliar , Humanos , Masculino , Idoso , Estados Unidos/epidemiologia , Pessoa de Meia-Idade , Feminino , Estudos Retrospectivos , Medicare , Insuficiência Cardíaca/cirurgia
8.
Sci Rep ; 12(1): 19397, 2022 11 12.
Artigo em Inglês | MEDLINE | ID: mdl-36371591

RESUMO

Vitamin D deficiency has long been associated with reduced immune function that can lead to viral infection. Several studies have shown that Vitamin D deficiency is associated with increases the risk of infection with COVID-19. However, it is unknown if treatment with Vitamin D can reduce the associated risk of COVID-19 infection, which is the focus of this study. In the population of US veterans, we show that Vitamin D2 and D3 fills were associated with reductions in COVID-19 infection of 28% and 20%, respectively [(D3 Hazard Ratio (HR) = 0.80, [95% CI 0.77, 0.83]), D2 HR = 0.72, [95% CI 0.65, 0.79]]. Mortality within 30-days of COVID-19 infection was similarly 33% lower with Vitamin D3 and 25% lower with D2 (D3 HR = 0.67, [95% CI 0.59, 0.75]; D2 HR = 0.75, [95% CI 0.55, 1.04]). We also find that after controlling for vitamin D blood levels, veterans receiving higher dosages of Vitamin D obtained greater benefits from supplementation than veterans receiving lower dosages. Veterans with Vitamin D blood levels between 0 and 19 ng/ml exhibited the largest decrease in COVID-19 infection following supplementation. Black veterans received greater associated COVID-19 risk reductions with supplementation than White veterans. As a safe, widely available, and affordable treatment, Vitamin D may help to reduce the severity of the COVID-19 pandemic.


Assuntos
COVID-19 , Deficiência de Vitamina D , Humanos , Pandemias , Suplementos Nutricionais , Deficiência de Vitamina D/complicações , Deficiência de Vitamina D/tratamento farmacológico , Deficiência de Vitamina D/epidemiologia , Colecalciferol , Vitamina D/uso terapêutico , Vitaminas/uso terapêutico
9.
JAMA Psychiatry ; 79(12): 1173-1179, 2022 12 01.
Artigo em Inglês | MEDLINE | ID: mdl-36197659

RESUMO

Importance: Nonadherence to buprenorphine may increase patient risk of opioid overdose and increase health care spending. Quantifying the impacts of nonadherence can help inform clinician practice and policy. Objective: To estimate the association between buprenorphine treatment gaps, opioid overdose, and health care spending. Design, Setting, and Participants: This longitudinal case-control study compared patient opioid overdose and health care spending in buprenorphine-treated months with treatment gap months. Individuals who were US Medicare fee-for-service beneficiaries diagnosed with opioid use disorder who received at least 1 two-week period of continuous buprenorphine treatment between 2010 and 2017 were included. Analysis took place between January 2010 and December 2017. Interventions: A gap in buprenorphine treatment in a month lasting more than 15 consecutive days. Main Outcomes and Measures: Opioid overdose and total, medical, and drug spending (combined patient out-of-pocket and Medicare spending). Results: Of 34 505 Medicare beneficiaries (17 927 [520%] male; 16 578 [48.1%] female; mean [SD] age, 49.5 [12.7] years; 168 [0.5%] Asian; 2949 [8.5%] Black; 2089 [6.0%] Hispanic; 266 [0.8%] Native American; 28 525 [82.7%] White; 508 [1.5%] other race), 11 524 beneficiaries (33.4%) experienced 1 or more buprenorphine treatment gaps. Treatment gap beneficiaries, compared with nontreatment gap beneficiaries, were more likely to be younger, be male, have a disability, and be Medicaid dual-eligible while less likely to be White, close to a buprenorphine prescriber, and treated with buprenorphine monotherapy (ie, buprenorphine hydrochloride). Beneficiaries were 2.89 (95% CI, 2.20-3.79) times more likely to experience an opioid overdose during buprenorphine treatment gap months compared with treated months. During treatment gap months, spending was $196.41 (95% CI, $110.53-$282.30) more than in treated months. Patients who continued to take buprenorphine dosages of greater than 8 mg/d and 16 mg/d were 2.61 and 2.84 times more likely to overdose in a treatment gap month, respectively, while patients taking buprenorphine dosages of 8 mg/d or less were 3.62 times more likely to overdose in a treatment gap month (maintenance of >16 mg/d: hazard ratio (HR), 2.64 [95% CI, 1.80-3.87]; maintenance of >8 mg/d: HR, 2.84 [95% CI, 2.13-3.78]; maintenance of ≤8 mg/d: HR, 3.62 [95% CI, 1.54-8.50]). Buprenorphine monotherapy was associated with greater risk of overdose and higher spending during treatment gaps months than buprenorphine/naloxone. Conclusions and Relevance: Medicare patients treated with buprenorphine between 2010 and 2017 had a lower associated opioid overdose risk and spending during treatment months than treatment gap months.


Assuntos
Buprenorfina , Overdose de Opiáceos , Estados Unidos , Humanos , Idoso , Feminino , Masculino , Pessoa de Meia-Idade , Buprenorfina/uso terapêutico , Estudos de Casos e Controles , Gastos em Saúde , Medicare
10.
JAMA Netw Open ; 5(7): e2223080, 2022 07 01.
Artigo em Inglês | MEDLINE | ID: mdl-35895063

RESUMO

Importance: While left ventricular assist devices (LVADs) increase survival for patients with advanced heart failure (HF), racial and sex access and outcome inequities remain and are poorly understood. Objectives: To assess risk-adjusted inequities in access and outcomes for both Black and female patients and to examine heterogeneity in treatment decisions among patients for whom clinician discretion has a more prominent role. Design, Setting, and Participants: This retrospective cohort study of 12 310 Medicare beneficiaries used 100% Medicare Fee-for-Service administrative claims. Included patients had been admitted for heart failure from 2008 to 2014. Data were collected from July 2007 to December 2015 and analyzed from August 23, 2020, to May 15, 2022. Exposures: Beneficiary race and sex. Main Outcomes and Measures: The propensity for LVAD implantation was based on clinical risk factors from the 6 months preceding HF admission using XGBoost and the synthetic minority oversampling technique. Beneficiaries with a 5% or greater probability of receiving an LVAD were included. Logistic regression models were estimated to measure associations of race and sex with LVAD receipt adjusting for clinical characteristics and social determinants of health (eg, distance from LVAD center, Medicare low-income subsidy, neighborhood deprivation). Next, 1-year mortality after LVAD was examined. Results: The analytic sample included 12 310 beneficiaries, of whom 22.9% (n = 2819) were Black and 23.7% (n = 2920) were women. In multivariable models, Black beneficiaries were 3.0% (0.2% to 5.8%) less likely to receive LVAD than White beneficiaries, and women were 7.9% (5.6% to 10.2%) less likely to receive LVAD than men. Individual poverty and worse neighborhood deprivation were associated with reduced use, 2.9% (0.4% to 5.3%) and 6.7% (2.9% to 10.5%), respectively, but these measures did little to explain observed disparities. The racial disparity was concentrated among patients with a low propensity score (propensity score <0.52). One-year survival by race and sex were similar on average, but Black patients with a low propensity score experienced improved survival (7.2% [95% CI, 0.9% to 13.5%]). Conclusions and Relevance: In this cohort study of Medicare beneficiaries hospitalized for HF, disparities in LVAD use by race and sex existed and were not explained by clinical characteristics or social determinants of health. The treatment and post-LVAD survival by race were equivalent among the most obvious LVAD candidates. However, there was differential use and outcomes among less clear-cut LVAD candidates, with lower use but improved survival among Black patients. Inequity in LVAD access may have resulted from differences in clinician decision-making because of systemic racism and discrimination, implicit bias, or patient preference.


Assuntos
Insuficiência Cardíaca , Coração Auxiliar , Idoso , Estudos de Coortes , Feminino , Insuficiência Cardíaca/terapia , Humanos , Masculino , Medicare , Estudos Retrospectivos , Estados Unidos/epidemiologia
11.
Trials ; 23(1): 257, 2022 Apr 04.
Artigo em Inglês | MEDLINE | ID: mdl-35379302

RESUMO

BACKGROUND: Transfusion-transmitted infections (TTIs) are a global health challenge. One new approach to reduce TTIs is the use of pathogen reduction technology (PRT). In vitro, Mirasol PRT reduces the infectious load in whole blood (WB) by at least 99%. However, there are limited in vivo data on the safety and efficacy of Mirasol PRT. The objective of the Mirasol Evaluation of Reduction in Infections Trial (MERIT) is to investigate whether Mirasol PRT of WB can prevent seven targeted TTIs (malaria, bacteria, human immunodeficiency virus, hepatitis B virus, hepatitis C virus, hepatitis E virus, and human herpesvirus 8). METHODS: MERIT is a randomized, double-blinded, controlled clinical trial. Recruitment started in November 2019 and is expected to end in 2024. Consenting participants who require transfusion as medically indicated at three hospitals in Kampala, Uganda, will be randomized to receive either Mirasol-treated WB (n = 1000) or standard WB (n = 1000). TTI testing will be performed on donor units and recipients (pre-transfusion and day 2, day 7, week 4, and week 10 after transfusion). The primary endpoint is the cumulative incidence of one or more targeted TTIs from the Mirasol-treated WB vs. standard WB in a previously negative recipient for the specific TTI that is also detected in the donor unit. Log-binomial regression models will be used to estimate the relative risk reduction of a TTI by 10 weeks associated with Mirasol PRT. The clinical effectiveness of Mirasol WB compared to standard WB products in recipients will also be evaluated. DISCUSSION: Screening infrastructure for TTIs in low-resource settings has gaps, even for major TTIs. PRT presents a fast, potentially cost-effective, and easy-to-use technology to improve blood safety. MERIT is the largest clinical trial designed to evaluate the use of Mirasol PRT for WB. In addition, this trial will provide data on TTIs in Uganda. TRIAL REGISTRATION: Mirasol Evaluation of Reduction in Infections Trial (MERIT) NCT03737669 . Registered on 9 November 2018.


Assuntos
Reação Transfusional , Plaquetas , Segurança do Sangue/métodos , Humanos , Ensaios Clínicos Controlados Aleatórios como Assunto , Uganda
12.
JAMA Netw Open ; 5(3): e225484, 2022 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-35357448

RESUMO

Importance: During the COVID-19 pandemic, many primary care practices adopted telehealth in place of in-person care to preserve access to care for patients with acute and chronic conditions. The extent to which this change was associated with their rates of acute care visits (ie, emergency department visits and hospitalizations) for these conditions is unknown. Objective: To examine whether a primary care practice's level of telehealth use is associated with a change in their rate of acute care visits for ambulatory care-sensitive conditions (ACSC visits). Design, Setting, and Participants: This retrospective cohort analysis used a difference-in-differences study design to analyze insurance claims data from 4038 Michigan primary care practices from January 1, 2019, to September 30, 2020. Exposures: Low, medium, or high tertile of practice-level telehealth use based on the rate of telehealth visits from March 1 to August 31, 2020, compared with prepandemic visit volumes. Main Outcomes and Measures: Risk-adjusted ACSC visit rates before (June to September 2019) and after (June to September 2020) the start of the COVID-19 pandemic, reported as an annualized average marginal effect. The study examined overall, acute, and chronic ACSC visits separately and controlled for practice size, in-person visit volume, zip code-level attributes, and patient characteristics. Results: A total of nearly 1.5 million beneficiaries (53% female; mean [SD] age, 40 [22] years) were attributed to 4038 primary care practices. Compared with 2019 visit volumes, median telehealth use was 0.4% for the low telehealth tertile, 14.7% for the medium telehealth tertile, and 39.0% for the high telehealth tertile. The number of ACSC visits decreased in all tertiles, with adjusted rates changing from 24.3 to 14.9 per 1000 patients per year (low), 23.9 to 15.3 per 1000 patients per year (medium), and 27.5 to 20.2 per 1000 patients per year (high). In difference-in-differences analysis, high telehealth use was associated with a higher ACSC visit rate (2.10 more visits per 1000 patients per year; 95% CI, 0.22-3.97) compared with low telehealth practices; practices in the middle tertile did not differ significantly from the low tertile. No difference was found in ACSC visits across tertiles when acute and chronic ACSC visits were examined separately. Conclusions and Relevance: In this cohort study that used a difference-in-differences analysis, the association between practice-level telehealth use and ACSC visits was mixed. High telehealth use was associated with a slightly higher overall ACSC visit rate than low telehealth practices. The association of telehealth with downstream care use should be closely monitored going forward.


Assuntos
COVID-19 , Telemedicina , Adulto , Assistência Ambulatorial , COVID-19/epidemiologia , Estudos de Coortes , Feminino , Humanos , Masculino , Pandemias , Atenção Primária à Saúde , Estudos Retrospectivos
13.
Ann Thorac Surg ; 113(5): 1544-1551, 2022 05.
Artigo em Inglês | MEDLINE | ID: mdl-35176258

RESUMO

BACKGROUND: Patients undergoing left ventricular assist device (LVAD) implantation are at risk for death and postoperative adverse outcomes. Interhospital variability and concordance of quality metrics were assessed using the Society of Thoracic Surgeons Interagency Registry for Mechanically Assisted Circulatory Support (Intermacs). METHODS: A total of 22 173 patients underwent primary, durable LVAD implantation across 160 hospitals from 2012 to 2020, excluding hospitals performing <10 implant procedures. Observed and risk-adjusted operative mortality rates were calculated for each hospital. Outcomes included operative and 90-day mortality, a composite of adverse events (operative mortality, bleeding, stroke, device malfunction, renal dysfunction, respiratory failure), and secondarily failure to rescue. Rates are presented as median (interquartile range [IQR]). Hospital performance was evaluated using observed-to-expected (O/E) ratios for mortality and the composite outcome. RESULTS: Interhospital variability existed in observed (median, 7.2% [IQR, 5.1%-9.6%]) mortality. The rates of adverse events varied across hospitals: major bleeding, 15.6% (IQR, 11.4%-22.4%); stroke, 3.1% (IQR, 1.6%-4.7%); device malfunction, 2.4% (IQR, 0.8%-3.7%); respiratory failure, 10.5% (IQR, 4.6%-15.7%); and renal dysfunction, 6.4% (IQR, 3.2%-9.6%). The O/E ratio for operative mortality varied from 0.0 to 6.1, whereas the O/E ratio for the composite outcome varied from 0.28 to 1.99. Hospital operative mortality O/E ratios were more closely correlated with the 90-day mortality O/E ratio (r = 0.74) than with the composite O/E ratio (r = 0.12). CONCLUSIONS: This study reported substantial interhospital variability in performance for hospitals implanting durable LVADs. These findings support the need to (1) report hospital-level performance (mortality, composite) and (2) undertake benchmarking activities to reduce unwarranted variability in outcomes.


Assuntos
Insuficiência Cardíaca , Coração Auxiliar , Nefropatias , Insuficiência Respiratória , Acidente Vascular Cerebral , Cirurgiões , Benchmarking , Feminino , Coração Auxiliar/efeitos adversos , Humanos , Nefropatias/etiologia , Masculino , Sistema de Registros , Insuficiência Respiratória/etiologia , Estudos Retrospectivos , Acidente Vascular Cerebral/etiologia , Resultado do Tratamento
15.
J Heart Lung Transplant ; 41(1): 95-103, 2022 01.
Artigo em Inglês | MEDLINE | ID: mdl-34666942

RESUMO

BACKGROUND: The United States National Organ Procurement Transplant Network (OPTN) implemented changes to the adult heart allocation system to reduce waitlist mortality by improving access for those at greater risk of pre-transplant death, including patients on short-term mechanical circulatory support (sMCS). While sMCS increased, it is unknown whether the increase occurred equitably across centers. METHODS: The OPTN database was used to assess changes in use of sMCS at time of transplant in the 12 months before (pre-change) and after (post-change) implementation of the allocation system in October 2018 among 5,477 heart transplant recipients. An interrupted time series analysis comparing use of bridging therapies pre- and post-change was performed. Variability in the proportion of sMCS use at the center level pre- and post-change was determined. RESULTS: In the month pre-change, 9.7% of patients were transplanted with sMCS. There was an immediate increase in sMCS transplant the following month to 32.4% - an absolute and relative increase of 22.7% and 312% (p < 0.001). While sMCS use was stable pre-change (monthly change 0.0%, 95% CI [-0.1%,0.1%]), there was a continuous 1.2%/month increase post-change ([0.6%,1.8%], p < 0.001). Center-level variation in sMCS use increased substantially after implementation, from a median (interquartile range) of 3.85% (10%) pre-change to 35.7% (30.6%) post-change (p < 0.001). CONCLUSIONS: Use of sMCS at time of transplant increased immediately and continued to expand following heart allocation policy changes. Center-level variation in use of sMCS at the time of transplant increased compared to pre-change, which may have negatively impacted equitable access to heart transplantation.


Assuntos
Equidade em Saúde , Política de Saúde , Transplante de Coração , Coração Auxiliar , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Obtenção de Tecidos e Órgãos/normas , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de Tempo , Estados Unidos
16.
Ann Thorac Surg ; 114(4): 1307-1317, 2022 10.
Artigo em Inglês | MEDLINE | ID: mdl-34619136

RESUMO

BACKGROUND: Although the current wide-scale adoption of the HeartMate 3 left ventricular assist device can be attributed to favorable clinical trial outcomes, restrictive clinical trial eligibility criteria may result in lack of generalizability to real-world populations. We assessed the generalizability of left ventricular assist device clinical trial outcomes and evaluated the prognostic value of specific inclusion and exclusion criteria. METHODS: The Multicenter Study of MagLev Technology in Patients Undergoing Mechanical Circulatory Therapy With HeartMate 3 (MOMENTUM 3) eligibility criteria were applied to patients identified in The Society of Thoracic Surgeons Interagency Registry for Mechanically Assisted Circulatory Support (Intermacs) who underwent HeartMate 3 implantation (n = 4610) between August 2017 and March 2020. Patients were categorized as trial-eligible or trial-ineligible and by number of ineligibility criteria. The effect of trial eligibility on mortality was estimated using Cox models. RESULTS: Indications for HeartMate 3 implant included destination therapy (n = 2827, 61%), bridge to candidacy (n = 969, 21%), and bridge to transplant (n = 702, 15%). A total of 1941 recipients (42%) were trial-ineligible, with 1245 (27%) meeting one ineligibility criterion, 470 (10%) meeting two, and 226 (5%) meeting three or more. Estimated 1-year mortality for trial-ineligible recipients was higher than for trial-eligible recipients (17% ± 1% vs 10% ± 1%, P < .001). Compared with trial-eligible patients, 1-year mortality was incrementally higher for patients meeting one ineligibility criterion (15% ± 1%), two criteria (16% ± 2%), and three or more criteria (30% ± 3%). Thrombocytopenia and elevated creatinine, bilirubin, and international normalized ratio in trial-ineligible patients were independently associated with increased mortality. CONCLUSIONS: Despite differences in mortality, both trial-eligible and trial-ineligible HeartMate 3 recipients had excellent outcomes in real-world practice, suggesting future trial eligibility criteria could be expanded.


Assuntos
Insuficiência Cardíaca , Coração Auxiliar , Cirurgiões , Bilirrubina , Creatinina , Insuficiência Cardíaca/cirurgia , Humanos , Sistema de Registros , Estudos Retrospectivos , Resultado do Tratamento
17.
Transfusion ; 62(1): 227-246, 2022 01.
Artigo em Inglês | MEDLINE | ID: mdl-34870335

RESUMO

Standard platelet concentrates (PCs) stored at 22°C have a limited shelf life of 5 days. Because of the storage temperature, bacterial contamination of PCs can result in life-threatening infections in transfused patients. The potential of blood components to cause infections through contaminating pathogens or transmitting blood-borne diseases has always been a concern. The current safety practice to prevent pathogen transmission through blood transfusion starts with a stringent screening of donors and regulated testing of blood samples to ensure that known infections cannot reach transfusion products. Pathogen reduction technologies (PRTs), initially implemented to ensure the safety of plasma products, have been adapted to treat platelet products. In addition to reducing bacterial contamination, PRT applied to PCs can extend their shelf life up to 7 days, alleviating the impact of their shortage, while providing an additional safety layer against emerging blood-borne infectious diseases. While a deleterious action of PRTs in quantitative and qualitative aspects of plasma is accepted, the impact of PRTs on the quality, function, and clinical efficacy of PCs has been under constant examination. The potential of PRTs to prevent the possibility of new emerging diseases to reach cellular blood components has been considered more hypothetical than real. In 2019, a coronavirus-related disease (COVID-19) became a pandemic. This episode should help when reconsidering the possibility of future blood transmissible threats. The following text intends to evaluate the impact of different PRTs on the quality, function, and clinical effectiveness of platelets within the perspective of a developing pandemic.


Assuntos
Plaquetas , Preservação de Sangue , Patógenos Transmitidos pelo Sangue , COVID-19 , Humanos , Pandemias , Transfusão de Plaquetas/efeitos adversos , Resultado do Tratamento
18.
J Am Heart Assoc ; 10(21): e021629, 2021 11 02.
Artigo em Inglês | MEDLINE | ID: mdl-34689581

RESUMO

Background Public reporting of transcatheter aortic valve replacement (TAVR) claims-based outcome measures is used to identify high- and low-performing centers. Whether claims-based TAVR outcomes can reliably be used for center-level comparisons is unknown. In this study, we sought to evaluate center variability in claims-based TAVR outcomes used in public reporting. Methods and Results The study sample included 119 554 Medicare beneficiaries undergoing TAVR between January 2014 and October 2018 based on procedure codes in 100% Medicare inpatient claims. Multivariable hierarchical logistic regression was used to estimate center-specific adjusted rates and reliability (R) of 30-day mortality, discharge not to home/self-care, 30-day stroke, and 30-day readmission. Reliability was defined as the ratio of between-hospital variation to the sum of the between- and within-hospital variation. The median (interquartile range [IQR]) center-level adjusted outcome rates were 3.1% (2.9%-3.4%) for 30-day mortality, 41.4% (31.3%-53.4%) for discharge not to home, 2.5% (2.3%-2.7%) for 30-day stroke, and 14.9% (14.4%-15.5%) for 30-day readmission. Median reliability was highest for the discharge not to home measure (R=0.95; IQR, 0.94-0.97), followed by the 30-day stroke (R=0.92; IQR, 0.87-0.94), 30-day mortality (R=0.86; IQR, 0.81-0.91), and 30-day readmission measures (R=0.42; IQR, 0.35-0.51). Across outcomes, there was an inverse relationship between center volume and measure reliability. Conclusions Claims-based TAVR outcome measures for mortality, discharge not to home, and stroke were reliable measures for center-level comparisons, but readmission measures were unreliable. Stakeholders should consider these findings when evaluating claims-based measures to compare center-level TAVR performance.


Assuntos
Estenose da Valva Aórtica , Substituição da Valva Aórtica Transcateter , Idoso , Valva Aórtica/cirurgia , Estenose da Valva Aórtica/cirurgia , Humanos , Medicare , Avaliação de Resultados em Cuidados de Saúde , Reprodutibilidade dos Testes , Fatores de Risco , Acidente Vascular Cerebral/epidemiologia , Substituição da Valva Aórtica Transcateter/efeitos adversos , Resultado do Tratamento , Estados Unidos/epidemiologia
19.
Int J Cardiol Heart Vasc ; 36: 100864, 2021 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-34522766

RESUMO

BACKGROUND: Aortic stenosis is a prevalent valvular heart disease that is treated primarily by surgical aortic valve replacement (SAVR) or transcatheter aortic valve replacement (TAVR), which are common treatments for addressing symptoms secondary to valvular heart disease. This narrative review article focuses on the existing literature comparing recovery and cost-effectiveness for SAVR and TAVR. METHODS: Major databases were searched for relevant literature discussing HRQOL and cost-effectiveness of TAVR and SAVR. We also searched for studies analyzing the use of wearable devices to monitor post-discharge recovery patterns. RESULTS: The literature focusing on quality-of-life following TAVR and SAVR has been limited primarily to single-center observational studies and randomized controlled trials. Studies focused on TAVR report consistent and rapid improvement relative to baseline status. Common HRQOL instruments (SF-36, EQ-5D, KCCQ, MLHFQ) have been used to document that TF-TAVR is advantageous over SAVR at 1-month follow-up, with the benefits leveling off following 1 year. TF-TAVR is economically favorable relative to SAVR, with estimated incremental cost-effectiveness ratio values ranging from $50,000 to $63,000/QALY gained. TA-TAVR has not been reported to be advantageous from an HRQOL or cost-effectiveness perspective. CONCLUSIONS: While real-world experiences are less described, large-scale trials have advanced our understanding of recovery and cost-effectiveness of aortic valve replacement treatment strategies. Future work should focus on scalable wearable device technology, such as smartwatches and heart-rate monitors, to facilitate real-world evaluation of TAVR and SAVR to support clinical decision-making and outcomes ascertainment.

20.
JAMA Intern Med ; 181(8): 1065-1070, 2021 08 01.
Artigo em Inglês | MEDLINE | ID: mdl-34152373

RESUMO

Importance: The Epic Sepsis Model (ESM), a proprietary sepsis prediction model, is implemented at hundreds of US hospitals. The ESM's ability to identify patients with sepsis has not been adequately evaluated despite widespread use. Objective: To externally validate the ESM in the prediction of sepsis and evaluate its potential clinical value compared with usual care. Design, Setting, and Participants: This retrospective cohort study was conducted among 27 697 patients aged 18 years or older admitted to Michigan Medicine, the academic health system of the University of Michigan, Ann Arbor, with 38 455 hospitalizations between December 6, 2018, and October 20, 2019. Exposure: The ESM score, calculated every 15 minutes. Main Outcomes and Measures: Sepsis, as defined by a composite of (1) the Centers for Disease Control and Prevention surveillance criteria and (2) International Statistical Classification of Diseases and Related Health Problems, Tenth Revision diagnostic codes accompanied by 2 systemic inflammatory response syndrome criteria and 1 organ dysfunction criterion within 6 hours of one another. Model discrimination was assessed using the area under the receiver operating characteristic curve at the hospitalization level and with prediction horizons of 4, 8, 12, and 24 hours. Model calibration was evaluated with calibration plots. The potential clinical benefit associated with the ESM was assessed by evaluating the added benefit of the ESM score compared with contemporary clinical practice (based on timely administration of antibiotics). Alert fatigue was evaluated by comparing the clinical value of different alerting strategies. Results: We identified 27 697 patients who had 38 455 hospitalizations (21 904 women [57%]; median age, 56 years [interquartile range, 35-69 years]) meeting inclusion criteria, of whom sepsis occurred in 2552 (7%). The ESM had a hospitalization-level area under the receiver operating characteristic curve of 0.63 (95% CI, 0.62-0.64). The ESM identified 183 of 2552 patients with sepsis (7%) who did not receive timely administration of antibiotics, highlighting the low sensitivity of the ESM in comparison with contemporary clinical practice. The ESM also did not identify 1709 patients with sepsis (67%) despite generating alerts for an ESM score of 6 or higher for 6971 of all 38 455 hospitalized patients (18%), thus creating a large burden of alert fatigue. Conclusions and Relevance: This external validation cohort study suggests that the ESM has poor discrimination and calibration in predicting the onset of sepsis. The widespread adoption of the ESM despite its poor performance raises fundamental concerns about sepsis management on a national level.


Assuntos
Antibacterianos/uso terapêutico , Hospitalização/estatística & dados numéricos , Unidades de Terapia Intensiva/estatística & dados numéricos , Sepse , Sistemas de Apoio a Decisões Clínicas/normas , Feminino , Mortalidade Hospitalar , Humanos , Masculino , Michigan/epidemiologia , Pessoa de Meia-Idade , Escores de Disfunção Orgânica , Valor Preditivo dos Testes , Prognóstico , Reprodutibilidade dos Testes , Estudos Retrospectivos , Sepse/diagnóstico , Sepse/epidemiologia , Sepse/prevenção & controle
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA