RESUMO
Background: Anterior cruciate ligament (ACL) injuries are prevalent musculoskeletal conditions often resulting in long-term degenerative outcomes such as osteoarthritis (OA). Despite surgical advances in ACL reconstruction, a significant number of patients develop OA within ten years post-surgery, providing a patient population that may present early markers of cartilage degeneration detectable using noninvasive imaging. Purpose: This study aims to investigate the temporal evolution of cartilage strain and relaxometry post-ACL reconstruction using displacement under applied loading MRI and quantitative MRI. Specifically, we examined the correlations between MRI metrics and pain, as well as knee loading patterns during gait, to identify early candidate markers of cartilage degeneration. Materials and Methods: Twenty-five participants (female/male = 15/10; average age = 25.6 yrs) undergoing ACL reconstruction were enrolled in a prospective longitudinal cohort study between 2022 and 2023. MRI scans were conducted at 6- and 12-months post-surgery, assessing T2, T2*, and T1ρ relaxometry values, and intratissue cartilage strain. Changes in pain were evaluated using standard outcome scores, and gait analysis assessed the knee adduction moment (KAM). Regressions were performed to evaluate relationships between MRI metrics in cartilage contact regions, patient-reported pain, and knee loading metrics. Results: Increases in axial and transverse strains in the tibial cartilage were significantly correlated with increased pain, while decreases in shear strain were associated with increased pain. Changes in strain metrics were also significantly related to KAM at12 months. Conclusions: Changes in cartilage strain and relaxometry are related to heightened pain and altered knee loading patterns, indicating potential early markers of osteoarthritis progression. These findings underscore the importance of using advanced MRI for early monitoring in ACL-reconstructed patients to optimize treatment outcomes, while also highlighting KAM as a modifiable intervention through gait retraining that may positively impact the evolution of cartilage health and patient pain. Key Results: Increased axial and transverse strains in the tibial cartilage from 6 to 12 months post-ACL reconstruction were significantly correlated with increased pain, suggesting evolving changes in cartilage biomechanical properties over time.Decreases in shear strain in inner femoral and central tibial cartilage regions were linked to increased pain, indicating alterations in joint loading patterns.Decreases in shear strain in the inner femoral cartilage were significantly associated with decreased 12-month knee adduction moment (KAM), a surrogate for medial cartilage knee loading during walking.
RESUMO
Key terms: Multicontrast and Multiparametric, Magnetic Resonance Imaging, Osteoarthritis, Functional Biomechanical Imaging, Knee Joint Degeneration What is known about the subject: dualMRI has been used to quantify strains in a healthy human population in vivo and in cartilage explant models. Previously, OA severity, as determined by histology, has been positively correlated to increased shear and transverse strains in cartilage explants. What this study adds to existing knowledge: This is the first in vivo use of dualMRI in a participant demographic post-ACL reconstruction and at risk for developing osteoarthritis. This study shows that dualMRI-derived strains are more significantly correlated with patient-reported outcomes than any MRI relaxometry metric. Background: Anterior cruciate ligament (ACL) injuries lead to an increased risk of osteoarthritis, characterized by altered cartilage tissue structure and function. Displacements under applied loading by magnetic resonance imaging (dualMRI) is a novel MRI technique that can be used to quantify mechanical strain in cartilage while undergoing a physiological load. Purpose: To determine if strains derived by dualMRI and relaxometry measures correlate with patient-reported outcomes at six months post unilateral ACL reconstruction. Study Design: Cohort study. Methods: Quantitative MRI (T2, T2*, T1ρ) measurements and transverse, axial, and shear strains were quantified in the medial articular tibiofemoral cartilage of 35 participants at six-months post unilateral ACL reconstruction. The relationships between patient-reported outcomes (WOMAC, KOOS, MARS) and all qMRI relaxation times were quantified using general linear mixed-effects models. A combined best-fit multicontrast MRI model was then developed using backwards regression to determine the patient features and MRI metrics that are most predictive of patient-reported outcome scores. Results: Higher femoral strains were significantly correlated with worse patient-reported functional outcomes. Femoral shear and transverse strains were positively correlated with six-month KOOS and WOMAC scores, after controlling for covariates. No relaxometry measures were correlated with patient-reported outcome scores. We identified the best-fit model for predicting WOMAC score using multiple MRI measures and patient-specific information, including sex, age, graft type, femoral transverse strain, femoral axial strain, and femoral shear strain. The best-fit model significantly predicted WOMAC score (p<0.001) better than any one individual MRI metric alone. When we regressed the model-predicted WOMAC scores against the patient-reported WOMAC scores, we found that our model achieved a goodness of fit exceeding 0.52. Conclusions: This work presents the first use of dualMRI in vivo in a cohort of participants at risk for developing osteoarthritis. Our results indicate that both shear and transverse strains are highly correlated with patient-reported outcome severity could serve as novel imaging biomarkers to predict the development of osteoarthritis.
RESUMO
Individuals who have undergone anterior cruciate ligament reconstruction (ACLR) are at greater risk of developing knee osteoarthritis (OA). This elevated risk of knee OA is associated with high tibiofemoral (TF) compressive force, due to a combination of low knee flexion angles and increased co-contraction of the hamstrings and quadriceps during limb loading. Prolonged vibration of the hamstrings fatigues the intrafusal muscle fibers, which reduces autonomic reflexive excitation of the hamstrings and alleviates reciprocal inhibition to the quadriceps. The aim of this study was to examine the effect of prolonged hamstrings vibration on TF compressive force in individuals who have undergone ACL reconstruction. Fourteen participants with unilateral ACLR and 14 participants without knee injury performed a single-leg drop-land task before and after prolonged (20 min) vibration of the hamstrings. Peak TF compressive force, knee flexion angle, and hamstrings/quadriceps co-contraction were calculated during the deceleration phase of the drop-land task before and after vibration. The ACLR group experienced an 18% decrease in TF compressive force, a 32% increase in knee flexion angle, and a 38% decrease in hamstrings/quadriceps co-contraction after hamstrings vibration. There was no difference in any of the parameters in the noninjured group after vibration. These data suggest that acute prolonged hamstrings vibration has the potential to mitigate TF compressive force, which may protect the knee joint in the long term. Clinical significance: The results of this research are expected to lead to improved clinical care for ACLR patients because it holds promise for mitigating altered joint mechanics and perhaps slowing down the onset of posttraumatic knee osteoarthritis.
Assuntos
Lesões do Ligamento Cruzado Anterior , Reconstrução do Ligamento Cruzado Anterior , Osteoartrite do Joelho , Humanos , Osteoartrite do Joelho/cirurgia , Vibração , Lesões do Ligamento Cruzado Anterior/cirurgia , Fenômenos Biomecânicos , Articulação do Joelho/fisiologia , Músculo QuadrícepsRESUMO
Resistance training with low loads in combination with blood flow restriction (BFR) facilitates increases in muscle size and strength comparable with high-intensity exercise. We investigated the effects of BFR on single motor unit discharge behavior throughout a sustained low-intensity isometric contraction. Ten healthy individuals attended two experimental sessions: one with, the other without, BFR. Motor unit discharge rates from the tibialis anterior (TA) were recorded with intramuscular fine-wire electrodes throughout the duration of a sustained fatigue task. Three 5-s dorsiflexion maximal voluntary contractions (MVC) were performed before and after the fatigue task. Each participant held a target force of 20% MVC until endurance limit. A significant decrease in motor unit discharge rate was observed in both the non-BFR condition (from 13.13 ± 0.87 Hz to 11.95 ± 0.43 Hz, P = 0.03) and the BFR condition (from 12.95 ± 0.71 Hz to 10.9 ± 0.75 Hz, P = 0.03). BFR resulted in significantly shorter endurance time and time-to-minimum discharge rates and greater end-stage motor unit variability. Thus, low-load BFR causes an immediate steep decline in motor unit discharge rate that is greater than during contractions performed without BFR. This shortened neuromuscular response of time-to-minimum discharge rate likely contributes to the rapid rate of neuromuscular fatigue observed during BFR.
Assuntos
Alta do Paciente , Músculo Quadríceps , Humanos , Músculo Quadríceps/fisiologia , Músculo Esquelético/fisiologia , Hemodinâmica , Contração Isométrica/fisiologia , Fluxo Sanguíneo Regional/fisiologia , EletromiografiaRESUMO
Persistent quadriceps strength deficits in individuals with anterior cruciate ligament reconstruction (ACLr) have been attributed to arthrogenic muscle inhibition (AMI). The purpose of the present study was to investigate the effect of vibration-induced hamstrings fatigue on AMI in patients with ACLr. Eight participants with unilateral ACLr (post-surgery time: Mâ¯=â¯46.5, SDâ¯=â¯23.5â¯months; age: Mâ¯=â¯21.4, SDâ¯=â¯1.4â¯years) and eight individuals with no previous history of knee injury (age: Mâ¯=â¯22.5, SDâ¯=â¯2.5â¯years) were recruited. A fatigue protocol, consisting of 10â¯min of prolonged local hamstrings vibration, was applied to both the ACLr and control groups. The central activation ratio (CAR) of the quadriceps was measured with a superimposed burst of electrical stimulation, and hamstrings/quadriceps coactivation was assessed using electromyography (EMG) during isometric knee extension exercises, both before and after prolonged local vibration. For the ACLr group, the hamstrings strength, measured by a load cell on a purpose-built chair, was significantly (Pâ¯=â¯0.016) reduced about 14.5%, indicating fatigue was actually induced in the hamstrings. At baseline, the ACLr group showed a trend (Pâ¯=â¯0.051) toward a lower quadriceps CAR (Mâ¯=â¯93.2%, SDâ¯=â¯6.2% versus Mâ¯=â¯98.1%, SDâ¯=â¯1.1%) and significantly (Pâ¯=â¯0.001) higher hamstrings/quadriceps coactivation (Mâ¯=â¯15.1%, SDâ¯=â¯6.2% versus Mâ¯=â¯7.5%, SDâ¯=â¯4.0%) during knee extension compared to the control group. The fatigue protocol significantly (Pâ¯=â¯0.001) increased quadriceps CAR (from Mâ¯=â¯93.2%, SDâ¯=â¯6.2% to Mâ¯=â¯97.9%, SDâ¯=â¯2.8%) and significantly (Pâ¯=â¯0.006) decreased hamstrings/quadriceps coactivation during knee extension (from Mâ¯=â¯15.1%, SDâ¯=â¯6.2% to Mâ¯=â¯9.5%, SDâ¯=â¯4.5%) in the ACLr group. In conclusion, vibration-induced hamstrings fatigue can alleviate AMI of the quadriceps in patients with ACLr. This finding has clinical implications in the management of recovery for ACLr patients with quadriceps strength deficits and dysfunction.
Assuntos
Reconstrução do Ligamento Cruzado Anterior/tendências , Músculos Isquiossurais/fisiologia , Articulação do Joelho/fisiologia , Fadiga Muscular/fisiologia , Força Muscular/fisiologia , Músculo Quadríceps/fisiologia , Vibração/uso terapêutico , Adulto , Ligamento Cruzado Anterior/fisiopatologia , Ligamento Cruzado Anterior/cirurgia , Lesões do Ligamento Cruzado Anterior/fisiopatologia , Lesões do Ligamento Cruzado Anterior/cirurgia , Eletromiografia/métodos , Feminino , Humanos , Masculino , Modalidades de Fisioterapia/tendências , Adulto JovemRESUMO
Arthrogenic muscle inhibition, an inability to fully activate the quadriceps muscles, has been persistently observed after anterior cruciate ligament reconstruction (ACLr) surgery. Reductions in quadriceps activation may be partly due to the flexion reflex pathway, hamstrings activation, and reciprocal quadriceps inhibition. Since central fatigue has been shown to modify hamstring excitability and change the hamstring reflex response, hamstring fatigue might alleviate quadriceps muscle inhibition by counteracting the flexion reflex. In this study, nine young adult athletes (age: M = 19.9 years, SD = 1.7) with unilateral ACLr and nine control athletes (age: M = 24.0 years, SD = 2.4) with no previous history of knee injury performed tempo squats to induce fatigue. The ACLr group tended to use hamstrings for more hip flexion and trunk forward flexion than the control group. We assessed each participant's quadriceps inhibition through the central activation ratio (CAR), measured by twitch interpolation, before and after the induced fatigue. A mixed analysis of variance was used to examine the effect of fatigue on the CAR between pre- and post-fatigue and for both ACLr and control groups. The ACLr group showed significantly ( p = .010) greater CAR of the quadriceps post-fatigue ( M = 96.0%, SD = 7.6%) than pre-fatigue ( M = 81.2%, SD = 15.8%), while the control group showed no significant ( p = .969) pre-fatigue ( M = 96.9%, SD = 9.6%) and post-fatigue ( M = 97.0%, SD = 17.1%) differences. Thus, fatigue training may be used as a rehabilitation strategy to restore normal quadriceps function at the knee joint following ACL reconstruction by relaxing the hamstrings and overcoming quadriceps inhibition.
Assuntos
Lesões do Ligamento Cruzado Anterior/fisiopatologia , Reconstrução do Ligamento Cruzado Anterior/reabilitação , Articulação do Joelho/fisiopatologia , Fadiga Muscular/fisiologia , Músculo Quadríceps/fisiopatologia , Adolescente , Lesões do Ligamento Cruzado Anterior/cirurgia , Atletas , Feminino , Humanos , Masculino , Amplitude de Movimento Articular , Adulto JovemRESUMO
BACKGROUND Risk adjustment is needed to fairly compare central-line-associated bloodstream infection (CLABSI) rates between hospitals. Until 2017, the Centers for Disease Control and Prevention (CDC) methodology adjusted CLABSI rates only by type of intensive care unit (ICU). The 2017 CDC models also adjust for hospital size and medical school affiliation. We hypothesized that risk adjustment would be improved by including patient demographics and comorbidities from electronically available hospital discharge codes. METHODS Using a cohort design across 22 hospitals, we analyzed data from ICU patients admitted between January 2012 and December 2013. Demographics and International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) discharge codes were obtained for each patient, and CLABSIs were identified by trained infection preventionists. Models adjusting only for ICU type and for ICU type plus patient case mix were built and compared using discrimination and standardized infection ratio (SIR). Hospitals were ranked by SIR for each model to examine and compare the changes in rank. RESULTS Overall, 85,849 ICU patients were analyzed and 162 (0.2%) developed CLABSI. The significant variables added to the ICU model were coagulopathy, paralysis, renal failure, malnutrition, and age. The C statistics were 0.55 (95% CI, 0.51-0.59) for the ICU-type model and 0.64 (95% CI, 0.60-0.69) for the ICU-type plus patient case-mix model. When the hospitals were ranked by adjusted SIRs, 10 hospitals (45%) changed rank when comorbidity was added to the ICU-type model. CONCLUSIONS Our risk-adjustment model for CLABSI using electronically available comorbidities demonstrated better discrimination than did the CDC model. The CDC should strongly consider comorbidity-based risk adjustment to more accurately compare CLABSI rates across hospitals. Infect Control Hosp Epidemiol 2017;38:1019-1024.
Assuntos
Infecções Relacionadas a Cateter/epidemiologia , Cateterismo Venoso Central/efeitos adversos , Comorbidade , Infecção Hospitalar/epidemiologia , Infecção Hospitalar/etiologia , Risco Ajustado/métodos , Fatores Etários , Centers for Disease Control and Prevention, U.S. , Infecção Hospitalar/etnologia , Contaminação de Equipamentos , Hospitais/estatística & dados numéricos , Humanos , Unidades de Terapia Intensiva , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Estados UnidosRESUMO
BACKGROUND: Healthcare-associated infections such as surgical site infections (SSIs) are used by the Centers for Medicare and Medicaid Services (CMS) as pay-for-performance metrics. Risk adjustment allows a fairer comparison of SSI rates across hospitals. Until 2016, Centers for Disease Control and Prevention (CDC) risk adjustment models for pay-for-performance SSI did not adjust for patient comorbidities. New 2016 CDC models only adjust for body mass index and diabetes. METHODS: We performed a multicenter retrospective cohort study of patients undergoing surgical procedures at 28 US hospitals. Demographic data and International Classification of Diseases, Ninth Revision codes were obtained on patients undergoing colectomy, hysterectomy, and knee and hip replacement procedures. Complex SSIs were identified by infection preventionists at each hospital using CDC criteria. Model performance was evaluated using measures of discrimination and calibration. Hospitals were ranked by SSI proportion and risk-adjusted standardized infection ratios (SIR) to assess the impact of comorbidity adjustment on public reporting. RESULTS: Of 45394 patients at 28 hospitals, 573 (1.3%) developed a complex SSI. A model containing procedure type, age, race, smoking, diabetes, liver disease, obesity, renal failure, and malnutrition showed good discrimination (C-statistic, 0.73) and calibration. When comparing hospital rankings by crude proportion to risk-adjusted ranks, 24 of 28 (86%) hospitals changed ranks, 16 (57%) changed by ≥2 ranks, and 4 (14%) changed by >10 ranks. CONCLUSIONS: We developed a well-performing risk adjustment model for SSI using electronically available comorbidities. Comorbidity-based risk adjustment should be strongly considered by the CDC and CMS to adequately compare SSI rates across hospitals.
Assuntos
Infecção da Ferida Cirúrgica/epidemiologia , Adulto , Idoso , Comorbidade , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Risco Ajustado , Fatores de Risco , Estados Unidos/epidemiologiaRESUMO
OBJECTIVE: This study examines nurse-related clinical nonlicensed personnel (CNLP) in U.S. hospitals between 2010 and 2014, including job categories, trends in staffing levels, and the possible relationship of substitution between this group of workers and registered nurses (RNs) and/or licensed practical nurses (LPNs). DATA SOURCE: We used 5 years of data (2010-2014) from an operational database maintained by Premier, Inc. that tracks labor hours, hospital units, and facility characteristics. STUDY DESIGN: We assessed changes over time in the average number of total hours worked by RNs, LPNs, and CNLP, adjusted by total patient days. We then conducted linear regressions to estimate the relationships between nurse and CNLP staffing, controlling for patient acuity, volume, and hospital fixed effects. PRINCIPAL FINDINGS: The overall use of CNLP and LPN hours per patient day declined from 2010 to 2014, while RN hours per patient day remained stable. We found no evidence of substitution between CNLP and nurses during the study period: Nurse-related CNLP hours were positively associated with RN hours and not significantly related to LPN hours, holding other factors constant. CONCLUSIONS: Findings point to the importance of examining where and why CNLP hours per patient day have declined and to understanding of the effects of these changes on outcomes.
Assuntos
Pessoal Técnico de Saúde/provisão & distribuição , Pessoal Técnico de Saúde/tendências , Certificação/estatística & dados numéricos , Recursos Humanos de Enfermagem Hospitalar/provisão & distribuição , Recursos Humanos de Enfermagem Hospitalar/tendências , Admissão e Escalonamento de Pessoal/estatística & dados numéricos , Admissão e Escalonamento de Pessoal/tendências , Pessoal Técnico de Saúde/estatística & dados numéricos , Estudos Transversais , Previsões , Humanos , Recursos Humanos de Enfermagem Hospitalar/estatística & dados numéricos , Estados UnidosRESUMO
BACKGROUND: In 2008 Premier (Premier, Inc., Charlotte, North Carolina) began its Quality, Efficiency, and Safety with Transparency (QUEST®) collaborative, which is an acute health care organization program focused on improving quality and reducing patient harm. METHODS: Retrospective performance data for QUEST hospitals were used to establish trends from the third quarter (Q3; JulySeptember) of 2006 through Q3 2015. The study population included past and present members of the QUEST collaborative (N = 356), with each participating hospital considered a member. The QUEST program engages with member hospitals through a routine-coaching structure, sprints, minicollaboratives, and face-to-face meetings. RESULTS: Cost and efficiency data showed reductions in adjusted cost per discharge for hospitals between Q3 2013 (mean, $8,296; median, $8,459) and Q3 2015 (mean, $8,217; median, $7,895). Evidence-based care (EBC) measures showed improvement from baseline (Q3 2006; mean, 77%; median, 79%) to Q3 2015 (mean, 95%; median, 96%). Observed-to-expected (O/E) mortality improved from 1% to 22% better-than-expected outcomes on average. The QUEST safety harm composite score showed moderate reduction from Q1 2009 to Q3 2015, as did the O/E readmission rates--from Q1 2010 to Q3 2015--with improvement from a 5% to an 8% better-than-expected score. CONCLUSION: Quantitative and qualitative evaluation of QUEST collaborative hospitals indicated that for the 2006-2015 period, QUEST facilities reduced cost per discharge, improved adherence with evidence-based practice, reduced safety harm composite score, improved patient experience, and reduced unplanned readmissions.
Assuntos
Comportamento Cooperativo , Hospitalização/estatística & dados numéricos , Segurança do Paciente/estatística & dados numéricos , Melhoria de Qualidade , Análise Custo-Benefício , Hospitalização/economia , Humanos , Segurança do Paciente/economia , Satisfação do Paciente , Avaliação de Programas e Projetos de Saúde , Estados UnidosRESUMO
The purpose of this study was to provide a novel stochastic assessment of inhomogeneous distribution of bone mineral density (BMD) from the Dual-energy X-ray Absorptiometry (DXA) scans of human lumbar vertebrae and identify the stochastic predictors that were correlated with the microarchitecture parameters of trabecular bone. Eighteen human lumbar vertebrae with intact posterior elements from 5 cadaveric spines were scanned in the posterior-anterior projection using a Hologic densitometer. The BMD map of human vertebrae was obtained from the raw data of DXA scans by directly operating on the transmission measurements of low- and high-energy X-ray beams. Stochastic predictors were calculated by fitting theoretical models onto the experimental variogram of the BMD map, rather than grayscale images, from DXA scans. In addition, microarchitecture parameters of trabecular bone were measured from the 3D images of human vertebrae acquired using a Micro-CT scanner. Significant correlations were observed between stochastic predictors and microarchitecture parameters. The sill variance, representing the standard deviation of the BMD map to some extent, had significantly positive correlations with bone volume, trabecular thickness, trabecular number and connectivity density. The sill variance was also negatively associated with bone surface to volume ratio and trabecular separation. This study demonstrates that the stochastic assessment of the inhomogeneous distribution of BMD from DXA scans of human lumbar vertebrae can reveal microarchitecture information of trabecular bone. However, future studies are needed to examine the potential of stochastic predictors from routine clinical DXA scans in providing bone fragility information complementary to BMD.
Assuntos
Absorciometria de Fóton , Densidade Óssea , Vértebras Lombares/citologia , Vértebras Lombares/fisiologia , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Imageamento Tridimensional , Vértebras Lombares/diagnóstico por imagem , Masculino , Pessoa de Meia-Idade , Processos Estocásticos , Tomografia Computadorizada por Raios XRESUMO
BACKGROUND: During a myocardial infarction, no single best approach of systemic anticoagulation is recommended, likely due to a lack of comparative effectiveness studies and trade-offs between treatments. METHODS AND RESULTS: We investigated the patterns of use and site-level variability in anticoagulant strategies (unfractionated heparin [UFH] only, low-molecular-weight heparin [LMWH] only, UFH+LMWH, any bivalirudin) of 63 796 patients with a principal diagnosis of myocardial infarction treated with an early invasive strategy with percutaneous coronary intervention at 257 hospitals. About half (47%) of patients received UFH only, 6% UFH+LMWH, 7% LMWH only, and 40% bivalirudin. Compared with UFH, the median odds ratio was 2.90 for LMWH+UFH, 4.70 for LMWH only, and 3.09 for bivalirudin, indicating that 2 "identical" patients would have a 3- to 4-fold greater likelihood of being treated with anticoagulants other than UFH at one hospital compared with another. We then categorized hospitals as low- or high-users of LMWH and bivalirudin. Using hierarchical, multivariate regression models, we found that low bivalirudin-using hospitals had higher unadjusted bleeding rates, but the risk-adjusted and anticoagulant-adjusted bleeding rates did not differ across the hospital anticoagulation phenotypes. Risk-standardized mortality and risk-standardized length of stay also did not differ across hospital phenotypes. CONCLUSIONS: We found substantial site-level variability in the choice of anticoagulants for invasively managed acute myocardial infarction patients, even after accounting for patient factors. No single hospital-use pattern was found to be clinically superior. More studies are needed to determine which patients would derive the greatest benefit from various anticoagulants and to support consistent treatment of patients with the optimal anticoagulant strategy.
Assuntos
Anticoagulantes/uso terapêutico , Hospitais/estatística & dados numéricos , Infarto do Miocárdio/tratamento farmacológico , Idoso , Feminino , Heparina/uso terapêutico , Heparina de Baixo Peso Molecular/uso terapêutico , Hirudinas , Humanos , Tempo de Internação/estatística & dados numéricos , Masculino , Pessoa de Meia-Idade , Infarto do Miocárdio/mortalidade , Infarto do Miocárdio/terapia , Fragmentos de Peptídeos/uso terapêutico , Intervenção Coronária Percutânea/métodos , Intervenção Coronária Percutânea/estatística & dados numéricos , Proteínas Recombinantes/uso terapêutico , Estudos Retrospectivos , Resultado do TratamentoRESUMO
Bone mineral density (BMD) measurements from Dual-energy X-ray Absorptiometry (DXA) alone cannot account for all factors associated with the risk of hip fractures. For example, the inhomogeneity of bone mineral density in the hip region also contributes to bone strength. In the stochastic assessment of bone inhomogeneity, the BMD map in the hip region is considered as a random field and stochastic predictors can be calculated by fitting a theoretical model onto the experimental variogram of the BMD map. The objective of this study was to compare the ability of bone mineral density and stochastic assessment of inhomogeneous distribution of bone mineral density in predicting hip fractures for postmenopausal women. DXA scans in the hip region were obtained from postmenopausal women with hip fractures (N=47, Age: 71.3±11.4 years) and without hip fractures (N=45, Age: 66.7±11.4 years). Comparison of BMD measurements and stochastic predictors in assessing bone fragility was based on the area under the receiver operating characteristic curves (AUC) from logistic regression analyses. Although stochastic predictors offered higher accuracy (AUC=0.675) in predicting the risk of hip fractures than BMD measurements (AUC=0.625), this difference was not statistically significant (p=0.548). Nevertheless, the combination of stochastic predictors and BMD measurements had significantly (p=0.039) higher prediction accuracy (AUC=0.748) than BMD measurements alone. This study demonstrates that stochastic assessment of bone mineral distribution from DXA scans can serve as a valuable tool in enhancing the prediction of hip fractures for postmenopausal women in addition to BMD measurements.
Assuntos
Absorciometria de Fóton , Densidade Óssea/fisiologia , Fraturas do Quadril/fisiopatologia , Articulação do Quadril/fisiologia , Pós-Menopausa/fisiologia , Idoso , Idoso de 80 Anos ou mais , Diagnóstico Diferencial , Feminino , Fraturas do Quadril/diagnóstico , Humanos , Modelos Logísticos , Pessoa de Meia-Idade , Curva ROC , Processos EstocásticosRESUMO
BACKGROUND: Overutilization of antimicrobial therapy places patients at risk for harm and contributes to antimicrobial resistance and escalating healthcare costs. Focusing on redundant or duplicate antimicrobial therapy is 1 recommended strategy to reduce overutilization and its attendant effects on patient safety and hospital costs. OBJECTIVE: This study explored the incidence and economic impact of potentially redundant antimicrobial therapy. METHODS: We conducted a retrospective analysis of inpatient administrative data drawn from 505 nonfederal US hospitals. All hospitalized patients discharged between January 1, 2008, and December 31, 2011, were eligible for study inclusion. Potentially redundant antimicrobial therapy was identified from pharmacy records and was defined as patients receiving treatment with overlapping antibiotic spectra for 2 or more consecutive days. RESULTS: We found evidence of potentially inappropriate, redundant antimicrobial coverage for 23 different antimicrobial combinations in 394 of the 505 (78%) hospitals, representing a total of 32,507 cases. High-frequency redundancies were observed in 3 antianaerobic regimens, accounting for 22,701 (70%) of the cases. Of these, metronidazole and piperacillin-tazobactam accounted for 53% (n = 17,326) of all potentially redundant cases. Days of redundant therapy totaled 148,589, representing greater than $12 million in potentially avoidable healthcare costs. CONCLUSIONS: Our study suggests that there may be pervasive use of redundant antimicrobial therapy within US hospitals. Appropriate use of antimicrobials may reduce the risk of harm to patients and lower healthcare costs.
Assuntos
Anti-Infecciosos/economia , Economia Hospitalar , Prescrição Inadequada/economia , Antibacterianos/economia , Antibacterianos/uso terapêutico , Anti-Infecciosos/uso terapêutico , Clostridioides difficile , Infecções por Clostridium/tratamento farmacológico , Infecções por Clostridium/economia , Custos de Medicamentos/estatística & dados numéricos , Economia Hospitalar/estatística & dados numéricos , Custos Hospitalares/estatística & dados numéricos , Humanos , Prescrição Inadequada/estatística & dados numéricos , Staphylococcus aureus Resistente à Meticilina , Estudos Retrospectivos , Infecções Estafilocócicas/tratamento farmacológico , Infecções Estafilocócicas/economia , Estados Unidos/epidemiologiaRESUMO
IMPORTANCE Current guidelines allow substantial discretion in use of noninvasive cardiac imaging for patients without acute myocardial infarction (AMI) who are being evaluated for ischemia. Imaging use may affect downstream testing and outcomes. OBJECTIVE To characterize hospital variation in use of noninvasive cardiac imaging and the association of imaging use with downstream testing, interventions, and outcomes. DESIGN, SETTING, AND PARTICIPANTS Cross-sectional study of hospitals using 2010 administrative data from Premier, Inc, including patients with suspected ischemia on initial evaluation who were seen in the emergency department, observation unit, or inpatient ward; received at least 1 cardiac biomarker test on day 0 or 1; and had a principal discharge diagnosis for a common cause of chest discomfort, a sign or symptom of cardiac ischemia, and/or a comorbidity associated with coronary disease. We excluded patients with AMI. MAIN OUTCOMES AND MEASURES At each hospital, the proportion of patients who received noninvasive imaging to identify cardiac ischemia and the subsequent rates of admission, coronary angiography, and revascularization procedures. RESULTS We identified 549,078 patients at 224 hospitals. The median (interquartile range) hospital noninvasive imaging rate was 19.8% (10.9%-27.7%); range, 0.2% to 55.7%. Median hospital imaging rates by quartile were Q1, 6.0%; Q2, 15.9%; Q3, 23.5%; Q4, 34.8%. Compared with Q1, Q4 hospitals had higher rates of admission (Q1, 32.1% vs Q4, 40.0%), downstream coronary angiogram (Q1, 1.2% vs Q4, 4.9%), and revascularization procedures (Q1, 0.5% vs Q4, 1.9%). Hospitals in Q4 had a lower yield of revascularization for noninvasive imaging (Q1, 7.6% vs Q4, 5.4%) and for angiograms (Q1, 41.2% vs Q4, 38.8%). P <.001 for all comparisons. Readmission rates to the same hospital for AMI within 2 months were not different by quartiles (P = .51). Approximately 23% of variation in imaging use was attributable to the behavior of individual hospitals. CONCLUSIONS AND RELEVANCE Hospitals vary in their use of noninvasive cardiac imaging in patients with suspected ischemia who do not have AMI. Hospitals with higher imaging rates did not have substantially different rates of therapeutic interventions or lower readmission rates for AMI but were more likely to admit patients and perform angiography.
Assuntos
Doenças Cardiovasculares/diagnóstico , Diagnóstico por Imagem/estatística & dados numéricos , Hospitalização , Padrões de Prática Médica/estatística & dados numéricos , Biomarcadores/análise , Doenças Cardiovasculares/terapia , Estudos Transversais , Feminino , Humanos , Masculino , Avaliação de Processos e Resultados em Cuidados de Saúde , Estados UnidosRESUMO
The authors developed 8 measures of waste associated with cardiac procedures to assist hospitals in comparing their performance with peer facilities. Measure selection was based on review of the research literature, clinical guidelines, and consultation with key stakeholders. Development and validation used the data from 261 hospitals in a split-sample design. Measures were risk adjusted using Premier's CareScience methodologies or mean peer value based on Medicare Severity Diagnosis-Related Group assignment. High variability was found in resource utilization across facilities. Validation of the measures using item-to-total correlations (range = 0.27-0.78), Cronbach α (.88), and Spearman rank correlation (0.92) showed high reliability and discriminatory power. Because of the level of variability observed among hospitals, this study suggests that there is opportunity for facilities to design successful waste reduction programs targeting cardiac-device procedures.
Assuntos
Doenças Cardiovasculares/terapia , Custos Hospitalares , Procedimentos Desnecessários/economia , Bases de Dados Factuais , Eficiência Organizacional/economia , Equipamentos e Provisões/economia , Recursos em Saúde/estatística & dados numéricos , Administradores Hospitalares , Hospitais Gerais/economia , Humanos , Corpo Clínico Hospitalar , Pesquisa Qualitativa , Garantia da Qualidade dos Cuidados de Saúde/métodos , Estados UnidosRESUMO
The authors developed 15 measures and a comparative index to assist acute care facilities in identifying and monitoring clinical and administrative functions for health care waste reduction. Primary clinical and administrative data were collected from 261 acute care facilities contained within a database maintained by Premier Inc, spanning October 1, 2010, to September 30, 2011. The measures and 4 index models were tested using the Cronbach α coefficient and item-to-total and Spearman rank correlations. The final index model was validated using 52 facilities that had complete data. Analysis of the waste measures showed good internal reliability (α = .85) with some overlap. Index modeling found that data transformation using the standard deviation and adjusting for the proportional contribution of each measure normalized the distribution and produced a Spearman rank correlation of 0.95. The waste measures and index methodology provide a simple and reliable means to identify and reduce waste and compare and monitor facility performance.
Assuntos
Eficiência Organizacional , Hospitais/estatística & dados numéricos , Benchmarking/métodos , Eficiência Organizacional/normas , Eficiência Organizacional/estatística & dados numéricos , Administração Hospitalar/métodos , Hospitais/normas , Humanos , Modelos Estatísticos , Indicadores de Qualidade em Assistência à Saúde , Reprodutibilidade dos Testes , Estados UnidosRESUMO
OBJECTIVES: The objective of this observational study was to compare 48-h all-cause mortality (as well as hospital stay mortality) among critically ill patients who underwent echocardiography either with or without an ultrasound contrast agent (UCA). BACKGROUND: The safety of perflutren-based UCAs has been questioned by the U.S. Food and Drug Administration (particularly when administered to critically ill patients) following rare reports of deaths or life-threatening adverse reactions that occurred in close temporal relationship to UCA administration. METHODS: This was a retrospective observational outcome study conducted in critically ill patients to compare all-cause 48-h and hospital stay mortality subsequent to echocardiography procedures performed either with or without a UCA. The study utilized discharge data from a database maintained by Premier, Inc. (Charlotte, North Carolina). Premier's database is the largest U.S. hospital-based, service-level comparative database for quality and outcomes research, and provides detailed resource utilization data along with patients' primary and secondary diagnoses and procedure billing codes. A propensity score-matching algorithm between UCA-enhanced echocardiography patients and non-contrast-enhanced echocardiography patients was utilized to reduce the potential for imbalance in covariates of selected patients in the comparison of mortality between groups. RESULTS: Patients undergoing echocardiography with a UCA had lower mortality at 48 h compared with patients undergoing non-contrast-enhanced echocardiography (1.70% vs. 2.50%), with an odds ratio = 0.66 (95% confidence interval [CI]: 0.54 to 0.80). Patients undergoing echocardiography with a UCA had lower hospital stay mortality compared with patients undergoing noncontrast echocardiography (14.85% vs. 15.66%), with an odds ratio = 0.89 (95% CI: 0.84 to 0.96). CONCLUSIONS: In critically ill, propensity-matched hospitalized patients undergoing echocardiography, use of a UCA is associated with a 28% lower mortality at 48 h in comparison with patients undergoing non-contrast-enhanced echocardiography. These results are reassuring, given previous reports suggesting an association between UCAs and increased mortality in critically ill patients.
Assuntos
Meios de Contraste , Estado Terminal/mortalidade , Ecocardiografia/métodos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Causas de Morte/tendências , Feminino , Seguimentos , Mortalidade Hospitalar/tendências , Humanos , Tempo de Internação/tendências , Masculino , Pessoa de Meia-Idade , Razão de Chances , Prognóstico , Pontuação de Propensão , Estudos Retrospectivos , Fatores de Risco , Taxa de Sobrevida/tendências , Estados Unidos/epidemiologia , Adulto JovemRESUMO
PURPOSE: Hyponatremia is associated with higher morbidity and mortality rates among hospitalized patients. Our study evaluated health care utilization and associated costs of patients hospitalized with a primary diagnosis of hyponatremia. METHODS: Hospitalized patients with a primary discharge diagnosis of hyponatremia (aged ≥ 18 years) were identified from the Premier Perspective™ database (January 1, 2007-March 31, 2010) and matched to non-hyponatremic (non-HN) patients using a combination of exact patient characteristic matching and propensity score matching. Univariate and multivariate statistics were used to compare hospital resource usage, costs, and 30-day readmission rates between cohorts. RESULTS: Hospital length of stay (LOS) (± standard deviation) (3.78 ± 3.19 vs 3.54 ± 3.26 days; P < 0.001) and cost ($5396 ± $6500 vs $4979 ± $6152; P < 0.001 for the hyponatremic [HN] and non-HN patient cohorts, respectively) were greater for the HN cohort, but intensive care unit (ICU) costs ($3554 ± $6463 vs $3484 ± $8510; P = 0.828) and ICU LOS (2.37 ± 3.47 vs 2.52 ± 3.87; P = 0.345) did not differ between cohorts. The ICU admission rate (7.9% vs 4.4%; P < 0.001), as well as the 30-day readmission rate (12.1% vs 2.9%; P < 0.001) were greater for the HN cohort. After adjustment for key patient characteristics, hyponatremia was associated with a 7.6% increase in hospital LOS, an 8.9% increase in hospital costs, and a 9% increase in ICU costs. Hyponatremia was associated with an increased risk of ICU admission (odds ratio, 1.89, confidence limits, 1.72, 2.07; P < 0.001) and 30-day hospital readmission for hyponatremia (odds ratio, 4.76; confidence limits, 4.31, 5.26; P < 0.001). CONCLUSION: Compared with non-HN patients, patients with a primary diagnosis of hyponatremia use a greater amount of hospital resources and represent a challenge to hospital profitability due to the increased likelihood of 30-day readmission.
Assuntos
Serviços de Saúde/economia , Custos Hospitalares/estatística & dados numéricos , Hiponatremia/economia , Readmissão do Paciente/economia , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Análise de Variância , Estudos de Casos e Controles , Feminino , Serviços de Saúde/estatística & dados numéricos , Custos Hospitalares/tendências , Humanos , Hiponatremia/mortalidade , Hiponatremia/terapia , Unidades de Terapia Intensiva/economia , Unidades de Terapia Intensiva/estatística & dados numéricos , Tempo de Internação/economia , Tempo de Internação/estatística & dados numéricos , Masculino , Pessoa de Meia-Idade , Alta do Paciente/economia , Alta do Paciente/estatística & dados numéricos , Readmissão do Paciente/estatística & dados numéricos , Pontuação de Propensão , Estudos Retrospectivos , Adulto JovemRESUMO
UNLABELLED: The beneficial effects of exercise and a healthy diet are well documented in the general population but poorly understood in elite athletes. Previous research in subelite athletes suggests that regular training and an antioxidant-rich diet enhance antioxidant defenses but not performance. PURPOSE: To investigate whether habitual diet and/or exercise (training status or performance) affect antioxidant status in elite athletes. METHODS: Antioxidant blood biomarkers were assessed before and after a 30-min ergometer time trial in 28 male and 34 female rowers. The antioxidant blood biomarkers included ascorbic acid, uric acid, total antioxidant capacity (TAC), erythrocyte- superoxide dismutase, glutathione peroxidase (GPx), and catalase. Rowers completed a 7-d food diary and an antioxidant-intake questionnaire. Effects of diet, training, and performance on resting biomarkers were assessed with Pearson correlations, and their effect on exercise-induced changes in blood biomarkers was assessed by a method of standardization. RESULTS: With the exception of GPx, there were small to moderate increases with exercise for all markers. Blood resting TAC had a small correlation with total antioxidant intake (correlation .29; 90% confidence limits, ±.27), and the exercise-induced change in TAC had a trivial to small association with dietary antioxidant intake from vitamin C (standardized effect .19; ±.22), vegetables (.20; ±.23), and vitamin A (.25; ±.27). Most other dietary intakes had trivial associations with antioxidant biomarkers. Years of training had a small inverse correlation with TAC (-.32; ±.19) and a small association with the exercise-induced change in TAC (.27; ±.24). CONCLUSION: Training status correlates more strongly with antioxidant status than diet does.