Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 478
Filtrar
2.
Artículo en Inglés | MEDLINE | ID: mdl-38953178

RESUMEN

PURPOSE: The aim of this study is to investigate the cost-effectiveness of revision total knee arthroplasty compared to primary total knee arthroplasty in terms of cost-per-quality-adjusted life year (QALY). METHODS: Data were retrieved for all primary and revision total knee replacement (TKA) procedures performed at a tertiary Swiss hospital between 2006 and 2019. A Markov model was created to evaluate revision risk and we calculated lifetime QALY gain and lifetime procedure costs through individual EuroQol 5 dimension (EQ-5D) scores, hospital costs, national life expectancy tables and standard discounting processes. Cost-per-QALY gain was calculated for primary and revision procedures. RESULTS: EQ-5D data were available for 1343 primary and 103 revision procedures. Significant QALY gains were seen following surgery in all cases. Similar, but significantly more QALYs were gained following primary TKA (PTKA) (5.67 ± 3.98) than following revision TKA (RTKA) (4.67 ± 4.20). Cost-per-QALY was €4686 for PTKA and €10,364 for RTKA. The highest average cost-per-QALY was seen in two-stage RTKA (€12,292), followed by one-stage RTKA (€8982). CONCLUSION: RTKA results in a similar QALY gain as PTKA. The costs of achieving health gain are two to three times higher in RTKA, but both procedures are highly cost-effective. LEVEL OF EVIDENCE: Economic level II.

3.
Asian J Neurosurg ; 19(2): 317-320, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-38974458

RESUMEN

Objectives Expandable transforaminal interbody fusion (TLIF) devices have been developed to introduce more segmental lordosis through a narrow operative corridor, but there are concerns about the degree of achievable correction with a small graft footprint. In this report, we describe the technical nuances associated with placing bilateral expandable cages for correction of iatrogenic deformity. Materials and Methods A 60-year-old female with symptomatic global sagittal malalignment and a severe lumbar kyphotic deformity after five prior lumbar surgeries presented to our institution. We performed multilevel posterior column osteotomies, a L3-4 intradiscal osteotomy, and placed bilateral lordotic expandable TLIF cages at the level of maximum segmental kyphosis. Results We achieve a 21-degree correction of the patient's focal kyphotic deformity and restoration of the patient global sagittal alignment. Conclusion This case demonstrates both the feasibility and utility of placing bilateral expandable TLIF cages at a single disc space in the setting of severe focal sagittal malalignment. This technique expands the implant footprint and, when coupled with an intradiscal osteotomy, allows for a significant restoration of segmental lordosis.

5.
Scand J Trauma Resusc Emerg Med ; 32(1): 57, 2024 Jun 17.
Artículo en Inglés | MEDLINE | ID: mdl-38886775

RESUMEN

BACKGROUND: Limited research has explored the effect of Circle of Willis (CoW) anatomy among blunt cerebrovascular injuries (BCVI) on outcomes. It remains unclear if current BCVI screening and scanning practices are sufficient in identification of concomitant COW anomalies and how they affect outcomes. METHODS: This retrospective cohort study included adult traumatic BCVIs at 17 level I-IV trauma centers (08/01/2017-07/31/2021). The objectives were to compare screening criteria, scanning practices, and outcomes among those with and without COW anomalies. RESULTS: Of 561 BCVIs, 65% were male and the median age was 48 y/o. 17% (n = 93) had a CoW anomaly. Compared to those with normal CoW anatomy, those with CoW anomalies had significantly higher rates of any strokes (10% vs. 4%, p = 0.04), ICHs (38% vs. 21%, p = 0.001), and clinically significant bleed (CSB) before antithrombotic initiation (14% vs. 3%, p < 0.0001), respectively. Compared to patients with a normal CoW, those with a CoW anomaly also had ischemic strokes more often after antithrombotic interruption (13% vs. 2%, p = 0.02).Patients with CoW anomalies were screened significantly more often because of some other head/neck indication not outlined in BCVI screening criteria than patients with normal CoW anatomy (27% vs. 18%, p = 0.04), respectively. Scans identifying CoW anomalies included both the head and neck significantly more often (53% vs. 29%, p = 0.0001) than scans identifying normal CoW anatomy, respectively. CONCLUSIONS: While previous studies suggested universal scanning for BCVI detection, this study found patients with BCVI and CoW anomalies had some other head/neck injury not identified as BCVI scanning criteria significantly more than patients with normal CoW which may suggest that BCVI screening across all patients with a head/neck injury may improve the simultaneous detection of CoW and BCVIs. When screening for BCVI, scans including both the head and neck are superior to a single region in detection of concomitant CoW anomalies. Worsened outcomes (strokes, ICH, and clinically significant bleeding before antithrombotic initiation) were observed for patients with CoW anomalies when compared to those with a normal CoW. Those with a CoW anomaly experienced strokes at a higher rate than patients with normal CoW anatomy specifically when antithrombotic therapy was interrupted. This emphasizes the need for stringent antithrombotic therapy regimens among patients with CoW anomalies and may suggest that patients CoW anomalies would benefit from more varying treatment, highlighting the need to include the CoW anatomy when scanning for BCVI. LEVEL OF EVIDENCE: Level III, Prognostic/Epidemiological.


Asunto(s)
Traumatismos Cerebrovasculares , Círculo Arterial Cerebral , Heridas no Penetrantes , Humanos , Círculo Arterial Cerebral/anomalías , Círculo Arterial Cerebral/anatomía & histología , Círculo Arterial Cerebral/diagnóstico por imagen , Estudios Retrospectivos , Femenino , Masculino , Persona de Mediana Edad , Traumatismos Cerebrovasculares/diagnóstico por imagen , Heridas no Penetrantes/complicaciones , Adulto , Centros Traumatológicos
6.
PLoS One ; 19(6): e0298317, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38913647

RESUMEN

BACKGROUND: Although a common injury there is a lack of published primary data to inform clinical management of sports related brachial plexus injuries. METHODS: A systematic search was completed in Medline, CINAHL, PubMed, SPORTDiscus and Web of Science databases and Google Scholar from inception to August 2023 according to the PRISMA-ScR guidelines. Methodological quality assessment of included articles was with the Joanna Briggs Institute tool. Studies providing primary data as to the rehabilitative management of diagnosed or suspected brachial plexus injuries sustained when playing contact sports were included. RESULTS: Sixty-five studies were identified and screened, of which, 8 case reports were included, incorporating 10 participants with a mean age of 19.8 (±4.09) years. There was wide heterogeneity in injury severity, injury reporting, physical examination and imaging approaches documented. 9 of 10 participants returned to competitive sports, though follow-up periods also varied widely. Whilst return to play criteria varied between studies, the most consistent indicator was pain-free shoulder range of motion and strength. CONCLUSIONS: There is a distinct lack of data available to inform evidence-based rehabilitation management of sports related brachial plexus injury. Only 8 individual case reports contain published data reporting on 10 athletes. Further reporting is critical to inform clinical management.


Asunto(s)
Traumatismos en Atletas , Plexo Braquial , Humanos , Plexo Braquial/lesiones , Traumatismos en Atletas/rehabilitación , Adulto Joven , Masculino , Femenino , Rango del Movimiento Articular , Adulto , Volver al Deporte , Neuropatías del Plexo Braquial/rehabilitación , Neuropatías del Plexo Braquial/etiología , Adolescente
7.
Synapse ; 78(3): e22291, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38733105

RESUMEN

Spinal serotonin enables neuro-motor recovery (i.e., plasticity) in patients with debilitating paralysis. While there exists time of day fluctuations in serotonin-dependent spinal plasticity, it is unknown, in humans, whether this is due to dynamic changes in spinal serotonin levels or downstream signaling processes. The primary objective of this study was to determine if time of day variations in spinal serotonin levels exists in humans. To assess this, intrathecal drains were placed in seven adults with cerebrospinal fluid (CSF) collected at diurnal (05:00 to 07:00) and nocturnal (17:00 to 19:00) intervals. High performance liquid chromatography with mass spectrometry was used to quantify CSF serotonin levels with comparisons being made using univariate analysis. From the 7 adult patients, 21 distinct CSF samples were collected: 9 during the diurnal interval and 12 during nocturnal. Diurnal CSF samples demonstrated an average serotonin level of 216.6 ± $ \pm $ 67.7 nM. Nocturnal CSF samples demonstrated an average serotonin level of 206.7 ± $ \pm $ 75.8 nM. There was no significant difference between diurnal and nocturnal CSF serotonin levels (p = .762). Within this small cohort of spine healthy adults, there were no differences in diurnal versus nocturnal spinal serotonin levels. These observations exclude spinal serotonin levels as the etiology for time of day fluctuations in serotonin-dependent spinal plasticity expression.


Asunto(s)
Ritmo Circadiano , Serotonina , Humanos , Serotonina/líquido cefalorraquídeo , Masculino , Adulto , Femenino , Ritmo Circadiano/fisiología , Persona de Mediana Edad , Médula Espinal/metabolismo , Cromatografía Líquida de Alta Presión , Anciano
8.
Inj Epidemiol ; 11(1): 18, 2024 May 13.
Artículo en Inglés | MEDLINE | ID: mdl-38741167

RESUMEN

BACKGROUND: There is an epidemic of firearm injuries in the United States since the mid-2000s. Thus, we sought to examine whether hospitalization from firearm injuries have increased over time, and to examine temporal changes in patient demographics, firearm injury intent, and injury severity. METHODS: This was a multicenter, retrospective, observational cohort study of patients hospitalized with a traumatic injury to six US level I trauma centers between 1/1/2016 and 6/30/2022. ICD-10-CM cause codes were used to identify and describe firearm injuries. Temporal trends were compared for demographics (age, sex, race, insured status), intent (assault, unintentional, self-harm, legal intervention, and undetermined), and severity (death, ICU admission, severe injury (injury severity score ≥ 16), receipt of blood transfusion, mechanical ventilation, and hospital and ICU LOS (days). Temporal trends were examined over 13 six-month intervals (H1, January-June; H2, July-December) using joinpoint regression and reported as semi-annual percent change (SPC); significance was p < 0.05. RESULTS: Firearm injuries accounted for 2.6% (1908 of 72,474) of trauma hospitalizations. The rate of firearm injuries initially declined from 2016-H1 to 2018-H2 (SPC = - 4.0%, p = 0.002), followed by increased rates from 2018-H2 to 2020-H1 (SPC = 9.0%, p = 0.005), before stabilizing from 2020-H1 to 2022-H1 (0.5%, p = 0.73). NH black patients had the greatest hospitalization rate from firearm injuries (14.0%) and were the only group to demonstrate a temporal increase (SPC = 6.3%, p < 0.001). The proportion of uninsured patients increased (SPC = 2.3%, p = 0.02) but there were no temporal changes by age or sex. ICU admission rates declined (SPC = - 2.2%, p < 0.001), but ICU LOS increased (SPC = 2.8%, p = 0.04). There were no significant changes over time in rates of death (SPC = 0.3%), severe injury (SPC = 1.6%), blood transfusion (SPC = 0.6%), and mechanical ventilation (SPC = 0.6%). When examined by intent, self-harm injuries declined over time (SPC = - 4.1%, p < 0.001), assaults declined through 2019-H2 (SPC = - 5.6%, p = 0.01) before increasing through 2022-H1 (SPC = 6.5%, p = 0.01), while undetermined injuries increased through 2019-H1 (SPC = 24.1%, p = 0.01) then stabilized (SPC = - 4.5%, p = 0.39); there were no temporal changes in unintentional injuries or legal intervention. CONCLUSIONS: Hospitalizations from firearm injuries are increasing following a period of declines, driven by increases among NH Black patients. Trauma systems need to consider these changing trends to best address the needs of the injured population.

9.
J Clin Med ; 13(8)2024 Apr 11.
Artículo en Inglés | MEDLINE | ID: mdl-38673475

RESUMEN

Background: The objective of this study was to evaluate if imbalance influences complication rates, radiological outcomes, and patient-reported outcomes (PROMs) following adult spinal deformity (ASD) surgery. Methods: ASD patients with baseline and 2-year radiographic and PROMs were included. Patients were grouped according to whether they answered yes or no to a recent history of pre-operative loss of balance. The groups were propensity-matched by age, pelvic incidence-lumbar lordosis (PI-LL), and surgical invasiveness score. Results: In total, 212 patients were examined (106 in each group). Patients with gait imbalance had worse baseline PROM measures, including Oswestry disability index (45.2 vs. 36.6), SF-36 mental component score (44 vs. 51.8), and SF-36 physical component score (p < 0.001 for all). After 2 years, patients with gait imbalance had less pelvic tilt correction (-1.2 vs. -3.6°, p = 0.039) for a comparable PI-LL correction (-11.9 vs. -15.1°, p = 0.144). Gait imbalance patients had higher rates of radiographic proximal junctional kyphosis (PJK) (26.4% vs. 14.2%) and implant-related complications (47.2% vs. 34.0%). After controlling for age, baseline sagittal parameters, PI-LL correction, and comorbidities, patients with imbalance had 2.2-times-increased odds of PJK after 2 years. Conclusions: Patients with a self-reported loss of balance/unsteady gait have significantly worse PROMs and higher risk of PJK.

10.
JMIR Perioper Med ; 7: e52125, 2024 Apr 04.
Artículo en Inglés | MEDLINE | ID: mdl-38573737

RESUMEN

BACKGROUND: Pip is a novel digital health platform (DHP) that combines human health coaches (HCs) and technology with patient-facing content. This combination has not been studied in perioperative surgical optimization. OBJECTIVE: This study's aim was to test the feasibility of the Pip platform for deploying perioperative, digital, patient-facing optimization guidelines to elective surgical patients, assisted by an HC, at predefined intervals in the perioperative journey. METHODS: We conducted an institutional review board-approved, descriptive, prospective feasibility study of patients scheduled for elective surgery and invited to enroll in Pip from 2.5 to 4 weeks preoperatively through 4 weeks postoperatively at an academic medical center between November 22, 2022, and March 27, 2023. Descriptive primary end points were patient-reported outcomes, including patient satisfaction and engagement, and Pip HC evaluations. Secondary end points included mean or median length of stay (LOS), readmission at 7 and 30 days, and emergency department use within 30 days. Secondary end points were compared between patients who received Pip versus patients who did not receive Pip using stabilized inverse probability of treatment weighting. RESULTS: A total of 283 patients were invited, of whom 172 (60.8%) enrolled in Pip. Of these, 80.2% (138/172) patients had ≥1 HC session and proceeded to surgery, and 70.3% (97/138) of the enrolled patients engaged with Pip postoperatively. The mean engagement began 27 days before surgery. Pip demonstrated an 82% weekly engagement rate with HCs. Patients attended an average of 6.7 HC sessions. Of those patients that completed surveys (95/138, 68.8%), high satisfaction scores were recorded (mean 4.8/5; n=95). Patients strongly agreed that HCs helped them throughout the perioperative process (mean 4.97/5; n=33). The average net promoter score was 9.7 out of 10. A total of 268 patients in the non-Pip group and 128 patients in the Pip group had appropriate overlapping distributions of stabilized inverse probability of treatment weighting for the analytic sample. The Pip cohort was associated with LOS reduction when compared to the non-Pip cohort (mean 2.4 vs 3.1 days; median 1.9, IQR 1.0-3.1 vs median 3.0, IQR 1.1-3.9 days; mean ratio 0.76; 95% CI 0.62-0.93; P=.009). The Pip cohort experienced a 49% lower risk of 7-day readmission (relative risk [RR] 0.51, 95% CI 0.11-2.31; P=.38) and a 17% lower risk of 30-day readmission (RR 0.83, 95% CI 0.30-2.31; P=.73), though these did not reach statistical significance. Both cohorts had similar 30-day emergency department returns (RR 1.06, 95% CI 0.56-2.01, P=.85). CONCLUSIONS: Pip is a novel mobile DHP combining human HCs and perioperative optimization content that is feasible to engage patients in their perioperative journey and is associated with reduced hospital LOS. Further studies assessing the impact on clinical and patient-reported outcomes from the use of Pip or similar DHPs HC combinations during the perioperative journey are required.

11.
Injury ; : 111523, 2024 Apr 09.
Artículo en Inglés | MEDLINE | ID: mdl-38614835

RESUMEN

BACKGROUND: In patients with severe traumatic brain injury (TBI), clinicians must balance preventing venous thromboembolism (VTE) with the risk of intracranial hemorrhagic expansion (ICHE). We hypothesized that low molecular weight heparin (LMWH) would not increase risk of ICHE or VTE as compared to unfractionated heparin (UH) in patients with severe TBI. METHODS: Patients ≥ 18 years of age with isolated severe TBI (AIS ≥ 3), admitted to 24 level I and II trauma centers between January 1, 2014 to December 31, 2020 and who received subcutaneous UH and LMWH injections for chemical venous thromboembolism prophylaxis (VTEP) were included. Primary outcomes were VTE and ICHE after VTEP initiation. Secondary outcomes were mortality and neurosurgical interventions. Entropy balancing (EBAL) weighted competing risk or logistic regression models were estimated for all outcomes with chemical VTEP agent as the predictor of interest. RESULTS: 984 patients received chemical VTEP, 482 UH and 502 LMWH. Patients on LMWH more often had pre-existing conditions such as liver disease (UH vs LMWH 1.7 % vs. 4.4 %, p = 0.01), and coagulopathy (UH vs LMWH 0.4 % vs. 4.2 %, p < 0.001). There were no differences in VTE or ICHE after VTEP initiation. There were no differences in neurosurgical interventions performed. There were a total of 29 VTE events (3 %) in the cohort who received VTEP. A Cox proportional hazards model with a random effect for facility demonstrated no statistically significant differences in time to VTE across the two agents (p = 0.44). The LMWH group had a 43 % lower risk of overall ICHE compared to the UH group (HR = 0.57: 95 % CI = 0.32-1.03, p = 0.062), however was not statistically significant. CONCLUSION: In this multi-center analysis, patients who received LMWH had a decreased risk of ICHE, with no differences in VTE, ICHE after VTEP initiation and neurosurgical interventions compared to those who received UH. There were no safety concerns when using LMWH compared to UH. LEVEL OF EVIDENCE: Level III, Therapeutic Care Management.

12.
Oecologia ; 204(4): 943-957, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38619585

RESUMEN

Top carnivores can influence the structure of ecological communities, primarily through competition and predation; however, communities are also influenced by bottom-up forces such as anthropogenic habitat disturbance. Top carnivore declines will likely alter competitive dynamics within and amongst sympatric carnivore species. Increasing intraspecific competition is generally predicted to drive niche expansion and/or individual specialisation, while interspecific competition tends to constrain niches. Using stable isotope analysis of whiskers, we studied the effects of Tasmanian devil Sarcophilus harrisii declines upon the population- and individual-level isotopic niches of Tasmanian devils and sympatric spotted-tailed quolls Dasyurus maculatus subsp. maculatus. We investigated whether time since the onset of devil decline (a proxy for severity of decline) and landscape characteristics affected the isotopic niche breadth and overlap of devil and quoll populations. We quantified individual isotopic niche breadth for a subset of Tasmanian devils and spotted-tailed quolls and assessed whether between-site population niche variation was driven by individual-level specialisation. Tasmanian devils and spotted-tailed quolls demonstrated smaller population-level isotopic niche breadths with increasing human-modified habitat, while time since the onset of devil decline had no effect on population-level niche breadth or interspecific niche overlap. Individual isotopic niche breadths of Tasmanian devils and spotted-tailed quolls were narrower in human-modified landscapes, likely driving population isotopic niche contraction, however, the degree of individuals' specialisation relative to one another remained constant. Our results suggest that across varied landscapes, mammalian carnivore niches can be more sensitive to the bottom-up forces of anthropogenic habitat disturbance than to the top-down effects of top carnivore decline.


Asunto(s)
Ecosistema , Animales , Marsupiales , Humanos , Carnívoros
13.
J Laryngol Otol ; : 1-8, 2024 Apr 22.
Artículo en Inglés | MEDLINE | ID: mdl-38644734

RESUMEN

OBJECTIVE: Evidence for necrotising otitis externa (NOE) diagnosis and management is limited, and outcome reporting is heterogeneous. International best practice guidelines were used to develop consensus diagnostic criteria and a core outcome set (COS). METHODS: The study was pre-registered on the Core Outcome Measures in Effectiveness Trials (COMET) database. Systematic literature review identified candidate items. Patient-centred items were identified via a qualitative study. Items and their definitions were refined by multidisciplinary stakeholders in a two-round Delphi exercise and subsequent consensus meeting. RESULTS: The final COS incorporates 36 items within 12 themes: Signs and symptoms; Pain; Advanced Disease Indicators; Complications; Survival; Antibiotic regimes and side effects; Patient comorbidities; Non-antibiotic treatments; Patient compliance; Duration and cessation of treatment; Relapse and readmission; Multidisciplinary team management.Consensus diagnostic criteria include 12 items within 6 themes: Signs and symptoms (oedema, otorrhoea, granulation); Pain (otalgia, nocturnal otalgia); Investigations (microbiology [does not have to be positive], histology [malignancy excluded], positive CT and MRI); Persistent symptoms despite local and/or systemic treatment for at least two weeks; At least one risk factor for impaired immune response; Indicators of advanced disease (not obligatory but mut be reported when present at diagnosis). Stakeholders were unanimous that there is no role for secondary, graded, or optional diagnostic items. The consensus meeting identified themes for future research. CONCLUSION: The adoption of consensus-defined diagnostic criteria and COS facilitates standardised research reporting and robust data synthesis. Inclusion of patient and professional perspectives ensures best practice stakeholder engagement.

14.
Proc Natl Acad Sci U S A ; 121(12): e2307780121, 2024 Mar 19.
Artículo en Inglés | MEDLINE | ID: mdl-38466855

RESUMEN

Coevolution is common and frequently governs host-pathogen interaction outcomes. Phenotypes underlying these interactions often manifest as the combined products of the genomes of interacting species, yet traditional quantitative trait mapping approaches ignore these intergenomic interactions. Devil facial tumor disease (DFTD), an infectious cancer afflicting Tasmanian devils (Sarcophilus harrisii), has decimated devil populations due to universal host susceptibility and a fatality rate approaching 100%. Here, we used a recently developed joint genome-wide association study (i.e., co-GWAS) approach, 15 y of mark-recapture data, and 960 genomes to identify intergenomic signatures of coevolution between devils and DFTD. Using a traditional GWA approach, we found that both devil and DFTD genomes explained a substantial proportion of variance in how quickly susceptible devils became infected, although genomic architectures differed across devils and DFTD; the devil genome had fewer loci of large effect whereas the DFTD genome had a more polygenic architecture. Using a co-GWA approach, devil-DFTD intergenomic interactions explained ~3× more variation in how quickly susceptible devils became infected than either genome alone, and the top genotype-by-genotype interactions were significantly enriched for cancer genes and signatures of selection. A devil regulatory mutation was associated with differential expression of a candidate cancer gene and showed putative allele matching effects with two DFTD coding sequence variants. Our results highlight the need to account for intergenomic interactions when investigating host-pathogen (co)evolution and emphasize the importance of such interactions when considering devil management strategies.


Asunto(s)
Enfermedades Transmisibles , Daunorrubicina/análogos & derivados , Neoplasias Faciales , Marsupiales , Animales , Neoplasias Faciales/genética , Neoplasias Faciales/veterinaria , Estudio de Asociación del Genoma Completo , Marsupiales/genética
15.
Artículo en Inglés | MEDLINE | ID: mdl-38462731

RESUMEN

STUDY DESIGN: Retrospective cohort. OBJECTIVE: To evaluate factors associated with the long-term durability of cost-effectiveness (CE) in ASD patients. BACKGROUND: A substantial increase in costs associated with the surgical treatment for adult spinal deformity (ASD) has given precedence to scrutinize the value and utility it provides. METHODS: We included 327 operative ASD patients with 5-year (5 Y) follow-up. Published methods were used to determine costs based on CMS.gov definitions and were based on the average DRG reimbursement rates. Utility was calculated using quality-adjusted life-years (QALY) utilizing the Oswestry Disability Index (ODI) converted to Short-Form Six-Dimension (SF-6D), with a 3% discount applied for its decline with life expectancy. The CE threshold of $150,000 was used for primary analysis. RESULTS: Major and minor complication rates were 11% and 47% respectively, with 26% undergoing reoperation by 5 Y. The mean cost associated with surgery was $91,095±$47,003, with a utility gain of 0.091±0.086 at 1Y, QALY gained at 2 Y of 0.171±0.183, and at 5 Y of 0.42±0.43. The cost per QALY at 2 Y was $414,885, which decreased to $142,058 at 5 Y.With the threshold of $150,000 for CE, 19% met CE at 2 Y and 56% at 5 Y. In those in which revision was avoided, 87% met cumulative CE till life expectancy. Controlling analysis depicted higher baseline CCI and pelvic tilt (PT) to be the strongest predictors for not maintaining durable CE to 5 Y (CCI OR: 1.821 [1.159-2.862], P=0.009) (PT OR: 1.079 [1.007-1.155], P=0.030). CONCLUSIONS: Most patients achieved cost-effectiveness after four years postoperatively, with 56% meeting at five years postoperatively. When revision was avoided, 87% of patients met cumulative cost-effectiveness till life expectancy. Mechanical complications were predictive of failure to achieve cost-effectiveness at 2 Y, while comorbidity burden and medical complications were at 5 Y.

16.
Health Technol Assess ; 28(10): 1-213, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-38477237

RESUMEN

Background: The indications for septoplasty are practice-based, rather than evidence-based. In addition, internationally accepted guidelines for the management of nasal obstruction associated with nasal septal deviation are lacking. Objective: The objective was to determine the clinical effectiveness and cost-effectiveness of septoplasty, with or without turbinate reduction, compared with medical management, in the management of nasal obstruction associated with a deviated nasal septum. Design: This was a multicentre randomised controlled trial comparing septoplasty, with or without turbinate reduction, with defined medical management; it incorporated a mixed-methods process evaluation and an economic evaluation. Setting: The trial was set in 17 NHS secondary care hospitals in the UK. Participants: A total of 378 eligible participants aged > 18 years were recruited. Interventions: Participants were randomised on a 1: 1 basis and stratified by baseline severity and gender to either (1) septoplasty, with or without turbinate surgery (n = 188) or (2) medical management with intranasal steroid spray and saline spray (n = 190). Main outcome measures: The primary outcome was the Sino-nasal Outcome Test-22 items score at 6 months (patient-reported outcome). The secondary outcomes were as follows: patient-reported outcomes - Nasal Obstruction Symptom Evaluation score at 6 and 12 months, Sino-nasal Outcome Test-22 items subscales at 12 months, Double Ordinal Airway Subjective Scale at 6 and 12 months, the Short Form questionnaire-36 items and costs; objective measurements - peak nasal inspiratory flow and rhinospirometry. The number of adverse events experienced was also recorded. A within-trial economic evaluation from an NHS and Personal Social Services perspective estimated the incremental cost per (1) improvement (of ≥ 9 points) in Sino-nasal Outcome Test-22 items score, (2) adverse event avoided and (3) quality-adjusted life-year gained at 12 months. An economic model estimated the incremental cost per quality-adjusted life-year gained at 24 and 36 months. A mixed-methods process evaluation was undertaken to understand/address recruitment issues and examine the acceptability of trial processes and treatment arms. Results: At the 6-month time point, 307 participants provided primary outcome data (septoplasty, n = 152; medical management, n = 155). An intention-to-treat analysis revealed a greater and more sustained improvement in the primary outcome measure in the surgical arm. The 6-month mean Sino-nasal Outcome Test-22 items scores were -20.0 points lower (better) for participants randomised to septoplasty than for those randomised to medical management [the score for the septoplasty arm was 19.9 and the score for the medical management arm was 39.5 (95% confidence interval -23.6 to -16.4; p < 0.0001)]. This was confirmed by sensitivity analyses and through the analysis of secondary outcomes. Outcomes were statistically significantly related to baseline severity, but not to gender or turbinate reduction. In the surgical and medical management arms, 132 and 95 adverse events occurred, respectively; 14 serious adverse events occurred in the surgical arm and nine in the medical management arm. On average, septoplasty was more costly and more effective in improving Sino-nasal Outcome Test-22 items scores and quality-adjusted life-years than medical management, but incurred a larger number of adverse events. Septoplasty had a 15% probability of being considered cost-effective at 12 months at a £20,000 willingness-to-pay threshold for an additional quality-adjusted life-year. This probability increased to 99% and 100% at 24 and 36 months, respectively. Limitations: COVID-19 had an impact on participant-facing data collection from March 2020. Conclusions: Septoplasty, with or without turbinate reduction, is more effective than medical management with a nasal steroid and saline spray. Baseline severity predicts the degree of improvement in symptoms. Septoplasty has a low probability of cost-effectiveness at 12 months, but may be considered cost-effective at 24 months. Future work should focus on developing a septoplasty patient decision aid. Trial registration: This trial is registered as ISRCTN16168569 and EudraCT 2017-000893-12. Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: 14/226/07) and is published in full in Health Technology Assessment; Vol. 28, No. 10. See the NIHR Funding and Awards website for further award information.


Septoplasty is an operation to straighten the septum, which is the partition wall between the nostrils inside the nose. Septoplasty can be used as a treatment for people who have a bent septum and symptoms of a blocked nose, such as difficulty sleeping and exercising. Medical management (a saltwater spray to clear the nose followed by a nose steroid spray) is an alternative treatment to septoplasty. The Nasal AIRway Obstruction Study (NAIROS) aimed to find out whether septoplasty or medical management is a better treatment for people with a bent septum and symptoms of a blocked nose. We recruited 378 patients with at least moderately severe nose symptoms from 17 hospitals in England, Scotland and Wales to take part in the NAIROS. Participants were randomly put into one of two groups: septoplasty or medical management. Participants' nose symptoms were measured both when they joined the study and after 6 months, using a questionnaire called the Sino-nasal Outcome Test-22 items. This questionnaire was chosen because patients reported that it included symptoms that were important to them. Other studies have shown that a 9-point change in the Sino-nasal Outcome Test-22 items score is significant. After 6 months, on average, people in the septoplasty group improved by 25 points, whereas people in the medical management group improved by 5 points. We saw improvement after septoplasty among patients with moderate symptoms, and among those with severe symptoms. Most patients who we spoke to after a septoplasty were happy with their treatment, but some would have liked more information about what to expect after their nose surgery. In the short term, septoplasty is more costly than medical management. However, over the longer term, taking into account all the costs and benefits of treatment, suggests that septoplasty would be considered good value for money for the NHS.


Asunto(s)
Obstrucción Nasal , Adulto , Humanos , Obstrucción Nasal/diagnóstico , Obstrucción Nasal/cirugía , Resultado del Tratamiento , Encuestas y Cuestionarios , Análisis Costo-Beneficio , Tabique Nasal/cirugía , Esteroides , Calidad de Vida
18.
Front Immunol ; 15: 1286352, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38515744

RESUMEN

The world's largest extant carnivorous marsupial, the Tasmanian devil, is challenged by Devil Facial Tumor Disease (DFTD), a fatal, clonally transmitted cancer. In two decades, DFTD has spread across 95% of the species distributional range. A previous study has shown that factors such as season, geographic location, and infection with DFTD can impact the expression of immune genes in Tasmanian devils. To date, no study has investigated within-individual immune gene expression changes prior to and throughout the course of DFTD infection. To explore possible changes in immune response, we investigated four locations across Tasmania that differed in DFTD exposure history, ranging between 2 and >30 years. Our study demonstrated considerable complexity in the immune responses to DFTD. The same factors (sex, age, season, location and DFTD infection) affected immune gene expression both across and within devils, although seasonal and location specific variations were diminished in DFTD affected devils. We also found that expression of both adaptive and innate immune genes starts to alter early in DFTD infection and continues to change as DFTD progresses. A novel finding was that the lower expression of immune genes MHC-II, NKG2D and CD8 may predict susceptibility to earlier DFTD infection. A case study of a single devil with regressed tumor showed opposite/contrasting immune gene expression patterns compared to the general trends observed across devils with DFTD infection. Our study highlights the complexity of DFTD's interactions with the host immune system and the need for long-term studies to fully understand how DFTD alters the evolutionary trajectory of devil immunity.


Asunto(s)
Daunorrubicina/análogos & derivados , Neoplasias Faciales , Marsupiales , Animales , Neoplasias Faciales/genética , Neoplasias Faciales/veterinaria , Sistema Inmunológico/patología , Expresión Génica , Marsupiales/genética
19.
Neurosurgery ; 2024 Mar 29.
Artículo en Inglés | MEDLINE | ID: mdl-38551355

RESUMEN

BACKGROUND AND OBJECTIVES: Nearly 30% of older adults presenting with isolated spine fractures will die within 1 year. Attempts to ameliorate this alarming statistic are hindered by our inability to identify relevant risk factors. The primary objective of this study was to develop a prediction model that identifies feasible targets to limit 1-year mortality. METHODS: This retrospective cohort study included 703 older adults (65 years or older) admitted to a level I trauma center with isolated spine fractures, without neural deficit, from January 2013 to January 2018. Multivariable analysis was used to select for independently significant patient demographics, frailty variables, injury metrics, and management decisions to incorporate into distinct logistic regression models predicting 1-year mortality. Variables were considered significant, if P < .05. RESULTS: Of the 703 older adults, 199 (28.3%) died after hospital discharge, but within 1 year of index trauma. Risk Analysis Index (RAI; odds ratio [OR]: 1.116; 95% CI: 1.087-1.149; P < .001) and ambulation requiring a cane (OR: 2.601; 95% CI: 1.151-5.799; P = .02) or walker (OR: 4.942; 95% CI: 2.698-9.196; P < .001), ie, frailty variables, were associated with increased odds of 1-year mortality. Spine trauma scales were not associated with 1-year mortality. Longer hospital stays (OR: 1.112; 95% CI: 1.034-1.196; P = .004) and nursing home discharge (OR: 3.881; 95% CI: 2.070-7.378; P < .001) were associated with increased odds, while discharge to rehab (OR: 0.361; 95% CI: 0.155-0.799; P = .014) decreased 1-year mortality odds. A "preinjury" regression model incorporating Risk Analysis Index and ambulation status resulted in an area under receiver operating characteristic curve (AUROCC) of 0.914 (95% CI: 0.863-0.965). A "postinjury" model incorporating Glasgow Coma Scale, hospital stay duration, and discharge disposition resulted in AUROCC of 0.746 (95% CI: 0.642-0.849). Combining elements of the preinjury and postinjury models into an "integrated model" produced an AUROCC of 0.908 (95% CI: 0.852-0.965). CONCLUSION: Preinjury frailty measures are most strongly associated with 1-year mortality outcomes in older adults with isolated spine fractures. Incorporating injury metrics or management decisions did not enhance predictive accuracy. Further work is needed to understand how targeting frailty may reduce mortality.

20.
Spine (Phila Pa 1976) ; 49(11): 743-751, 2024 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-38375611

RESUMEN

STUDY DESIGN: Retrospective review of prospectively collected data. OBJECTIVE: To investigate the effect of lower extremity osteoarthritis on sagittal alignment and compensatory mechanisms in adult spinal deformity (ASD). BACKGROUND: Spine, hip, and knee pathologies often overlap in ASD patients. Limited data exists on how lower extremity osteoarthritis impacts sagittal alignment and compensatory mechanisms in ASD. PATIENTS AND METHODS: In total, 527 preoperative ASD patients with full body radiographs were included. Patients were grouped by Kellgren-Lawrence grade of bilateral hips and knees and stratified by quartile of T1-Pelvic Angle (T1PA) severity into low-, mid-, high-, and severe-T1PA. Full-body alignment and compensation were compared across quartiles. Regression analysis examined the incremental impact of hip and knee osteoarthritis severity on compensation. RESULTS: The mean T1PA for low-, mid-, high-, and severe-T1PA groups was 7.3°, 19.5°, 27.8°, and 41.6°, respectively. Mid-T1PA patients with severe hip osteoarthritis had an increased sagittal vertical axis and global sagittal alignment ( P <0.001). Increasing hip osteoarthritis severity resulted in decreased pelvic tilt ( P =0.001) and sacrofemoral angle ( P <0.001), but increased knee flexion ( P =0.012). Regression analysis revealed that with increasing T1PA, pelvic tilt correlated inversely with hip osteoarthritis and positively with knee osteoarthritis ( r2 =0.812). Hip osteoarthritis decreased compensation through sacrofemoral angle (ß-coefficient=-0.206). Knee and hip osteoarthritis contributed to greater knee flexion (ß-coefficients=0.215, 0.101; respectively). For pelvic shift, only hip osteoarthritis significantly contributed to the model (ß-coefficient=0.100). CONCLUSIONS: For the same magnitude of spinal deformity, increased hip osteoarthritis severity was associated with worse truncal and full body alignment with posterior translation of the pelvis. Patients with severe hip and knee osteoarthritis exhibited decreased hip extension and pelvic tilt but increased knee flexion. This examines sagittal alignment and compensation in ASD patients with hip and knee arthritis and may help delineate whether hip and knee flexion is due to spinal deformity compensation or lower extremity osteoarthritis.


Asunto(s)
Osteoartritis de la Cadera , Osteoartritis de la Rodilla , Humanos , Masculino , Femenino , Osteoartritis de la Rodilla/diagnóstico por imagen , Osteoartritis de la Rodilla/fisiopatología , Osteoartritis de la Rodilla/cirugía , Persona de Mediana Edad , Osteoartritis de la Cadera/diagnóstico por imagen , Osteoartritis de la Cadera/fisiopatología , Anciano , Estudios Retrospectivos , Adulto , Curvaturas de la Columna Vertebral/diagnóstico por imagen , Curvaturas de la Columna Vertebral/fisiopatología , Radiografía
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA