RESUMO
OBJECTIVE: To evaluate the quality of operative reports for endometriosis surgeries performed by fellowship-trained, high-volume endometriosis surgeons. METHODS: In this retrospective review, 5 consecutive deidentified surgical reports per surgeon were evaluated by two reviewers. Each dictation was assigned a quality score (between 0 and 28), based on the number of components from the American Association of Gynecologic Laparoscopists (AAGL) classification system that were documented. Primary outcome was the proportion of reports for which endometriosis AAGL 2021 stage could be assigned. Secondary outcomes included median dictation quality scores, proportion of dictations for fertility-preserving cases where Endometriosis Fertility Index (EFI) score could be assigned, individual quality score components, and quality score variation between surgeons, institutions, and reporting methods. RESULTS: 82 operative reports were reviewed from 16 surgeons across 7 sites in Ontario. AAGL stage could be assigned in 48/82 (59%) of cases, and EFI score could be assigned in 31/45 of fertility-preserving cases (69%). Median quality score was 57% (range 18%-86%). Only 13% of operative reports included comment on residual disease. Quality score consistency between reports was poor for a given surgeon (ICC = 0.22, 95% CI 0.03-0.49). Quality scores differed significantly between surgeons (chi-square = 30.6, df = 16, P = .015) and institutions (chi-square = 19.59, df = 7, P = .007). Operative report quality score did not differ based on completion by trainee or staff, template use, or whether the report was completed by telephone or typed. CONCLUSION: There is significant variability and inconsistency in endometriosis surgery documentation. There is a need to standardize surgical documentation for endometriosis surgeries, enhancing communication and ultimately patient care.
RESUMO
BACKGROUND: Exposure to air pollution post-lung transplant has been shown to decrease graft and patient survival. This study examines the impact of air pollution exposure in the first 3 months post-transplant on baseline (i.e., highest) forced expiratory volume in 1 second (FEV1) achieved and development of chronic lung allograft dysfunction (CLAD). METHODS: Double-lung transplant recipients (n = 82) were prospectively enrolled for comprehensive indoor and personal environmental monitoring at 6- and 12-week post transplant and followed for >4 years. Associations between clinical and exposure variables were investigated using an exposomics approach followed by analysis with a Cox proportional hazards model. Multivariable analyses were used to examine the impact of air pollution on baseline % predicted FEV1 (defined as the average of the 2 highest values post transplant) and risk of CLAD. RESULTS: Multivariable analysis revealed a significant inverse relationship between personal black carbon (BC) levels and baseline % FEV1. The multivariable model indicated that patients with higher-than-median exposure to BC (>350 ng/m3) attained a baseline % FEV1 that was 8.8% lower than those with lower-than-median BC exposure (p = 0.019). Cox proportional hazards model analysis revealed that patients with high personal BC exposure had a 2.4 times higher hazard risk for CLAD than patients with low BC exposure (p = 0.045). CONCLUSIONS: Higher personal BC levels during the first 3 months post-transplant decrease baseline FEV1 and double the risk of CLAD. Strategies to reduce BC exposure early following a lung transplant may help improve lung function and long-term outcomes.
RESUMO
Introduction: Prior studies assessing outcomes of lung transplants from cigarette-smoking donors found mixed results. Oscillometry, a non-invasive test of respiratory impedance, detects changes in lung function of smokers prior to diagnosis of COPD, and identifies spirometrically silent episodes of rejection post-transplant. We hypothesise that oscillometry could identify abnormalities in recipients of smoking donor lungs and discriminate from non-smoking donors. Methods: This prospective single-center cohort study analysed 233 double-lung recipients. Oscillometry was performed alongside routine conventional pulmonary function tests (PFT) post-transplant. Multivariable regression models were constructed to compare oscillometry and conventional PFT parameters between recipients of lungs from smoking vs non-smoking donors. Results: The analysis included 109 patients who received lungs from non-smokers and 124 from smokers. Multivariable analysis identified significant differences between recipients of smoking and non-smoking lungs in the oscillometric measurements R5-19, X5, AX, R5z and X5z, but no differences in %predicted FEV1, FEV1/FVC, %predicted TLC or %predicted DLCO. An analysis of the smoking group also demonstrated associations between increasing smoke exposure, quantified in pack years, and all the oscillometry parameters, but not the conventional PFT parameters. Conclusion: An interaction was identified between donor-recipient sex match and the effect of smoking. The association between donor smoking and oscillometry outcomes was significant predominantly in the female donor/female recipient group.
RESUMO
Background: Longitudinal data on the detectability of monkeypox virus (MPXV) genetic material in different specimen types are scarce. Methods: We describe MPXV-specific polymerase chain reaction (PCR) results from adults with confirmed mpox infection from Toronto, Canada, including a cohort undergoing weekly collection of specimens from multiple anatomic sites until 1 week after skin lesions had fully healed. We quantified the time from symptom onset to resolution of detectable viral DNA (computed tomography [Ct] ≥ 35) by modeling exponential decay in Ct value as a function of illness day for each site, censoring at the time of tecovirimat initiation. Results: Among 64 men who have sex with men, the median (interquartile range [IQR]) age was 39 (32.75-45.25) years, and 49% had HIV. Twenty received tecovirimat. Viral DNA was detectable (Ct < 35) at baseline in 74% of genital/buttock/perianal skin swabs, 56% of other skin swabs, 44% of rectal swabs, 37% of throat swabs, 27% of urine, 26% of nasopharyngeal swabs, and 8% of semen samples. The median time to resolution of detectable DNA (IQR) was longest for genital/buttock/perianal skin and other skin swabs at 30.0 (23.0-47.9) and 22.4 (16.6-29.4) days, respectively, and shortest for nasopharyngeal swabs and semen at 0 (0-12.1) and 0 (0-0) days, respectively. We did not observe an effect of tecovirimat on the rate of decay in viral DNA detectability in any specimen type (all P > .05). Conclusions: MPXV DNA detectability varies by specimen type and persists for over 3-4 weeks in skin specimens. The rate of decay did not differ by tecovirimat use in this nonrandomized study.
RESUMO
OBJECTIVE: To evaluate the effect of hormonal suppression of endometriosis on the size of endometriotic ovarian cysts. DATA SOURCES: The authors searched MEDLINE, PubMed, Cochrane Central Register of Controlled Trials, Embase, and ClinicalTrials.gov from January 2012 to December 2022. METHODS OF STUDY SELECTION: We included studies of premenopausal women undergoing hormonal treatment of endometriosis for ≥3 months. The authors excluded studies involving surgical intervention in the follow-up period and those using hormones to prevent endometrioma recurrence after endometriosis surgery. Risk of bias was assessed with the Newcastle-Ottawa Scale and Cochrane Risk of Bias Tool. The protocol was registered in PROSPERO (CRD42022385612). TABULATION, INTEGRATION, AND RESULTS: The primary outcome was the mean change in endometrioma volume, expressed as a percentage, from baseline to at least 6 months. Secondary outcomes were the change in volume at 3 months and analyses by class of hormonal therapy. The authors included 16 studies (15 cohort studies, 1 randomized controlled trial) of 888 patients treated with dienogest (7 studies), other progestins (4), combined hormonal contraceptives (2), and other suppressive therapy (3). Globally, the decrease in endometrioma volume became statistically significant at 6 months with a mean reduction of 55% (95% confidence interval, -40 to -71; 18 treatment groups; 730 patients; p <.001; I2 = 96%). The reduction was the greatest with dienogest and norethindrone acetate plus letrozole, followed by relugolix and leuprolide acetate. The volume reduction was not statistically significant with combined hormonal contraceptives or other progestins. There was high heterogeneity, and studies were at risk of selection bias. CONCLUSION: Hormonal suppression can substantially reduce endometrioma size, but there is uncertainty in the exact reduction patients may experience.
Assuntos
Endometriose , Humanos , Feminino , Endometriose/tratamento farmacológico , Endometriose/cirurgia , Endometriose/patologia , Nandrolona/análogos & derivados , Nandrolona/uso terapêutico , Doenças Ovarianas/tratamento farmacológico , Doenças Ovarianas/cirurgia , Doenças Ovarianas/patologia , Leuprolida/uso terapêutico , Letrozol/uso terapêutico , Cistos Ovarianos/tratamento farmacológico , Cistos Ovarianos/cirurgia , Resultado do TratamentoRESUMO
Introduction: Variable transplant-related knowledge may contribute to inequitable access to living donor kidney transplant (LDKT). We compared transplant-related knowledge between African, Caribbean, and Black (ACB) versus White Canadian patients with kidney failure using the Knowledge Assessment of Renal Transplantation (KART) questionnaire. Methods: This was a cross-sectional cohort study. Data were collected from a cross-sectional convenience sample of adults with kidney failure in Toronto. Participants also answered an exploratory question about their distrust in the kidney allocation system. Clinical characteristics were abstracted from medical records. The potential contribution of distrust to differences in transplant knowledge was assessed in mediation analysis. Results: Among 577 participants (mean [SD] age 57 [14] years, 63% male), 25% were ACB, and 43% were White Canadians. 45% of ACB versus 26% of White participants scored in the lowest tertile of the KART score. The relative risk ratio to be in the lowest tertile for ACB compared to White participants was 2.22 (95% confidence interval [CI]: 1.11, 4.43) after multivariable adjustment. About half of the difference in the knowledge score between ACB versus White patients was mediated by distrust in the kidney allocation system. Conclusion: Participants with kidney failure from ACB communities have less transplant-related knowledge compared to White participants. Distrust is potentially contributing to this difference.
RESUMO
INTRODUCTION: The aim of our study was to assess the initial impact of COVID-19 on total publicly-funded direct healthcare costs and health services use in two Canadian provinces, Ontario and British Columbia (BC). METHODS: This retrospective repeated cross-sectional study used population-based administrative datasets, linked within each province, from January 1, 2018 to December 27, 2020. Interrupted time series analysis was used to estimate changes in the level and trends of weekly resource use and costs, with March 16-22, 2020 as the first pandemic week. Also, in each week of 2020, we identified cases with their first positive SARS-CoV-2 test and estimated their healthcare costs until death or December 27, 2020. RESULTS: The resources with the largest level declines (95% confidence interval) in use in the first pandemic week compared to the previous week were physician services [Ontario: -43% (-49%,-37%); BC: -24% (-30%,-19%) (both p<0.001)] and emergency department visits [Ontario: -41% (-47%,-35%); BC: -29% (-35%,-23%) (both p<0.001)]. Hospital admissions declined by 27% (-32%,-23%) in Ontario and 21% (-26%,-16%) in BC (both p<0.001). Resource use subsequently rose but did not return to pre-pandemic levels. Only home care and dialysis clinic visits did not significantly decrease compared to pre-pandemic. Costs for COVID-19 cases represented 1.3% and 0.7% of total direct healthcare costs in 2020 in Ontario and BC, respectively. CONCLUSIONS: Reduced utilization of healthcare services in the overall population outweighed utilization by COVID-19 patients in 2020. Meeting the needs of all patients across all services is essential to maintain resilient healthcare systems.
Assuntos
COVID-19 , Pandemias , Humanos , Análise de Séries Temporais Interrompida , Estudos Transversais , Estudos Retrospectivos , COVID-19/epidemiologia , SARS-CoV-2 , Diálise Renal , Colúmbia Britânica , Custos de Cuidados de SaúdeRESUMO
INTRODUCTION: Total knee arthroplasty is associated with significant postoperative pain. Continuous adductor canal blocks via an inserted adductor canal catheter are effective analgesia interventions with the advantage of decreasing quadriceps weakness and the potential of extending the analgesic effect. The classical adductor canal catheter insertion technique may have a high likelihood of catheter dislodgement out of the canal. The interfascial plane between the sartorius muscle and femoral artery (ISAFE) approach has the potential of decreasing the adductor canal catheter migration. The purpose of this study was to evaluate the incidence of catheter dislodgment to outside the adductor canal, for ISAFE and classical approaches. We hypothesized that ISAFE approach would result in a lower dislodgment rate. METHODS: Ninety-seven patients for unilateral total knee arthroplasty were included and randomized to either ISAFE intervention group or conventional group. The primary outcome was the incidence of adductor canal catheter dislodged to outside the adductor canal, on ultrasound evaluation, 24 hours after the surgery. Secondary outcomes were pain scores, opioid consumption and continuous adductor canal block related complications for the first 48 hours after surgery. RESULTS: The catheters placed using ISAFE approach had a lower rate of dislodgement in comparison to the control group (18.6% vs 44.9%, respectively, p=0.01), at 24 hours after surgery; and lower pain scores for rest, on the first two postoperative days. CONCLUSIONS: ISAFE group had a significantly lower rate of dislodgement at 24 hours. The continuous adductor canal block analgesic benefit for knee arthroplasty depends on the position of the tip of the catheter inside the adductor canal.
RESUMO
BACKGROUND: Approximately 20% of patients who are discharged from hospital for an acute exacerbation of COPD (AECOPD) are readmitted within 30 days. To reduce this, it is important both to identify all individuals admitted with AECOPD and to predict those who are at higher risk for readmission. OBJECTIVES: To develop two clinical prediction models using data available in electronic medical records: 1) identifying patients admitted with AECOPD and 2) predicting 30-day readmission in patients discharged after AECOPD. METHODS: Two datasets were created using all admissions to General Internal Medicine from 2012 to 2018 at two hospitals: one cohort to identify AECOPD and a second cohort to predict 30-day readmissions. We fit and internally validated models with four algorithms. RESULTS: Of the 64,609 admissions, 3,620 (5.6%) were diagnosed with an AECOPD. Of those discharged, 518 (15.4%) had a readmission to hospital within 30 days. For identification of patients with a diagnosis of an AECOPD, the top-performing models were LASSO and a four-variable regression model that consisted of specific medications ordered within the first 72 hours of admission. For 30-day readmission prediction, a two-variable regression model was the top performing model consisting of number of COPD admissions in the previous year and the number of non-COPD admissions in the previous year. CONCLUSION: We generated clinical prediction models to identify AECOPDs during hospitalization and to predict 30-day readmissions after an acute exacerbation from a dataset derived from available EMR data. Further work is needed to improve and externally validate these models.
Assuntos
Readmissão do Paciente , Doença Pulmonar Obstrutiva Crônica , Humanos , Doença Pulmonar Obstrutiva Crônica/epidemiologia , Doença Pulmonar Obstrutiva Crônica/terapia , Doença Pulmonar Obstrutiva Crônica/diagnóstico , Estudos Retrospectivos , Registros Eletrônicos de Saúde , Fatores de Risco , Hospitalização , Hospitais , Progressão da DoençaRESUMO
BACKGROUND: The severity of sleep-disordered breathing is known to worsen postoperatively and is associated with increased cardio-pulmonary complications and increased resource implications. In the general population, the semi-upright position has been used in the management of OSA. We hypothesized that the use of a semi-upright position versus a non-elevated position will reduce postoperative worsening of OSA in patients undergoing non-cardiac surgeries. METHODS: This study was conducted as a prospective randomized controlled trial of perioperative patients, undergoing elective non-cardiac inpatient surgeries. Patients underwent a preoperative sleep study using a portable polysomnography device. Patients with OSA (apnea hypopnea index (AHI) > 5 events/hr), underwent a sleep study on postoperative night 2 (N2) after being randomized into an intervention group (Group I): semi-upright position (30 to 45 degrees incline), or a control group (Group C) (zero degrees from horizontal). The primary outcome was postoperative AHI on N2. The secondary outcomes were obstructive apnea index (OAI), central apnea index (CAI), hypopnea index (HI), obstructive apnea hypopnea index (OAHI) and oxygenation parameters. RESULTS: Thirty-five patients were included. Twenty-one patients were assigned to the Group 1 (females-14 (67%); mean age 65 ± 12) while there were fourteen patients in the Group C (females-5 (36%); mean age 63 ± 10). The semi-upright position resulted in a significant reduction in OAI in the intervention arm (Group C vs Group I postop AHI: 16.6 ± 19.0 vs 8.6 ± 11.2 events/hr; overall p = 0.01), but there were no significant differences in the overall AHI or other parameters between the two groups. Subgroup analysis of patients with "supine related OSA" revealed a decreasing trend in postoperative AHI with semi-upright position, but the sample size was too small to evaluate statistical significance. CONCLUSION: In patients with newly diagnosed OSA, the semi-upright position resulted in improvement in obstructive apneas, but not the overall AHI. TRIAL REGISTRATION: This trial was retrospectively registered in clinicaltrials.gov NCT02152202 on 02/06/2014.
Assuntos
Obstrução das Vias Respiratórias , Síndromes da Apneia do Sono , Apneia Obstrutiva do Sono , Feminino , Humanos , Pessoa de Meia-Idade , Idoso , Estudos Prospectivos , Apneia Obstrutiva do Sono/diagnóstico , Polissonografia/efeitos adversos , Polissonografia/métodos , Obstrução das Vias Respiratórias/complicaçõesRESUMO
PURPOSE: Thoracic epidural analgesia (TEA) is a well stablished technique for pain management in major thoracic and abdominal surgeries; however, it has considerable failure rates. Local anesthetic (LA) administration and subsequent assessment of sensory block through physical examination (e.g., decreased temperature perception determined via an LA temperature dissociation test [LATDT]) has been the historical standard for evaluation of thoracic epidural placement. Nevertheless, newer methods to objectively evaluate successful placement have recently been developed, e.g., the epidural electrical stimulation test (EEST) and epidural pressure waveform analysis (EWA). The purpose of this study was to evaluate the effectiveness of preoperative TEA catheter testing (LATDT, EEST, and EWA) on reducing TEA failure. METHODS: After obtaining an institutional research ethics board approval for a retrospective study, we conducted a single-institution retrospective review on all TEAs performed between January 2016 and December 2021. Patients were assigned to one of four groups based on the performed test method to verify the placement of the TEA catheter: no test, LATDT, EEST, and EWA. A TEA was deemed successful if it provided bilateral dermatomal sensory block to ice test in the postoperative period, and was used for patient analgesia for at least 24 hr. RESULTS: One thousand two hundred and forty-one patients submitted to preoperative TEA were included. Twenty-eight patients were excluded. Tested and untested epidurals had failure rates of 3.8% (95% confidence interval [CI], 1.8 to 6.2) and 11.5% (95% CI, 5.2 to 17.1), respectively (P < 0.001). CONCLUSION: Objective preoperative testing after placement of thoracic epidurals was associated with a reduction in failure rates.
RéSUMé: OBJECTIF: L'analgésie péridurale thoracique (APT) est une technique bien établie pour la prise en charge de la douleur dans les chirurgies thoraciques et abdominales majeures; cette modalité entraîne cependant des taux d'échec considérables. L'administration d'anesthésique local (AL) et l'évaluation subséquente du bloc sensitif par un examen physique (p. ex. diminution de la perception de la température déterminée par un test de dissociation de la température après l'AL [LATDT]) ont constitué la norme historique pour l'évaluation du positionnement de la péridurale thoracique. Néanmoins, de nouvelles méthodes permettant d'évaluer objectivement le positionnement réussi ont récemment été mises au point, par exemple le test de stimulation électrique péridurale (EEST) et l'analyse de la forme d'onde de pression péridurale (EWA). L'objectif de cette étude était d'évaluer l'efficacité des tests préopératoires de cathéters d'APT (LATDT, EEST et EWA) sur la réduction des échecs d'APT. MéTHODE: Après avoir obtenu l'approbation d'un comité d'éthique de la recherche de l'établissement pour une étude rétrospective, nous avons réalisé un examen rétrospectif monocentrique de toutes les APT réalisées entre janvier 2016 et décembre 2021. Les patient·es ont été assigné·es à l'un des quatre groupes en fonction de la méthode de test utilisée pour vérifier l'emplacement du cathéter d'APT, soit : aucun test, LATDT, EEST et EWA. Une APT a été jugée efficace si elle a fourni un bloc sensitif dermatomal bilatéral au test de glace en postopératoire et a été utilisée pour l'analgésie pendant au moins 24 heures. RéSULTATS: Mille deux cent quarante et un·e patient·es soumis·es à une APT préopératoire ont été inclus·es. Vingt-huit personnes ont été exclues. Les péridurales testées et non testées présentaient des taux d'échec de 3,8 % (intervalle de confiance [IC] à 95 %, 1,8 à 6,2) et de 11,5 % (IC 95 %, 5,2 à 17,1), respectivement (P < 0,001). CONCLUSION: Les tests préopératoires objectifs après la mise en place de péridurales thoraciques ont été associés à une réduction des taux d'échec.
RESUMO
OBJECTIVE: Planned hysterectomy at the time of cesarean delivery may be reasonable in cases other than placenta accreta spectrum disorders. Our objective was to synthesize the published literature on the indications and outcomes for planned cesarean hysterectomy. DATA SOURCES: We performed a systematic review of published literature from the following databases from inception (1946) to June 2021: MEDLINE, PubMed, EMBASE, Cochrane CENTRAL, DARE, and clinicaltrials.gov. STUDY SELECTION: We included all study designs where subjects underwent planned cesarean delivery with simultaneous hysterectomy. Emergency procedures and those performed for placenta accreta spectrum disorders were excluded. DATA EXTRACTION AND SYNTHESIS: The primary outcome was surgical indication, though other surgical outcomes were evaluated when data permitted. Quantitative analysis was limited to studies published in 1990 or later. Risk of bias was assessed using an adaptation of the ROBINS-I tool. CONCLUSION: The most common indication for planned cesarean hysterectomy was malignancy, with cervical cancer being the most frequent. Other indications included permanent contraception, uterine fibroids, menstrual disorders, and chronic pelvic pain. Common complications included bleeding, infection, and ileus. The surgical skill for cesarean hysterectomy continues to be relevant in contemporary obstetrical practice for reproductive malignancy and several benign indications. Although the data indicate relatively safe outcomes, these studies show significant publication bias and, therefore, further systematic study of this procedure is justified. PROSPERO REGISTRATION NUMBER: CRD42021260545, registered June 16, 2021.
Assuntos
Neoplasias , Placenta Acreta , Gravidez , Feminino , Humanos , Placenta Acreta/cirurgia , Estudos Retrospectivos , Fatores de Risco , Histerectomia/métodosRESUMO
Background: Chronic lung allograft dysfunction (CLAD) is the major cause of death post-lung transplantation, with acute cellular rejection (ACR) being the biggest contributing risk factor. Although patients are routinely monitored with spirometry, FEV1 is stable or improving in most ACR episodes. In contrast, oscillometry is highly sensitive to respiratory mechanics and shown to track graft injury associated with ACR and its improvement following treatment. We hypothesize that intra-subject variability in oscillometry measurements correlates with ACR and risk of CLAD. Methods: Of 289 bilateral lung recipients enrolled for oscillometry prior to laboratory-based spirometry between December 2017 and March 2020, 230 had ≥ 3 months and 175 had ≥ 6 months of follow-up. While 37 patients developed CLAD, only 29 had oscillometry at time of CLAD onset and were included for analysis. These 29 CLAD patients were time-matched with 129 CLAD-free recipients. We performed multivariable regression to investigate the associations between variance in spirometry/oscillometry and the A-score, a cumulative index of ACR, as our predictor of primary interest. Conditional logistic regression models were built to investigate associations with CLAD. Results: Multivariable regression showed that the A-score was positively associated with the variance in oscillometry measurements. Conditional logistic regression models revealed that higher variance in the oscillometry metrics of ventilatory inhomogeneity, X5, AX, and R5-19, was independently associated with increased risk of CLAD (p < 0.05); no association was found for variance in %predicted FEV1. Conclusion: Oscillometry tracks graft injury and recovery post-transplant. Monitoring with oscillometry could facilitate earlier identification of graft injury, prompting investigation to identify treatable causes and decrease the risk of CLAD.
RESUMO
BACKGROUND: Excessive use of CT pulmonary angiography (CTPA) to investigate pulmonary embolism (PE) in the emergency department (ED) contributes to adverse patient outcomes. Non-invasive D-dimer testing, in the context of a clinical algorithm, may help decrease unnecessary imaging but this has not been widely implemented in Canadian EDs. AIM: To improve the diagnostic yield of CTPA for PE by 5% (absolute) within 12 months of implementing the YEARS algorithm. MEASURES AND DESIGN: Single centre study of all ED patients >18 years investigated for PE with D-dimer and/or CTPA between February 2021 and January 2022. Primary and secondary outcomes were the diagnostic yield of CTPA and frequency of CTPA ordered compared with baseline. Process measures included the percentage of D-dimer tests ordered with CTPA and CTPAs ordered with D-dimers <500 µg/L Fibrinogen Equivalent Units (FEU). The balancing measure was the number of PEs identified on CTPA within 30 days of index visit. Multidisciplinary stakeholders developed plan- do-study-act cycles based on the YEARS algorithm. RESULTS: Over 12 months, 2695 patients were investigated for PE, of which 942 had a CTPA. Compared with baseline, the CTPA yield increased by 2.9% (12.6% vs 15.5%, 95% CI -0.06% to 5.9%) and the proportion of patients that underwent CTPA decreased by 11.4% (46.4% vs 35%, 95% CI -14.1% to -8.8%). The percentage of CTPAs ordered with a D-dimer increased by 26.3% (30.7% vs 57%, 95% CI 22.2% 30.3%) and there were two missed PE (2/2695, 0.07%). IMPACT: Implementing the YEARS criteria may safely improve the diagnostic yield of CTPAs and reduce the number of CTPAs completed without an associated increase in missed clinically significant PEs. This project provides a model for optimising the use of CTPA in the ED.
Assuntos
Embolia Pulmonar , Humanos , Canadá , Embolia Pulmonar/diagnóstico por imagem , Serviço Hospitalar de Emergência , AlgoritmosRESUMO
INTRODUCTION AND HYPOTHESIS: The objective was to assess the impact of total excision of polypropylene midurethral slings (MUS) on patient pain levels and to report on functional outcomes including recurrent/de novo stress urinary incontinence (SUI), sexual function, and quality of life measures. METHODS: This is a retrospective analysis of patients who underwent total MUS excision from March 2017 to December 2019. The primary outcome was the impact on pain assessed by a Numeric Rating Scale (NRS). Questionnaires analyzed were: Pain Catastrophizing Scale, Pelvic Floor Distress Inventory Short Form-20, Female Sexual Function Index, and McGill Pain Index questionnaires. RESULTS: Thirty-two women underwent total mesh excision within the inclusion period; with follow-up data available for 31 out of 32; 14 (43.8%) had previously undergone one or more partial vaginal mesh excision procedures. Types of MUS removed were: 14 (43.8%) transobturator midurethral slings, 12 (37.5%) retropubic midurethral slings, 4 (12.5%) mini-slings, and 2 (6.3%) mesh slings placed by laparotomy. Pain was the main reason for referral in 31 patients (96.9%). Mean pain NRS reduced from 6.1 pre-operatively to 3.3 post-operatively, with paired comparison showing a significant difference (p<0.01). Qualitatively, complete symptoms resolution was observed in 10 out of 31 (32.3%), another 9 out of 31 (29.0%) patients experienced clinically significant improvement, 2 out of 31 (6.5%) did not experience improvement in pain, and 10 out of 31 (32.3%) reported new/worsening pain. Post-operative complications occurred in 9 (29.0%) patients; all were Clavien-Dindo grade II. Nineteen (61.3%) reported de novo/recurrent SUI post-operatively. CONCLUSION: Total MUS mesh excision yields high complication and SUI recurrence rates, counter-balanced by a 61.3% pain resolution/improvement rate. These data are pertinent for patient counseling.
Assuntos
Slings Suburetrais , Incontinência Urinária por Estresse , Humanos , Feminino , Estudos Retrospectivos , Procedimentos Cirúrgicos Urológicos/métodos , Telas Cirúrgicas/efeitos adversos , Qualidade de Vida , Slings Suburetrais/efeitos adversos , Incontinência Urinária por Estresse/cirurgia , Incontinência Urinária por Estresse/etiologia , Dor/etiologia , Resultado do TratamentoRESUMO
BACKGROUND: Emergency department (ED) utilization is a significant concern in many countries, but few population-based studies have compared ED use. Our objective was to compare ED utilization in New York (United States), Ontario (Canada), and New Zealand (NZ). METHODS: A retrospective cross-sectional analysis of all ED visits between January 1, 2016, and September 30, 2017, for adults ≥18 years using data from the State Emergency Department and Inpatient Databases (New York), the National Ambulatory Care Reporting System and Discharge Abstract Data (Ontario), and the National Non-Admitted Patient Collection and the National Minimum Data Set (New Zealand). Outcomes included age- and sex-standardized per-capita ED utilization (overall and stratified by neighborhood income), ED disposition, and ED revisit and hospitalization within 30 days of ED discharge. RESULTS: There were 10,998,371 ED visits in New York, 8,754,751 in Ontario, and 1,547,801 in New Zealand. Patients were older in Ontario (mean age 51.1 years) compared to New Zealand (50.3) and New York (48.7). Annual sex- and age-standardized per-capita ED utilization was higher in Ontario than New York or New Zealand (443.2 vs. 404.0 or 248.4 visits per 1000 population/year, respectively). In all countries, ED utilization was highest for residents of the lowest income quintile neighborhoods. The proportion of ED visits resulting in hospitalization was higher in New Zealand (34.5%) compared to New York (20.8%) and Ontario (12.8%). Thirty-day ED revisits were higher in Ontario (27.0%) than New Zealand (18.6%) or New York (21.4%). CONCLUSIONS: Patterns of ED utilization differed widely across three high-income countries. These differences highlight the varying approaches that our countries take with respect to urgent visits, suggest opportunities for shared learning through international comparisons, and raise important questions about optimal approaches for all countries.
Assuntos
Serviço Hospitalar de Emergência , Hospitalização , Adulto , Humanos , Estados Unidos , Pessoa de Meia-Idade , Ontário , Estudos Retrospectivos , New York , Estudos Transversais , Nova Zelândia/epidemiologiaRESUMO
Peripartum cardiomyopathy is the development of heart failure toward the end of pregnancy or in the months after delivery in the absence of other attributable causes, with left ventricular systolic dysfunction and a left ventricular ejection fraction (LVEF) generally <45%. Given that patients are relatively young at the time of diagnosis, this study was performed to summarize current evidence surrounding the long-term cardiac outcomes. MEDLINE, Embase, Cochrane CENTRAL, and CINAHL were searched for original studies that reported long-term (>1 year) patient outcomes. Of the 3,144 total records identified, 62 studies involving 4,282 patients met the selection criteria. The mean LVEF was 28% at diagnosis and 47% at the time of the last follow-up. Approximately half of the patients achieved myocardial recovery (47%), most commonly defined as an LVEF >50% (n = 21). The prevalence of implantable cardioverter-defibrillator use, left ventricular assist device implantation, and heart transplantation was 12%, 7%, and 11%, respectively. The overall all-cause mortality was 9%, and despite having more cardiovascular risk factors, patients residing in high-income countries had superior outcomes, including reduced rates of mortality.
Assuntos
Cardiomiopatias , Desfibriladores Implantáveis , Insuficiência Cardíaca , Gravidez , Feminino , Humanos , Volume Sistólico , Função Ventricular Esquerda , Período Periparto , Insuficiência Cardíaca/epidemiologia , Insuficiência Cardíaca/terapiaRESUMO
BACKGROUND: Income disparities may affect patients' care transition home. Evidence among patients who have access to publicly funded healthcare coverage remains limited. OBJECTIVE: To evaluate the association between low income and post-discharge health outcomes and explore patient and caregiver perspectives on the role of income disparities. DESIGN: Mixed-methods secondary analysis conducted among participants in a double-blind randomized controlled trial. PARTICIPANTS: Participants from a multicenter study in Ontario, Canada, were classified as low income if annual self-reported salary was below $29,000 CAD, or between $30,000 and $50,000 CAD and supported ≥ 3 individuals. MAIN MEASURES: The associations between low income and the following self-reported outcomes were evaluated using multivariable logistic regression: patient experience, adherence to medications, diet, activity and follow-up, and the aggregate of emergency department (ED) visits, readmission, or death up to 3 months post-discharge. A deductive direct content analysis of patient and caregivers on the role of income-related disparities during care transitions was conducted. KEY RESULTS: Individuals had similar odds of reporting high patient experience and adherence to instructions regardless of reported income. Compared to higher income individuals, low-income individuals also had similar odds of ED visits, readmissions, and death within 3 months post-discharge. Low-income individuals were more likely than high-income individuals to report understanding their medications completely (OR 1.9, 95% CI: 1.0-3.4) in fully adjusted regression models. Two themes emerged from 25 interviews which (1) highlight constraints of publicly funded services and costs incurred to patients or their caregivers along with (2) the various ways patients adapt through caregiver support, private services, or prioritizing finances over health. CONCLUSIONS: There were few quantitative differences in patient experience, adherence, ED visits, readmissions, and death post-discharge between individuals reporting low versus higher income. Several hidden costs for transportation, medications, and home care were reported however and warrant further research.
Assuntos
Alta do Paciente , Transferência de Pacientes , Humanos , Assistência ao Convalescente , Salários e Benefícios , Atenção à Saúde , Ontário/epidemiologia , Readmissão do PacienteRESUMO
BACKGROUND: Chronic neuropathic pain is often debilitating and can have a significant impact on sleep health and quality of life. There is limited information on the impact of cannabinoids on sleep health when treating neuropathic pain. OBJECTIVE: The objectives of this systematic review and meta-analysis were to determine the effect of cannabinoids on sleep quality, pain intensity, and patient impression of treatment efficacy in patients with neuropathic pain. EVIDENCE REVIEW: Nine available medical literature databases were searched for randomized controlled trials comparing synthetic and natural cannabinoids to placebo in patients with neuropathic pain syndromes. Data on validated tools for sleep quality, pain intensity, patients' global impression of change (PGIC), and incidence of adverse effects of cannabinoids were extracted and synthesized. FINDINGS: Of the 3491 studies screened, eight randomized controlled trials satisfied the inclusion criteria for this review. Analyses were performed using R -4.1.2. using the metafor package and are interpreted using alpha=0.05 as the threshold for statistical significance. Validated measures for sleep health were not used in most studies. Meta-analysis of data from six studies showed that cannabinoids were associated with a significant improvement in sleep quality (standardized mean difference (SMD): 0.40; 95% CI: 0.19 to -0.61, 95% prediction interval (PI): -0.12 to 0.88, p-value=0.002, I2=55.26, τ2=0.05, Q-statistic=16.72, GRADE: moderate certainty). Meta-analysis of data from eight studies showed a significant reduction in daily pain scores in the cannabinoid (CB) group (SMD: -0.55, 95% CI:-0.69 to -0.19, 95% PI: -1.51 to 0.39, p=0.003, I2=82.49, τ2=0.20, Q-statistic=47.69, GRADE: moderate certainty). However, sleep health and analgesic benefits were associated with a higher likelihood of experiencing daytime somnolence, nausea, and dizziness. CONCLUSIONS: Cannabinoids have a role in treating chronic neuropathic pain as evidenced by significant improvements in sleep quality, pain intensity, and PGIC. More research is needed to comprehensively evaluate the impact of cannabinoids on sleep health and analgesic efficacy. PROSPERO REGISTRATION NUMBER: CRD42017074255.