Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 93
Filter
1.
Article in English | MEDLINE | ID: mdl-38730126

ABSTRACT

PURPOSE: To explore if a day 7 blastocyst is predictive of the reproductive potential of sibling day 5 or day 6 blastocysts? METHODS: Retrospective cohort of autologous frozen embryo transfers (FET), February 2019 to April 2022. Cycles divided into groups 1 to 5, according to the day of embryo cryopreservation and the presence of a day 7 blastocyst sibling within the cohort: group 1/group 2-day 5 blastocyst without/with a day 7 sibling, group 3/group 4-day 6 blastocyst without/with a day 7 sibling, group 5-day 7 blastocyst. Clinical, ongoing pregnancy and miscarriage rates, cycle, and patient characteristics are reported. Multivariable generalized estimating equations (GEE) logistic regression analysis accounts for confounders and assesses the effect of a sibling day 7 blastocyst on ongoing pregnancy rates of day 5 or day 6 blastocyst FETs. RESULTS: Ongoing pregnancy rates are 38.4%, 59.5%, 30.8%, 32.7%, and 4.4% in groups 1-5, respectively. When correcting for maternal age, number of oocytes retrieved and discarded per cohort, and ploidy, embryos cryopreserved on either day 6 or day 7 have reduced odds of ongoing pregnancy after FET compared to day 5 blastocysts (OR = 0.76, IQR [0.61-0.95], p-value = 0.01). However, the presence of a day 7 sibling does not significantly affect odds of ongoing pregnancy of day 5 or day 6 blastocysts compared to the same-day blastocyst without a day 7 sibling (p-value = 0.20 and 0.46, respectively). This finding is consistent within both the Preimplantation Genetic Testing for Aneuploidy (PGT-A) unscreened and screened (euploid) embryo subgroups. CONCLUSIONS: Day of embryo cryopreservation significantly affects ongoing pregnancy rates. However, day 7 embryos within a cohort do not affect the reproductive potential of sibling day 5 and day 6 blastocysts, suggesting that slow embryo development is an embryo-specific trait.

2.
Osteoporos Int ; 2024 May 14.
Article in English | MEDLINE | ID: mdl-38740589

ABSTRACT

PURPOSE: This systematic review seeks to evaluate the proportion of fragility fracture patients screened in secondary fracture prevention programs who were indicated for pharmacological treatment, received prescriptions for bone-active medications, and initiated the prescribed medication. Additionally, the study aims to analyze equity in pharmacological treatment by examining equity-related variables including age, sex, gender, race, education, income, and geographic location. METHODS: We conducted a systematic review to ascertain the proportion of fragility fracture patients indicated for treatment who received prescriptions and/or initiated bone-active medication through secondary fracture prevention programs. We also examined treatment indications reported in studies and eligibility criteria to confirm patients who were eligible for treatment. To compute the pooled proportions for medication prescription and initiation, we carried out a single group proportional meta-analysis. We also extracted the proportions of patients who received a prescription and/or began treatment based on age, sex, race, education, socioeconomic status, location, and chronic conditions. RESULTS: This review included 122 studies covering 114 programs. The pooled prescription rate was 77%, and the estimated medication initiation rate was 71%. Subgroup analysis revealed no significant difference in treatment initiation between the Fracture Liaison Service and other programs. Across all studies, age, sex, and socioeconomic status were the only equity variables reported in relation to treatment outcomes. CONCLUSION: Our systematic review emphasizes the need for standardized reporting guidelines in post-fracture interventions. Moreover, considering equity stratifiers in the analysis of health outcomes will help address inequities and improve the overall quality and reach of secondary fracture prevention programs.

3.
BJOG ; 2024 May 09.
Article in English | MEDLINE | ID: mdl-38725333

ABSTRACT

OBJECTIVE: To identify which components of maternal vascular malperfusion (MVM) pathology are associated with adverse pregnancy outcomes and to investigate the morphological phenotypes of MVM placental pathology and their relationship with distinct clinical presentations of pre-eclampsia and/or fetal growth restriction (FGR). DESIGN: Retrospective cohort study. SETTING: Tertiary care hospital in Toronto, Canada. POPULATION: Pregnant individuals with low circulating maternal placental growth factor (PlGF) levels (<100 pg/mL) and placental pathology analysis between March 2017 and December 2019. METHODS: Association between each pathological finding and the outcomes of interest were calculated using the chi-square test. Cluster analysis and logistic regression was used to identify phenotypic clusters, and their association with adverse pregnancy outcomes. Cluster analysis was performed using the K-modes unsupervised clustering algorithm. MAIN OUTCOME MEASURES: Preterm delivery <34+0 weeks of gestation, early onset pre-eclampsia with delivery <34+0 weeks of gestation, birthweight <10th percentile (small for gestational age, SGA) and stillbirth. RESULTS: The diagnostic features of MVM most strongly associated with delivery <34+0 weeks of gestation were: infarction, accelerated villous maturation, distal villous hypoplasia and decidual vasculopathy. Two dominant phenotypic clusters of MVM pathology were identified. The largest cluster (n = 104) was characterised by both reduced placental mass and hypoxic ischaemic injury (infarction and accelerated villous maturation), and was associated with combined pre-eclampsia and SGA. The second dominant cluster (n = 59) was characterised by infarction and accelerated villous maturation alone, and was associated with pre-eclampsia and average birthweight for gestational age. CONCLUSIONS: Patients with placental MVM disease are at high risk of pre-eclampsia and FGR, and distinct pathological findings correlate with different clinical phenotypes, suggestive of distinct subtypes of MVM disease.

4.
Article in English | MEDLINE | ID: mdl-38663465

ABSTRACT

BACKGROUND: Long-term survival after lung transplantation (LTx) remains limited by Chronic Lung Allograft Dysfunction (CLAD), which includes two main phenotypes: bronchiolitis obliterans syndrome (BOS) and restrictive allograft syndrome (RAS), with possible overlap. We aimed to detail and quantify pathological features of these CLAD sub-types. METHODS: Peripheral and central paraffin-embedded explanted lung samples were obtained from 20 consecutive patients undergoing a second LTx for CLAD, from 3 lobes. Thirteen lung samples, collected from non-transplant lobectomies or donor lungs, were used as controls. Blinded semi-quantitative grading was performed to assess airway fibrotic changes, parenchymal and pleural fibrosis, as well as epithelial and vascular abnormalities. RESULTS: CLAD lung samples had higher scores for all airway- and lung-related parameters compared to controls. There was a notable overlap in pathological scores between BOS and RAS, with a wide range of scores in both conditions. Parenchymal and vascular fibrosis scores were significantly higher in RAS compared to BOS (p=0.003 for both). We observed a significant positive correlation between the degree of inflammation around each airway, the severity of epithelial changes and airway fibrosis. Immunofluorescence staining demonstrated a trend towards a lower frequency of club cells in CLAD, and a higher frequency of apoptotic club cells in BOS samples (p=0.01). CONCLUSIONS: CLAD is a spectrum of airway, parenchymal, and pleural fibrosis, as well as epithelial, vascular, and inflammatory pathological changes, where BOS and RAS overlap significantly. Our semi-quantitative grading score showed a generally high inter-reader reliability and may be useful for future CLAD pathological assessments.

5.
J Patient Rep Outcomes ; 8(1): 47, 2024 Apr 29.
Article in English | MEDLINE | ID: mdl-38683439

ABSTRACT

BACKGROUND: The EvalUation of goal-diRected activities to prOmote well-beIng and heAlth (EUROIA) scale is a novel patient-reported measure that was administered to individuals with chronic heart failure (CHF). It assesses goal-directed activities that are self-reported as being personally meaningful and commonly utilized to optimize health-related quality of life (HRQL). Our aim was to evaluate psychometric properties of the EUROIA, and to determine if it accounted for novel variance in its association with clinical outcomes. METHODS: This study was a secondary analysis of the CHF-CePPORT trial, which enrolled 231 CHF patients: median age = 59.5 years, 23% women. Baseline assessments included: EUROIA, Kansas City Cardiomyopathy Questionnaire-Overall Summary (KCCQ-OS), Patient Health Questionnaire-9 for depression (PHQ-9), and the Generalized Anxiety Disorder-7 (GAD-7). 12-month outcomes included health status (composite index of incident hospitalization or emergency department, ED, visit) and mental health (PHQ-9 and GAD-7). RESULTS: Exploratory Principal Axis Factoring identified four EUROIA factors with satisfactory internal reliability: i.e., activities promoting eudaimonic well-being (McDondald's ω = 0.79), social affiliation (⍺=0.69), self-affirmation (⍺=0.73), and fulfillment of social roles/responsibilities (Spearman-Brown coefficient = 0.66). Multivariable logistic regression indicated that not only was the EUROIA inversely associated with incidence of 12-month hospitalization/ED visits independent of the KCCQ-OS (Odds Ratio, OR = 0.95, 95% Confidence Interval, CI, 0.91, 0.98), but it was also associated with 12-month PHQ-9 (OR = 0.91, 95% CI, 0.86, 0.97), and GAD-7 (OR = 0.94, 95% CI, 0.90, 0.99) whereas the KCCQ-OS was not. CONCLUSION: The EUROIA provides a preliminary taxonomy of goal-directed activities that promote HRQL among CHF patients independently from a current gold standard state-based measure. CLINICAL TRIAL REGISTRATION: NCT01864369; https://classic. CLINICALTRIALS: gov/ct2/show/NCT01864369 .


Subject(s)
Goals , Heart Failure , Psychometrics , Quality of Life , Aged , Female , Humans , Male , Middle Aged , Depression/psychology , Depression/epidemiology , Depression/diagnosis , Health Status , Heart Failure/psychology , Mental Health , Patient Reported Outcome Measures , Psychometrics/methods , Psychometrics/instrumentation , Quality of Life/psychology , Reproducibility of Results , Surveys and Questionnaires
6.
J Neurotrauma ; 2024 Apr 05.
Article in English | MEDLINE | ID: mdl-38468550

ABSTRACT

It is important for patients and clinicians to know the potential for recovery from concussion as soon as possible after injury, especially in patients who do not recover completely in the first month and have concussion with persisting concussion symptoms (C+PCS). We assessed the association between the causes of concussion and recovery from C+PCS in a consecutive retrospective and prospective cohort of 600 patients referred to the Canadian Concussion Center (CCC) at Toronto Western Hospital. Data were obtained from clinical records and follow-up questionnaires and not from a standardized database. A novel method was used to assess long-term recovery, and multi-variable Cox proportional hazards models were used to assess relationships between cause of concussion and time to recovery. We examined the subsequent recovery of patients who had not recovered after at least one month from the time of concussion. Patients were grouped into the following four causes: sports and recreation (S&R, n = 312, 52%); motor vehicle collisions (MVC, n = 103, 17%); falls (n = 100, 17%); and being struck by an object including violence (SBOV, n = 85, 14%). The MVC group had the highest percentage of females (75.7%), the oldest participants (median: 40.0 [interquartile range (IQR):30.5-49.0] years), the most symptoms (median:11.0 [IQR:8.5-15.0]), and the longest symptom duration (median: 28.0 [IQR:12.0-56.00] months). In contrast, the S&R group had the highest percentage of males (58.1%), the youngest participants (median:20.0 [IQR:17.0-30.0] years), the best recovery outcome, and shortest symptom duration (median:22.0 [IQR:8.0-49.5] months). Significant differences among the four causes included age (p < 0.001), sex (p < 0.001), number of previous concussions (p < 0.001), history of psychiatric disorders (p = 0.002), and migraine (p = 0.001). Recovery from concussion was categorized into three groups: (1) Complete Recovery occurred in only 60 (10%) patients with median time 8.0 (IQR:3.5-18.0) months and included 42 S&R, 7 MVC, 8 falls, and 3 SBOV; (2) Incomplete Recovery occurred in 408 (68.0%) patients with persisting median symptom time of 5.0 (IQR:2.0-12.0) months; and (3) Unknown Recovery occurred in 132 (22.0%) patients and was because of lack of follow-up. In summary, the cause of C+PCS was associated with the type, number, and duration of symptoms and time required for recovery, although all causes of C+PCS produced prolonged symptoms in a large percentage of patients, which emphasizes the importance of concussions as a public health concern necessitating improved prevention and treatment strategies.

7.
Am J Transplant ; 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-38307417

ABSTRACT

Although cytomegalovirus (CMV) viremia/DNAemia has been associated with reduced survival after lung transplantation, its association with chronic lung allograft dysfunction (CLAD) and its phenotypes is unclear. We hypothesized that, in a modern era of CMV prophylaxis, CMV DNAemia would still remain associated with death, but also represent a risk factor for CLAD and specifically restrictive allograft syndrome (RAS)/mixed phenotype. This was a single-center retrospective cohort study of all consecutive adult, first, bilateral-/single-lung transplants done between 2010-2016, consisting of 668 patients. Risks for death/retransplantation, CLAD, or RAS/mixed, were assessed by adjusted cause-specific Cox proportional-hazards models. CMV viral load (VL) was primarily modeled as a categorical variable: undetectable, detectable to 999, 1000 to 9999, and ≥10 000 IU/mL. In multivariable models, CMV VL was significantly associated with death/retransplantation (≥10 000 IU/mL: HR = 2.65 [1.78-3.94]; P < .01), but was not associated with CLAD, whereas CMV serostatus mismatch was (D+R-: HR = 2.04 [1.30-3.21]; P < .01). CMV VL was not associated with RAS/mixed in univariable analysis. Secondary analyses with a 7-level categorical or 4-level ordinal CMV VL confirmed similar results. In conclusion, CMV DNAemia is a significant risk factor for death/retransplantation, but not for CLAD or RAS/mixed. CMV serostatus mismatch may have an impact on CLAD through a pathway independent of DNAemia.

8.
J Heart Lung Transplant ; 43(6): 973-982, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38211836

ABSTRACT

BACKGROUND: Aspiration is a known risk factor for adverse outcomes post-lung transplantation. Airway bile acids are the gold-standard biomarker of aspiration; however, they are released into the duodenum and likely reflect concurrent gastrointestinal dysmotility. Previous studies investigating total airway pepsin have found conflicting results on its relationship with adverse outcomes post-lung transplantation. These studies measured total pepsin and pepsinogen in the airways. Certain pepsinogens are constitutively expressed in the lungs, while others, such as pepsinogen A4 (PGA4), are not. We sought to evaluate the utility of measuring airway PGA4 as a biomarker of aspiration and predictor of adverse outcomes in lung transplant recipients (LTRs) early post-transplant. METHODS: Expression of PGA4 was compared to other pepsinogens in lung tissue. Total pepsin and PGA4 were measured in large airway bronchial washings and compared to preexisting markers of aspiration. Two independent cohorts of LTRs were used to assess the relationship between airway PGA4 and chronic lung allograft dysfunction (CLAD). Changes to airway PGA4 after antireflux surgery were assessed in a third cohort of LTRs. RESULTS: PGA4 was expressed in healthy human stomach but not lung. Airway PGA4, but not total pepsin, was associated with aspiration. Airway PGA4 was associated with an increased risk of CLAD in two independent cohorts of LTRs. Antireflux surgery was associated with reduced airway PGA4. CONCLUSIONS: Airway PGA4 is a marker of aspiration that predicts CLAD in LTRs. Measuring PGA4 at surveillance bronchoscopies can help triage high-risk LTRs for anti-reflux surgery.


Subject(s)
Allografts , Biomarkers , Lung Transplantation , Humans , Lung Transplantation/adverse effects , Male , Female , Middle Aged , Biomarkers/metabolism , Respiratory Aspiration/diagnosis , Respiratory Aspiration/etiology , Respiratory Aspiration/metabolism , Pepsinogen C/metabolism , Pepsinogen C/blood , Adult , Primary Graft Dysfunction/diagnosis , Primary Graft Dysfunction/metabolism , Primary Graft Dysfunction/etiology , Chronic Disease , Lung/metabolism , Lung/physiopathology , Postoperative Complications/diagnosis , Predictive Value of Tests
9.
Ther Adv Respir Dis ; 18: 17534666231221750, 2024.
Article in English | MEDLINE | ID: mdl-38179653

ABSTRACT

BACKGROUND: Interstitial lung diseases (ILD) unresponsive to medical therapy often require lung transplantation (LTx), which prolongs quality of life and survival. Ideal timing for referral for LTx remains challenging, with late referral associated with significant morbidity and mortality. Among other criteria, patients with ILD should be considered for LTx if forced vital capacity (FVC) is less than 80% or diffusion capacity for carbon monoxide (DLCO) is less than 40%. However, data on referral rates are lacking. OBJECTIVES: To evaluate referral rates for LTx based on pulmonary function tests (PFTs) and identify barriers associated with non-referral. DESIGN: A single-center retrospective cohort study. METHODS: The study consisted of ILD patients who performed PFT between 2014 and 2020. Patients with FVC < 80% or a DLCO < 40% were included in the study. Patients with absolute contraindications to LTx were excluded. Referral rates were computed, and a comparison was made between referred and non-referred subjects. RESULTS: Out of 114 ILD patients meeting criteria for referral to LTx, 35 were referred (30.7%), and 7 proceeded to undergo LTx. Median time from PFT to referral for assessment was 255 days [interquartile range (IQR) 35-1077]. Median time from referral to LTx was 89 days (IQR 59-143). Referred patients were younger (p = 0.003), had lower FVC (p < 0.001), DLCO (p < 0.001), and a higher rate of pulmonary hypertension (p = 0.04). Relatively better PFT, and older age, were significantly associated with non-referral of patients. CONCLUSION: There is under-referral of ILD patients who are eligible for LTx, which is associated with severe disease and missed opportunities for LTx. Further research is required to validate these findings.


Lung transplants: addressing referral gaps for lung disease patientsPatients with severe lung diseases that are unresponsive to medical treatments often require lung transplants to enhance their quality of life and survival. Determining the optimal timing for considering a transplant is challenging, as delaying it can lead to complications. Our study aimed to assess how frequently individuals with lung problems, particularly interstitial lung diseases, were referred for lung transplants based on lung function tests. We conducted a retrospective analysis of medical records for patients with lung diseases who underwent lung function tests between 2014 and 2020. We selected patients whose test results indicated impaired lung function, excluding those who were ineligible for lung transplants due to other medical reasons. Subsequently, we examined the number of patients referred for a lung transplant and compared them to those who were not referred. Our findings revealed that out of 114 patients eligible for a lung transplant, only 35 were referred, representing a referral rate of approximately 31%. Among these, only 7 patients actually underwent the transplant procedure. The time elapsed between the lung function test and the referral for a transplant assessment was notably long, averaging around 255 days. Additionally, once referred, patients waited an average of 89 days for the transplant assessment. Referred patients tended to be younger and had more severe lung disease, characterized by lower lung function test results and a higher likelihood of pulmonary hypertension. Conversely, patients who were not referred generally enjoyed better overall health and were older. This discrepancy highlights the missed opportunities for patients to improve their health and quality of life through lung transplantation. Further research is essential to verify the accuracy of these findings, but this study represents a crucial step toward ensuring that individuals with lung diseases receive the appropriate care they require.


Subject(s)
Lung Diseases, Interstitial , Lung Transplantation , Humans , Retrospective Studies , Quality of Life , Lung , Lung Diseases, Interstitial/diagnosis , Lung Diseases, Interstitial/surgery , Lung Transplantation/adverse effects , Referral and Consultation
10.
Am J Transplant ; 24(1): 89-103, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37625646

ABSTRACT

The acute rejection score (A-score) in lung transplant recipients, calculated as the average of acute cellular rejection A-grades across transbronchial biopsies, summarizes the cumulative burden of rejection over time. We assessed the association between A-score and transplant outcomes in 2 geographically distinct cohorts. The primary cohort included 772 double lung transplant recipients. The analysis was repeated in 300 patients from an independent comparison cohort. Time-dependent multivariable Cox models were constructed to evaluate the association between A-score and chronic lung allograft dysfunction or graft failure. Landmark analyses were performed with A-score calculated at 6 and 12 months posttransplant. In the primary cohort, no association was found between A-score and graft outcome. However, in the comparison cohort, time-dependent A-score was associated with chronic lung allograft dysfunction both as a time-dependent variable (hazard ratio, 1.51; P < .01) and when calculated at 6 months posttransplant (hazard ratio, 1.355; P = .031). The A-score can be a useful predictor of lung transplant outcomes in some settings but is not generalizable across all centers; its utility as a prognostication tool is therefore limited.


Subject(s)
Lung Transplantation , Humans , Prognosis , Retrospective Studies , Lung Transplantation/adverse effects , Lung , Proportional Hazards Models , Graft Rejection/diagnosis , Graft Rejection/etiology
11.
J Heart Lung Transplant ; 43(3): 414-419, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37813131

ABSTRACT

BACKGROUND: Our program uses a desensitization protocol that includes intraoperative therapeutic plasma exchange (iTPE) for crossmatch-positive lung transplants, which improves access to lung transplant for sensitized candidates while mitigating immunologic risk. Although we have reported excellent outcomes for sensitized patients with the use of this protocol, concern for perioperative bleeding appears to have hindered broader adoption of it at other programs. We conducted a retrospective cohort study to quantify the impact of iTPE on perioperative bleeding in lung transplantation. METHODS: All first-time lung transplant recipients from 2014 to 2019 who received iTPE were compared to those who did not. Multivariable logistic regression was used to determine the association between iTPE and large-volume perioperative transfusion requirements (≥5 packed red blood cell units within 24 hours of transplant start), adjusted for disease type, transplant type, and extracorporeal membrane oxygenation or cardiopulmonary bypass use. The incidence of hemothorax (requiring reoperation within 7 days of lung transplant) and 30-day posttransplant mortality were compared between the 2 groups using chi-square test. RESULTS: One hundred forty-two patients (16%) received iTPE, and 755 patients (84%) did not. The mean number of perioperative pRBC transfusions was 4.2 among patients who received iTPE and 2.9 among patients who did not. iTPE was associated with increased odds of requiring large-volume perioperative transfusion (odds ratio 1.9; 95% confidence interval: 1.2-2.9, p-value = 0.007) but was not associated with an increased incidence of hemothorax (5% in both groups, p = 0.99) or 30-day posttransplant mortality (3.5% among patients who received iTPE vs 2.1% among patients who did not, p = 0.31). CONCLUSIONS: This study demonstrates that the use of iTPE in lung transplantation may increase perioperative bleeding but not to a degree that impacts important posttransplant outcomes.


Subject(s)
Lung Transplantation , Plasma Exchange , Humans , Retrospective Studies , Hemothorax/etiology , Treatment Outcome , Lung Transplantation/adverse effects , Hemorrhage/etiology
12.
Reg Anesth Pain Med ; 2023 Nov 08.
Article in English | MEDLINE | ID: mdl-37940349

ABSTRACT

INTRODUCTION: Point-of-care ultrasound can assess diaphragmatic function and rule in or rule out paresis of the diaphragm. While this is a useful bedside tool, established methods have significant limitations. This study explores a new method to assess diaphragmatic motion by measuring the excursion of the uppermost point of the zone of apposition (ZOA) at the mid-axillary line using a high-frequency linear ultrasound probe and compares it with two previously established methods: the assessment of the excursion of the dome of the diaphragm (DOD) and the thickening ratio at the ZOA. METHODS: This is a single-centre, prospective comparative study on elective surgical patients with normal diaphragmatic function. Following research ethics board approval and patient written consent, 75 elective surgical patients with normal diaphragmatic function were evaluated preoperatively. Three ultrasound methods were compared: (1) assessment of the excursion of the DOD using a curvilinear probe through an abdominal window; (2) assessment of the thickening fraction of the ZOA; and (3) assessment of the excursion of the ZOA. The last two methods performed with a linear probe on the lateral aspect of the chest. RESULTS: Seventy-five patients were studied. We found that the evaluation of the excursion of the ZOA was more consistently successful (100% bilaterally) than the evaluation of the excursion of the DOD (98.7% and 34.7% on the right and left sides, respectively). The absolute values of the excursion of the ZOA were greater than and well correlated with the values of the DOD. CONCLUSION: Our preliminary data from this exploratory study suggest that the evaluation of the excursion of the ZOA on the lateral aspect of the chest using a linear probe is consistently successful on both right and left sides. Future studies are needed to establish the distribution of normal values and suggest diagnostic criteria for diaphragmatic paresis or paralysis. TRIAL REGISTRATION NUMBER: NCT03225508.

13.
J Obstet Gynaecol Can ; : 102286, 2023 Nov 14.
Article in English | MEDLINE | ID: mdl-37972692

ABSTRACT

OBJECTIVES: To determine whether reinforcing cerclage following ultrasound evidence of cerclage failure before 24 weeks is an effective method to delay gestational age at delivery, and to decrease the rate of preterm and peri-viable delivery. METHODS: A retrospective review was conducted for all patients who underwent any cervical cerclage procedure at a single tertiary care centre in Toronto, Canada between 1 December 2007 and 31 December 2017. RESULTS: Of 1482 cerclage procedures completed during the study period, 40 pregnant persons who underwent reinforcing cerclage were compared with 40 pregnant persons who were found to have cerclage failure before 24 weeks but were managed expectantly. After adjusting for the shortest cervical length measured prior to 24 weeks, there was no significant difference between the reinforcing cerclage and control group for gestational age at delivery, preterm, or peri-viable birth (P = 0.52, P = 0.54, P = 0.74, respectively). In an unadjusted model, there was a statistically significant increase in placental infection identified on postpartum placenta pathology in the reinforcing cerclage group compared with the expectant management group, 92.9% compared with 66.7% (P = 0.028). CONCLUSION: Reinforcing cerclage is unlikely to successfully delay the gestational age at delivery and reduce rates of preterm and pre-viable birth, especially if irreversible and progressive cervical change has begun. Future work should examine the role of preoperative amniocentesis to explore the impact of pre-existing intra-amniotic infection and reinforcing cerclage success.

14.
Am J Perinatol ; 2023 Nov 07.
Article in English | MEDLINE | ID: mdl-37935374

ABSTRACT

OBJECTIVE: Animal literature has suggested that the impact of antenatal corticosteroids (ACS) may vary by infant sex. Our objective was to assess the impact of infant sex on the use of multiple courses versus a single course of ACS and perinatal outcomes. STUDY DESIGN: We conducted a secondary analysis of the Multiple Courses of Antenatal Corticosteroids for Preterm Birth trial, which randomly allocated pregnant people to multiple courses versus a single course of ACS. Our primary outcome was a composite of perinatal mortality or clinically significant neonatal morbidity (including neonatal death, stillbirth, severe respiratory distress syndrome, intraventricular hemorrhage [grade III or IV], cystic periventricular leukomalacia, and necrotizing enterocolitis [stage II or III]). Secondary outcomes included individual components of the primary outcome as well as anthropometric measures. Baseline characteristics were compared between participants who received multiple courses versus a single course of ACS. An interaction between exposure to ACS and infant sex was assessed for significance and multivariable regression analyses were conducted with adjustment for predefined covariates, when feasible. RESULTS: Data on 2,300 infants were analyzed. The interaction term between treatment status (multiple courses vs. a single course of ACS) and infant sex was not significant for the primary outcome (p = 0.86), nor for any of the secondary outcomes (p > 0.05). CONCLUSION: Infant sex did not modify the association between exposure to ACS and perinatal outcomes including perinatal mortality or neonatal morbidity or anthropometric outcomes. However, animal literature indicates that sex-specific differences after exposure to ACS may emerge over time and thus investigating long-term sex-specific outcomes warrants further attention. KEY POINTS: · We explored the impact of infant sex on perinatal outcomes after multiple versus a single course of ACS.. · Infant sex was not a significant effect modifier of ACS exposure and perinatal outcomes.. · Animal literature indicates that sex-specific differences after ACS exposure may emerge over time.. · Further investigation of long-term sex-specific outcomes is warranted..

15.
ERJ Open Res ; 9(5)2023 Sep.
Article in English | MEDLINE | ID: mdl-37817870

ABSTRACT

Background: Morbidity and mortality in lung transplant recipients are often triggered by recurrent aspiration events, potentiated by oesophageal and gastric disorders. Previous small studies have shown conflicting associations between oesophageal function and the development of chronic lung allograft dysfunction (CLAD). Herein, we sought to investigate the relationship between oesophageal motility disorders and long-term outcomes in a large retrospective cohort of lung transplant recipients. Methods: All lung transplant recipients at the Toronto Lung Transplant Program from 2012 to 2018 with available oesophageal manometry testing within the first 7 months post-transplant were included in this study. Patients were categorised according to the Chicago Classification of oesophageal disorders (v3.0). Associations between oesophageal motility disorders with the development of CLAD and allograft failure (defined as death or re-transplantation) were assessed. Results: Of 487 patients, 57 (12%) had oesophagogastric junction outflow obstruction (OGJOO) and 47 (10%) had a disorder of peristalsis (eight major, 39 minor). In a multivariable analysis, OGJOO was associated with an increased risk of CLAD (HR 1.71, 95% CI 1.15-2.55, p=0.008) and allograft failure (HR 1.69, 95% CI 1.13-2.53, p=0.01). Major disorders of peristalsis were associated with an increased risk of CLAD (HR 1.55, 95% CI 1.01-2.37, p=0.04) and allograft failure (HR 3.33, 95% CI 1.53-7.25, p=0.002). Minor disorders of peristalsis were not significantly associated with CLAD or allograft failure. Conclusion: Lung transplant recipients with oesophageal stasis characterised by OGJOO or major disorders of peristalsis were at an increased risk of adverse long-term outcomes. These findings will help with risk stratification of lung transplant recipients and personalisation of treatment for aspiration prevention.

16.
J Geriatr Oncol ; 14(8): 101601, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37597295

ABSTRACT

INTRODUCTION: Differences in symptom distress among older (age 65-74) and very old (age 75+) patients with cancer, compared to younger patients, remain to be well explored. These differences are important to understand given the heterogeneity of older populations and may have implications for age-appropriate symptom detection and management. MATERIALS AND METHODS: We examined routinely collected Edmonton Symptom Assessment System Revised (ESAS-r) scores from 9,143 patients age 40+ initiating chemotherapy for solid malignancies at a single academic cancer centre, between September 2011 and May 2019. We used multivariable logistic regression models to determine associations between the most common symptoms and age group (ages 40-64, 65-74, 75-84, and 85+), cancer site, clinical stage, sex, and income levels. We focused our findings on patients with the five most common cancers, breast (n = 1,532), prostate (n = 923), lung (n = 889), pancreatic (n = 429), and colorectal (n = 368), prior to receiving treatment. RESULTS: Within our sample, 58.0% were age 40-64, 27.3% age 65-74, 11.8% age 75-84, and 2.9% age 85+. Among the nine symptoms in the ESAS-r (anxiety, depression, tiredness, wellbeing, nausea, pain, drowsiness, appetite, and shortness of breath), the most common symptoms overall were anxiety (moderate-severe scores [ESAS-r 4 or higher] were reported by 33.8% of patients), lack of well-being (38.3%), and tiredness (38.3%). Older age was associated with lower odds of moderate/severe anxiety (odds ratio [OR] 0.81, 95% confidence interval [CI] 0.73-0.90 for age 65-74; OR 0.81, 95%CI 0.70-0.93 for age 75-84; OR 0.62, 95%CI 0.47-0.82 for age 85+; referent is 40-64-year-olds for all analyses), and increased odds of tiredness (OR 1.00, 95%CI 0.90-1.11 for age 65-74; OR 1.19, 95%CI 1.04-1.37 for age 75-84; and OR 1.34, 95%CI 1.04-1.72 for age 85+). Advanced stage, female sex, and lower income levels were associated with higher odds of moderate/severe tiredness, anxiety, and lack of well-being in adjusted models. Patients with pancreatic and lung cancers reported worse scores for these three symptoms than patients with other cancers. DISCUSSION: Older age was associated with differences in symptom experiences such as increased tiredness and reduced anxiety. Supportive care interventions and future research should focus on addressing these symptoms to improve patient quality of life.


Subject(s)
Lung Neoplasms , Neoplasms , Male , Humans , Female , Aged , Aged, 80 and over , Quality of Life , Depression/epidemiology , Depression/diagnosis , Neoplasms/therapy , Pain , Anxiety/epidemiology , Lung Neoplasms/drug therapy , Lung Neoplasms/complications , Fatigue/epidemiology , Fatigue/etiology , Palliative Care
17.
JMIR Res Protoc ; 12: e48666, 2023 Jul 12.
Article in English | MEDLINE | ID: mdl-37436794

ABSTRACT

BACKGROUND: Chronic obstructive pulmonary disease (COPD) is a progressive condition associated with physical and cognitive impairments contributing to difficulty in performing activities of daily living (ADLs) that require dual tasking (eg, walking and talking). Despite evidence showing that cognitive decline occurs among patients with COPD and may contribute to functional limitations and decreased health-related quality of life (HRQL), pulmonary rehabilitation continues to focus mainly on physical training (ie, aerobic and strength exercises). An integrated cognitive and physical training program compared to physical training alone may be more effective in increasing dual-tasking ability among people living with COPD, leading to greater improvements in performance of ADLs and HRQL. OBJECTIVE: The aims of this study are to evaluate the feasibility of an 8-week randomized controlled trial of home-based, cognitive-physical training versus physical training for patients with moderate to severe COPD and derive preliminary estimates of cognitive-physical training intervention efficacy on measures of physical and cognitive function, dual task performance, ADLs, and HRQL. METHODS: A total of 24 participants with moderate to severe COPD will be recruited and randomized into cognitive-physical training or physical training. All participants will be prescribed an individualized home physical exercise program comprising 5 days of moderate-intensity aerobic exercise (30-50 minutes/session) and 2 days of whole-body strength training per week. The cognitive-physical training group will also perform cognitive training for approximately 60 minutes, 5 days per week via the BrainHQ platform (Posit Science Corporation). Participants will meet once weekly with an exercise professional (via videoconference) who will provide support by reviewing the progression of their training and addressing any queries. Feasibility will be assessed through the recruitment rate, program adherence, satisfaction, attrition, and safety. The intervention efficacy regarding dual task performance, physical function, ADLs, and HRQL will be evaluated at baseline and at 4 and 8 weeks. Descriptive statistics will be used to summarize intervention feasibility. Paired 2-tailed t tests and 2-tailed t tests will be used to compare the changes in the outcome measures over the 8-week study period within and between the 2 randomized groups, respectively. RESULTS: Enrollment started in January 2022. It is estimated that the enrollment period will be 24 months long, with data collection to be completed by December 2023. CONCLUSIONS: A supervised home-based cognitive-physical training program may be an accessible intervention to improve dual-tasking ability in people living with COPD. Evaluating the feasibility and effect estimates is a critical first step to inform future clinical trials evaluating this approach and its effects on physical and cognitive function, ADL performance, and HRQL. TRIAL REGISTRATION: ClinicalTrials.gov NCT05140226; https://clinicaltrials.gov/ct2/show/NCT05140226. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): DERR1-10.2196/48666.

18.
Assist Technol ; : 1-8, 2023 Jul 18.
Article in English | MEDLINE | ID: mdl-37463511

ABSTRACT

Research evidence demonstrates the negative effects of Whole-Body Vibration (WBV) and correlation between exposure to WBV and detriment to health. ISO Standard 2631-1 (1997) is the accepted standard for human exposure to WBV in vehicle vibration, and provides vibration guidelines for health and comfort. These standards have not been applied to power wheelchairs (PWC), and no clinical tool exists that measures vibration levels during live power wheelchair driving. This study measures WBV and shock levels during PWC driving, considering the impact of terrains, base configurations, and seat cushions. A sensor tag accelerometer was used to measure vibration and shock in three different PWC configurations driven over seven different terrains. Data was collected for two runs per wheelchair, per terrain type, per cushion type. Differences were significant (p < .001) for overall mean and median peak vibration compared across the seven terrains, and for overall mean vibration for basic and enhanced cushions. Differences were also noted in mean and peak vibration in the three different base configurations (p = .0052). Results were compared with ISO 2631-1 guidelines. Mechanical shock on certain terrains created peak vibration levels with likely health risk. Results from this study can inform PWC prescription process.

19.
Front Med (Lausanne) ; 10: 1158870, 2023.
Article in English | MEDLINE | ID: mdl-37305133

ABSTRACT

Background: Chronic lung allograft dysfunction (CLAD) is the major cause of death post-lung transplantation, with acute cellular rejection (ACR) being the biggest contributing risk factor. Although patients are routinely monitored with spirometry, FEV1 is stable or improving in most ACR episodes. In contrast, oscillometry is highly sensitive to respiratory mechanics and shown to track graft injury associated with ACR and its improvement following treatment. We hypothesize that intra-subject variability in oscillometry measurements correlates with ACR and risk of CLAD. Methods: Of 289 bilateral lung recipients enrolled for oscillometry prior to laboratory-based spirometry between December 2017 and March 2020, 230 had ≥ 3 months and 175 had ≥ 6 months of follow-up. While 37 patients developed CLAD, only 29 had oscillometry at time of CLAD onset and were included for analysis. These 29 CLAD patients were time-matched with 129 CLAD-free recipients. We performed multivariable regression to investigate the associations between variance in spirometry/oscillometry and the A-score, a cumulative index of ACR, as our predictor of primary interest. Conditional logistic regression models were built to investigate associations with CLAD. Results: Multivariable regression showed that the A-score was positively associated with the variance in oscillometry measurements. Conditional logistic regression models revealed that higher variance in the oscillometry metrics of ventilatory inhomogeneity, X5, AX, and R5-19, was independently associated with increased risk of CLAD (p < 0.05); no association was found for variance in %predicted FEV1. Conclusion: Oscillometry tracks graft injury and recovery post-transplant. Monitoring with oscillometry could facilitate earlier identification of graft injury, prompting investigation to identify treatable causes and decrease the risk of CLAD.

20.
Clin Transplant ; 37(10): e15053, 2023 10.
Article in English | MEDLINE | ID: mdl-37350742

ABSTRACT

BACKGROUND: Acute kidney injury (AKI) is a frequent adverse outcome following liver transplantation (LT) with a multifactorial etiology. It is critical to identify modifiable risk factors to mitigate the risk. One key area of interest is the role of intraoperative hypotension, which remains relatively unexplored in liver transplant cohorts. METHODS: This was a retrospective observational cohort study of 1292 adult patients who underwent LT (between 2009 and 2019). Multivariable logistic regression analysis was used to explore the association between intraoperative hypotension, quantified by time duration (in min) under various mean arterial pressure (MAP) thresholds, and the primary outcome of early postoperative AKI according to the KDIGO criteria. RESULTS: AKI occurred in 40% of patients and was independently associated with greater than 20 min spent below MAP thresholds of 55 mm Hg (adjusted OR = 1.866; 95% CI = 1.037-3.44; P = 0.041) and 50 mm Hg (adjusted OR = 1.801; 95% CI = 1.087-2.992; P = 0.023). Further sensitivity analyses demonstrated that the association between intraoperative hypotension and postoperative AKI was accentuated after restricting the analysis to patients with a normal preoperative renal function. CONCLUSIONS: Prolonged (>20 min) intraoperative hypotension (below a MAP of 55 mm Hg) was independently associated with AKI following LT, after adjusting for several known confounders.


Subject(s)
Acute Kidney Injury , Hypotension , Liver Transplantation , Adult , Humans , Cohort Studies , Liver Transplantation/adverse effects , Retrospective Studies , Postoperative Complications/etiology , Hypotension/complications , Risk Factors , Acute Kidney Injury/epidemiology , Acute Kidney Injury/etiology
SELECTION OF CITATIONS
SEARCH DETAIL
...