Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 52
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Diabetes Obes Metab ; 25(12): 3529-3537, 2023 12.
Article in English | MEDLINE | ID: mdl-37646197

ABSTRACT

BACKGROUND: Donor hyperglycaemia following brain death has been attributed to reversible insulin resistance. However, our islet and pancreas transplant data suggest that other mechanisms may be predominant. We aimed to determine the relationships between donor insulin use and markers of beta-cell death and beta-cell function in pancreas donors after brain death. METHODS: In pancreas donors after brain death, we compared clinical and biochemical data in 'insulin-treated' and 'not insulin-treated donors' (IT vs. not-IT). We measured plasma glucose, C-peptide and levels of circulating unmethylated insulin gene promoter cell-free DNA (INS-cfDNA) and microRNA-375 (miR-375), as measures of beta-cell death. Relationships between markers of beta-cell death and islet isolation outcomes and post-transplant function were also evaluated. RESULTS: Of 92 pancreas donors, 40 (43%) required insulin. Glycaemic control and beta-cell function were significantly poorer in IT donors versus not-IT donors [median (IQR) peak glucose: 8 (7-11) vs. 6 (6-8) mmol/L, p = .016; C-peptide: 3280 (3159-3386) vs. 3195 (2868-3386) pmol/L, p = .046]. IT donors had significantly higher levels of INS-cfDNA [35 (18-52) vs. 30 (8-51) copies/ml, p = .035] and miR-375 [1.050 (0.19-1.95) vs. 0.73 (0.32-1.10) copies/nl, p = .05]. Circulating donor miR-375 was highly predictive of recipient islet graft failure at 3 months [adjusted receiver operator curve (SE) = 0.813 (0.149)]. CONCLUSIONS: In pancreas donors, hyperglycaemia requiring IT is strongly associated with beta-cell death. This provides an explanation for the relationship of donor IT with post-transplant beta-cell dysfunction in transplant recipients.


Subject(s)
Cell-Free Nucleic Acids , Hyperglycemia , Islets of Langerhans Transplantation , MicroRNAs , Humans , C-Peptide , Brain Death , Insulin/genetics , Tissue Donors , Cell Death
2.
Thorax ; 77(4): 357-363, 2022 04.
Article in English | MEDLINE | ID: mdl-34301741

ABSTRACT

BACKGROUND: Lung clearance index (LCI) is a valuable research tool in cystic fibrosis (CF) but clinical application has been limited by technical challenges and uncertainty about how to interpret longitudinal change. In order to help inform clinical practice, this study aimed to assess feasibility, repeatability and longitudinal LCI change in children and adults with CF with predominantly mild baseline disease. METHODS: Prospective, 3-year, multicentre, observational study of repeated LCI measurement at time of clinical review in patients with CF >5 years, delivered using a rapid wash-in system. RESULTS: 112 patients completed at least one LCI assessment and 98 (90%) were still under follow-up at study end. The median (IQR) age was 14.7 (8.6-22.2) years and the mean (SD) FEV1 z-score was -1.2 (1.3). Of 81 subjects with normal FEV1 (>-2 z-scores), 63% had raised LCI (indicating worse lung function). For repeat stable measurements within 6 months, the mean (limits of agreement) change in LCI was 0.9% (-18.8% to 20.7%). A latent class growth model analysis identified four discrete clusters with high accuracy, differentiated by baseline LCI and FEV1. Baseline LCI was the strongest factor associated with longitudinal change. The median total test time was under 19 min. CONCLUSIONS: Most patients with CF with well-preserved lung function show stable LCI over time. Cluster behaviours can be identified and baseline LCI is a risk factor for future progression. These results support the use of LCI in clinical practice in identifying patients at risk of lung function decline.


Subject(s)
Cystic Fibrosis , Adolescent , Adult , Child , Disease Progression , Forced Expiratory Volume , Humans , Lung , Prospective Studies , Young Adult
3.
Heart Lung Circ ; 31(7): 1015-1022, 2022 Jul.
Article in English | MEDLINE | ID: mdl-35301985

ABSTRACT

PURPOSE: Cardiac catheter ablations are an established treatment for supraventricular tachycardia (SVT) involving prolonged cannulation of the common femoral vein with multiple catheters. This study aimed to identify the risk of deep vein thrombosis (DVT) by studying the frequency of this complication after catheter ablation. METHODS: This was a prospective multi-centre cohort study of patients undergoing cardiac ablation for atrioventricular nodal re-entry tachycardia or right-sided accessory atrioventricular connection. Those taking anticoagulation or antiplatelet therapy prior to the procedure were excluded. Following the procedure, bilateral venous duplex ultrasonography from the popliteal vein to the inferior vena cava for DVT was undertaken at 24 hours and between 10 to 14 days. RESULTS: Eighty (80) patients (mean age 47.6 yrs [SD 13.4] with 67% female) underwent cardiac ablation (median duration 70 mins). Seven (7) patients developed acute DVT in either the femoral or external iliac vein of the intervention leg, giving a frequency of 8.8% (95% CI 3.6-17.2%). No thrombus was seen in the contralateral leg (p=0.023). An elevated D-dimer prior to the procedure was significantly more frequent in patients developing DVT (42.9% vs 4.1%, p=0.0081; OR 17.0). No other patient or procedural characteristics significantly influenced the risk of DVT. CONCLUSION: In patients without peri-procedural anticoagulation catheter ablation precipitated DVT in the catheterised femoral or iliac veins in 8.8% of patients. Peri-procedure prophylactic anticoagulation may be considered for all patients undergoing catheter ablation for SVT. CLINICAL TRIAL REGISTRATION: https://clinicaltrials.gov/ct2/show/NCT03877770.


Subject(s)
Catheter Ablation , Venous Thrombosis , Anticoagulants , Catheter Ablation/adverse effects , Cohort Studies , Female , Fibrin Fibrinogen Degradation Products , Humans , Male , Middle Aged , Prospective Studies , Venous Thrombosis/diagnosis , Venous Thrombosis/etiology
4.
Diabetologia ; 64(6): 1375-1384, 2021 06.
Article in English | MEDLINE | ID: mdl-33665687

ABSTRACT

AIMS/HYPOTHESIS: Approximately 50% of organ donors develop hyperglycaemia in intensive care, which is managed with insulin therapy. We aimed to determine the relationships between donor insulin use (DIU) and graft failure in pancreas transplantation. METHODS: UK Transplant Registry organ donor data were linked with national data from the UK solid pancreas transplant programme. All pancreas transplants performed between 2004 and 2016 with complete follow-up data were included. Logistic regression models determined associations between DIU and causes of graft failure within 3 months. Area under the receiver operating characteristic curve (aROC) and net reclassification improvement (NRI) assessed the added value of DIU as a predictor of graft failure. RESULTS: In 2168 pancreas transplant recipients, 1112 (51%) donors were insulin-treated. DIU was associated with a higher risk of graft loss from isolated islet failure: OR (95% CI), 1.79 (1.05, 3.07), p = 0.03, and this relationship was duration/dose dependent. DIU was also associated with a higher risk of graft loss from anastomotic leak (2.72 [1.07, 6.92], p = 0.04) and a lower risk of graft loss from thrombosis (0.62 [0.39, 0.96], p = 0.03), although duration/dose-dependent relationships were only identified in pancreas transplant alone/pancreas after kidney transplant recipients with grafts failing due to thrombosis (0.86 [0.74, 0.99], p = 0.03). The relationships between donor insulin characteristics and isolated islet failure remained significant after adjusting for potential confounders: DIU 1.75 (1.02, 2.99), p = 0.04; duration 1.08 (1.01, 1.16), p = 0.03. In multivariable analyses, donor insulin characteristics remained significant predictors of lower risk of graft thrombosis in pancreas transplant alone/pancreas after kidney transplant recipients: DIU, 0.34 (0.13, 0.90), p = 0.03; insulin duration/dose, 0.02 (0.001, 0.85), p = 0.04. When data on insulin were added to models predicting isolated islet failure, a significant improvement in discrimination and risk reclassification was observed in all models: no DIU aROC 0.56; DIU aROC 0.57, p = 0.86; NRI 0.28, p < 0.00001; insulin duration aROC 0.60, p = 0.47; NRI 0.35, p < 0.00001. CONCLUSIONS/INTERPRETATION: DIU predicts graft survival in pancreas transplant recipients. This assessment could help improve donor selection and thereby improve patient and graft outcomes.


Subject(s)
Critical Care , Graft Survival , Hyperglycemia/drug therapy , Insulin/therapeutic use , Pancreas Transplantation , Adult , Female , Humans , Male , Middle Aged , Prognosis , Registries , Young Adult
5.
Diabetes Obes Metab ; 23(1): 49-57, 2021 01.
Article in English | MEDLINE | ID: mdl-32893472

ABSTRACT

AIMS: The relationship between peri-transplant glycaemic control and outcomes following pancreas transplantation is unknown. We aimed to relate peri-transplant glycaemic control to pancreas graft survival and to develop a framework for defining early graft dysfunction. METHODS: Peri-transplant glycaemic control profiles over the first 5 days postoperatively were determined by an area under the curve [AUC; average daily glucose level (mmol/L) × time (days)] and the coefficient of variation of mean daily glucose levels. Peri-transplant hyperglycaemia was defined as an AUC ≥35 mmol/day/L (daily mean blood glucose ≥7 mmol/L). Risks of graft failure associated with glycaemic control and variability and peri-transplant hyperglycaemia were determined using covariate-adjusted Cox regression. RESULTS: We collected 7606 glucose readings over 5 days postoperatively from 123 pancreas transplant recipients. Glucose AUC was a significant predictor of graft failure during 3.6 years of follow-up (unadjusted HR [95% confidence interval] 1.17 [1.06-1.30], P = .002). Death censored non-technical graft failure occurred in eight (10%) recipients with peri-transplant normoglycaemia, and eight (25%) recipients with peri-transplant hyperglycaemia such that hyperglycaemia predicted a 3-fold higher risk of graft failure [HR (95% confidence interval): 3.0 (1.1-8.0); P = .028]. CONCLUSION: Peri-transplant hyperglycaemia is strongly associated with graft loss and could be a valuable tool guiding individualized graft monitoring and treatment. The 5-day peri-transplant glucose AUC provides a robust and responsive framework for comparing graft function.


Subject(s)
Pancreas Transplantation , Blood Glucose , Glycemic Control , Graft Survival , Humans , Pancreas
6.
Diabetes Obes Metab ; 22(10): 1874-1879, 2020 10.
Article in English | MEDLINE | ID: mdl-32452110

ABSTRACT

Insulin is routinely used to manage hyperglycaemia in organ donors and during the peri-transplant period in islet transplant recipients. However, it is unknown whether donor insulin use (DIU) predicts beta-cell dysfunction after islet transplantation. We reviewed data from the UK Transplant Registry and the UK Islet Transplant Consortium; all first-time transplants during 2008-2016 were included. Linear regression models determined associations between DIU, median and coefficient of variation (CV) peri-transplant glucose levels and 3-month islet graft function. In 91 islet cell transplant recipients, DIU was associated with lower islet function assessed by BETA-2 scores (ß [SE] -3.5 [1.5], P = .02), higher 3-month post-transplant HbA1c levels (5.4 [2.6] mmol/mol, P = .04) and lower fasting C-peptide levels (-107.9 [46.1] pmol/l, P = .02). Glucose at 10 512 time points was recorded during the first 5 days peri-transplant: the median (IQR) daily glucose level was 7.9 (7.0-8.9) mmol/L and glucose CV was 28% (21%-35%). Neither median glucose levels nor glucose CV predicted outcomes post-transplantation. Data on DIU predicts beta-cell dysfunction 3 months after islet transplantation and could help improve donor selection and transplant outcomes.


Subject(s)
Diabetes Mellitus, Type 1 , Insulin-Secreting Cells , Islets of Langerhans Transplantation , Blood Glucose , C-Peptide , Glucose , Humans , Insulin , Tissue Donors
8.
Paediatr Anaesth ; 29(2): 161-168, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30447167

ABSTRACT

BACKGROUND: The local anesthetic, levobupivacaine, is the safer enantiomer of racemic bupivacaine. Present protocols for levobupivacaine are based on studies and pharmacokinetic modeling with racemic bupivacaine. AIMS: The aim is to investigate total serum levobupivacaine concentrations after a caudalepidural loading dose followed by a maintenance infusion over 48 hours in infants aged 3-6 months. METHODS: The clinical trial was conducted in eight infants aged 3-6 months, undergoing bladder exstrophy repair. Pharmacokinetic modeling allowed optimization of clinical sampling to measure total levobupivacaine and α1 -acid glycoprotein and prediction of the effect of α1 -acid glycoprotein on levobupivacaine plasma protein binding. RESULTS: The observed median total levobupivacaine serum concentration was 0.30 mg/L (range: 0.20-0.70 mg/L) at 1 hour after the loading dose of 2 mg/kg. The median total levobupivacaine concentration after 47 hours of infusion, at 0.2 mg/kg/h, was 1.21 mg/L (0.07-1.85 mg/L). Concentrations of α1 -acid glycoprotein were found to rise throughout the study period. Pharmacokinetic modeling suggested that unbound levobupivacaine quickly reached steady state at a concentration of approximately 0.03 mg/L. CONCLUSION: The study allows the development of a pharmacokinetic model, combining levobupivacaine and α1 -acid glycoprotein data. Modeling indicates that unbound levobupivacaine quickly reaches steady state once the infusion is started. Simulations suggest that it may be possible to continue the infusion beyond 48 hours.


Subject(s)
Anesthesia, Epidural/methods , Anesthetics, Local/administration & dosage , Levobupivacaine/administration & dosage , Orosomucoid/metabolism , Analgesia, Epidural/methods , Anesthetics, Local/blood , Anesthetics, Local/pharmacokinetics , Bladder Exstrophy/surgery , Humans , Infant , Levobupivacaine/blood , Levobupivacaine/pharmacokinetics , Pain Measurement , Pain, Postoperative/drug therapy , Pain, Postoperative/metabolism , Prospective Studies
9.
Clin Otolaryngol ; 44(6): 1045-1058, 2019 11.
Article in English | MEDLINE | ID: mdl-31544346

ABSTRACT

OBJECTIVE: To characterise the burden of voice disorders in teachers in a UK population, compare it with non-teachers and identify groups of teachers who may be particularly at risk of developing a voice problem. DESIGN: Questionnaire-based survey of primary and secondary school teachers and non-teachers. Questions consisted of general demographics, VHI-10 and questions relating to voice problems. METHODS: Distribution of questionnaires to teachers and non-teachers and statistical analysis of the responses. SETTING: University teaching hospital. PARTICIPANTS: Teachers and non-teachers in a region of North West England. MAIN OUTCOME MEASURES: Identification of risk factors for voice problems in teachers, compared to non-teachers. RESULTS: A total of 210 primary and 244 secondary school teachers and 304 non-teachers participated in the questionnaire survey. Response rates were 67.9% from primary schools, 41.2% from secondary schools and 40.0% from the non-teachers. 30.0% of teachers and 9.0% of non-teachers had reported problems with their voice. 12.8% of teachers and 2.0% of non-teachers had missed work due to voice problems. 14.1% of teachers and 5.3% non-teachers had seen a general practitioner for voice-related problems, whilst 7.1% of teachers and 6.3% of non-teachers had been referred to an otolaryngologist or speech therapist for voice problems. Factors related to VHI-10 (P < .05) were identified. CONCLUSIONS: Voice disorders are an occupational health problem for teachers, with a significant burden of these disorders in this group of teachers in the UK. We have identified risk factors that could be exploited to identify groups of teachers who would benefit from early intervention.


Subject(s)
Faculty , Voice Disorders/epidemiology , Adult , England/epidemiology , Female , Humans , Male , Risk Factors , Surveys and Questionnaires , Voice Quality
13.
Pediatr Res ; 76(2): 184-9, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24819374

ABSTRACT

BACKGROUND: Permissive hypercapnia is routinely practiced in neonatal intensive care units. The effect of permissive hypercapnia on the preterm brain and brain electrical activity is unknown. In this study, we aimed to determine the effect of chronic changes in partial pressure of blood carbon dioxide (PcO2) on brain electrical activity in preterm newborn babies born at or before 32 wk gestation. METHODS: Eighty-three 1-h long digital electroencephalography (EEG) recordings were performed once a week for 4 wk on 25 babies with median gestational age of 29 wk (range: 23-32) after 48 h of age. Capillary blood gas measurements were performed midway through EEG recordings. RESULTS: There are associations between EEG parameters and blood pH, PcO2, and blood glucose concentration. However, there are also strong and complex associations with gestational age and substantial individual patient effects that make it difficult to demonstrate predictive associations. PcO2 and bicarbonate are significantly correlated with relative power of θ EEG band and Δ EEG band respectively after adjustment for age and intrababy correlations, but after allowing for multiple testing these relationships are of borderline statistical significance. CONCLUSION: Compensated respiratory acidosis may affect EEG by increased delta wave activity in preterm babies born before 32 wk gestation.


Subject(s)
Brain/physiology , Hypercapnia/physiopathology , Infant, Premature/physiology , Age Factors , Bicarbonates/blood , Blood Glucose , Brain Mapping , Carbon Dioxide/blood , Electroencephalography , Gestational Age , Humans , Hydrogen-Ion Concentration , Infant, Newborn , Prospective Studies
14.
World J Surg ; 38(10): 2558-70, 2014 Oct.
Article in English | MEDLINE | ID: mdl-24770891

ABSTRACT

BACKGROUND: The aim of this study was to conduct a comprehensive systematic review comparing tissue glue (TG) against tacks/staples for mesh fixation in laparoscopic (totally extra-peritoneal and trans-abdominal pre-peritoneal) groin hernia repair with the incidence of post-operative chronic pain as the primary outcome measure. METHODS: A computerized search of MEDLINE, EMBASE, and Cochrane databases for the period from 1 January 1,990 to 30 June 2013 produced 39 reports. The quality of reports was assessed according to criteria reported by the Cochrane communication review group. RESULTS: Five randomized controlled trials (RCTs, 491 patients) and five non-RCTs (1,034 patients) fulfilled the selection criteria. A meta-analysis of chronic pain from the five RCTs gave a statistically significant Peto odds ratio (OR) of 0.40 (0.21-0.76; p = 0.005) indicating that the TG group experience less chronic pain. Although the studies are underpowered to detect recurrence, the meta-analysis of the recurrence rates from the RCTs identified no difference between tacks/staple and glue fixation (OR 2.36; 0.67-8.37). There were also no differences found in meta-analysis of seroma and hematoma formation between the two methods of fixation. The wide variation in time points regarding pain score meant it was not possible to combine the studies and perform analysis for pain score with earlier time points. CONCLUSIONS: Meta-analysis of RCTs comparing TG with tack fixation in laparoscopic inguinal hernia surgery depicts a significant reduction in chronic pain with no increase in recurrence rates. Early post-operative outcome is similar after both methods of mesh fixation, although larger RCTs are required, with long-term pain as the primary endpoint.


Subject(s)
Hernia, Inguinal/surgery , Herniorrhaphy/methods , Surgical Mesh , Surgical Stapling , Tissue Adhesives/therapeutic use , Chronic Pain , Hematoma/etiology , Humans , Laparoscopy/methods , Pain, Postoperative/etiology , Recurrence , Seroma/etiology , Surgical Mesh/adverse effects , Surgical Stapling/adverse effects , Tissue Adhesives/adverse effects
15.
BMC Nephrol ; 15: 84, 2014 May 29.
Article in English | MEDLINE | ID: mdl-24885247

ABSTRACT

BACKGROUND: AKI is common among hospital in-patients and places a huge financial burden on the UK National Health Service, causing increased length of hospital stay and use of critical care services, with increased requirement for complex interventions including dialysis. This may account for up to 0.6% of the total Health Service budget. To investigate the incidence and consequences of AKI, all unselected emergency admissions to a large acute UK single centre University Teaching Hospital over two separate 7 day periods were reviewed. METHODS: A retrospective audit of 745 case records was undertaken (54.6% male) including laboratory data post-discharge or death, with classification of AKI by RIFLE, AKIN and AKIB criteria. Participants were included whether admitted via their general practitioners, the emergency department, or as tertiary specialty transfers. Outcome measures were presence or absence of AKI recorded using each of the three AKI criteria, length of hospital stay (LOS), admission to, and LOS in critical care, and mortality. The most severe grade of AKI only, at any time during the admission, was recorded to prevent double counting. Renal outcome was determined by requirement for renal replacement therapy (RRT), and whether those receiving RRT remained dialysis dependent or not. RESULTS: AKI incidence was 25.4% overall. With approximately one third present on admission and two thirds developing post admission. The AKI group had LOS almost three times higher than the non AKI group (10 vs 4 days). Requirement for critical care beds was 8.1% in the AKI group compared to 1.7% in non AKI group. Overall mortality was 5.5%, with the AKI group at 11.4% versus 3.3% in the non AKI group. CONCLUSIONS: AKI in acute unselected hospital admissions is more common than existing literature suggests, affecting 25% of unselected admissions. In many this is relatively mild and may resolve spontaneously, but is associated with increased LOS, likelihood of admission to critical care, and risk of death. If targeted effective interventions can be developed it seems likely that substantial clinical benefits for the patient, as well as financial and structural benefits for the healthcare organisation may accrue.


Subject(s)
Acute Kidney Injury/mortality , Acute Kidney Injury/therapy , Emergency Service, Hospital/statistics & numerical data , Hospital Mortality , Kidney Transplantation/mortality , Length of Stay/statistics & numerical data , Renal Replacement Therapy/mortality , Adult , Aged , Aged, 80 and over , Female , Health Impact Assessment , Humans , Incidence , Male , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate , Treatment Outcome , United Kingdom/epidemiology
16.
J Pediatr Urol ; 2024 Apr 29.
Article in English | MEDLINE | ID: mdl-38762401

ABSTRACT

INTRODUCTION: Bladder Exstrophy and Epispadias Complex (BEEC) is associated with psychosocial difficulties. Screening questionnaires, alongside consultation with a clinical psychologist, can help identify children/young people for further assessment and track trends over time to improve service delivery. OBJECTIVE: To screen paediatric BEEC patients for a range of general psychosocial difficulties in a multi-disciplinary out-patient clinic setting and compare these results with our previous study and against published norms. STUDY DESIGN: The current service evaluation collected data at outpatient BEEC clinic appointments between 2016 and 2022 (2016-2022 sample). Children aged 4-5, 10-11 and 14-15 years and their parents/proxy were asked to complete two standardised psychosocial questionnaires: Paediatric Quality of Life Inventory (PedsQL 4.0 Generic Core and Family Impact Module) and the Strengths and Difficulties Questionnaires (SDQ). 79 children (CYP) and 93 parent/proxy (P/P) responses were recorded. The sample included paired CYP and P/P responses for the PedsQL (n = 67) and SDQ (n = 35). The mean age for CYP was 9.9 years (SD 3.99, range 2-17), 69.8% (n = 120) of responses for male children. RESULTS: The percentage of total CYP scores falling within the 'At Risk' category on the PedsQL increased in the 2016-2022 sample compared to our 2015 sample, indicating the prevalence of greater difficulties. Differences between P/P and CYP responses on both the PedsQL and SDQ favoured CYP. Age, gender, and diagnosis appeared to influence certain questionnaire responses, depending on respondent (CYP or P/P). A significant difference between P/P and CYP in the emotional domain of the PedsQL for those aged 13-18 was observed (p = 0.020), with P/P reporting greater difficulties, but this was not seen in the younger age ranges. Physical Health on the PedsQL were significantly lower for children with a cloacal exstrophy diagnosis, in comparison to bladder exstrophy and epispadias. P/P SDQ scores for boys were significantly higher in several domains. CONCLUSIONS: The results demonstrate the need for psychosocial screening, providing benchmarking for psychosocial difficulties within this patient group. Results indicate that patients accessing our clinic are reporting a higher level of challenge across psychosocial domains in recent years reflected by the percentage within the 'At Risk' category for psychosocial difficulty. Linked questionnaire data with condition specific information and surgical history would improve service evaluations. CYP reaching clinical thresholds are offered further psychological assessment within the service.

17.
J Diabetes Sci Technol ; : 19322968241245923, 2024 Apr 14.
Article in English | MEDLINE | ID: mdl-38616550

ABSTRACT

INTRODUCTION: Patients with congenital hyperinsulinism (HI) require constant glucose monitoring to detect and treat recurrent and severe hypoglycemia. Historically, this has been achieved with intermittent self-monitoring blood glucose (SMBG), but patients are increasingly using continuous glucose monitoring (CGM). Given the rapidity of CGM device development, and increasing calls for CGM use from HI families, it is vital that new devices are evaluated early. METHODS: We provided two months of supplies for the new Dexcom G7 CGM device to 10 patients with HI who had recently finished using the Dexcom G6. Self-monitoring blood glucose was performed concurrently with paired readings providing accuracy calculations. Patients and families completed questionnaires about device use at the end of the two-month study period. RESULTS: Compared to the G6, the G7 showed a significant reduction in mean absolute relative difference (25%-18%, P < .001) and in the over-read error (Bland Altman +1.96 SD; 3.54 mmol/L to 2.95 mmol/L). This resulted in an improvement in hypoglycemia detection from 42% to 62% (P < .001). Families reported an overall preference for the G7 but highlighted concerns about high sensor failure rates. DISCUSSION: The reduction in mean absolute relative difference and over-read error and the improvement in hypoglycemia detection implies that the G7 is a safer and more useful device in the management of hypoglycemia for patients with HI. Accuracy, while improved from previous devices, remains suboptimal with 40% of hypoglycemia episodes not detected.

18.
Front Endocrinol (Lausanne) ; 15: 1282925, 2024.
Article in English | MEDLINE | ID: mdl-38567303

ABSTRACT

Background: Encapsulating peritoneal sclerosis (EPS) is a rare complication of prolonged peritoneal dialysis (PD) exposure, characterised by peritoneal thickening, calcification, and fibrosis ultimately presenting with life-threatening bowel obstruction. The presence or role of peritoneal calcification in the pathogenesis of EPS is poorly characterised. We hypothesise that significantly aberrant bone mineral metabolism in patients on PD can cause peritoneal calcification which may trigger the development of EPS. We compared the temporal evolution of bone mineral markers during PD in EPS patients with non-EPS long-term PD controls. Methods: Linear mixed model and logistic regression analysis were used to compare four-monthly serum levels of calcium, phosphate, parathyroid hormone, and alkaline phosphatase (ALP) over the duration of PD exposure in 46 EPS and 46 controls (PD, non-EPS) patients. Results: EPS patients had higher mean calcium (2.51 vs. 2.41 mmol/L) and ALP (248.00 vs. 111.13 IU/L) levels compared with controls (p=0.01 and p<0.001 respectively, maximum likelihood estimation). Logistic regression analysis demonstrated that high serum calcium and phosphate levels during PD were associated with a 4.5 and 2.9 fold increase in the risk of developing EPS respectively. Conclusion: High levels of calcium and phosphate in patients on PD were identified to be risk factors for EPS development. Possible reasons for this may be an imbalance of pro-calcifying factors and calcification inhibitors promoting peritoneal calcification which increases peritoneal stiffness. Mechanical alterations may trigger, unregulated fibrosis and subsequent development of EPS. Improved management of secondary hyperparathyroidism during PD may ultimately diminish the EPS risk.


Subject(s)
Calcinosis , Hyperparathyroidism , Peritoneal Fibrosis , Humans , Peritoneal Fibrosis/etiology , Calcium , Risk Factors , Calcinosis/etiology , Minerals , Phosphates
19.
Transplant Cell Ther ; 30(5): 488.e1-488.e15, 2024 May.
Article in English | MEDLINE | ID: mdl-38369017

ABSTRACT

The majority of established KIR clinical assessment algorithms used for donor selection for hematopoietic progenitor cell transplantation (HPCT) evaluate gene content (presence/absence) of the KIR gene complex. In comparison, relatively little is known about the impact of KIR allelic polymorphism. By analyzing donors of T cell depleted (TcD) reduced intensity conditioning (RIC) HPCT, this study investigated the influence on post-transplant outcome of 2 polymorphic residues of the inhibitory KIR2DL1. The aim of this study was to expand upon existing research into the influence of KIR2DL1 allelic polymorphism upon post-transplant outcome. The effects of allele groups upon transplant outcomes were investigated within a patient cohort using a defined treatment protocol of RIC with TcD. Using phylogenetic data, KIR2DL1 allelic polymorphism was categorized into groups on the basis of variation within codons 114 and 245 (positive or negative for the following groups: KIR2DL1*002/001g, KIR2DL1*003, KIR2DL1*004g) and the identification of null alleles. The influence of these KIR2DL1 allele groups in hematopoietic progenitor cell transplantation (HPCT) donors was assessed in the post-transplant data of 86 acute myelogenous leukemia patients receiving RIC TcD HPCT at a single center. KIR2DL1 allele groups in the donor significantly impacted upon 5-year post-transplant outcomes in RIC TcD HPCT. Donor KIR2DL1*003 presented the greatest influence upon post-transplant outcomes, with KIR2DL1*003 positive donors severely reducing 5-year post-transplant overall survival (OS) compared to those receiving a transplant from a KIR2DL1*003 negative donor (KIR2DL1*003 pos versus neg: 27.0% versus 60.0%, P = .008, pc = 0.024) and disease-free survival (DFS) (KIR2DL1*003 pos versus neg: 23.5% versus 60.0%, P = .004, pc = 0.012), and increasing 5-year relapse incidence (KIR2DL1*003 pos versus neg: 63.9% versus 27.2%, P = .009, pc = 0.027). KIR2DL1*003 homozygous and KIR2DL1*003 heterozygous grafts did not present significantly different post-transplant outcomes. Donors possessing the KIR2DL1*002/001 allele group were found to significantly improve post-transplant outcomes, with donors positive for the KIR2DL1*004 allele group presenting a trend towards improvement. KIR2DL1*002/001 allele group (KIR2DL1*002/001g) positive donors improved 5-year OS (KIR2DL1*002/001g pos versus neg: 56.4% versus 27.2%, P = .009, pc = 0.024) and DFS (KIR2DL1*002/001g pos versus neg: 53.8% versus 25.5%, P = .018, pc = 0.036). KIR2DL1*004 allele group (KIR2DL1*004g) positive donors trended towards improving 5-year OS (KIR2DL1*004g pos versus neg: 53.3% versus 35.5%, P = .097, pc = 0.097) and DFS (KIR2DL1*004g pos versus neg: 50.0% versus 33.9%, P = .121, pc = 0.121), and reducing relapse incidence (KIR2DL1*004g pos versus neg: 33.1% versus 54.0%, P = .079, pc = 0.152). The presented findings suggest donor selection algorithms for TcD RIC HPCT should consider avoiding KIR2DL1*003 positive donors, where possible, and contributes to the mounting evidence that KIR assessment in donor selection algorithms should reflect the conditioning regime protocol used.


Subject(s)
Alleles , Hematopoietic Stem Cell Transplantation , Polymorphism, Genetic , Receptors, KIR2DL1 , Transplantation Conditioning , Adult , Female , Humans , Male , Hematopoietic Stem Cell Transplantation/methods , Leukemia, Myeloid, Acute/genetics , Leukemia, Myeloid, Acute/therapy , Lymphocyte Depletion , Receptors, KIR2DL1/genetics , T-Lymphocytes/immunology , Tissue Donors , Treatment Outcome
20.
ESC Heart Fail ; 2024 May 07.
Article in English | MEDLINE | ID: mdl-38712903

ABSTRACT

AIMS: Clinical pathways have been shown to improve outcomes in patients with heart failure (HF). Although patients with HF often have a cardiac implantable electronic device, few studies have reported the utility of device-derived risk scores to augment and organize care. TriageHF Plus is a device-based HF clinical pathway (DHFP) that uses remote monitoring alerts to trigger structured telephone assessment for HF stability and optimization. We aimed to evaluate the impact of TriageHF Plus on hospitalizations and describe the associated workforce burden. METHODS AND RESULTS: TriageHF Plus was a multi-site, prospective study that compared outcomes for patients recruited between April 2019 and February 2021. All alert-triggered assessments were analysed to determine the appropriateness of the alert and the workload burden. A negative-binomial regression with inverse probability treatment weighting using a time-matched usual care cohort was applied to estimate the effect of TriageHF Plus on non-elective hospitalizations. A post hoc pre-COVID-19 sensitivity analysis was also performed. The TriageHF Plus cohort (n = 443) had a mean age of 68.8 ± 11.2 years, 77% male (usual care cohort: n = 315, mean age of 66.2 ± 14.5 years, 65% male). In the TriageHF Plus cohort, an acute medical issue was identified following an alert in 79/182 (43%) cases. Fifty assessments indicated acute HF, requiring clinical action in 44 cases. At 30 day follow-up, 39/66 (59%) of initially symptomatic patients reported improvement, and 20 (19%) initially asymptomatic patients had developed new symptoms. On average, each assessment took 10 min. The TriageHF Plus group had a 58% lower rate of hospitalizations across full follow-up [incidence relative ratio: 0.42, 95% confidence interval (CI): 0.23-0.76, P = 0.004]. Across the pre-COVID-19 window, hospitalizations were 31% lower (0.69, 95% CI: 0.46-1.04, P = 0.077). CONCLUSIONS: These data represent the largest real-world evaluation of a DHFP based on multi-parametric risk stratification. The TriageHF Plus clinical pathway was associated with an improvement in HF symptoms and reduced all-cause hospitalizations.

SELECTION OF CITATIONS
SEARCH DETAIL