RESUMO
Introduction: Autosomal dominant polycystic kidney disease (ADPKD) is the most common monogenic nephropathy and has striking familial variability of disease severity. Methods: To better comprehend familial phenotypic variability, we analyzed clinical and pedigree data on 92 unrelated ADPKD kindreds with ≥2 affected individuals (N = 292) from an Irish population. All probands underwent genetic sequencing. Age at onset of kidney failure (KF), decline in estimated glomerular filtration rate (eGFR), predicting renal outcome in polycystic kidney disease (PROPKD) score, and imaging criteria were used to assess and grade disease severity as mild, intermediate, or severe. One mild and 1 severe case per family defined marked intrafamilial variability of disease severity. Results: Marked intrafamilial variability was observed in at least 13% of the 92 families, with a higher proportion of families carrying PKD1-nontruncating (PKD1-NT) variants. In families with ≥2 members affected by KF, the average intrafamilial age difference was 7 years, and there was no observed difference in intrafamilial variability of age at KF between allelic groups. The prespecified criteria showed marked familial variability in 7.7%, 8.4%, and 24% for age at KF, the PROPKD score, and imaging criteria, respectively. In our multivariate mixed-effects model, the intrafamilial variability in kidney survival was independent of the measured genotypic factors associated with prognosis and survival (P = <0.001). Conclusion: Using objective measures, we quantified marked intrafamilial variability in ADPKD disease phenotype in at least 13% of families. Our findings indicate that intrafamilial phenotypic variability remains incompletely understood and necessitates a more thorough identification of relevant clinical and genotypic factors.
RESUMO
BACKGROUND: Early-onset scoliosis (EOS) is frequently associated with complex spine and chest wall deformities that may lead to severe cardiopulmonary impairment and malnutrition. The aim of this study is to evaluate the change in the nutritional status of EOS patients after treatment with magnetically controlled growing rod instrumentation (MCGR) in a single center. METHODS: We prospectively collected data of patients treated with MCGR for EOS in a single center. Exclusion criteria were <2 years' follow-up and incomplete weight-for-age Z-scores (WAZ) data. Preoperative and postoperative WAZ, radiographic parameters, including major coronal curve, kyphosis angle, space available for lung ratios, thoracic height, and unplanned returns to the operating room (UPROR), were analyzed. SD and 95% Confidence intervals (CI) are presented with means. RESULTS: Sixty-eight patients (37 males/31 females) were included. The mean age at surgery was 8.2 years (SD 2.8, range 1.8-14.2), and the mean follow-up time was 3.8 years (SD 1.0, range 2.1-6.8). The study population was categorized by the primary diagnosis as follows: 23 neuromuscular, 18 idiopathic, 15 congenital, and 12 syndromic patients. The major coronal curve improved between the preoperative and latest visits by 40% ( P <0.005, SD 27, CI 33-47), while the space available for lung ratios improved by 8% ( P <0.005, SD 13, CI 5-12). Thoracic height increased by 25% ( P <0.005, SD 13, CI 22-28), and kyphosis angle decreased by 25% ( P <0.005, SD 26, CI 9-39). Eighteen patients (27%) required a total of 53 UPRORs. WAZ improved significantly between the preoperative and the latest follow-up ( P =0.005). Regression analysis showed WAZ improvements were most significant in the underweight patients and the Idiopathic or Syndromic EOS patients. UPROR was not associated with deterioration in WAZ. CONCLUSIONS: Treatment of EOS patients with MCGR resulted in an improvement in nutritional status, as evidenced by the significant increase in WAZ. Underweight, Idiopathic and Syndromic EOS patients, and those who required UPROR all had significant improvement in their WAZ with MCGR treatment. LEVEL OF EVIDENCE: Therapeutic Study-Level II.
Assuntos
Cifose , Escoliose , Masculino , Feminino , Humanos , Lactente , Pré-Escolar , Criança , Adolescente , Escoliose/cirurgia , Seguimentos , Magreza , Resultado do Tratamento , Cifose/cirurgia , Aumento de Peso , Estudos RetrospectivosRESUMO
BACKGROUND: Solid organ transplant recipients are recognized to carry a high burden of malignancy and frequently this cancer develops in the head and neck region. Furthermore, cancer of the head and neck post-transplant carries a significantly increased mortality. In this study, we aim to conduct a national retrospective cohort study to investigate the impact of head and neck cancer in terms of frequency and mortality in a large group of solid organ transplant recipients over a 20 year time span and compare the mortality in transplant patients to non-transplant patients with head and neck cancer. METHODS: Patients in the Republic of Ireland who underwent solid organ transplantation between 1994 and 2014 who developed post-transplant head and neck malignancy were identified from the records of two prospective, national databases (National Cancer Registry of Ireland (NCRI) and The Irish Transplant Cancer Group database) working in conjunction with each other. Incidence of head and neck malignancy post-transplant was compared with the general population by means of standardised incidence ratios (SIR). Cumulative incidence of all cause and cancer related mortality from head and neck keratinocytic was undertaken by a competing risks analysis. RESULTS: A total of 3346 solid organ transplant recipients were identified, 2382 (71.2 %) kidney, 562 (16.8 %) liver, 214 (6.4 %) cardiac and 188 (5.6 %) lung. During the period of follow up of 428 patients developed head and neck cancer, representing (12.8 %) of the population. 97 % of these patients developed keratinocytic cancers, specifically, of head and neck. The frequency of post-transplant head and neck cancer was related to the duration of immunosuppression with 14 % of patients developing cancer at 10 years and 20 % having developed at least one cancer by 15 years. 12 (3 %) patients developed non-cutaneous head and neck malignancy. 10 (0.3 %) patients died due to head and neck keratinocytic malignancy post-transplant. Competing risk analysis demonstrated that organ transplantation conferred a strong independent effect of death, compared to non-transplant patients with head and neck keratinocytes. This applied specifically for kidney (HR 4.4, 95 % CI 2.5-7.8) and heart transplants (HR 6.5, 95 % CI 2.1-19.9), and overall, across the four transplant categories (P < 0.001). The SIR of developing keratinocyte cancer varied based on primary tumor site, gender, and type of transplant organ. CONCLUSION: Transplant patients demonstrate a particularly high rate of head and neck keratinocyte cancer with a very high rate of associated mortality. Physicians should be cognizant of the increased rate of malignancy in this population and monitor for red flag signs/symptoms.
Assuntos
Neoplasias de Cabeça e Pescoço , Transplante de Órgãos , Humanos , Estudos de Coortes , Estudos Retrospectivos , Estudos Prospectivos , Irlanda/epidemiologia , Neoplasias de Cabeça e Pescoço/epidemiologia , Neoplasias de Cabeça e Pescoço/etiologia , Transplante de Órgãos/efeitos adversos , Incidência , Fatores de RiscoRESUMO
BACKGROUND: A fundamental tenent of treating developmental dysplasia of the hip is to identify patients with dislocated hips early so as to avoid the long-term sequelae of late diagnosis. The aim of this study was to develop a readily useable triage tool for patients with suspected hip dislocation, based on the clinical history and examination findings of the referring practitioner. METHODS: All primary care referrals (n=934) over a 3-year period for suspected developmental dysplasia of the hip to a tertiary pediatric center were evaluated. Defined parameters with respect to history and clinical examination were evaluated. Multivariable logistic regression was used to establish predictors of hip dislocation, and from this a predictive model was derived which incorporated significant predictors of dislocation. An illustrative nomogram translated this predictive model into a usable numerical scoring system called the Children's Hip Prediction score, which estimates probability of hip dislocation. RESULTS: There were 97 dislocated hips in 85 patients. The final predictive model included age, sex, family history, breech, gait concerns, decreased abduction, leg length discrepancy, and medical/neurological syndrome. The area under receiver operating curve for the model is 0.761. A Children's Hip Prediction score of≥5 corresponds to a sensitivity of 76.3% and a score of≥15 has a specificity of 97.8%, corresponding to an odds ratio of 27.3 for increased risk of dislocation. CONCLUSION: We found that a novel clinical prediction score, based on readily available history and examination parameters strongly predicted risk of dislocations in hip dysplasia referral. It is hoped that this tool could be utilized to optimize resource allocation and may be of particular benefit in less well-resourced health care systems. LEVEL OF EVIDENCE: Level II.
Assuntos
Displasia do Desenvolvimento do Quadril , Luxação Congênita de Quadril , Luxação do Quadril , Luxações Articulares , Criança , Luxação do Quadril/diagnóstico , Luxação Congênita de Quadril/diagnóstico , Luxação Congênita de Quadril/terapia , Humanos , Encaminhamento e Consulta , Estudos Retrospectivos , Fatores de Risco , TriagemRESUMO
INTRODUCTION: Fragility hip fractures are common and costly. Secondary fracture prevention is a treatment goal following hip fracture; however, the number of those that proceed to fracture their contralateral hip in Ireland is unknown. There are plans to introduce a Fracture Liaison Service Database in Ireland which will aim to prevent secondary fractures. To establish a baseline figure for secondary hip fractures, the injury radiographs of 1284 patients from 6 teaching hospitals over a 1-year period were reviewed. METHODS: Irish Hip Fracture Datasheets and corresponding injury radiographs were reviewed locally for all hip fractures within each respective teaching hospital for a 1-year period (2019). RESULTS: A total of 8.7% of all fragility hip fractures across the 6 hospitals were secondary hip fractures (range 4.9-11.5%). 46% occurred within years 1 to 3 following index hip fracture. Forty-eight per cent of patients were started on bone protection medications following their second hip fracture. DISCUSSION/CONCLUSION: Approximately 1 in 11 hip fractures treated across the 6 teaching hospitals assessed in 2019 was a patient's second hip fracture. We advocate for the widespread availability of Fracture Liaison Services to patients throughout Ireland to assist secondary fracture prevention.
Assuntos
Fraturas do Quadril , Fraturas por Osteoporose , Fraturas do Quadril/diagnóstico por imagem , Fraturas do Quadril/epidemiologia , Hospitais de Ensino , Humanos , Irlanda/epidemiologia , Fraturas por Osteoporose/terapia , Prevenção SecundáriaRESUMO
BACKGROUND AND PURPOSE: Currently the Irish Hip Fracture Standards [IHFS] recommend a Time-to-Surgery [TTS] of within 48 h of admission. The aim of our research is to determine if there was a statistically significant relationship between TTS and 30-day or one-year mortality and to assess whether a 48 h window for surgery is still the most appropriate recommendation. METHODS USED: This was a single-hospital retrospective review of all of the fragility hip fractures between 1st January 2013 and 31st December 2017. Patient demographics were described using descriptive statistics. Dependent variables of interest were 30-day mortality and one-year mortality. Independent predictor variables analysed included age, ASA grade, fracture type, surgery performed, anaesthesia administered, length of stay and TTS (hours as an interval variable), TTS in less than 36 h (binary variable) and TTS in less than 48 h (binary variable). When the significant predictor variables were identified, in order to control for confounder variables, a multivariate regression analysis was performed to identify which predictors were still significantly associated with the outcome variables even after controlling for all other known confounder variables. RESULTS: In total, 806 patients were identified. TTS within 36 h was predictive of a significantly lower 30-day mortality when compared to those undergoing surgery after 36 h (p = 0.031). In contrast, TTS within 48 h did not demonstrate a significantly lower 30-day mortality when compared to those undergoing surgery after 48 h (p = 0.104). On multivariate regression analysis, TTS <36 h (p = 0.011) and age (p < 0.0001) were all independently predictive of 30-day mortality. On multivariate regression analysis, both age (p < 0.0001) and TTS < 36 h (p = 0.002) were significantly predictive of one-year mortality. CONCLUSION: Performing hip fracture surgery within 36 h confers a significant reduction in both 30-day and one-year mortality rates when compared to patients undergoing surgery outside of this time frame. A 36-h window also appears to be superior to a 48-h window because performing surgery within 48 h has no significant impact on the reduction of 30-day mortality rates. We recommend that national guidelines reflect these important findings.
Assuntos
Fraturas do Quadril , Fraturas do Quadril/cirurgia , Mortalidade Hospitalar , Hospitalização , Humanos , Estudos Retrospectivos , Fatores de RiscoRESUMO
BACKGROUND: The role of kidney volume measurement in predicting the donor and recipient kidney function is not clear. METHODS: We measured kidney volume bilaterally in living kidney donors using CT angiography and assessed the association with the donor remaining kidney and recipient kidney (donated kidney) function at 1 year after kidney transplantation. Donor volume was categorized into tertiles based on lowest, middle, and highest volume. RESULTS: There were 166 living donor and recipient pairs. The mean donor age was 44.8 years (SD ± 10.8), and donor mean BMI was 25.5 (SD ± 2.9). The recipients of living donor kidneys were 64% male and had a mean age of 43.5 years (SD ± 13.3). Six percent of patients experienced an episode of cellular rejection and were maintained on dialysis for a mean of 18 months (13-32) prior to transplant. Kidney volume was divided into tertiles based on lowest, middle, and highest volume. Kidney volume median (range) in tertiles 1, 2, and 3 was 124 (89-135 ml), 155 (136-164 ml), and 184 (165-240 ml) with donor eGFR ml/min (adjusted for body surface area expressed as ml/min/1.73 m2) at the time of donation in each tertile, 109 (93-129), 110 (92-132), and 101 ml/min (84-117). The median (IQR) eGFR in tertiles 1 to 3 in kidney recipients at 1 year after donation was 54 (44-67), 62 (50-75), and 63 ml/min (58-79), respectively. The median (IQR) eGFR in tertiles 1 to 3 in the remaining kidney of donors at 1 year after donation was 59 (53-66), 65 (57-72), and 65 ml/min (56-73), respectively. CONCLUSION: Bigger kidney volume was associated with better eGFR at 1 year after transplant in the recipient and marginally in the donor remaining kidney.
RESUMO
BACKGROUND: Non-traditional cardiovascular risk factors, including calcium and phosphate derangement, may play a role in mortality in renal transplant. The data regarding this effect are conflicting. Our aim was to assess the impact of calcium and phosphate derangements in the first 90 days post-transplant on allograft and recipient outcomes. METHODS: We performed a retrospective cohort review of all-adult, first renal transplants in the Republic of Ireland between 1999 and 2015. We divided patients into tertiles based on serum phosphate and calcium levels post-transplant. We assessed their effect on death-censored graft survival and all-cause mortality. We used Stata for statistical analysis and did survival analysis and spline curves to assess the association. RESULTS: We included 1525 renal transplant recipients. Of the total, 86.3% had hypophosphataemia and 36.1% hypercalcaemia. Patients in the lowest phosphate tertile were younger, more likely female, had lower weight, more time on dialysis, received a kidney from a younger donor, had less delayed graft function and better transplant function compared with other tertiles. Patients in the highest calcium tertile were younger, more likely male, had higher body mass index, more time on dialysis and better transplant function. Adjusting for differences between groups, we were unable to show any difference in death-censored graft failure [phosphate = 1.14, 95% confidence interval (CI) 0.92-1.41; calcium = 0.98, 95% CI 0.80-1.20] or all-cause mortality (phosphate = 1.10, 95% CI 0.91-1.32; calcium = 0.96, 95% CI 0.81-1.13) based on tertiles of calcium or phosphate in the initial 90 days. CONCLUSIONS: Hypophosphataemia and hypercalcaemia are common occurrences post-kidney transplant. We have identified different risk factors for these metabolic derangements. The calcium and phosphate levels exhibit no independent association with death-censored graft failure and mortality.
RESUMO
BACKGROUND: The survival of incident dialysis patients' end-stage kidney disease in some European and American has been reported to improve in modern era compared to earlier periods. However, in Ireland, this has not been well documented. AIM: To investigate the survival outcomes of incident end-stage kidney failure dialysis patients in a tertiary center over a 24-year period, 1993-2017. METHODS: A retrospective analysis was carried out utilizing the Beaumont Hospital Renal Database. Consecutive adults with incident dialysis were analyzed. Kaplan-Meier methods and the estimated mean survival times were used to evaluate survival at successive 4-year periods of time. RESULTS: In total, 2106 patients were included, of whom 830 underwent subsequent renal transplantation during follow-up. During the study period, from 1993 up to 2017, the mean patients' age increased from 56.3 ± 17.4 in 1993-1996 to 60.6 ± 18.3 in 2014-2017. There was an overall decrement in mortality over successive time intervals which were mirrored by the improvements in median survival after commencement of dialysis treatment from 6.14 years during 1993-1996 to 8.01 years during 2009-2012. Patients' survival has steadily improved, with the 5-year survival has risen over time, by almost 15%. This positive signal persisted and became more pronounced after adjusting Kaplan-Meier curve to age, where the 5-year survival estimates were exceeding 80% in 2014-2017. CONCLUSION: Survival rates among incident dialysis patients have improved progressively between 1993 and 2017 in Beaumont Hospital in Dublin, Ireland. The factors which led to this improvement are not entirely clear, but likely to be multifactorial.
Assuntos
Falência Renal Crônica , Transplante de Rim , Humanos , Irlanda/epidemiologia , Estimativa de Kaplan-Meier , Falência Renal Crônica/terapia , Diálise Renal , Estudos Retrospectivos , Fatores de Risco , Taxa de SobrevidaRESUMO
BACKGROUND: Internationally, the number of computerised tomographic pulmonary angiographies (CTPAs) being performed to rule out pulmonary embolism (PE) has caused some concern. AIM: This study was performed to assess if the application of Pulmonary Embolism Rule-out Criteria (PERC) in an Irish Emergency Department (ED) would have helped to safely reduce the number of D-dimer assays and computed tomographic pulmonary angiographies (CTPAs) ordered. METHODS: The PERC was retrospectively calculated in all patients who underwent CTPA for possible PE. It was then established if the application of the PERC as per the American College of Physicians' (ACP) guidelines would have safely ruled out the need for further imaging. RESULTS: Of the 529 patients who underwent CTPA in the study, 63 patients (12%) had PE on CTPA. Had the PERC criteria been applied, no patient who had a PE would have been missed. In this study, PERC had 100% sensitivity and 14% specificity. DISCUSSION/CONCLUSION: Application of the PERC rule, as per the ACP guidelines, would have reduced the number of CTPAs performed by 32 (6%) without missing any patient with a proven pulmonary embolus.
Assuntos
Médicos , Embolia Pulmonar , Angiografia , Técnicas de Apoio para a Decisão , Serviço Hospitalar de Emergência , Hospitais , Humanos , Embolia Pulmonar/diagnóstico por imagem , Estudos Retrospectivos , Tomografia Computadorizada por Raios X , Estados UnidosRESUMO
BACKGROUND: This study aims to evaluate allograft and patient outcomes among recipients of kidney transplants after non-renal solid organ transplants. We also aim to compare our findings with recipients of a repeat kidney transplant. METHODS: We performed an analysis on kidney transplant recipients who underwent kidney transplantation after a non-renal solid organ transplant. Survival data were stratified into 2 groups: Group A (n = 37) consisted of recipients of a kidney transplant after prior non-renal solid organ transplant, and Group B (n = 330) consisted of recipients of a repeat kidney transplant. RESULTS: The 1-,5-, and 10-year graft survival (death-censored) for recipients of a kidney transplant post-non-renal solid organ transplant (Group A) were 97.3%, 91.5%, and 86.9%, compared with 97.9%, 90.2%, and 83.4% for recipients of a repeat kidney transplant (Group B) (p = .32). The 1-, 5-, and 10-year patient survival rates were 97.3%, 82.7%, and 79.1% in Group A compared to 97.9%, 90.2%, and 83.4% in Group B. Unadjusted overall patient survival was significantly lower for Group A (p = .017). CONCLUSION: Kidney transplant recipients who have undergone a previous non-renal solid organ transplant have similar allograft survival outcomes, but higher long-term mortality rates compared to repeat kidney transplant recipients.
Assuntos
Transplante de Rim , Transplante de Órgãos , Sobrevivência de Enxerto , Humanos , Estudos Retrospectivos , Transplante HomólogoRESUMO
BACKGROUND: Measurement of late night salivary cortisol (LNSF) is useful in the identification of cyclical Cushing's syndrome (CS); the usefulness of its metabolite cortisone (late night salivary cortisone, LNSE) is less well described. AIM: The aim of this study was to determine the utility of measuring LNSE in patients with confirmed CS compared with other diagnostic tests and to analyse serial LNSF measurements for evidence of variable hormonogenesis. METHODS: This was a retrospective observational study including patients with confirmed CS in whom LNSF and LNSE were measured. RESULTS: Twenty-three patients with confirmed CS were included, 21 with Cushing's disease. LNSF had a sensitivity of 92%, LNSE 87% and combined LNSF/LNSE 94% per sample. Four patients had cyclical hormonogenesis, when the definition of one trough and two peaks was applied to LNSF measurements, and a fifth patient fell just outside the criteria. Six patients had evidence of variable hormonogenesis, defined as doubling of LNSF concentration on serial measurements. Sensitivity of 24-h urinary free cortisol (UFC) was 89% per collection. Sixteen patients had simultaneous measurements of LNSF and UFC; in three patients, they provided discordant results. CONCLUSION: LNSF appears more sensitive than LNSE and UFC in the diagnosis of CS, combining LNSF and LNSE results leads to superior sensitivity. Half of our cohort had evidence of cyclical or variable hormonogenesis. Fluctuations in LNSF did not always correlate with changes in UFC concentration, emphasising the importance of performing more than one screening test, particularly if pretest clinical suspicion is high.
Assuntos
Ritmo Circadiano/fisiologia , Cortisona/metabolismo , Síndrome de Cushing/diagnóstico , Hidrocortisona/metabolismo , Saliva/química , Adolescente , Adulto , Idoso , Criança , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Adulto JovemRESUMO
CONTEXT: Fluid restriction (FR) is the recommended first-line treatment for syndrome of inappropriate antidiuresis (SIAD), despite the lack of prospective data to support its efficacy. DESIGN: A prospective nonblinded randomized controlled trial of FR versus no treatment in chronic SIAD. INTERVENTIONS AND OUTCOME: A total of 46 patients with chronic asymptomatic SIAD were randomized to either FR (1 liter/day) or no specific hyponatremia treatment (NoTx) for 1 month. The primary endpoints were change in plasma sodium concentration (pNa) at days 4 and 30. RESULTS: Median baseline pNa was similar in the 2 groups [127 mmol/L (interquartile range [IQR] 126-129) FR and 128 mmol/L (IQR 126-129) NoTx, P = 0.36]. PNa rose by 3 mmol/L (IQR 2-4) after 3 days FR, compared with 1 mmol/L (IQR 0-3) NoTx, P = 0.005. There was minimal additional rise in pNa by day 30; median pNa increased from baseline by 4 mmol/L (IQR 2-6) in FR, compared with 1 mmol/L (IQR 0-1) NoTx, P = 0.04. After 3 days, 17% of FR had a rise in pNa of ≥5 mmol/L, compared with 4% NoTx, RR 4.0 (95% CI 0.66-25.69), P = 0.35. After 3 days, 61% of FR corrected pNa to ≥130 mmol/L, compared with 39% of NoTx, RR 1.56 (95% CI 0.87-2.94), P = 0.24. CONCLUSION: FR induces a modest early rise in pNa in patients with chronic SIAD, with minimal additional rise thereafter, and it is well-tolerated. More than one-third of patients fail to reach a pNa ≥130 mmol/L after 3 days of FR, emphasizing the clinical need for additional therapies for SIAD in some patients.
Assuntos
Hidratação/métodos , Síndrome de Secreção Inadequada de HAD/terapia , Privação de Água , Idoso , Idoso de 80 Anos ou mais , Líquidos Corporais/metabolismo , Doença Crônica , Feminino , Humanos , Hiponatremia/etiologia , Hiponatremia/terapia , Síndrome de Secreção Inadequada de HAD/complicações , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Privação de Água/fisiologiaRESUMO
Background: Solid organ transplantation is associated with increased risk of non-melanoma skin cancer. Studies with short follow up times have suggested a reduced occurrence of these cancers in recipients treated with mammalian target of rapamycin inhibitors as maintenance immunosuppression. We aimed to describe the occurrence of skin cancers in renal and liver transplant recipients switched from calcineurin inhibitor to sirolimus-based regimes.Methods: We performed a retrospective study of sirolimus conversion within the Irish national kidney and liver transplant programs. These data were linked with the National Cancer Registry Ireland to determine the incidence of NMSC among these recipients. The incidence rate ratio (IRR) for post versus pre-conversion NMSC rates are referred in this study as an effect size with [95% confidence interval].Results: Of 4,536 kidney transplants and 574 liver transplants functioning on the 1 January 1994 or transplanted between 1 January 1994 and 01 January 1994 and 01 January 2015, 85 kidney and 88 liver transplant recipients were transitioned to sirolimus-based immunosuppression. In renal transplants, the rate of NMSC was 131 per 1000 patient years pre-switch to sirolimus, and 68 per 1000 patient years post switch, with adjusted effect size of 0.48 [0.31 - 0.74] (p = .001) following the switch. For liver transplant recipients, the rate of NMSC was 64 per 1,000 patient years pre-switch and 30 per 1,000 patient years post switch, with an adjusted effect size of 0.49 [0.22 - 1.09] (p .081). Kidney transplant recipients were followed up for a median 3.4 years. Liver transplants were followed for a median 6.6 years.Conclusions: In this study, the conversion of maintenance immunosuppression from calcineurin inhibitors to mTOR inhibitors for clinical indications did appear to reduce the incidence of NMSC in kidney and liver transplant recipients.
Assuntos
Transplante de Rim/efeitos adversos , Transplante de Fígado/efeitos adversos , Complicações Pós-Operatórias/prevenção & controle , Sirolimo/uso terapêutico , Neoplasias Cutâneas/prevenção & controle , Serina-Treonina Quinases TOR/antagonistas & inibidores , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Inibidores de Calcineurina/uso terapêutico , Criança , Substituição de Medicamentos , Feminino , Humanos , Imunossupressores/uso terapêutico , Incidência , Irlanda/epidemiologia , Masculino , Pessoa de Meia-Idade , Complicações Pós-Operatórias/epidemiologia , Estudos Retrospectivos , Neoplasias Cutâneas/epidemiologia , Neoplasias Cutâneas/etiologia , Adulto JovemRESUMO
BACKGROUND: Transplantation is a well-known risk factor for malignancy. However, outcomes of cancer in transplant recipients compared with non-transplant recipients are less well studied. We aim to study the survival in kidney transplant recipients who develop cancer and compare this with cancer outcomes in the general population. METHODS: We linked data from the National Cancer Registry Ireland with the National Kidney Transplant Database. The period of observation was from 1 January 1994 until 31 December 2014. Transplant recipients were considered at risk from the time of diagnosing cancer. We administratively censored data at 10 years post-cancer diagnosis. Survival was compared with all patients in the general population that had a recorded diagnosis of cancer. RESULTS: There were 907 renal transplant recipients and 426679 individuals in the general population diagnosed with cancer between 1 January 1994 and 31 December 2014. In those with non-melanoma skin cancer, the hazard ratio (HR) for 10-year, all-cause mortality [HR = 3.06, 95% confidence interval (CI) 2.66-3.52] and cancer-specific mortality (HR = 3.91, 95% CI 2.57-5.96) was significantly higher among transplant recipients than the general population. Patients who developed non-Hodgkin lymphoma (HR = 2.89, 95% CI 1.96-4.25) and prostate cancer (HR = 4.32, 95% CI 2.39-7.82) had increased all-cause but not cancer-specific mortality. Colorectal, lung, breast and renal cell cancer did not show an increased risk of death in transplant recipients. CONCLUSION: Cancer-attributable mortality is higher in kidney transplant recipients with non-melanoma skin cancer compared with non-transplant patients. The American Joint Committee on Cancer staging should reflect the increased hazard of death in these immunosuppressed patients.
Assuntos
Transplante de Rim/efeitos adversos , Neoplasias/epidemiologia , Sistema de Registros/estatística & dados numéricos , Transplantados/estatística & dados numéricos , Adulto , Idoso , Feminino , Humanos , Irlanda/epidemiologia , Masculino , Pessoa de Meia-Idade , Neoplasias/etiologia , Estudos Retrospectivos , Fatores de Risco , Fatores de TempoRESUMO
INTRODUCTION: Few studies investigate significant perioperative predictors for long-term renal allograft survival after second kidney transplant (SKT). We compared long-term survival following SKT with primary kidney transplant and determined predictors of renal allograft failure after SKT. METHODS: Outcomes of all primary or second kidney transplant recipients at a national kidney transplant center between 1993 and 2017 were reviewed. The primary outcomes measurements were renal allograft survival for both first and second kidney transplants. Secondary outcome measurements were incidence of delayed graft function (DGF), incidence of acute rejection (AR), and predictors for renal allograft survival in SKT recipients. RESULTS: In total, there were 392 SKTs and 2748 primary kidney transplants performed between 1993 and 2017. The 1-, 5-, and 10-year death-censored graft survival for deceased-donor recipients was 95.3%, 88.7%, and 78.2% for primary kidney transplant and 94.9%, 87.1%, and 74.9% for SKT (P = .0288). Survival of primary renal allograft <6 years (HR 0.6, P = .017), AR episodes (HR 1.6, P = .031), DGF (HR 2.0, P = .005), and HLA-DR MM (HR 1.7, P = .018) was independent predictors of long-term renal allograft failure after SKT. CONCLUSION: These findings may provide important information on long-term survival outcomes after SKT and for identifying patients at risk for long-term renal allograft failure after SKT.
Assuntos
Transplante de Rim , Aloenxertos , Rejeição de Enxerto/etiologia , Sobrevivência de Enxerto , Humanos , Rim , Estudos Retrospectivos , Fatores de RiscoRESUMO
INTRODUCTION: Autosomal dominant tubulointerstitial kidney disease (ADTKD) is a rare genetic cause of chronic kidney disease (CKD) and end-stage renal disease (ESRD). We aimed to compare renal transplant outcomes in people with ESRD due to ADTKD to those with other causes of renal failure. METHODS: Patients with clinical characteristics consistent with ADTKD by the criteria outlined in the 2015 KDIGO consensus were included. We compared ADTKD transplant outcomes with those of 4633 non-ADTKD renal transplant recipients. RESULTS: We included 31 patients who met diagnostic criteria for ADTKD in this analysis, 23 of whom had an identified mutation (28 were categorized as definite-ADTKD and 3 as suspected ADTKD). Five patients received a second transplant during follow-up. In total, 36 grafts were included. We did not identify significant differences between groups in terms of graft or patient survival after transplantation. Twenty-five transplant biopsies were performed during follow-up, and none of these showed signs of recurrent ADTKD post-transplant. CONCLUSION: In patients with ESRD due to ADTKD, we demonstrate that transplant outcomes are comparable with the general transplant population. There is no evidence that ADTKD can recur after transplantation.
Assuntos
Falência Renal Crônica , Transplante de Rim , Rim Policístico Autossômico Dominante , Sobrevivência de Enxerto , Humanos , Falência Renal Crônica/cirurgia , Mutação , Uromodulina/genéticaRESUMO
BACKGROUND: New-onset diabetes after transplant (NODAT) confers risk of diabetes-related complications as well as a threat to graft function and overall patient survival. The reported incidence of NODAT varies from 14 to 37% in renal transplant recipients worldwide; however, NODAT is yet to be studied in the Irish renal transplant population. AIMS: Primary aims of this project were to estimate the incidence, to determine associated risk factors and to assess the long-term consequences of NODAT on graft survival and patient survival in the Irish renal transplant population. METHODS: Retrospective data collection of 415 renal transplant recipients over a 12-year period was performed to record presence of NODAT, patient characteristics and perioperative management. Preoperative screening was reviewed in a subgroup of patients to determine concordance with the International Consensus Guidelines. Statistical analysis was performed using Kaplan-Meier survival functions estimating NODAT detection over time, graft and patient survival. Risk factor association was determined using Cox proportional-hazards models. RESULTS: NODAT incidence was 10.2% in the first 5 years of post-transplant. Risk factors for developing NODAT were recipient age and body weight. Risk of NODAT was highest in the first year of post-transplant and conferred decreased patient survival; however, it did not significantly affect graft survival. Only seven patients of a subgroup of 21 patients who developed NODAT had preoperative testing for diabetes. CONCLUSIONS: NODAT incidence in the Irish renal transplant population is slightly below international figures. This project has highlighted current deficits in the national transplant guidelines for the detection of NODAT and NODAT-related risk factors.
Assuntos
Diabetes Mellitus/dietoterapia , Diabetes Mellitus/etiologia , Transplante de Rim/efeitos adversos , Adulto , Diabetes Mellitus/diagnóstico , Feminino , Humanos , Transplante de Rim/mortalidade , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de RiscoRESUMO
OBJECTIVES: Delayed graft function after kidney transplant can affect patient and graft survival, resulting in prolonged hospital stay and need for dialysis. Ischemia times during organ procurement and reanastomosis at transplant are key factors in delayed graft function. MATERIALS AND METHODS: We analyzed all living- and deceased-donor renal transplants in Ireland over a 33-month period, with effect of warm ischemia time during anastomosis on delayed graft function being the primary outcome. We performed statistical regression analyses to account for confounding variables. Patients had identical surgical technique and immunosuppression protocols. RESULTS: Of 481 transplants during the study period, 20 patients were excluded because of paired-kidney exchange, nephron dosing transplant, or simul-taneous pancreas-kidney transplant. In the donor pool, 70% were donors after brainstem death, 3.6% were donors after cardiac death, and 26% were living donors. All living donors were direct altruistic donors and underwent stringent assessment via the ethics committee and multidisciplinary team meeting. Of living donors, 8% were not related. These were true altruistic donors who were acquaintances of the recipients and volunteered themselves for assessment. They were assessed in accordance with the declaration of Istanbul and received no compensation of any kind for donation. Of total patients, 18% had delayed graft function, defined as need for dialysis within 7 days of transplant. Warm ischemia time during anastomosis significantly affected risk of delayed graft function but not graft survival or function at 3 months. This factor did not correlate with hospital stay duration. Time on dialysis and recipient weight significantly correlated with risk of delayed graft function. CONCLUSIONS: Our findings support a role for minimizing warm ischemia time during anastomosis to reduce delayed graft function and need for dialysis in the perioperative period. However, a longer time does not appear to affect creatinine levels and therefore graft function at 3 months.
Assuntos
Função Retardada do Enxerto/etiologia , Falência Renal Crônica/cirurgia , Transplante de Rim/efeitos adversos , Procedimentos Cirúrgicos Vasculares/efeitos adversos , Isquemia Quente/efeitos adversos , Adulto , Anastomose Cirúrgica , Peso Corporal , Bases de Dados Factuais , Função Retardada do Enxerto/diagnóstico , Feminino , Humanos , Irlanda , Falência Renal Crônica/diagnóstico , Doadores Vivos , Masculino , Pessoa de Meia-Idade , Diálise Renal/efeitos adversos , Medição de Risco , Fatores de Risco , Fatores de Tempo , Resultado do TratamentoRESUMO
BACKGROUND: The Kidney Donor Risk Index (KDRI)/Kidney Donor Profile Index (KDPI) is relied upon for donor organ allocation in the USA, based on its association with graft failure in time-to-event models. However, the KDRI/KDPI has not been extensively evaluated in terms of predictive metrics for graft failure and allograft estimated glomerular filtration rate (eGFR) outside of the USA. METHODS: We performed a retrospective analysis of outcomes in the Irish National Kidney Transplant Service Registry for the years 2006-13. Associations of the KDRI/KDPI score with eGFR at various time points over the follow-up and ultimate graft failure were modelled. RESULTS: A total of 772 patients had complete data regarding KDRI/KDPI calculation and 148 of these allografts failed over the follow-up. The median and 25-75th centile for KDRI/KDPI was 51 (26-75). On repeated-measures analysis with linear mixed effects models, the KDRI/KDPI (fixed effect covariate) associated with eGFR over 5 years: eGFR = -0.25 (standard error 0.02; P < 0.001). The variability in eGFR mathematically accounted for by the KDRI/KDPI score was only 21%. The KDRI/KDPI score did not add significantly to graft failure prediction above donor age alone (categorized as > and <50 years of age) when assessed by the categorical net reclassification index. CONCLUSIONS: In this cohort, while the KDRI/KDPI was predictive of eGFR over the follow-up, it did not provide additive discrimination above donor age alone in terms of graft failure prediction. Therefore it is unlikely to help inform decisions regarding kidney organ allocation in Ireland.