RESUMEN
PURPOSE: Incisional hernias (IH) after kidney transplantation (KTx) can cause significant morbidity in kidney transplant recipients (KTR). We aimed to report the outcomes of surgical repair of IH in KTR from our centre. METHODS: We retrospectively analysed all the IH repairs in KTR from May 2018 to May 2023. We documented pre-transplant baseline characteristics, peri- and post-KTx events and outcomes and post-IH repair complications. We also documented length of stay, survival, and hernia recurrence post-IH repair. RESULTS: We performed 35 incisional hernia repairs in 34 KTR from May 2018 to May 2023 with an overall incidence of 1.63% symptomatic IH. Mean patient age was 56.7 ± 10.1 years and mean body mass index (BMI) 29.7 ± 6.49 kg/m2. A history of previous hernia operation and open abdominal operations was present in 11.4% and 22.9% of the population, respectively. The types of repairs performed were primary (5.7%), onlay (62.9%), inlay (2.9%) and retromuscular sublay (28.6%). Mean hernia neck size was 8.9 ± 5.6 cm. After IH repair, there was no perioperative mortality with an average 5.5 ± 3.9 days of length of stay. There were seven episodes (20%) of IH recurrence. There was a 6% of superficial wound dehiscence rate and a 3% of surgical site infection. Pearson's correlation test revealed that post-operative hernia recurrence was not related with neck size, post-transplant complications or pre- and post-transplant characteristics, as well as post-transplant outcome. CONCLUSIONS: The recurrence rate in our cohort was 20%. Known risk factors for IH in KTR as well as post-KTx events were not correlated with hernia recurrence or other post-hernia repair complications.
RESUMEN
Live donor kidney transplantation (LDKT) is the optimal treatment modality for end stage renal disease (ESRD), enhancing patient and graft survival. Pre-emptive LDKT, prior to requirement for renal replacement therapy (RRT), provides further advantages, due to uraemia and dialysis avoidance. There are a number of potential barriers and opportunities to promoting pre-emptive LDKT. Significant infrastructure is needed to deliver robust programmes, which varies based on socio-economic standards. National frameworks can impact on national prioritisation of pre-emptive LDKT and supporting education programmes. Focus on other programme's components, including deceased kidney transplantation and RRT, can also hamper uptake. LDKT programmes are designed to provide maximal benefit to the recipient, which is specifically true for pre-emptive transplantation. Health care providers need to be educated to maximize early LDKT referral. Equitable access for varying population groups, without socio-economic bias, also requires prioritisation. Cultural barriers, including religious influence, also need consideration in developing successful outcomes. In addition, the benefit of pre-emptive LDKT needs to be emphasised, and opportunities provided to potential donors, to ensure timely and safe work-up processes. Recipient education and preparation for pre-emptive LDKT needs to ensure increased uptake. Awareness of the benefits of pre-emptive transplantation require prioritisation for this population group. We recommend an approach where patients approaching ESRD are referred early to pre-transplant clinics facilitating early discussion regarding pre-emptive LDKT and potential donors for LDKT are prioritized for work-up to ensure success. Education regarding pre-emptive LDKT should be the norm for patients approaching ESRD, appropriate for the patient's cultural needs and physical status. Pre-emptive transplantation maximize benefit to potential recipients, with the potential to occur within successful service delivery. To fully embrace preemptive transplantation as the norm, investment in infrastructure, increased awareness, and donor and recipient support is required.
RESUMEN
BACKGROUND: Despite technical refinements, early pancreas graft loss due to thrombosis continues to occur. Conventional coagulation tests (CCT) do not detect hypercoagulability and hence the hypercoagulable state due to diabetes is left untreated. Thromboelastogram (TEG) is an in-vitro diagnostic test which is used in liver transplantation, and in various intensive care settings to guide anticoagulation. TEG is better than CCT because it is dynamic and provides a global hemostatic profile including fibrinolysis. AIM: To compare the outcomes between TEG and CCT (prothrombin time, activated partial thromboplastin time and international normalized ratio) directed anticoagulation in simultaneous pancreas and kidney (SPK) transplant recipients. METHODS: A single center retrospective analysis comparing the outcomes between TEG and CCT-directed anticoagulation in SPK recipients, who were matched for donor age and graft type (donors after brainstem death and donors after circulatory death). Anticoagulation consisted of intravenous (IV) heparin titrated up to a maximum of 500 IU/h based on CCT in conjunction with various clinical parameters or directed by TEG results. Graft loss due to thrombosis, anticoagulation related bleeding, radiological incidence of partial thrombi in the pancreas graft, thrombus resolution rate after anticoagulation dose escalation, length of the hospital stays and, 1-year pancreas and kidney graft survival between the two groups were compared. RESULTS: Seventeen patients who received TEG-directed anticoagulation were compared against 51 contemporaneous SPK recipients (ratio of 1: 3) who were anticoagulated based on CCT. No graft losses occurred in the TEG group, whereas 11 grafts (7 pancreases and 4 kidneys) were lost due to thrombosis in the CCT group (P = 0.06, Fisher's exact test). The overall incidence of anticoagulation related bleeding (hematoma/ gastrointestinal bleeding/ hematuria/ nose bleeding/ re-exploration for bleeding/ post-operative blood transfusion) was 17.65% in the TEG group and 45.10% in the CCT group (P = 0.05, Fisher's exact test). The incidence of radiologically confirmed partial thrombus in pancreas allograft was 41.18% in the TEG and 25.50% in the CCT group (P = 0.23, Fisher's exact test). All recipients with partial thrombi detected in computed tomography (CT) scan had an anticoagulation dose escalation. The thrombus resolution rates in subsequent scan were 85.71% and 63.64% in the TEG group vs the CCT group (P = 0.59, Fisher's exact test). The TEG group had reduced blood product usage {10 packed red blood cell (PRBC) and 2 fresh frozen plasma (FFP)} compared to the CCT group (71 PRBC/ 10 FFP/ 2 cryoprecipitate and 2 platelets). The proportion of patients requiring transfusion in the TEG group was 17.65% vs 39.25% in the CCT group (P = 0.14, Fisher's exact test). The median length of hospital stay was 18 days in the TEG group vs 31 days in the CCT group (P = 0.03, Mann Whitney test). The 1-year pancreas graft survival was 100% in the TEG group vs 82.35% in the CCT group (P = 0.07, log rank test) and, the 1-year kidney graft survival was 100% in the TEG group vs 92.15% in the CCT group (P = 0.23, log tank test). CONCLUSION: TEG is a promising tool in guiding judicious use of anticoagulation with concomitant prevention of graft loss due to thrombosis, and reduces the length of hospital stay.
RESUMEN
Although health care is encouraged to follow an evidence-based approach, there are perceived instances where suboptimal practice persists in the presence of better options due to an inherent resistance to change within many health care systems. To continue striving for clinical excellence, it is important to identify deficient practices and make appropriate corrections by implementing new and improved techniques and treatments. Bringing about change, however, tends to be a long, arduous process consisting of several small and successive deviations from the norm, analogous to "turning the oil tanker". Analyzing the methods employed by successful health care innovators has allowed the development of a "three-pronged" approach to overcoming resistance to change: 1) a determined opinion leader with a network or like-minded opinion leaders; 2) the presentation of hard evidence with adequate praise for current practice and the generation of clearly worded, specific guidelines; and 3) the use of simple reminders and continuous analysis of outcomes. Employing this three-pronged approach could lead to faster and more successful implementation of change within the health care system.
RESUMEN
BACKGROUND AND OBJECTIVES: Acute rejection is a significant complication detrimental to kidney transplant function. Current accepted means of diagnosis is percutaneous renal biopsy, a costly and invasive procedure. There is an urgent need to detect and validate non-invasive biomarkers capable of replacing the biopsy. DESIGN, SETTING, PARTICIPANTS AND MEASUREMENTS: Comprehensive literature searches of Medline, EMBASE and Cochrane Central Register of Controlled Trials databases were performed. Eligible studies were included as per inclusion criteria and assessed for quality using the GRADE quality of evidence tool. Outcomes evaluated included biomarker diagnostic performance, number of patients/samples, mean age and gender ratio, immunosuppression regime, in addition to clinical applications of the biomarker(s) tested. PRISMA guidelines were followed. Where possible, statistical analysis of comparative performance data was performed. RESULTS: 23 studies were included in this review, including 19 adult, 3 paediatric and 1 mixed studies. A total of 2858 participants and 50 candidate non-invasive tests were identified. Sensitivity, specificity and area under the curve performance values ranged 36%-100%, 30%-100% and 0.55-0.98, respectively. CONCLUSIONS: Although larger, more robust multi-centre validation studies are needed before non-invasive biomarkers can replace the biopsy, numerous candidate tests have demonstrated significant promise for various facets of postoperative management. Suggested uses include: ruling out patients with a low risk of acute rejection to avoid the need for biopsy, non-invasive testing where the biopsy is contraindicated and a prompt diagnosis is needed, and integration into a serial blood monitoring protocol in conjunction with serum creatinine.
RESUMEN
BACKGROUND: The T-cell composition within the lymph node (LN) of end-stage renal disease (ESRD) patients differs from the composition within the circulation. Activation of the alloreactive T-cell response within secondary lymphoid organs is important after organ transplantation. However, to date no data are present on LN T-cell subsets and the risk for acute rejection after kidney transplantation. METHODS: T cells from LNs of ESRD patients were analyzed for frequency of recent thymic emigrants, relative telomere length, expression of differentiation markers, and were related to the development of early acute rejection (EAR), occurring within 3 months after renal transplantation (RT). Furthermore, the alloreactive potential of mononuclear cells isolated from the LN and peripheral blood of 10 patients was analyzed. Measures of alloreactive potential included proliferation, cytokine production, frequencies of interferon-gamma-producing cells, and the presence of cytotoxic molecules. RESULTS: Patients with EAR were younger (p = 0.019), cytomegalovirus-seropositive (p = 0.037) and usually received dialysis prior to RT (p = 0.030). Next to this, patients with EAR showed a lower CD4:CD8 ratio (p = 0.027) within the LN. T cells from the LN were similar with regard to alloreactive capacity compared with those within the circulation. Univariate regression analysis showed that the CD4:CD8 ratio (OR: 0.67, p = 0.039), patient age (OR: 0.93, p = 0.024), and preemptive RT (OR: 0.11, p = 0.046) were associated with EAR. After a multivariate analysis, only the CD4:CD8 ratio (OR: 0.58, p = 0.019) and preemptive RT (OR:0.05, p = 0.012) were associated with EAR. CONCLUSION: A lower CD4:CD8 ratio in the LN is associated with a higher risk for the development of rejection within 3 months after RT.
RESUMEN
Ischemia/reperfusion injury (IRI), an inherent component of transplantation, affects organ quality and transplant outcomes. Although the complexity of the pathophysiology is recognized, detailed mechanisms remain unclear, and strategies preventing the consequences of IRI have been challenging. Of critical significance appears the link between IRI, the initiation of innate immune responses, and the (potential) augmentation of adaptive immunity. An improved understanding of those complex mechanisms and interactions may pave the way for more effective treatment strategies.