RESUMEN
Introduction: We assess if ultrasound surveillance of newly-created arteriovenous fistulas (AVFs) can predict nonmaturation sufficiently reliably to justify randomized controlled trial (RCT) evaluation of ultrasound-directed salvage intervention. Methods: Consenting adults underwent blinded fortnightly ultrasound scanning of their AVF after creation, with scan characteristics that predicted AVF nonmaturation identified by logistic regression modeling. Results: Of 333 AVFs created, 65.8% matured by 10 weeks. Serial scanning revealed that maturation occurred rapidly, whereas consistently lower fistula flow rates and venous diameters were observed in those that did not mature. Wrist and elbow AVF nonmaturation could be optimally modeled from week 4 ultrasound parameters alone, but with only moderate positive predictive values (PPVs) (wrist, 60.6% [95% confidence interval, CI: 43.9-77.3]; elbow, 66.7% [48.9-84.4]). Moreover, 40 (70.2%) of the 57 AVFs that thrombosed by week 10 had already failed by the week 4 scan, thus limiting the potential of salvage procedures initiated by that scan's findings to alter overall maturation rates. Modeling of the early ultrasound characteristics could also predict primary patency failure at 6 months; however, that model performed poorly at predicting assisted primary failure (those AVFs that failed despite a salvage attempt), partly because patency of at-risk AVFs was maintained by successful salvage performed without recourse to the early scan data. Conclusion: Early ultrasound surveillance may predict fistula maturation, but is likely, at best, to result in only very modest improvements in fistula patency. Power calculations suggest that an impractically large number of participants (>1700) would be required for formal RCT evaluation.
RESUMEN
Background: Arteriovenous fistulas are considered the best option for haemodialysis provision, but as many as 30% fail to mature or suffer early failure. Objective: To assess the feasibility of performing a randomised controlled trial that examines whether, by informing early and effective salvage intervention of fistulas that would otherwise fail, Doppler ultrasound surveillance of developing arteriovenous fistulas improves longer-term arteriovenous fistula patency. Design: A prospective multicentre observational cohort study (the 'SONAR' study). Setting: Seventeen haemodialysis centres in the UK. Participants: Consenting adults with end-stage renal disease who were scheduled to have an arteriovenous fistula created. Intervention: Participants underwent Doppler ultrasound surveillance of their arteriovenous fistulas at 2, 4, 6 and 10 weeks after creation, with clinical teams blinded to the ultrasound surveillance findings. Main outcome measures: Fistula maturation at week 10 defined according to ultrasound surveillance parameters of representative venous diameter and blood flow (wrist arteriovenous fistulas: ≥ 4 mm andâ > 400 ml/minute; elbow arteriovenous fistulas: ≥ 5 mm and > 500 ml/minute). Mixed multivariable logistic regression modelling of the early ultrasound scan data was used to predict arteriovenous fistula non-maturation by 10 weeks and fistula failure at 6 months. Results: A total of 333 arteriovenous fistulas were created during the study window (47.7% wrist, 52.3% elbow). By 2 weeks, 37 (11.1%) arteriovenous fistulas had failed (thrombosed), but by 10 weeks, 219 of 333 (65.8%) of created arteriovenous fistulas had reached maturity (60.4% wrist, 67.2% elbow). Persistently lower flow rates and venous diameters were observed in those fistulas that did not mature. Models for arteriovenous fistulas' non-maturation could be optimally constructed using the week 4 scan data, with fistula venous diameter and flow rate the most significant variables in explaining wrist fistula maturity failure (positive predictive value 60.6%, 95% confidence interval 43.9% to 77.3%), whereas resistance index and flow rate were most significant for elbow arteriovenous fistulas (positive predictive value 66.7%, 95% confidence interval 48.9% to 84.4%). In contrast to non-maturation, both models predicted fistula maturation much more reliably [negative predictive values of 95.4% (95% confidence interval 91.0% to 99.8%) and 95.6% (95% confidence interval 91.8% to 99.4%) for wrist and elbow, respectively]. Additional follow-up and modelling on a subset (nâ =â 192) of the original SONAR cohort (the SONAR-12M study) revealed the rates of primary, assisted primary and secondary patency arteriovenous fistulas at 6 months were 76.5, 80.7 and 83.3, respectively. Fistula vein size, flow rate and resistance index could identify primary patency failure at 6 months, with similar predictive power as for 10-week arteriovenous fistula maturity failure, but with wide confidence intervals for wrist (positive predictive value 72.7%, 95% confidence interval 46.4% to 99.0%) and elbow (positive predictive value 57.1%, 95% confidence interval 20.5% to 93.8%). These models, moreover, performed poorly at identifying assisted primary and secondary patency failure, likely because a subset of those arteriovenous fistulas identified on ultrasound surveillance as at risk underwent subsequent successful salvage intervention without recourse to early ultrasound data. Conclusions: Although early ultrasound can predict fistula maturation and longer-term patency very effectively, it was only moderately good at identifying those fistulas likely to remain immature or to fail within 6 months. Allied to the better- than-expected fistula patency rates achieved (that are further improved by successful salvage), we estimate that a randomised controlled trial comparing early ultrasound-guided intervention against standard care would require at least 1300 fistulas and would achieve only minimal patient benefit. Trial Registration: This trial is registered as ISRCTN36033877 and ISRCTN17399438. Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: NIHR135572) and is published in full in Health Technology Assessment; Vol. 28, No. 24. See the NIHR Funding and Awards website for further award information.
For people with advanced kidney disease, haemodialysis is best provided by an 'arteriovenous fistula', which is created surgically by joining a vein onto an artery at the wrist or elbow. However, these take about 2 months to develop fully ('mature'), and as many as 3 out of 10 fail to do so. We asked whether we could use early ultrasound scanning of the fistula to identify those that are unlikely to mature. This would allow us to decide whether it would be practical to run a large, randomised trial to find out if using early ultrasound allows us to 'rescue' fistulas that would otherwise fail. We invited adults to undergo serial ultrasound scanning of their fistula in the first few weeks after it was created. We then analysed whether we could use the data from the early scans to identify those fistulas that were not going to mature by week 10. Of the 333 fistulas that were created, about two-thirds reached maturity by week 10. We found that an ultrasound scan 4 weeks after fistula creation could reliably identify those fistulas that were going to mature. However, of those fistulas predicted to fail, about one-third did eventually mature without further intervention, and even without knowing what the early scans showed, another third were successfully rescued by surgery or X-ray-guided treatment at a later stage. Performing an early ultrasound scan on a fistula can provide reassurance that it will mature and deliver trouble-free dialysis. However, because scans are poor at identifying fistulas that are unlikely to mature, we would not recommend their use to justify early surgery or X-ray-guided treatment in the expectation that this will improve outcomes.
Asunto(s)
Derivación Arteriovenosa Quirúrgica , Fallo Renal Crónico , Diálisis Renal , Ultrasonografía Doppler , Grado de Desobstrucción Vascular , Humanos , Femenino , Masculino , Persona de Mediana Edad , Derivación Arteriovenosa Quirúrgica/efectos adversos , Estudios Prospectivos , Fallo Renal Crónico/terapia , Anciano , Reino Unido , AdultoRESUMEN
BACKGROUND & AIMS: The National Liver Offering Scheme (NLOS) was introduced in the UK in 2018 to offer livers from deceased donors to patients on the national waiting list based, for most patients, on calculated transplant benefit. Before NLOS, livers were offered to transplant centres by geographic donor zones and, within centres, by estimated recipient need for a transplant. METHODS: UK Transplant Registry data on patient registrations and transplants were analysed to build statistical models for survival on the list (M1) and survival post-transplantation (M2). A separate cohort of registrations - not seen by the models before - was analysed to simulate what liver allocation would have been under M1, M2 and a transplant benefit score (TBS) model (combining both M1 and M2), and to compare these allocations to what had been recorded in the UK Transplant Registry. The number of deaths on the waiting list and patient life years were used to compare the different simulation scenarios and to select the optimal allocation model. Registry data were monitored, pre- and post-NLOS, to understand the performance of the scheme. RESULTS: The TBS was identified as the optimal model to offer donation after brain death (DBD) livers to adult and large paediatric elective recipients. In the first 2 years of NLOS, 68% of DBD livers were offered using the TBS to this type of recipient. Monitoring data indicate that mortality on the waiting list post-NLOS significantly decreased compared with pre-NLOS (p <0.0001), and that patient survival post-listing was significantly greater post- compared to pre-NLOS (p = 0.005). CONCLUSIONS: In the first two years of NLOS offering, waiting list mortality fell while post-transplant survival was not negatively impacted, delivering on the scheme's objectives. IMPACT AND IMPLICATIONS: The National Liver Offering Scheme (NLOS) was introduced in the UK in 2018 to increase transparency of the deceased donor liver offering process, maximise the overall survival of the waiting list population, and improve equity of access to liver transplantation. To our knowledge, it is the first scheme that offers organs based on statistical prediction of transplant benefit: the transplant benefit score. The results are important to the transplant community - from healthcare practitioners to patients - and demonstrate that, in the first two years of NLOS offering, waiting list mortality fell while post-transplant survival was not negatively impacted, thus delivering on the scheme's objectives. The scheme continues to be monitored to ensure that the transplant benefit score remains up-to-date and that signals that suggest the possible disadvantage of some patients are investigated.
Asunto(s)
Trasplante de Hígado , Sistema de Registros , Donantes de Tejidos , Obtención de Tejidos y Órganos , Listas de Espera , Humanos , Trasplante de Hígado/métodos , Trasplante de Hígado/estadística & datos numéricos , Reino Unido , Obtención de Tejidos y Órganos/métodos , Obtención de Tejidos y Órganos/estadística & datos numéricos , Sistema de Registros/estadística & datos numéricos , Donantes de Tejidos/estadística & datos numéricos , Donantes de Tejidos/provisión & distribución , Adulto , Masculino , Femenino , Persona de Mediana Edad , Niño , AdolescenteRESUMEN
Prognostic model building is a process that begins much earlier than data analysis and ends later than when a model is reached. It requires careful delineation of a clinical question, methodical planning of the approach and attentive exploration of the data before attempting model building. Once following these important initial steps, the researcher may postulate a model to describe the process of interest and build such model. Once built, the model will need to be checked, validated and the exercise may take the researcher back a few steps - for instance, to adapt the model to fit a variable that displays a 'curved' pattern - to then return to check and validate the model again. To interpret and report the results it is vital to relate the output to the original question, to be transparent in the methodology followed and to understand the limitations of the data and the approach.
Asunto(s)
Modelos Estadísticos , Pronóstico , HumanosRESUMEN
INTRODUCTION: Therapeutic plasma exchange (TPE) is used for several chronic conditions with little evidence on the efficacy and safety of different choice of replacement fluid. Measurement of haemostasis, particularly in vitro thrombin generation, could play a role in determining the immediate efficacy of different fluid replacement. AIM: To determine the impact of different TPE replacement fluid regimens on haemostatic assays. METHODS: Prospective observational multi-centre cohort study in adult patients 18 years and older evaluating haemostatic changes between four different TPE regimens: (1) 5% human albumin solution (Alb) only, (2) 50:50 mix of 5% Alb + modified gelatin, (3) 70:30 mix of 5% Alb and normal saline (NS), and (4) solvent-detergent, virus-inactivated fresh frozen plasma (FFP) (either alone or combined with other fluids). Twenty-one haemostasis variables were analysed (procoagulant, anticoagulant and fibrinolytic factors) pre and post TPE sessions, including in vitro thrombin generation. Linear mixed modelling and canonical discriminant analyses were used to examine the effect of TPE fluid type on haemostatic variables. RESULTS: A total of 31 patients with up to 5 TPE sessions each (131 sessions in total) were enrolled. Out of 21 markers analysed using linear mixed modelling, the main effects of fluid type were found to be significant for 19 markers (P < 0.05), excluding plasminogen activator inhibitor-1 antigen and thrombin-anti-thrombin. Multivariate Analysis of Variance showed significant differences between the fluid types (Wilks' lambda = 0.07; F63,245.61 = 5.50; P < 0.0001) and this was supported by a canonical discriminant analysis, which identified the 4 most discriminating markers for fluid types as thrombin generation (lag-time, time-to Peak), fibrinogen and Factor V. In our analyses, the effect of FFP on haemostasis was significantly greater compared with other fluid types. Of the non-FFP fluids, 5% Alb + NS had a lower effect on haemostasis compared to other fluid types (Alb and modified gelatin + 5% Alb). CONCLUSION: Thrombin generation and fibrinogen discriminated better the effect of different TPE fluids on haemostasis and should be considered as potential markers to evaluate the immediate haemostatic effect of TPE procedures. The use of NS as a TPE replacement fluid had a distinctive impact on thrombin generation and fibrinogen responses compared to other non-FFP fluids.
Asunto(s)
Hemostáticos , Intercambio Plasmático , Adulto , Humanos , Intercambio Plasmático/métodos , Gelatina , Estudios de Cohortes , Hemostasis/fisiología , Fibrinógeno , TrombinaRESUMEN
BACKGROUND AND OBJECTIVES: Irradiation of red cell components is indicated for recipients at risk of transfusion-associated graft vs. host disease. Current technologies available comprise of a gamma (γ) or an x source of radiation. The benefits of x vs. γ include non-radioactivity and hence no decay of the source. We aimed to compare the effect of the two technologies on red cell component storage quality post-irradiation. MATERIALS AND METHODS: Paired units of red cell concentrates (RCC), neonatal red cell splits (RCS), red cells for intra-uterine transfusion (IUT) or neonatal exchange transfusion (ExTx) were either γ- or x-irradiated. Units were sampled and tested for five storage parameters until the end of shelf life. Equivalence analysis of storage quality parameters was performed for pairs of the same components (RCC, RCS, IUT or ExTx) that were either γ- or x-irradiated. RESULTS: Nearly all component comparisons studied showed equivalence between γ and x irradiation for haemolysis, ATP, 2,3-DPG, potassium release and lactate production. The exceptions found that were deemed non-equivalent were higher haemolysis with x irradiation for ExTx, lower 2,3-DPG with x irradiation for RCS irradiated early and higher ATP with x irradiation for IUT. However, these differences were considered not clinically significant. CONCLUSION: This study has demonstrated that a range of red cell components for use in different age groups are of acceptable quality following x irradiation, with only small differences deemed clinically insignificant in a few of the measured parameters.
Asunto(s)
Eritrocitos , Hemólisis , Conservación de la Sangre , Transfusión Sanguínea , Rayos gamma , Humanos , PotasioRESUMEN
BACKGROUND: There is renewed interest in the use of whole blood (WB) for the resuscitation of trauma patients. Platelet function in stored WB compared to platelet concentrates is not well established and was assessed in vitro in this study. METHODS: Leucocyte-depleted cold-stored WB (CS-WB) was prepared using a Terumo WB-SP Imuflex kit and held at 2-6°C alongside: (A) UK standard pooled platelets stored at 20-24°C (RT-PLTS), (B) pooled platelets stored at 2-6°C (CS-PLTS), and (C) platelet-rich plasma produced using the Terumo kit (CS-PRP), for 21 days. A series of in vitro assays were assessed platelet function. RESULTS: Platelet count was retained to 57 ± 14% of starting number at day 21 in CS-WB. Over time, CS-WB platelets become more activated, with increased CD62P expression (day 1: 7 ± 3.7% vs. day 21: 59 ± 17.1%) and annexin V binding (day 1: 2 ± 0.2% vs. day 21: 21 ± 15.1%). For comparison, 18.6 ± 6% of platelets in RT-PLTS demonstrated CD62P expression at day 7, whereas annexin V binding in RT-PLTS at day 7 was 2.6 ± 0.5%. Over storage, aggregatory response to agonists decreased in all arms. Functional platelet microparticles increased steadily in CS-WB throughout storage. CONCLUSION: During storage, platelet count reduced in CS-WB, whereas CD62P expression and annexin V binding increased. This was accompanied by a reduced aggregatory response, although compared to 7-day-old RT-PLTS, CS-WB maintained a maximal response to agonists for longer, suggesting that the shelf life for CS-WB can be considered for up to 21 days.
Asunto(s)
Conservación de la Sangre , Pruebas de Función Plaquetaria , Anexina A5/metabolismo , Plaquetas/metabolismo , Hemostasis , HumanosRESUMEN
BACKGROUND: In the United Kingdom, liver transplantation (LT) is undertaken in 7 supraregional centers. Until March 2018, liver grafts were offered to a center and allocated to a patient on their elective waiting list (WL) based on unit prioritization. Patients in Newcastle, Leeds, and Edinburgh with a United Kingdom Model for End-Stage Liver Disease (UKELD) score ≥62 were registered on a common WL and prioritized for deceased-donor liver allocation. This was known as the Northern Liver Alliance (NLA) "top-band scheme." Organs were shared between the 3 centers, with a "payback" scheme ensuring no patient in any center was disadvantaged. We investigated whether the NLA had improved WL survival and waiting time (WT) to transplantation. METHODS: Data for this study were obtained from the UK Transplant Registry maintained by National Health Service Blood and Transplant. This study was based on adult patients registered for first elective liver transplant between April 2013 and December 2016. Non-NLA centers were controls. The Kaplan-Meier method was used to estimate WL survival and median WT to transplant, with the log-rank test used to make comparisons; a Bonferroni correction was applied post hoc to determine pairwise differences. RESULTS: WT was significantly lower at NLA centers compared with non-NLA centers for top-band patients (23 versus 99 days, P < 0.001). However, WL survival was not significantly different for top-band patients (P > 0.999) comparing NLA with non-NLA centers. WL survival for nontop-band patients was no different (P > 0.999) comparing NLA with non-NLA centers. CONCLUSIONS: The NLA achieved its aim, providing earlier transplantation to patients with the greatest need. Nontop-band patients did not experience inferior survival.
Asunto(s)
Enfermedad Hepática en Estado Terminal/mortalidad , Enfermedad Hepática en Estado Terminal/cirugía , Trasplante de Hígado , Selección de Paciente , Obtención de Tejidos y Órganos/normas , Listas de Espera , Adulto , Accesibilidad a los Servicios de Salud , Humanos , Estimación de Kaplan-Meier , Hígado/cirugía , Donadores Vivos , Sistema de Registros , Asignación de Recursos , Índice de Severidad de la Enfermedad , Tiempo de Tratamiento , Obtención de Tejidos y Órganos/organización & administración , Trasplantes , Resultado del Tratamiento , Reino UnidoRESUMEN
OBJECTIVES: Increased morbidity and mortality have been associated with weekend and night-time clinical activity. We sought to compare the outcomes of liver transplantation (LT) between weekdays and weekends or night-time and day-time to determine if 'out-of-hours' LT has acceptable results compared with 'in-hours'. DESIGN, SETTING AND PARTICIPANTS: We conducted a retrospective analysis of patient outcomes for all 8816 adult, liver-only transplants (2000-2014) from the UK Transplant Registry. OUTCOME MEASURES: Outcome measures were graft failure (loss of the graft with or without death) and transplant failure (either graft failure or death with a functioning graft) at 30 days, 1 year and 3 years post-transplantation. The association of these outcomes with weekend versus weekday and day versus night transplantation were explored, following the construction of a risk-adjusted Cox regression model. RESULTS: Similar patient and donor characteristics were observed between weekend and weekday transplantation. Unadjusted graft failure estimates were 5.7% at 30 days, 10.4% at 1 year and 14.6% at 3 years; transplant failure estimates were 7.9%, 15.3% and 21.3% respectively.A risk-adjusted Cox regression model demonstrated a significantly lower adjusted HR (95% CI) of transplant failure for weekend transplant of 0.77 (0.66 to 0.91) within 30 days, 0.86 (0.77 to 0.97) within 1 year, 0.89 (0.81 to 0.99) within 3 years and for graft failure of 0.81 (0.67 to 0.97) within 30 days. For patients without transplant failure within 30 days, there was no weekend effect on transplant failure. Neither night-time procurement nor transplantation were associated with an increased hazard of transplant or graft failure. CONCLUSIONS: Weekend and night-time LT outcomes were non-inferior to weekday or day-time transplantation, and we observed a possible small beneficial effect of weekend transplantation. The structure of LT services in the UK delivers acceptable outcomes 'out-of-hours' and may offer wider lessons for weekend working structures.
Asunto(s)
Trasplante de Hígado/mortalidad , Trasplante de Hígado/tendencias , Adulto , Femenino , Supervivencia de Injerto , Humanos , Masculino , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Sistema de Registros , Estudios Retrospectivos , Factores de Tiempo , Resultado del Tratamiento , Reino Unido/epidemiologíaAsunto(s)
Sacrificio de Animales/legislación & jurisprudencia , Mustelidae , Política Pública , Tuberculosis Bovina/prevención & control , Sacrificio de Animales/métodos , Sacrificio de Animales/estadística & datos numéricos , Bienestar del Animal/normas , Animales , Bovinos , Humanos , Concesión de Licencias , Reino Unido , Medicina Veterinaria/organización & administraciónRESUMEN
BACKGROUND & AIMS: Donation after circulatory death (DCD) in the UK has tripled in the last decade. However, outcomes following DCD liver transplantation are worse than for donation after brainstem death (DBD) liver transplants. This study examines whether a recipient should accept a "poorer quality" DCD organ or wait longer for a "better" DBD organ. METHODS: Data were collected on 5,825 patients who were registered on the elective waiting list for a first adult liver-only transplant and 3,949 patients who received a liver-only transplant in the UK between 1 January 2008 and 31 December 2015. Survival following deceased donor liver transplantation performed between 2008 and 2015 was compared by Cox regression modelling to assess the impact on patient survival of accepting a DCD liver compared to deferring for a potential DBD transplant. RESULTS: A total of 953 (23%) of the 3,949 liver transplantations performed utilised DCD donors. Five-year post-transplant survival was worse following DCD than DBD transplantation (69.1% [DCD] vs. 78.3% [DBD]; p <0.0001: adjusted hazard ratio [HR] 1.65; 95% CI 1.40-1.94). Of the 5,798 patients registered on the transplant list, 1,325 (23%) died or were removed from the list without receiving a transplant. Patients who received DCD livers had a lower risk-adjusted hazard of death than those who remained on the waiting list for a potential DBD organ (adjusted HR 0.55; 95% CI 0.47-0.65). The greatest survival benefit was in those with the most advanced liver disease (adjusted HR 0.19; 95% CI 0.07-0.50). CONCLUSIONS: Although DCD liver transplantation leads to worse transplant outcomes than DBD transplantation, the individual's survival is enhanced by accepting a DCD offer, particularly for patients with more severe liver disease. DCD liver transplantation improves overall survival for UK listed patients and should be encouraged. LAY SUMMARY: This study looks at patients who require a liver transplant to save their lives; this liver can be donated by a person who has died either after their heart has stopped (donation after cardiac death [DCD]) or after the brain has been injured and can no longer support life (donation after brainstem death [DBD]). We know that livers donated after brainstem death function better than those after cardiac death, but there are not enough of these livers for everyone, so we wished to help patients decide whether it was better for them to accept an early offer of a DCD liver than waiting longer to receive a "better" liver from a DBD donor. We found that patients were more likely to survive if they accepted the offer of a liver transplant as soon as possible (DCD or DBD), especially if their liver disease was very severe.
Asunto(s)
Trasplante de Hígado/mortalidad , Obtención de Tejidos y Órganos , Adulto , Muerte Encefálica , Muerte , Femenino , Humanos , Masculino , Persona de Mediana EdadRESUMEN
Livers from controlled donation after circulatory death (DCD) donors suffer a higher incidence of nonfunction, poor function, and ischemic cholangiopathy. In situ normothermic regional perfusion (NRP) restores a blood supply to the abdominal organs after death using an extracorporeal circulation for a limited period before organ recovery. We undertook a retrospective analysis to evaluate whether NRP was associated with improved outcomes of livers from DCD donors. NRP was performed on 70 DCD donors from whom 43 livers were transplanted. These were compared with 187 non-NRP DCD donor livers transplanted at the same two UK centers in the same period. The use of NRP was associated with a reduction in early allograft dysfunction (12% for NRP vs. 32% for non-NRP livers, P = .0076), 30-day graft loss (2% NRP livers vs. 12% non-NRP livers, P = .0559), freedom from ischemic cholangiopathy (0% vs. 27% for non-NRP livers, P < .0001), and fewer anastomotic strictures (7% vs. 27% non-NRP, P = .0041). After adjusting for other factors in a multivariable analysis, NRP remained significantly associated with freedom from ischemic cholangiopathy (P < .0001). These data suggest that NRP during organ recovery from DCD donors leads to superior liver outcomes compared to conventional organ recovery.
Asunto(s)
Trasplante de Hígado/métodos , Preservación de Órganos/métodos , Adolescente , Adulto , Anciano , Enfermedades de los Conductos Biliares/prevención & control , Conductos Biliares/irrigación sanguínea , Niño , Muerte , Funcionamiento Retardado del Injerto/prevención & control , Circulación Extracorporea , Femenino , Supervivencia de Injerto , Humanos , Isquemia/prevención & control , Trasplante de Hígado/efectos adversos , Masculino , Persona de Mediana Edad , Preservación de Órganos/efectos adversos , Perfusión/métodos , Estudios Retrospectivos , Temperatura , Recolección de Tejidos y Órganos/efectos adversos , Recolección de Tejidos y Órganos/métodos , Obtención de Tejidos y Órganos/métodos , Adulto JovenRESUMEN
Phosphorus cycling exerts significant influence upon soil fertility and productivity - processes largely controlled by microbial activity. We adopted phenotypic and metagenomic approaches to investigate phosphatase genes within soils. Microbial communities in bare fallowed soil showed a marked capacity to utilise phytate for growth compared with arable or grassland soil communities. Bare fallowed soil contained lowest concentrations of orthophosphate. Analysis of metagenomes indicated phoA, phoD and phoX, and histidine acid and cysteine phytase genes were most abundant in grassland soil which contained the greatest amount of NaOH-EDTA extractable orthophosphate. Beta-propeller phytase genes were most abundant in bare fallowed soil. Phylogenetic analysis of metagenome sequences indicated the phenotypic shift observed in the capacity to mineralise phytate in bare fallow soil was accompanied by an increase in phoD, phoX and beta-propeller phytase genes coding for exoenzymes. However, there was a remarkable degree of genetic similarity across the soils despite the differences in land-use. Predicted extracellular ecotypes were distributed across a greater range of soil structure than predicted intracellular ecotypes, suggesting that microbial communities subject to the dual stresses of low nutrient availability and reduced access to organic material in bare fallowed soils rely upon the action of exoenzymes.