Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 40
Filtrar
1.
Transplant Direct ; 10(6): e1630, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38769984

RESUMO

Background: Small stature and female sex correlate to decreased deceased donor liver transplant (DDLT) access and higher waitlist mortality. However, efforts are being made to improve access and equity of allocation under the new continuous distribution (CD) system. Liver anteroposterior diameter (APD) is a method used by many centers to determine size compatibility for DDLT but is not recorded systematically, so it cannot be used for allocation algorithms. We therefore seek to correlate body surface area (BSA) and height to APD in donors and recipients and compare waitlist outcomes by these factors to support their use in the CD system. Methods: APD was measured from single-center DDLT recipients and donors with cross-sectional imaging. Linear, Pearson, and PhiK correlation coefficient were used to correlate BSA and height to APD. Competing risk analysis of waitlist outcomes was performed using United Network for Organ Sharing data. Results: For 143 pairs, donor BSA correlated better with APD than height (PhiK = 0.63 versus 0.20). For recipient all comers, neither BSA nor height were good correlates of APD, except in recipients without ascites, where BSA correlated well (PhiK = 0.63) but height did not. However, among female recipients, BSA, but not height, strongly correlated to APD regardless of ascites status (PhiK = 0.80 without, PhiK = 0.70 with). Among male recipients, BSA correlated to APD only in those without ascites (PhiK = 0.74). In multivariable models, both BSA and height were predictive of waitlist outcomes, with higher values being associated with increased access, decreased delisting for death/clinical deterioration, and decreased living donor transplant (model concordance 0.748 and 0.747, respectively). Conclusions: Taken together, BSA is a good surrogate for APD and can therefore be used in allocation decision making in the upcoming CD era to offset size and gender-based disparities among certain candidate populations.

2.
Transpl Infect Dis ; 26(3): e14229, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38214192

RESUMO

The Comparison of Antiviral Preventative Strategies In Liver Transplant (CAPSIL) study showed pre-emptive therapy (PET) to be superior to antiviral prophylaxis for Cytomegalovirus (CMV) disease prevention in high-risk CMV seronegative liver transplant recipients (LTRs) with seropositive donors (D+R-). Despite the statistical superiority of PET over prophylaxis in research settings, PET is perceived as a logistically more complex strategy that requires careful coordination of weekly CMV PCR testing, prompt initiation of CMV antivirals upon viremia detection, and timely cessation of antivirals following viremia resolution. Transplant centers may be hesitant to use PET for CMV disease prevention in D+R- LTRs out of concern that PET coordination is not feasible in clinical practice. We recently described our experience using PET in CMV D+R- LTRs in a real-world setting, and found it to be as effective for CMV disease prevention as PET performed as part of a clinical trial. Here, we describe a systematic approach for PET implementation in real-world settings and provide practical tools to address anticipated challenges. This framework can support transplant programs in overcoming logistical barriers to PET and incorporating an evidence-based and cost-effective CMV prevention strategy into routine care for high-risk CMV D+R- LTRs.


Assuntos
Antivirais , Infecções por Citomegalovirus , Citomegalovirus , Transplante de Fígado , Doadores de Tecidos , Humanos , Infecções por Citomegalovirus/prevenção & controle , Transplante de Fígado/efeitos adversos , Antivirais/uso terapêutico , Antivirais/administração & dosagem , Citomegalovirus/efeitos dos fármacos , Citomegalovirus/isolamento & purificação , Transplantados , Viremia/prevenção & controle
3.
Transpl Immunol ; 81: 101943, 2023 12.
Artigo em Inglês | MEDLINE | ID: mdl-37866670

RESUMO

BACKGROUND: The presence of anti-Glutathione S-transferase T1 (GSTT1) antibodies (abs) has been hypothesized as a pathogenic contributor in antibody-mediated rejection (AMR). METHODS: We aimed to evaluate the relationship between genetic variants of GSTT1, anti-GSTT1 abs and AMR in a cohort of 87 kidney transplant (KTx) patients using Immucor's non-HLA Luminex assay. Patients were classified according to biopsy-proven AMR and HLA-DSA status: AMR with positive anti-HLA-DSAs (AMR/DSA+, n = 29), AMR but no detectable anti-HLA-DSAs (AMR/DSA-, n = 28) and control patients with stable allograft function and no evidence of rejection (n = 30). RESULTS: At an MFI cut-off of 3000, the overall prevalence of anti-GSTT1 abs was 18.3%. The proportion of patients with anti-GSTT1 abs was higher in the AMR/DSA- group (25%), compared to the control (13.3%) and AMR/DSA+ group (3.4%) (p = 0.06). Among patients with anti-GSTT1 abs, the MFI was higher in AMR/DSA- and GSTT1-Null patients. Of 81 patients who underwent GSTT1 genotyping, 19.8% were homozygotes for the null allele (GSTT1-Null). GSTT1-Null status in the transplant recipients was associated with the development of anti-GSTT1 abs (OR, 4.49; 95%CI, 1.2-16.7). In addition, GSTT1-Null genotype (OR 26.01; 95%CI, 1.63-404) and anti-GSTT1 ab positivity (OR 14.8; 95%CI, 1.1-190) were associated with AMR. Within AMR/DSA- patients, the presence of anti-GSTT1 abs didn't confer a higher risk of failure within the study observation period. CONCLUSION: The presence of anti-GSTT1 abs and GSTT1-Null genotype is associated with AMR, but do not appear to lead to accelerated graft injury in this cohort of early allograft injury changes, with a limited period of follow-up.


Assuntos
Transplante de Rim , Humanos , Antígenos HLA/genética , Rejeição de Enxerto/genética , Anticorpos , Genótipo , Isoanticorpos , Doadores de Tecidos
4.
Front Immunol ; 14: 1194338, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37457719

RESUMO

Objective: There is an unmet need for optimizing hepatic allograft allocation from nondirected living liver donors (ND-LLD). Materials and method: Using OPTN living donor liver transplant (LDLT) data (1/1/2000-12/31/2019), we identified 6328 LDLTs (4621 right, 644 left, 1063 left-lateral grafts). Random forest survival models were constructed to predict 10-year graft survival for each of the 3 graft types. Results: Donor-to-recipient body surface area ratio was an important predictor in all 3 models. Other predictors in all 3 models were: malignant diagnosis, medical location at LDLT (inpatient/ICU), and moderate ascites. Biliary atresia was important in left and left-lateral graft models. Re-transplant was important in right graft models. C-index for 10-year graft survival predictions for the 3 models were: 0.70 (left-lateral); 0.63 (left); 0.61 (right). Similar C-indices were found for 1-, 3-, and 5-year graft survivals. Comparison of model predictions to actual 10-year graft survivals demonstrated that the predicted upper quartile survival group in each model had significantly better actual 10-year graft survival compared to the lower quartiles (p<0.005). Conclusion: When applied in clinical context, our models assist with the identification and stratification of potential recipients for hepatic grafts from ND-LLD based on predicted graft survivals, while accounting for complex donor-recipient interactions. These analyses highlight the unmet need for granular data collection and machine learning modeling to identify potential recipients who have the best predicted transplant outcomes with ND-LLD grafts.


Assuntos
Falência Hepática , Transplante de Fígado , Humanos , Transplante de Fígado/efeitos adversos , Doadores Vivos , Estudos Retrospectivos
5.
Proc Natl Acad Sci U S A ; 120(18): e2120251119, 2023 05 02.
Artigo em Inglês | MEDLINE | ID: mdl-37094119

RESUMO

Scientific knowledge related to quantifying the monetized benefits for landscape-wide water quality improvements does not meet current regulatory and benefit-cost analysis needs in the United States. In this study we addressed this knowledge gap by incorporating the Biological Condition Gradient (BCG) as a water quality metric into a stated preference survey capable of estimating the total economic value (use and nonuse) for aquatic ecosystem improvements. The BCG is grounded in ecological principles and generalizable and transferable across space. Moreover, as the BCG translates available data on biological condition into a score on a 6-point scale, it provides a simple metric that can be readily communicated to the public. We applied our BCG-based survey instrument to households across the Upper Mississippi, Ohio, and Tennessee river basins and report values for a range of potential improvements that vary by location, spatial scale, and the scope of the water quality change. We found that people are willing to pay twice as much for an improvement policy that targets their home watershed (defined as a four-digit hydrologic unit) versus a more distant one. We also found that extending the spatial scale of a local policy beyond the home watershed does not generate additional benefits to the household. Finally, our results suggest that nonuse sources of value (e.g., bequest value, intrinsic aesthetic value) are an important component of overall benefits.


Assuntos
Ecossistema , Rios , Humanos , Estados Unidos , Ohio , Mississippi
7.
Proc Natl Acad Sci U S A ; 120(15): e2210417120, 2023 Apr 11.
Artigo em Inglês | MEDLINE | ID: mdl-37011190

RESUMO

High-quality water resources provide a wide range of benefits, but the value of water quality is often not fully represented in environmental policy decisions, due in large part to an absence of water quality valuation estimates at large, policy relevant scales. Using data on property values with nationwide coverage across the contiguous United States, we estimate the benefits of lake water quality as measured through capitalization in housing markets. We find compelling evidence that homeowners place a premium on improved water quality. This premium is largest for lakefront property and decays with distance from the waterbody. In aggregate, we estimate that 10% improvement of water quality for the contiguous United States has a value of $6 to 9 billion to property owners. This study provides credible evidence for policymakers to incorporate lake water quality value estimates in environmental decision-making.

8.
JAMA Surg ; 158(6): 610-616, 2023 06 01.
Artigo em Inglês | MEDLINE | ID: mdl-36988928

RESUMO

Importance: Small waitlist candidates are significantly less likely than larger candidates to receive a liver transplant. Objective: To investigate the magnitude of the size disparity and test potential policy solutions. Design, Setting, and Participants: A decision analytical model was generated to match liver transplant donors to waitlist candidates based on predefined body surface area (BSA) ratio limits (donor BSA divided by recipient BSA). Participants included adult deceased liver transplant donors and waitlist candidates in the Organ Procurement and Transplantation Network database from June 18, 2013, to March 20, 2020. Data were analyzed from January 2021 to September 2021. Exposures: Candidates were categorized into 6 groups according to BSA from smallest (group 1) to largest (group 6). Waitlist outcomes were examined. A match run was created for each donor under the current acuity circle liver allocation policy, and the proportion of candidates eligible for a liver based on BSA ratio was calculated. Novel allocation models were then tested. Main Outcomes and Measures: Time on the waitlist, assigned Model for End-Stage Liver Disease (MELD) score, and proportion of patients undergoing a transplant were compared by BSA group. Modeling under the current allocation policies was used to determine baseline access to transplant by group. Simulation of novel allocation policies was performed to examine change in access. Results: There were 41 341 donors (24 842 [60.1%] male and 16 499 [39.9%] female) and 84 201 waitlist candidates (53 724 [63.8%] male and 30 477 [36.2%] female) in the study. The median age of the donors was 42 years (IQR, 28-55) and waitlist candidates, 57 years (IQR, 50-63). Females were overrepresented in the 2 smallest BSA groups (7100 [84.0%] and 7922 [61.1%] in groups 1 and 2, respectively). For each increase in group number, waitlist time decreased (234 days [IQR, 48-700] for group 1 vs 179 days [IQR, 26-503] for group 6; P < .001) and the proportion of the group undergoing transplant likewise improved (3890 [46%] in group 1 vs 4932 [57%] in group 6; P < .001). The smallest 2 groups of candidates were disadvantaged under the current acuity circle allocation model, with 37% and 7.4% fewer livers allocated relative to their proportional representation on the waitlist. Allocation of the smallest 10% of donors (by BSA) to the smallest 15% of candidates overcame this disparity, as did performing split liver transplants. Conclusions and Relevance: In this study, liver waitlist candidates with the smallest BSAs had a disadvantage due to size. Prioritizing allocation of smaller liver donors to smaller candidates may help overcome this disparity.


Assuntos
Doença Hepática Terminal , Transplante de Fígado , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Masculino , Feminino , Pessoa de Meia-Idade , Doença Hepática Terminal/cirurgia , Superfície Corporal , Índice de Gravidade de Doença , Doadores Vivos , Doadores de Tecidos , Listas de Espera
9.
Transpl Infect Dis ; 25(2): e14015, 2023 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-36734631

RESUMO

BACKGROUND: Despite superiority of preemptive therapy (PET) compared to universal prophylaxis for prevention of cytomegalovirus (CMV) disease in the CAPSIL randomized trial among CMV D+R- liver transplant recipients (LTxRs), real-world effectiveness may be lower because of logistical concerns about feasibility of PET. METHODS: We retrospectively assessed PET as standard clinical care at a single transplant center among 50 consecutive adult CMV D+R- LTxRs undergoing a first liver transplant between 4/4/2019 and 5/18/2021 and compared outcomes and adherence to those randomized to PET in the CAPSIL study (N = 100). The primary outcome was CMV disease and secondary outcomes were biopsy-confirmed acute allograft rejection, retransplant, invasive fungal infections, and death, all assessed by 1-year post-transplant. Exploratory outcomes included virologic parameters and measures of adherence to protocol-specified CMV qPCR monitoring. RESULTS: Baseline characteristics were similar between groups. The cumulative incidence of CMV disease at 1-year post-transplant was 4/50 (8%) versus 9/100 (9%) in the real-world and CAPSIL cohorts, respectively, p = 1.0. The rate of breakthrough CMV disease during the 100-day PET period was low (2/50 [4%]) and similar to the PET cohort from the CAPSIL study (3/100 [3%]).  All secondary and exploratory outcomes were not significantly different between the real-world and CAPSIL PET cohorts. CONCLUSIONS: In this first reported study of real-world PET, the feasibility and effectiveness for CMV disease prevention and for other clinical outcomes in CMV D+R- LTxRs were similar to those reported with PET in a clinical trial. Additional studies to confirm feasibility and generalizability in other settings are warranted.


Assuntos
Infecções por Citomegalovirus , Transplante de Fígado , Adulto , Humanos , Citomegalovirus , Antivirais/uso terapêutico , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Resultado do Tratamento , Infecções por Citomegalovirus/epidemiologia , Infecções por Citomegalovirus/prevenção & controle , Infecções por Citomegalovirus/tratamento farmacológico , Tomografia por Emissão de Pósitrons/efeitos adversos , Transplantados , Ganciclovir/uso terapêutico
11.
Am J Transplant ; 22(12): 3087-3092, 2022 12.
Artigo em Inglês | MEDLINE | ID: mdl-36088649

RESUMO

The kidney donor risk index (KDRI) and percentile conversion, kidney donor profile index (KDPI), provide a continuous measure of donor quality. Kidneys with a KDPI >85% (KDPI85 ) are referred to as "high KDPI." The KDPI85 cutoff changes every year, impacting which kidneys are labeled as KDPIHIGH . We examine kidney utilization around the KDPI85 cutoff and explore the "high KDPI" labeling effect. KDRI to KDPI Mapping Tables from 2012 to 2020 were used to determine the yearly KDRI85 value. Organ Procurement and Transplantation Network data was used to calculate discard rates and model organ use. KDRI85 varied between 1.768 and 1.888. In a multivariable analysis, kidney utilization was lower for KDPI 86% compared with KDPI 85% kidneys (p = .046). Kidneys with a KDRI between 1.785-1.849 were classified as KDPIHIGH in the years 2015-2017 and KDPILOW in the years 2018-2020. The discard rate was 44.9% when labeled as KDPIHIGH and 39.1% when labeled as KDPILOW (p < .01). For kidneys with the same KDRI, the high KDPI label is associated with increased discard. We should reconsider the appropriateness of the "high KDPI" label.


Assuntos
Transplante de Rim , Obtenção de Tecidos e Órgãos , Humanos , Seleção do Doador , Sobrevivência de Enxerto , Fatores de Risco , Doadores de Tecidos , Rim , Estudos Retrospectivos
13.
Transplant Direct ; 8(2): e1282, 2022 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-35047664

RESUMO

BACKGROUND: The current model for end-stage liver disease-based liver allocation system in the United States prioritizes sickest patients first at the expense of long-term graft survival. In a continuous distribution model, a measure of posttransplant survival will also be included. We aimed to use mathematical optimization to match donors and recipients based on quality to examine the potential impact of an allocation system designed to maximize long-term graft survival. METHODS: Cox proportional hazard models using organ procurement and transplantation network data from 2008 to 2012 were used to place donors and waitlist candidates into 5 groups of increasing risk for graft loss (1-lowest to 5-highest). A mixed integer programming optimization model was then used to generate allocation rules that maximized graft survival at 5 and 8 y. RESULTS: Allocation based on mathematical optimization improved 5-y survival by 7.5% (78.2% versus 70.7% in historic cohort) avoiding 2271 graft losses, and 8-y survival by 9% (71.8% versus 62.8%) avoiding 2725 graft losses. Long-term graft survival for recipients within a quality group is highly dependent on donor quality. All candidates in groups 1 and 2 and 43% of group 3 were transplanted, whereas none of the candidates in groups 4 and 5 were transplanted. CONCLUSIONS: Long-term graft survival can be improved using a model that allocates livers based on both donor and recipient quality, and the interaction between donor and recipient quality is an important predictor of graft survival. Considerations for incorporation into a continuous distribution model are discussed.

14.
Ecosystems ; 25(3): 697-711, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-34512142

RESUMO

The increasing frequency of extreme events, exogenous and endogenous, poses challenges for our societies. The current pandemic is a case in point; but "once-in-a-century" weather events are also becoming more common, leading to erosion, wildfire and even volcanic events that change ecosystems and disturbance regimes, threaten the sustainability of our life-support systems, and challenge the robustness and resilience of societies. Dealing with extremes will require new approaches and large-scale collective action. Preemptive measures can increase general resilience, a first line of protection, while more specific reactive responses are developed. Preemptive measures also can minimize the negative effects of events that cannot be avoided. In this paper, we first explore approaches to prevention, mitigation and adaptation, drawing inspiration from how evolutionary challenges have made biological systems robust and resilient, and from the general theory of complex adaptive systems. We argue further that proactive steps that go beyond will be necessary to reduce unacceptable consequences.

15.
Proc Natl Acad Sci U S A ; 118(28)2021 07 13.
Artigo em Inglês | MEDLINE | ID: mdl-34260382

RESUMO

Despite decades of policy that strives to reduce nutrient and sediment export from agricultural fields, surface water quality in intensively managed agricultural landscapes remains highly degraded. Recent analyses show that current conservation efforts are not sufficient to reverse widespread water degradation in Midwestern agricultural systems. Intensifying row crop agriculture and increasing climate pressure require a more integrated approach to water quality management that addresses diverse sources of nutrients and sediment and off-field mitigation actions. We used multiobjective optimization analysis and integrated three biophysical models to evaluate the cost-effectiveness of alternative portfolios of watershed management practices at achieving nitrate and suspended sediment reduction goals in an agricultural basin of the Upper Midwestern United States. Integrating watershed-scale models enabled the inclusion of near-channel management alongside more typical field management and thus directly the comparison of cost-effectiveness across portfolios. The optimization analysis revealed that fluvial wetlands (i.e., wide, slow-flowing, vegetated water bodies within the riverine corridor) are the single-most cost-effective management action to reduce both nitrate and sediment loads and will be essential for meeting moderate to aggressive water quality targets. Although highly cost-effective, wetland construction was costly compared to other practices, and it was not selected in portfolios at low investment levels. Wetland performance was sensitive to placement, emphasizing the importance of watershed scale planning to realize potential benefits of wetland restorations. We conclude that extensive interagency cooperation and coordination at a watershed scale is required to achieve substantial, economically viable improvements in water quality under intensive row crop agricultural production.


Assuntos
Agricultura/economia , Agricultura/normas , Análise Custo-Benefício , Modelos Teóricos , Qualidade da Água/normas , Orçamentos , Comportamento Cooperativo , Geografia , Minnesota
17.
Exp Clin Transplant ; 19(1): 8-13, 2021 01.
Artigo em Inglês | MEDLINE | ID: mdl-32133939

RESUMO

OBJECTIVES: Kidney transplant is the optimal treatment for patients with end-stage renal disease. The effects of using machine perfusion for donor kidneys with varying Kidney Donor Profile Index scores are unknown. We sought to assess the impact of machine perfusion on the incidence of delayed graft function in different score groups of kidney grafts classified with the Kidney Donor Profile Index. MATERIALS AND METHODS: We conducted a retrospective analysis from January 2008 through September 2017 of adult recipients (≥ 18 years old) undergoing kidney-only transplant from deceased donors. All transplant recipients were followed until December 2017. Recipients who received multiorgan transplants or kidneys from living donors were excluded from our analyses. Recipients were divided according to 5 donor categories of Kidney Donor Profile Index scores (0-20, 21-40, 41-60, 61-80, and 81-100). Logistic regression analysis was performed for each score group to determine the effects of machine perfusion on development of delayed graft function within each score group. RESULTS: Our study included 101222 recipients who met the inclusion criteria. Multivariate analysis revealed that machine perfusion was associated with significantly decreased development of delayed graft function only in donors with high-risk profiles: the 61 to 80 score group (odds ratio = 0.83; confidence interval, 0.78-0.89) and the 81 to 100 score group (odds ratio = 0.72; confidence interval, 0.67-0.78). CONCLUSIONS: Machine perfusion is beneficial in reducing delayed graft function only in donor kidneys with a higher risk profile.


Assuntos
Função Retardada do Enxerto , Transplante de Rim , Adulto , Função Retardada do Enxerto/etiologia , Função Retardada do Enxerto/prevenção & controle , Humanos , Transplante de Rim/efeitos adversos , Perfusão , Estudos Retrospectivos
18.
Proc Natl Acad Sci U S A ; 117(49): 30900-30906, 2020 12 08.
Artigo em Inglês | MEDLINE | ID: mdl-33234568

RESUMO

Massive wildlife losses over the past 50 y have brought new urgency to identifying both the drivers of population decline and potential solutions. We provide large-scale evidence that air pollution, specifically ozone, is associated with declines in bird abundance in the United States. We show that an air pollution regulation limiting ozone precursors emissions has delivered substantial benefits to bird conservation. Our estimates imply that air quality improvements over the past 4 decades have stemmed the decline in bird populations, averting the loss of 1.5 billion birds, ∼20% of current totals. Our results highlight that in addition to protecting human health, air pollution regulations have previously unrecognized and unquantified conservation cobenefits.


Assuntos
Poluição do Ar/análise , Aves/fisiologia , Conservação dos Recursos Naturais , Poluentes Atmosféricos/toxicidade , Animais , Geografia , Ozônio/toxicidade , Estados Unidos
19.
Clin Transplant ; 33(8): e13662, 2019 08.
Artigo em Inglês | MEDLINE | ID: mdl-31283049

RESUMO

The impact of size mismatch in deceased donor liver transplantation is unknown. BSA has been demonstrated to be an accurate indicator of liver volume. We developed a model to match livers by BSA and estimate the impact of size mismatch on graft survival. Using the Standard Transplant Analysis and Research (STAR) database we selected solitary primary liver transplants recipients of any age, transplanted between 3/6/2002 and 12/31/2016. Using the Cox proportional hazard model, and controlling for donor and recipient factors, we determined the relative risk for graft survival for four donor/recipient body surface area ratio groups (≤0.68, 0.69-0.90, 0.91-1.25, 1.26-1.5). We studied two groups: recipients with a BSA > 1.6 (adults) and ≤1.6 (children) and a subgroup with a BSA ≤ 0.53 (small infants). In recipients with BSA > 1.6 (adults [n = 71 365]), D/R ratios ≤ 0.68 and > 1.25 had a negative impact on graft survival. In recipients with BSA ≤ 1.6 (children [n = 8339]) D/R ratios <0.75 and >1.25 had a negative impact on graft survival. In the 1725 recipients with BSA ≤ 0.53 (small infants) D/R ratios <1 and >2.3 had a negative impact on graft survival. In deceased donor liver transplantation, the D/R ratio is a significant, yet underestimated predictor of graft survival that should be considered in donor and recipient selection.


Assuntos
Morte , Rejeição de Enxerto/mortalidade , Rim/anatomia & histologia , Transplante de Fígado/mortalidade , Complicações Pós-Operatórias/mortalidade , Doadores de Tecidos/estatística & dados numéricos , Adolescente , Criança , Pré-Escolar , Feminino , Seguimentos , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/metabolismo , Rejeição de Enxerto/patologia , Sobrevivência de Enxerto , Humanos , Lactente , Recém-Nascido , Transplante de Fígado/efeitos adversos , Masculino , Complicações Pós-Operatórias/etiologia , Complicações Pós-Operatórias/metabolismo , Complicações Pós-Operatórias/patologia , Prognóstico , Estudos Retrospectivos , Fatores de Risco , Soroalbumina Bovina/análise
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA