Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 252
Filter
1.
Clin Transplant ; 38(3): e15269, 2024 03.
Article in English | MEDLINE | ID: mdl-38445531

ABSTRACT

INTRODUCTION: Thoracoabdominal normothermic regional perfusion (TA-NRP) following cardiac death is an emerging multivisceral organ procurement technique. Recent national studies on outcomes of presumptive TA-NRP-procured organs are limited by potential misclassification since TA-NRP is not differentiated from donation after cardiac death (DCD) in registry data. METHODS: We studied 22 donors whose designees consented to TA-NRP and organ procurement performed at our institution between January 20, 2020 and July 3, 2022. We identified these donors in SRTR to describe organ utilization and recipient outcomes and compared them to recipients of traditional DCD (tDCD) and donation after brain death (DBD) organs during the same timeframe. RESULTS: All 22 donors progressed to cardiac arrest and underwent TA-NRP followed by heart, lung, kidney, and/or liver procurement. Median donor age was 41 years, 55% had anoxic brain injury, 45% were hypertensive, 0% were diabetic, and median kidney donor profile index was 40%. TA-NRP utilization was high across all organ types (88%-100%), with a higher percentage of kidneys procured via TA-NRP compared to tDCD (88% vs. 72%, p = .02). Recipient and graft survival ranged from 89% to 100% and were comparable to tDCD and DBD recipients (p ≥ .2). Delayed graft function was lower for kidneys procured from TA-NRP compared to tDCD donors (27% vs. 44%, p = .045). CONCLUSION: Procurement from TA-NRP donors yielded high organ utilization, with outcomes comparable to tDCD and DBD recipients across organ types. Further large-scale study of TA-NRP donors, facilitated by its capture in the national registry, will be critical to fully understand its impact as an organ procurement technique.


Subject(s)
Benzidines , Heart , Tissue and Organ Procurement , Humans , Adult , Perfusion , Tissue Donors , Brain Death
2.
Am J Transplant ; 2024 Mar 19.
Article in English | MEDLINE | ID: mdl-38514013

ABSTRACT

Xenotransplantation offers the potential to meet the critical need for heart and lung transplantation presently constrained by the current human donor organ supply. Much was learned over the past decades regarding gene editing to prevent the immune activation and inflammation that cause early organ injury, and strategies for maintenance of immunosuppression to promote longer-term xenograft survival. However, many scientific questions remain regarding further requirements for genetic modification of donor organs, appropriate contexts for xenotransplantation research (including nonhuman primates, recently deceased humans, and living human recipients), and risk of xenozoonotic disease transmission. Related ethical questions include the appropriate selection of clinical trial participants, challenges with obtaining informed consent, animal rights and welfare considerations, and cost. Research involving recently deceased humans has also emerged as a potentially novel way to understand how xeno-organs will impact the human body. Clinical xenotransplantation and research involving decedents also raise ethical questions and will require consensus regarding regulatory oversight and protocol review. These considerations and the related opportunities for xenotransplantation research were discussed in a workshop sponsored by the National Heart, Lung, and Blood Institute, and are summarized in this meeting report.

3.
Am J Transplant ; 24(4): 526-532, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38341026

ABSTRACT

The first 2 living recipients of pig hearts died unexpectedly within 2 months, despite both recipients receiving what over 30 years of nonhuman primate (NHP) research would suggest were the optimal gene edits and immunosuppression to ensure success. These results prompt us to question how faithfully data from the NHP model translate into human outcomes. Before attempting any further heart xenotransplants in living humans, it is highly advisable to gain a more comprehensive understanding of why the promising preclinical NHP data did not accurately predict outcomes in humans. It is also unlikely that additional NHP data will provide more information that would de-risk a xenoheart clinical trial because these cases were based on the best practices from the most successful NHP results to date. Although imperfect, the decedent model offers a complementary avenue to determine appropriate treatment regimens to control the human immune response to xenografts and better understand the biologic differences between humans and NHP that could lead to such starkly contrasting outcomes. Herein, we explore the potential benefits and drawbacks of the decedent model and contrast it to the advantages and disadvantages of the extensive body of data generated in the NHP xenoheart transplantation model.


Subject(s)
Immunosuppression Therapy , Humans , Animals , Swine , Transplantation, Heterologous , Heterografts
4.
Am J Transplant ; 24(3): 350-361, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37931753

ABSTRACT

The XVIth Banff Meeting for Allograft Pathology was held in Banff, Alberta, Canada, from September 19 to 23, 2022, as a joint meeting with the Canadian Society of Transplantation. In addition to a key focus on the impact of microvascular inflammation and biopsy-based transcript analysis on the Banff Classification, further sessions were devoted to other aspects of kidney transplant pathology, in particular T cell-mediated rejection, activity and chronicity indices, digital pathology, xenotransplantation, clinical trials, and surrogate endpoints. Although the output of these sessions has not led to any changes in the classification, the key role of Banff Working Groups in phrasing unanswered questions, and coordinating and disseminating results of investigations addressing these unanswered questions was emphasized. This paper summarizes the key Banff Meeting 2022 sessions not covered in the Banff Kidney Meeting 2022 Report paper and also provides an update on other Banff Working Group activities relevant to kidney allografts.


Subject(s)
Kidney Transplantation , Canada , Graft Rejection/etiology , Graft Rejection/pathology , Kidney/pathology , Allografts
5.
Am J Transplant ; 24(3): 328-337, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38072121

ABSTRACT

Obesity is a chronic, relapsing disease that increases the risks of living kidney donation; at the same time, transplant centers have liberalized body mass index constraints for donors. With the increasing number of antiobesity medications available, the treatment of obesity with antiobesity medications may increase the pool of potential donors and enhance donor safety. Antiobesity medications are intended for long-term use given the chronic nature of obesity. Cessation of treatment can be expected to lead to weight regain and increase the risk of comorbidity rebound/development. In addition, antiobesity medications are meant to be used in conjunction with-rather than in replacement of-diet and physical activity optimization. Antiobesity medication management includes selecting medications that may ameliorate any coexisting medical conditions, avoiding those that are contraindicated in such conditions, and being sensitive to any out-of-pocket expenses that may be incurred by the potential donor. A number of questions remain regarding who will and should shoulder the costs of long-term obesity treatment for donors. In addition, future studies are needed to quantify the degree of weight loss and duration of weight loss maintenance needed to normalize the risk of adverse kidney outcomes relative to comparable nondonors and lower-weight donors.


Subject(s)
Tissue Donors , Tissue and Organ Harvesting , Humans , Kidney , Obesity/drug therapy , Weight Loss
6.
J Am Soc Nephrol ; 35(3): 347-360, 2024 Mar 01.
Article in English | MEDLINE | ID: mdl-38147137

ABSTRACT

SIGNIFICANCE STATEMENT: There is no standardized desensitization regimen for kidney transplant candidates. CD38, expressed by plasma cells, could be targeted for desensitization to deplete plasma cells producing alloantibodies and donor-specific antibodies. Few studies and case reports are available regarding the use of CD38 antibodies for desensitization in patients awaiting kidney transplant. This study shows that isatuximab, a CD38-targeting therapy, was well tolerated in kidney transplant candidates, with a durable decrease in anti-HLA antibodies and partial desensitization activity. The short treatment period and long follow-up of this study allowed for the understanding of the mechanism and timing for any antibody rebound. Isatuximab could be further investigated as an option for adjunct therapy to existing desensitization for patients on the kidney transplant waitlist. BACKGROUND: Patients with calculated panel reactive antibody (cPRA) ≥80.00%, particularly those with cPRA ≥99.90%, are considered highly sensitized and underserved by the Kidney Allocation System. Desensitization removes circulating reactive antibodies and/or suppresses antibody production to increase the chances of a negative crossmatch. CD38 is expressed highly on plasma cells, thus is a potential target for desensitization. METHODS: This was an open-label single-arm phase 1/2 study investigating the safety, pharmacokinetics, and preliminary efficacy of isatuximab in patients awaiting kidney transplantation. There were two cohorts, cohorts A and B, which enrolled cPRA ≥99.90% and 80.00% to <99.90%, respectively. RESULTS: Twenty-three patients (12 cohort A, 11 cohort B) received isatuximab 10 mg/kg weekly for 4 weeks then every 2 weeks for 8 weeks. Isatuximab was well tolerated with pharmacokinetic and pharmacodynamic profiles that indicated similar exposure to multiple myeloma trials. It resulted in decreases in CD38 + plasmablasts, plasma cells, and NK cells and significant reductions in HLA-specific IgG-producing memory B cells. Overall response rate, on the basis of a predefined composite desensitization end point, was 83.3% and 81.8% in cohorts A and B. Most responders had decreases in anti-HLA antibodies that were maintained for 26 weeks after the last dose. Overall, cPRA values were minimally affected, however, with only 9/23 patients (39%) having cPRA decreases to target levels. By study cutoff (median follow-up of 68 weeks), six patients received transplant offers, of which four were accepted. CONCLUSIONS: In this open-label trial, isatuximab was well tolerated and resulted in a durable decrease in anti-HLA antibodies with partial desensitization activity. CLINICAL TRIAL REGISTRATION NUMBER: NCT04294459 .


Subject(s)
Kidney Transplantation , Humans , Antibodies, Monoclonal, Humanized , Kidney , Isoantibodies , Antilymphocyte Serum
7.
J Mammal ; 104(6): 1338-1352, 2023 Dec.
Article in English | MEDLINE | ID: mdl-38059008

ABSTRACT

The Coyote (Canis latrans) is one of the most studied species in North America with at least 445 papers on its diet alone. While this research has yielded excellent reviews of what coyotes eat, it has been inadequate to draw deeper conclusions because no synthesis to date has considered prey availability. We accounted for prey availability by investigating the prey selection of coyotes across its distribution using the traditional Jacobs' index method, as well as the new iterative preference averaging (IPA) method on scats and biomass. We found that coyotes selected for Dall's Sheep (Ovis dalli), White-tailed Deer (Odocoileus virginianus), Eastern Cottontail Rabbit (Sylvilagus floridanus), and California Vole (Microtus californicus), which yielded a predator-to-preferred prey mass ratio of 1:2. We also found that coyotes avoided preying on other small mammals, including carnivorans and arboreal species. There was strong concordance between the traditional and IPA method on scats, but this pattern was weakened when biomass was considered. General linear models revealed that coyotes preferred to prey upon larger species that were riskier to hunt, reflecting their ability to hunt in groups, and were least likely to hunt solitary species. Coyotes increasingly selected Mule Deer (O. hemionus) and Snowshoe Hare (Lepus americanus) at higher latitudes, whereas Black-tailed Jackrabbit (L. californicus) were increasingly selected toward the tropics. Mule Deer were increasingly selected at higher coyote densities, while Black-tailed Jackrabbit were increasingly avoided at higher coyote densities. Coyote predation could constrain the realized niche of prey species at the distributional limits of the predator through their increased efficiency of predation reflected in increased prey selection values. These results are integral to improved understandings of Coyote ecology and can inform predictive analyses allowing for spatial variation, which ultimately will lead to better understandings about the ecological role of the coyote across different ecosystems.


El coyote (Canis latrans) es una de las especies más estudiadas en América del Norte con al menos 445 artículos solo sobre su dieta. Si bien esta investigación ha producido excelentes revisiones de lo que comen los coyotes, no ha sido adecuada para sacar conclusiones más profundas porque ninguna síntesis hasta la fecha ha considerado la disponibilidad de presas. Tomamos en cuenta la disponibilidad de presas al investigar la selección de presas de los coyotes a lo largo de la distribución de la especie utilizando el método tradicional del índice de Jacobs, así como el nuevo método iterativo de promedio de preferencia sobre excrementos y biomasa. Descubrimos que los coyotes seleccionados para la oveja de Dall (Ovis dalli), el venado de cola blanca (Odocoileus virginianus), los conejos de rabo blanco del este (Sylvilagus floridanus) y los campañoles de California (Microtus californicus), produjeron una proporción de masa de depredador a presa preferida de 1:2. También encontramos que los coyotes evitaban depredar a otros mamíferos pequeños, incluidos los carnívoros y las especies arbóreas. Hubo una fuerte concordancia entre el método de promedio de preferencia tradicional e iterativo en los excrementos, pero este patrón se debilitó cuando se consideró la biomasa. Los modelos lineales generales revelaron que los coyotes preferían cazar especies más grandes que eran más riesgosas de cazar, lo que reflejaba su capacidad para cazar en grupos, y era menos probable que cazaran especies solitarias. Los coyotes seleccionaron cada vez más al venado bura (O. hemionus) y la liebre con raquetas de nieve (Lepus americanus) en latitudes más altas, mientras que la liebre de cola negra (L. californicus) fue seleccionada cada vez más hacia los trópicos. El venado bura se seleccionaba cada vez más en densidades más altas de coyotes, mientras que la liebre de cola negra se evitaba cada vez más en densidades más altas de coyotes. La depredación de los coyotes podría restringir el nicho realizado de las especies de presa en los límites de distribución de las especies de depredadores a través de su mayor eficiencia de depredación reflejada en mayores valores de selección de presas. Estos resultados son parte integral de una mejor comprensión de la ecología del coyote y pueden informar análisis predictivos que permitan la variación espacial, lo que en última instancia conducirá a una mejor comprensión sobre el papel ecológico del coyote en diferentes ecosistemas.

8.
Ecol Evol ; 13(11): e10658, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37915808

ABSTRACT

Investigating spatial patterns of animal occupancy and reproduction in peripheral populations can provide insight into factors that form species range boundaries. Following historical extirpation, American black bears (Ursus americanus) recolonized the western Great Basin in Nevada from the Sierra Nevada during the late 1900s. This range expansion, however, has not continued further into the Great Basin despite the presence of additional habitat. We aimed to quantify whether reduced reproduction toward the range edge contributes to this range boundary. We analyzed black bear detections from 100 camera traps deployed across black bear distribution in western Nevada using a multistate occupancy model that quantified the probability of occupancy and reproduction (i.e., female bears with cubs occupancy) in relation to changes in habitat type and habitat amount toward the range boundary. We detected a strong effect of habitat amount and habitat type on the probability of black bear occupancy and reproduction. At similar levels of landscape-scale habitat amount (e.g., 50%), estimated probability of occupancy for adult bears in piñon-juniper woodlands near the range boundary was 0.39, compared to ~1.0 in Sierra Nevada mixed-conifer forest (i.e., core habitat). Furthermore, estimated probability of cub occupancy, conditional on adult bear occupancy, in landscapes with 50% habitat was 0.32 in Great Basin piñon-juniper woodlands, compared to 0.92 in Sierra Nevada mixed-conifer forest. Black bear range in the western Great Basin conforms to the center-periphery hypothesis, with piñon-juniper woodland at the range edge supporting ecologically marginal habitat for the species compared to habitat in the Sierra Nevada. Further geographic expansion of black bears in the Great Basin may be limited by lower occupancy of reproducing females in piñon-juniper woodland. Center-periphery range dynamics may be common in large carnivore species, as their dispersal ability allows them to colonize low-quality habitat near range edges.

9.
Am J Transplant ; 23(12): 1980-1989, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37748554

ABSTRACT

Older compatible living donor kidney transplant (CLDKT) recipients have higher mortality and death-censored graft failure (DCGF) compared to younger recipients. These risks may be amplified in older incompatible living donor kidney transplant (ILDKT) recipients who undergo desensitization and intense immunosuppression. In a 25-center cohort of ILDKT recipients transplanted between September 24, 1997, and December 15, 2016, we compared mortality, DCGF, delayed graft function (DGF), acute rejection (AR), and length of stay (LOS) between 234 older (age ≥60 years) and 1172 younger (age 18-59 years) recipients. To investigate whether the impact of age was different for ILDKT recipients compared to 17 542 CLDKT recipients, we used an interaction term to determine whether the relationship between posttransplant outcomes and transplant type (ILDKT vs CLDKT) was modified by age. Overall, older recipients had higher mortality (hazard ratio: 1.632.072.65, P < .001), lower DCGF (hazard ratio: 0.360.530.77, P = .001), and AR (odds ratio: 0.390.540.74, P < .001), and similar DGF (odds ratio: 0.461.032.33, P = .9) and LOS (incidence rate ratio: 0.880.981.10, P = 0.8) compared to younger recipients. The impact of age on mortality (interaction P = .052), DCGF (interaction P = .7), AR interaction P = .2), DGF (interaction P = .9), and LOS (interaction P = .5) were similar in ILDKT and CLDKT recipients. Age alone should not preclude eligibility for ILDKT.


Subject(s)
Kidney Transplantation , Humans , Aged , Middle Aged , Adolescent , Young Adult , Adult , Kidney Transplantation/adverse effects , Living Donors , Graft Survival , Graft Rejection/etiology , HLA Antigens , Risk Factors
10.
Proc Biol Sci ; 290(2007): 20231261, 2023 09 27.
Article in English | MEDLINE | ID: mdl-37752836

ABSTRACT

The various debates around model selection paradigms are important, but in lieu of a consensus, there is a demonstrable need for a deeper appreciation of existing approaches, at least among the end-users of statistics and model selection tools. In the ecological literature, the Akaike information criterion (AIC) dominates model selection practices, and while it is a relatively straightforward concept, there exists what we perceive to be some common misunderstandings around its application. Two specific questions arise with surprising regularity among colleagues and students when interpreting and reporting AIC model tables. The first is related to the issue of 'pretending' variables, and specifically a muddled understanding of what this means. The second is related to p-values and what constitutes statistical support when using AIC. There exists a wealth of technical literature describing AIC and the relationship between p-values and AIC differences. Here, we complement this technical treatment and use simulation to develop some intuition around these important concepts. In doing so we aim to promote better statistical practices when it comes to using, interpreting and reporting models selected when using AIC.


Subject(s)
Intuition , Students , Humans , Computer Simulation , Consensus
11.
Transpl Infect Dis ; 25(6): e14122, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37707287

ABSTRACT

BACKGROUND: Understanding immunogenicity and alloimmune risk following severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) vaccination in kidney transplant recipients is imperative to understanding the correlates of protection and to inform clinical guidelines. METHODS: We studied 50 kidney transplant recipients following SARS-CoV-2 vaccination and quantified their anti-spike protein antibody, donor-derived cell-free DNA (dd-cfDNA), gene expression profiling (GEP), and alloantibody formation. RESULTS: Participants were stratified using nucleocapsid testing as either SARS-CoV-2-naïve or experienced prior to vaccination. One of 34 (3%) SARS-CoV-2 naïve participants developed anti-spike protein antibodies. In contrast, the odds ratio for the association of a prior history of SARS-CoV-2 infection with vaccine response was 18.3 (95% confidence interval 3.2, 105.0, p < 0.01). Pre- and post-vaccination levels did not change for median dd-cfDNA (0.23% vs. 0.21% respectively, p = 0.13), GEP scores (9.85 vs. 10.4 respectively, p = 0.45), calculated panel reactive antibody, de-novo donor specific antibody status, or estimated glomerular filtration rate. CONCLUSIONS: SARS-CoV-2 vaccines do not appear to trigger alloimmunity in kidney transplant recipients. The degree of vaccine immunogenicity was associated most strongly with a prior history of SARS-CoV-2 infection.


Subject(s)
COVID-19 , Cell-Free Nucleic Acids , Kidney Transplantation , Humans , Antibodies, Viral , COVID-19/prevention & control , COVID-19 Vaccines/adverse effects , Immunity , SARS-CoV-2 , Transplant Recipients , Vaccination
12.
Lancet ; 402(10408): 1158-1169, 2023 09 30.
Article in English | MEDLINE | ID: mdl-37598688

ABSTRACT

BACKGROUND: Cross-species immunological incompatibilities have hampered pig-to-human xenotransplantation, but porcine genome engineering recently enabled the first successful experiments. However, little is known about the immune response after the transplantation of pig kidneys to human recipients. We aimed to precisely characterise the early immune responses to the xenotransplantation using a multimodal deep phenotyping approach. METHODS: We did a complete phenotyping of two pig kidney xenografts transplanted to decedent humans. We used a multimodal strategy combining morphological evaluation, immunophenotyping (IgM, IgG, C4d, CD68, CD15, NKp46, CD3, CD20, and von Willebrand factor), gene expression profiling, and whole-transcriptome digital spatial profiling and cell deconvolution. Xenografts before implantation, wild-type pig kidney autografts, as well as wild-type, non-transplanted pig kidneys with and without ischaemia-reperfusion were used as controls. FINDINGS: The data collected from xenografts suggested early signs of antibody-mediated rejection, characterised by microvascular inflammation with immune deposits, endothelial cell activation, and positive xenoreactive crossmatches. Capillary inflammation was mainly composed of intravascular CD68+ and CD15+ innate immune cells, as well as NKp46+ cells. Both xenografts showed increased expression of genes biologically related to a humoral response, including monocyte and macrophage activation, natural killer cell burden, endothelial activation, complement activation, and T-cell development. Whole-transcriptome digital spatial profiling showed that antibody-mediated injury was mainly located in the glomeruli of the xenografts, with significant enrichment of transcripts associated with monocytes, macrophages, neutrophils, and natural killer cells. This phenotype was not observed in control pig kidney autografts or in ischaemia-reperfusion models. INTERPRETATION: Despite favourable short-term outcomes and absence of hyperacute injuries, our findings suggest that antibody-mediated rejection in pig-to-human kidney xenografts might be occurring. Our results suggest specific therapeutic targets towards the humoral arm of rejection to improve xenotransplantation results. FUNDING: OrganX and MSD Avenir.


Subject(s)
Graft Rejection , Kidney , Animals , Swine , Humans , Transplantation, Heterologous , Antibodies , Immunity , Inflammation , Ischemia
13.
Nat Med ; 29(8): 1989-1997, 2023 08.
Article in English | MEDLINE | ID: mdl-37488288

ABSTRACT

Genetically modified xenografts are one of the most promising solutions to the discrepancy between the numbers of available human organs for transplantation and potential recipients. To date, a porcine heart has been implanted into only one human recipient. Here, using 10-gene-edited pigs, we transplanted porcine hearts into two brain-dead human recipients and monitored xenograft function, hemodynamics and systemic responses over the course of 66 hours. Although both xenografts demonstrated excellent cardiac function immediately after transplantation and continued to function for the duration of the study, cardiac function declined postoperatively in one case, attributed to a size mismatch between the donor pig and the recipient. For both hearts, we confirmed transgene expression and found no evidence of cellular or antibody-mediated rejection, as assessed using histology, flow cytometry and a cytotoxic crossmatch assay. Moreover, we found no evidence of zoonotic transmission from the donor pigs to the human recipients. While substantial additional work will be needed to advance this technology to human trials, these results indicate that pig-to-human heart xenotransplantation can be performed successfully without hyperacute rejection or zoonosis.


Subject(s)
Antibodies , Graft Rejection , Animals , Humans , Swine , Transplantation, Heterologous/methods , Heterografts , Heart , Animals, Genetically Modified
14.
JAMA Surg ; 158(8): 787-788, 2023 08 01.
Article in English | MEDLINE | ID: mdl-37223921

ABSTRACT

This Viewpoint describes the organ shortage for patients with end-stage kidney disease despite increases in kidney donations between 2000 and 2021.


Subject(s)
Kidney Failure, Chronic , Tissue and Organ Procurement , Humans , Tissue Donors , Kidney Failure, Chronic/therapy , Living Donors
15.
J Card Fail ; 29(7): 986-996, 2023 07.
Article in English | MEDLINE | ID: mdl-37044281

ABSTRACT

BACKGROUND: Evidence for modulating the sodium chloride (NaCl) intake of patients hospitalized with acute heart failure (AHF) is inconclusive. Salt restriction may not benefit; hypertonic saline may aid diuresis. OBJECTIVE: To compare the safety and efficacy of oral NaCl during intravenous (IV) diuretic therapy in renal function and weight. METHODS: Seventy hospitalized patients with AHF who were being treated with IV furosemide infusion consented to receive, randomly, 2 grams of oral NaCl or placebo 3 times a day in a double-blind manner during diuresis. Treatment efficacy (bivariate primary endpoints of change in serum creatinine levels and change in weight) was measured at 96 hours, and adverse safety events were tracked for 90 days. RESULTS: Sixty-five patients (34 NaCl, 31 placebo) were included for analysis after 5 withdrew. A median of 13 grams of NaCl was given compared to placebo. At 96 hours, there was no significant difference between treatment groups with respect to the primary endpoint (P = 0.33); however, the trial was underpowered, and there was greater than expected standard deviation in weight change. The mean change in creatinine levels and weight was 0.15 ± 0.44 mg/dL and 4.6 ± 4.2 kg in the placebo group compared with 0.04 ± 0.40 mg/dL and 4.0 ± 4.3 kg in the NaCl group (P = 0.30 and 0.57, respectively). Across efficacy and safety endpoints, we observed no significant difference between the 2 groups other than changes in serum sodium levels (-2.6 ± 2.7 in the placebo group and -0.3 ± 3.3 mEq/L in the NaCl group; P < 0.001) and in serum blood urea nitrogen levels (11 ± 15 in the placebo group; 3.1 ± 13 mEq/L in the NaCl group; P = 0.025). CONCLUSIONS: In this single-center study, liberal vs restrictive oral sodium chloride intake strategies did not impact the safety and efficacy of intravenous diuretic therapy in patients with AHF. (ClinicalTrials.gov registration NCT04334668.).


Subject(s)
Heart Failure , Humans , Heart Failure/drug therapy , Sodium Chloride/therapeutic use , Double-Blind Method , Furosemide , Diuretics/therapeutic use , Treatment Outcome , Sodium , Kidney/physiology
16.
Clin Chest Med ; 44(1): 201-214, 2023 03.
Article in English | MEDLINE | ID: mdl-36774165

ABSTRACT

Xenotransplantation promises to alleviate the issue of donor organ shortages and to decrease waiting times for transplantation. Recent advances in genetic engineering have allowed for the creation of pigs with up to 16 genetic modifications. Several combinations of genetic modifications have been associated with extended graft survival and life-supporting function in experimental heart and kidney xenotransplants. Lung xenotransplantation carries specific challenges related to the large surface area of the lung vascular bed, its innate immune system's intrinsic hyperreactivity to perceived 'danger', and its anatomic vulnerability to airway flooding after even localized loss of alveolocapillary barrier function. This article discusses the current status of lung xenotransplantation, and challenges related to immunology, physiology, anatomy, and infection. Tissue engineering as a feasible alternative to develop a viable lung replacement solution is discussed.


Subject(s)
Lung Transplantation , Tissue and Organ Procurement , Animals , Humans , Swine , Transplantation, Heterologous , Lung/surgery , Bioengineering
18.
Commun Med (Lond) ; 2(1): 150, 2022 Nov 23.
Article in English | MEDLINE | ID: mdl-36418380

ABSTRACT

BACKGROUND: Clinical decisions are mainly driven by the ability of physicians to apply risk stratification to patients. However, this task is difficult as it requires complex integration of numerous parameters and is impacted by patient heterogeneity. We sought to evaluate the ability of transplant physicians to predict the risk of long-term allograft failure and compare them to a validated artificial intelligence (AI) prediction algorithm. METHODS: We randomly selected 400 kidney transplant recipients from a qualified dataset of 4000 patients. For each patient, 44 features routinely collected during the first-year post-transplant were compiled in an electronic health record (EHR). We enrolled 9 transplant physicians at various career stages. At 1-year post-transplant, they blindly predicted the long-term graft survival with probabilities for each patient. Their predictions were compared with those of a validated prediction system (iBox). We assessed the determinants of each physician's prediction using a random forest survival model. RESULTS: Among the 400 patients included, 84 graft failures occurred at 7 years post-evaluation. The iBox system demonstrates the best predictive performance with a discrimination of 0.79 and a median calibration error of 5.79%, while physicians tend to overestimate the risk of graft failure. Physicians' risk predictions show wide heterogeneity with a moderate intraclass correlation of 0.58. The determinants of physicians' prediction are disparate, with poor agreement regardless of their clinical experience. CONCLUSIONS: This study shows the overall limited performance and consistency of physicians to predict the risk of long-term graft failure, demonstrated by the superior performances of the iBox. This study supports the use of a companion tool to help physicians in their prognostic judgement and decision-making in clinical care.


The ability to predict the risk of a particular event is key to clinical decision-making, for example when predicting the risk of a poor outcome to help decide which patients should receive an organ transplant. Computer-based systems may help to improve risk prediction, particularly with the increasing volume and complexity of patient data available to clinicians. Here, we compare predictions of the risk of long-term kidney transplant failure made by clinicians with those made by our computer-based system (the iBox system). We observe that clinicians' overall performance in predicting individual long-term outcomes is limited compared to the iBox system, and demonstrate wide variability in clinicians' predictions, regardless of level of experience. Our findings support the use of the iBox system in the clinic to help clinicians predict outcomes and make decisions surrounding kidney transplants.

20.
Bioscience ; 72(6): 549-559, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35677291

ABSTRACT

Because biodiversity loss has largely been attributed to human actions, people, particularly those in the Global South, are regularly depicted as threats to conservation. This context has facilitated rapid growth in green militarization, with fierce crackdowns against real or perceived environmental offenders. We designed an undergraduate course to assess student perspectives on biodiversity conservation and social justice and positioned those students to contribute to a human heritage-centered conservation (HHCC) initiative situated in Uganda. We evaluated changes in perspectives using pre- and postcourse surveys and reflection instruments. Although the students started the course prioritizing biodiversity conservation, even when it was costly to human well-being, by the end of the course, they were recognizing and remarking on the central importance of social justice within conservation. We present a framework for further integration of HHCC approaches into higher education courses so as to conserve the integrity of coupled human and natural systems globally.

SELECTION OF CITATIONS
SEARCH DETAIL
...