Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 39
Filter
1.
Kidney Int ; 106(4): 712-722, 2024 Oct.
Article in English | MEDLINE | ID: mdl-39074554

ABSTRACT

Current kidney perfusion protocols are not optimized for addressing the ex vivo physiological and metabolic needs of the kidney. Ex vivo normothermic perfusion may be utilized to distinguish high-risk kidneys to determine suitability for transplantation. Here, we assessed the association of tissue metabolic changes with changes in a kidney injury biomarker and functional parameters in eight deceased donor kidneys deemed unsuitable for transplantation during a 12-hour ex vivo normothermic perfusion. The kidneys were grouped into good and poor performers based on blood flow and urine output. The mean age of the deceased kidney donors was 43 years with an average cold ischemia time of 37 hours. Urine output and creatinine clearance progressively increased and peaked at six hours post-perfusion among good performers. Poor performers had 71 ng/ml greater (95% confidence interval 1.5, 140) urinary neutrophil gelatinase-associated lipocalin at six hours compared to good performers corresponding to peak functional differences. Organ performance was distinguished by tissue metabolic differences in branched chain amino acid metabolism and that their tissue levels negatively correlated with urine output among all kidneys at six hours. Tissue lipid profiling showed poor performers were highlighted by the accumulation of membrane structure components including glycerolipids and sphingolipids at early perfusion time points. Thus, we showed that six hours is needed for kidney function recovery during ex vivo normothermic perfusion and that branched chain amino acid metabolism may be a major determinant of organ function and resilience.


Subject(s)
Amino Acids, Branched-Chain , Biomarkers , Kidney Transplantation , Kidney , Lipocalin-2 , Organ Preservation , Perfusion , Tissue Donors , Humans , Perfusion/methods , Adult , Kidney Transplantation/methods , Male , Kidney/metabolism , Kidney/blood supply , Middle Aged , Female , Organ Preservation/methods , Amino Acids, Branched-Chain/metabolism , Biomarkers/urine , Biomarkers/metabolism , Lipocalin-2/urine , Lipocalin-2/metabolism , Time Factors , Cold Ischemia/adverse effects , Donor Selection/methods , Creatinine/blood , Creatinine/urine
2.
Clin Transplant ; 38(8): e15436, 2024 Aug.
Article in English | MEDLINE | ID: mdl-39158959

ABSTRACT

BACKGROUND: Efforts to address the shortage of donor organs include increasing the use of renal allografts from donors after circulatory death (DCD). While warm ischemia time (WIT) is thought to be an important factor in DCD kidney evaluation, few studies have compared the relationship between WIT and DCD kidney outcomes, and WIT acceptance practices remain variable. METHODS: We conducted a single-center retrospective review of all adult patients who underwent deceased donor kidney transplantation from 2000 to 2021. We evaluated the impact of varied functional warm ischemia time (fWIT) in controlled DCD donors by comparing donor and recipient characteristics and posttransplant outcomes between high fWIT (>60 min), low fWIT (≤60 min), and kidneys transplanted from donors after brain death (DBD). RESULTS: Two thousand eight hundred eleven patients were identified, 638 received low fWIT DCD, 93 received high fWIT DCD, and 2080 received DBD kidneys. There was no significant difference in 5-year graft survival between the DCD low fWIT, high fWIT, and DBD groups, with 84%, 83%, and 83% of grafts functioning, respectively. Five-year patient survival was 91% in the low fWIT group, 92% in the high fWIT group, and 90% in the DBD group. An increase in kidney donor risk index (KDRI) (HR 3.37, 95% CI = 2.1-5.7) and high CIT compared to low CIT (HR 2.12, 95% CI = 1.4-3.1) have higher hazard ratios for 1-year graft failure. CONCLUSIONS: Increased acceptance of kidneys from selected DCD donors with prolonged fWIT may present an opportunity to increase kidney utilization while preserving outcomes. Our group specifically prioritizes the use of kidneys from younger donors, with lower KDPI, and without acute kidney injury, or risk factors for underlying chronic kidney disease.


Subject(s)
Graft Survival , Kidney Transplantation , Tissue Donors , Tissue and Organ Procurement , Warm Ischemia , Humans , Male , Female , Retrospective Studies , Middle Aged , Follow-Up Studies , Tissue Donors/supply & distribution , Tissue and Organ Procurement/methods , Prognosis , Adult , Risk Factors , Survival Rate , Glomerular Filtration Rate , Kidney Function Tests , Graft Rejection/etiology , Kidney Failure, Chronic/surgery , Donor Selection
3.
Clin Transplant ; 33(7): e13628, 2019 07.
Article in English | MEDLINE | ID: mdl-31173413

ABSTRACT

BACKGROUND: Postoperative severe cardiopulmonary failure carries a high rate of mortality. Extracorporeal membrane oxygenation (ECMO) can be used as a salvage therapy when conventional therapies fail. METHODS: We retrospectively reviewed our experience with ECMO support in the early postoperative period after liver transplant between September 2011 and May 2016. RESULTS: Out of 537 liver transplants performed at our institution, seven patients required ECMO support with a median age of 52 and a median MELD score of 28. Veno-venous ECMO was used in four patients with severe respiratory failure while the rest required veno-arterial ECMO for circulatory failure. The median time from transplant to cannulation was 3 days with a median duration of ECMO support of 7 days. All patients except one were successfully decannulated. The median hospital length of stay was 58 days with an in-hospital mortality of 28.6%. CONCLUSION: Extracorporeal membrane oxygenation can be considered a viable rescue therapy in the setting of severe postoperative cardiopulmonary failure. Extracorporeal membrane oxygenation therapy was successful in saving patients who were otherwise unsalvageable.


Subject(s)
Extracorporeal Membrane Oxygenation/methods , Graft Rejection/therapy , Heart Arrest/therapy , Hospital Mortality/trends , Liver Transplantation/adverse effects , Postoperative Complications/therapy , Respiratory Insufficiency/therapy , Adult , Female , Follow-Up Studies , Graft Rejection/etiology , Graft Rejection/mortality , Graft Survival , Heart Arrest/etiology , Heart Arrest/mortality , Humans , Male , Middle Aged , Postoperative Complications/etiology , Postoperative Complications/mortality , Respiratory Insufficiency/etiology , Respiratory Insufficiency/mortality , Retrospective Studies , Survival Rate , Treatment Outcome
4.
Hepatobiliary Pancreat Dis Int ; 17(2): 149-154, 2018 Apr.
Article in English | MEDLINE | ID: mdl-29709218

ABSTRACT

BACKGROUND: Consequences of incidental gallbladder cancer (iGBC) following cholecystectomy may include repeat operation (depending on T stage) and worse survival (if bile spillage occurred), both avoidable if iGBC were suspected preoperatively. METHODS: A retrospective single-institution review was done. Ultrasound images for cases and controls were blindly reviewed by a radiologist. Chi-square and Student's t tests, as well as logistic regression and Kaplan-Meier analyses were used. A P ≤ 0.01 was considered significant. RESULTS: Among 5796 cholecystectomies performed 2000-2013, 26 (0.45%) were iGBC cases. These patients were older (75.61 versus 52.27 years), had more laparoscopic-to-open conversions (23.1% versus 3.9%), underwent more imaging tests, had larger common bile duct diameter (7.13 versus 5.04 mm) and higher alkaline phosphatase. Ultrasound imaging showed that gallbladder wall thickening (GBWT) without pericholecystic fluid (PCCF), but not focal-versus-diffuse GBWT, was associated significantly with iGBC (73.9% versus 47.4%). On multivariable logistic regression analysis, GBWT without PCCF, and age were the strongest predictors of iGBC. The consequences iGBC depended significantly on intraoperative bile spillage, with nearly all such patients developing carcinomatosis and significantly worse survival. CONCLUSIONS: Besides age, GBWT, dilated common bile duct, and elevated alkaline phosphatase, number of preoperative imaging modalities and the presence of GBWT without PCCF are useful predictors of iGBC. Bile spillage causes poor survival in patients with iGBC.


Subject(s)
Cholecystectomy , Gallbladder Neoplasms/pathology , Gallbladder/surgery , Incidental Findings , Adult , Aged , Aged, 80 and over , Alkaline Phosphatase/blood , Baltimore , Bile/cytology , Chi-Square Distribution , Common Bile Duct/diagnostic imaging , Female , Gallbladder/diagnostic imaging , Gallbladder/pathology , Gallbladder Neoplasms/diagnostic imaging , Gallbladder Neoplasms/mortality , Gallbladder Neoplasms/surgery , Humans , Kaplan-Meier Estimate , Logistic Models , Male , Middle Aged , Multivariate Analysis , Neoplasm Staging , Peritoneal Neoplasms/secondary , Reoperation , Retrospective Studies , Risk Factors , Time Factors , Ultrasonography , Up-Regulation
5.
Hepatobiliary Pancreat Dis Int ; 16(4): 405-411, 2017 Aug 15.
Article in English | MEDLINE | ID: mdl-28823371

ABSTRACT

BACKGROUND: Minimally invasive surgery is increasingly used for gallbladder cancer resection. Postoperative mortality at 30 days is low, but 90-day mortality is underreported. METHODS: Using National Cancer Database (1998-2012), all resection patients were included. Thirty- and 90-day mortality rates were compared. RESULTS: A total of 36 067 patients were identified, 19 139 (53%) of whom underwent resection. Median age was 71 years and 70.7% were female. Ninety-day mortality following surgical resection was 2.3-fold higher than 30-mortality (17.1% vs 7.4%). There was a statistically significant increase in 30- and 90-day mortality with poorly differentiated tumors, presence of lymphovascular invasion, tumor stage, incomplete surgical resection and low-volume centers (P<0.001 for all). Even for the 1885 patients who underwent minimally invasive resection between 2010 and 2012, the 90-day mortality was 2.8-fold higher than the 30-day mortality (12.0% vs 4.3%). CONCLUSIONS: Ninety-day mortality following gallbladder cancer resection is significantly higher than 30-day mortality. Postoperative mortality is associated with tumor grade, lymphovascular invasion, tumor stage, type and completeness of surgical resection as well as type and volume of facility.


Subject(s)
Cholecystectomy, Laparoscopic/mortality , Cholecystectomy/mortality , Gallbladder Neoplasms/surgery , Aged , Cholecystectomy/adverse effects , Cholecystectomy, Laparoscopic/adverse effects , Databases, Factual , Female , Gallbladder Neoplasms/mortality , Gallbladder Neoplasms/pathology , Humans , Male , Middle Aged , Neoplasm Grading , Neoplasm Invasiveness , Neoplasm Staging , Postoperative Complications/etiology , Postoperative Complications/mortality , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , United States/epidemiology
6.
Hepatobiliary Pancreat Dis Int ; 16(2): 197-201, 2017 Apr.
Article in English | MEDLINE | ID: mdl-28381385

ABSTRACT

BACKGROUND: Despite the increasing use of fatty meal (FM) as a substitute for cholecystokinin (CCK) in pain reproduction during hepato-imino-diacetic acid (HIDA) scan in functional gallbladder disorder, there are no studies comparing the differences between CCK and FM. The present study was to compare the efficacy of FM in comparison of CCK in FGBD application. METHODS: Patients undergoing HIDA scans from August 2013 to May 2014 were divided into two groups: those undergoing CCK-stimulated HIDA scan versus FM-stimulated HIDA scan. These groups were compared according to demographics and HIDA results. RESULTS: Of 153 patients, 70 received CCK and 83 FM. There was no difference regarding age, gender, gallstones, gallbladder ejection fraction and time to visualization. However, significantly more of the patients receiving CCK than FM experienced pain reproduction (61% vs 30%, P<0.01). CONCLUSIONS: Stimulation of gallbladder contractility with a FM during HIDA is less than half as likely to reproduce biliary symptoms compared to CCK, despite similar ejection fractions and other parameters. It is essential that providers account for this difference when counseling patients regarding cholecystectomy for functional gallbladder disorder.


Subject(s)
Biliary Dyskinesia/diagnostic imaging , Gallbladder/diagnostic imaging , Imino Acids/administration & dosage , Radiopharmaceuticals/administration & dosage , Abdominal Pain/etiology , Adolescent , Adult , Aged , Aged, 80 and over , Biliary Dyskinesia/physiopathology , Biliary Dyskinesia/surgery , Cholecystectomy , Cholecystokinin/administration & dosage , Cholecystokinin/adverse effects , Dietary Fats/administration & dosage , Dietary Fats/adverse effects , Female , Gallbladder/physiopathology , Gallbladder/surgery , Humans , Male , Middle Aged , Predictive Value of Tests , Vitamin K/administration & dosage , Vitamin K/adverse effects , Young Adult
7.
Hepatobiliary Pancreat Dis Int ; 16(5): 545-551, 2017 Oct 15.
Article in English | MEDLINE | ID: mdl-28992888

ABSTRACT

BACKGROUND: Postoperative pancreatic fistula (POPF) remains common and morbid after pancreaticoduodenectomy (PD). A major advance in the study of POPF is the fistula risk score (FRS). METHODS: We analyzed 48 consecutive patients undergoing PD. The "Colonial Wig" pancreaticojejunostomy (CWPJ) technique was used in the last 22 PDs, we compared 22 CWPJ to 26 conventional PDs. RESULTS: Postoperative morbidity was 49% (27% Clavien grade >2). The median length of hospital stay was 11 days. In the first 26 PDs, the PJ was performed according to standard techniques and the clinically relevant POPF (CR-POPF) rate was 15%, similar to the FRS-predicted rate (14%). In the next 22 PJs, the CWPJ was employed. Although the FRS-predicted rates were similar in these two groups (14% vs 13%), the CR-POPF rate in the CWPJ group was 0 (P=0.052). CONCLUSION: Early experience with the CWPJ is encouraging, and this anastomosis may be a safe and effective way to lower POPF rates.


Subject(s)
Pancreatic Fistula/prevention & control , Pancreaticoduodenectomy/adverse effects , Pancreaticojejunostomy/methods , Postoperative Complications/prevention & control , Humans , Morbidity , Retrospective Studies
9.
J Surg Res ; 202(1): 43-8, 2016 May 01.
Article in English | MEDLINE | ID: mdl-27083946

ABSTRACT

BACKGROUND: The gastrografin (GG) challenge is a diagnostic and therapeutic tool used to treat patients with small bowel obstruction (SBO); however, long-term data on SBO recurrence after the GG challenge remain limited. We hypothesized that patients treated with GG would have the same long-term recurrence as those treated before the implementation of the GG challenge protocol. METHODS: Patients ≥18 years who were treated for SBO between July 2009 and December 2012 were identified. We excluded patients with contraindications to the GG challenge (i.e., signs of strangulation), patients having SBO within 6-wk of previous abdominal or pelvic surgery and patients with malignant SBO. All patients had been followed a minimum of 1 y or until death. Kaplan-Meier method and Cox regression models were used to describe the time-dependent outcomes. RESULTS: A total of 202 patients were identified of whom 114 (56%) received the challenge. Mean patients age was 66 y (range, 19-99 y) with 110 being female (54%). A total of 184 patients (91%) were followed minimum of 1 year or death (18 patients lost to follow-up). Median follow-up of living patients was 3 y (range, 1-5 y). During follow-up, 50 patients (25%) experienced SBO recurrences, and 24 (12%) had exploration for SBO recurrence. The 3-year cumulative rate of SBO recurrence in patients who received the GG was 30% (95% confidence interval [CI], 21%-42%) compared to 27% (95% CI, 18%-38%) for those who did not (P = 0.4). The 3-year cumulative rate of exploration for SBO recurrence in patients who received the GG was 15% (95% CI, 8%-26%) compared to 12 % (95% CI, 6%-22%) for those who did not (P = 0.6). CONCLUSIONS: The GG challenge is a clinically useful tool in treating SBO patients with comparable long-term recurrence rates compared to traditional management of SBO.


Subject(s)
Contrast Media , Diatrizoate Meglumine , Intestinal Obstruction/diagnostic imaging , Intestine, Small/diagnostic imaging , Adult , Aged , Aged, 80 and over , Combined Modality Therapy , Female , Follow-Up Studies , Humans , Intestinal Obstruction/therapy , Intestine, Small/surgery , Intubation, Gastrointestinal , Kaplan-Meier Estimate , Male , Middle Aged , Proportional Hazards Models , Radiography , Recurrence , Retrospective Studies , Treatment Outcome , Young Adult
10.
J Surg Res ; 204(2): 428-434, 2016 08.
Article in English | MEDLINE | ID: mdl-27565079

ABSTRACT

BACKGROUND: The anatomic severity schema for small bowel obstruction (SBO) has been described by the American Association for the Surgery of Trauma (AAST). Although acknowledging the importance of physiological and comorbid parameters, these factors were not included in the developed system. Thus, we sought to validate the AAST-SBO scoring system and evaluate the effect of adding patient's physiology and comorbidity on the prediction for the proposed system. METHODS: Patients aged ≥18 y who were treated for SBO at our institution between 2009 and 2012 were identified. The physiology and comorbidity as well as the AAST anatomic scores were determined, squared, and added to calculate the score that we termed Acute General Emergency Surgical Severity-Small Bowel Obstruction (AGESS-SBO). The area under the receiver operating characteristic (AUROC) curve analyses were performed for the AAST anatomic score and compared with the AGESS-SBO score as a predictor for inhospital mortality, extended hospital stay, and inhospital complications. RESULTS: A total of 351 patients with mean age of 66 ± 17 years were identified, of whom 145 (41%) underwent operation to treat bowel obstruction. Extended hospital stay (>9 d) occurred in 86 patients (25%), inhospital complications in 73 (21%), and inhospital mortality in eight patients (2%). The median (interquartile range [IQR]) AAST anatomic score was 1 point (IQR: 1-2), physiology score was 0 point (IQR: 0-1), and comorbidity score was 1 point (IQR: 1-3); for overall median AGESS-SBO score of 5 points (IQR: 3-13). The AUROC curve analyses demonstrated that the AGESS-SBO system with measures of presenting physiology, comorbidities in addition to AAST anatomic criteria could be beneficial in predicting key outcomes including inhospital mortality (AUROC curve: 0.80 versus 0.54, P = 0.03). CONCLUSIONS: The AAST anatomic score is a reliable system, which assists care providers to categorize SBO. Adding physiology and comorbidity parameters to the described anatomic criteria can be helpful in predicting the outcomes including mortality. Further studies evaluating its usefulness in research and quality improvement purposes across institutions are still required.


Subject(s)
Intestinal Obstruction , Severity of Illness Index , Adult , Aged , Aged, 80 and over , Female , Humans , Intestine, Small , Male , Middle Aged , Young Adult
11.
JOP ; 16(2): 125-35, 2015 Mar 20.
Article in English | MEDLINE | ID: mdl-25791545

ABSTRACT

The objective of this review is to summarize the current state of the art of the management of necrotizing pancreatitis, and to clarify some confusing points regarding the terminology and diagnosis of necrotizing pancreatitis, as these points are essential for management decisions and communication between providers and within the literature. Acute pancreatitis varies widely in its clinical presentation. Despite the publication of the Atlanta guidelines, misuse of pancreatitis terminology continues in the literature and in clinical practice, especially regarding the local complications associated with severe acute pancreatitis. Necrotizing pancreatitis is a manifestation of severe acute pancreatitis associated with significant morbidity and mortality. Diagnosis is aided by pancreas-protocol computed tomography or magnetic resonance imaging, ideally 72 h after onset of symptoms to achieve the most accurate characterization of pancreatic necrosis. The extent of necrosis correlates well with the incidence of infected necrosis, organ failure, need for debridement, and morbidity and mortality. Having established the diagnosis of pancreatic necrosis, goals of appropriately aggressive resuscitation should be established and adhered to in a multidisciplinary approach, ideally at a high-volume pancreatic center. The role of antibiotics is determined by the presence of infected necrosis. Early enteral feeds improve outcomes compared with parenteral nutrition. Pancreatic necrosis is associated with a multitude of complications which can lead to long-term morbidity or mortality. Interventional therapy should be guided by available resources and the principle of a minimally invasive approach. When open debridement is necessary, it should be delayed at least 3-6 weeks to allow demarcation of necrotic from viable tissue.

13.
Digestion ; 90(3): 147-54, 2014.
Article in English | MEDLINE | ID: mdl-25278145

ABSTRACT

BACKGROUND: Motility disorders of the biliary tree [biliary dyskinesia, including both gallbladder dysfunction (GBD), and sphincter of Oddi dysfunction] are difficult to diagnose and to treat. SUMMARY: There is controversy in the literature in particular regarding the criteria that should be used to select patients for cholecystectomy (CCY) in cases of suspected GBD. The current review covers the history, diagnosis, and treatment of GBD. Key Messages: Only >85% of patients with suspected GBD have relief following CCY, a much lower rate than the nearly 100% success rate following CCY for gallstone disease. Unfortunately, the literature is lacking, and there are no universally agreed-upon criteria for selecting which patients to refer for operation, although cholecystokinin (CCK)-enhanced hepatobiliary iminodiacetic acid scan is often used, with emphasis on an abnormally low gallbladder ejection fraction or pain reproduction at CCK administration. There is a clear need for large, well-designed, more definitive, prospective studies to better identify the indications for and efficacy of CCY in cases of GBD.


Subject(s)
Biliary Dyskinesia , Cholecystectomy , Biliary Dyskinesia/diagnosis , Biliary Dyskinesia/etiology , Biliary Dyskinesia/surgery , Cholagogues and Choleretics , Cholecystectomy/trends , Cholecystokinin , Gallbladder Diseases/diagnosis , Gallbladder Diseases/etiology , Gallbladder Diseases/surgery , Humans , Sphincter of Oddi Dysfunction/diagnosis , Sphincter of Oddi Dysfunction/etiology , Sphincter of Oddi Dysfunction/surgery
14.
JAMA Surg ; 159(1): 60-68, 2024 Jan 01.
Article in English | MEDLINE | ID: mdl-37910090

ABSTRACT

Importance: Despite the unmet need, many deceased-donor kidneys are discarded or not recovered. Inefficient allocation and prolonged ischemia time are contributing factors, and early detection of high-risk donors may reduce organ loss. Objective: To evaluate the feasibility of machine learning (ML) and natural language processing (NLP) classification of donors with kidneys that are used vs not used for organ transplant. Design, Setting, and Participants: This retrospective cohort study used donor information (structured donor characteristics and unstructured donor narratives) from the United Network for Organ Sharing (UNOS). All donor offers to a single transplant center between January 2015 and December 2020 were used to train and validate ML models to predict donors who had at least 1 kidney transplanted (at our center or another center). The donor data from 2021 were used to test each model. Exposures: Donor information was provided by UNOS to the transplant centers with potential transplant candidates. Each center evaluated the donor and decided within an allotted time whether to accept the kidney for organ transplant. Main Outcomes and Measures: Outcome metrics of the test cohort included area under the receiver operating characteristic curve (AUROC), F1 score, accuracy, precision, and recall of each ML classifier. Feature importance and Shapley additive explanation (SHAP) summaries were assessed for model explainability. Results: The training/validation cohort included 9555 donors (median [IQR] age, 50 [36-58] years; 5571 male [58.3%]), and the test cohort included 2481 donors (median [IQR] age, 52 [40-59] years; 1496 male [60.3%]). Only 20% to 30% of potential donors had at least 1 kidney transplanted. The ML model with a single variable (Kidney Donor Profile Index) showed an AUROC of 0.69, F1 score of 0.42, and accuracy of 0.64. Multivariable ML models based on basic a priori structured donor data showed similar metrics (logistic regression: AUROC = 0.70; F1 score = 0.42; accuracy = 0.62; random forest classifier: AUROC = 0.69; F1 score = 0.42; accuracy = 0.64). The classic NLP model (bag-of-words model) showed its best metrics (AUROC = 0.60; F1 score = 0.35; accuracy = 0.59) by the logistic regression classifier. The advanced Bidirectional Encoder Representations From Transformers model showed comparable metrics (AUROC = 0.62; F1 score = 0.39; accuracy = 0.69) only after appending basic donor information. Feature importance and SHAP detected the variables (and words) that affected the models most. Conclusions and Relevance: Results of this cohort study suggest that models using ML can be applied to predict donors with high-risk kidneys not used for organ transplant, but the models still need further elaboration. The use of unstructured data is likely to expand the possibilities; further exploration of new approaches will be necessary to develop models with better predictive metrics.


Subject(s)
Kidney Transplantation , Humans , Male , Middle Aged , Cohort Studies , Retrospective Studies , Kidney , Tissue Donors
15.
Am J Surg ; 229: 129-132, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38110322

ABSTRACT

BACKGROUND: Functional gallbladder disorder (FGBD) remains a controversial indication for cholecystectomy. METHODS: A prospective cohort study enrolled patients strictly meeting Rome criteria for FGBD, and cholecystectomy was performed. They were assessed pre- and 3 and 6 months postoperatively with surveys of abdominal pain and quality of life (RAPID and SF-12 surveys, respectively). Interim analysis was performed. RESULTS: Although neither ejection fraction nor pain reproduction predicted success after cholecystectomy, the vast majority of enrolled patients had a successful outcome after undergoing cholecystectomy for FGBD: of a planned 100 patients, 46 were enrolled. Of 31 evaluable patients, 26 (83.9 â€‹%) reported RAPID improvement and 28 (93.3 â€‹%) SF12 improvement at 3- or 6-month follow-up. CONCLUSION: FGBD, strictly diagnosed, should perhaps no longer be a controversial indication for cholecystectomy, since its success rate for biliary pain in this study was similar to that for symptomatic cholelithiasis. Larger-scale studies or randomized trials may confirm these findings.


Subject(s)
Biliary Dyskinesia , Gallbladder Diseases , Humans , Gallbladder , Prospective Studies , Quality of Life , Gallbladder Diseases/surgery , Gallbladder Diseases/diagnosis , Abdominal Pain/etiology , Biliary Dyskinesia/surgery , Retrospective Studies , Treatment Outcome
16.
Clin J Am Soc Nephrol ; 19(3): 292-300, 2024 03 01.
Article in English | MEDLINE | ID: mdl-37930674

ABSTRACT

BACKGROUND: Use of eGFR to determine preemptive waitlisting eligibility may contribute to racial/ethnic disparities in access to waitlisting, which can only occur when the eGFR falls to ≤20 ml/min per 1.73 m 2 . Use of an alternative risk-based strategy for waitlisting may reduce these inequities ( e.g. , a kidney failure risk equation [KFRE] estimated 2-year risk of kidney failure) rather than the standard eGFR threshold for determining waitlist eligibility. Our objective was to model the amount of preemptive waittime that could be accrued by race and ethnicity, applying two different strategies to determine waitlist eligibility. METHODS: Using electronic health record data, linear mixed models were used to compare racial/ethnic differences in preemptive waittime that could be accrued using two strategies: estimating the time between an eGFR ≤20 and 5 ml/min per 1.73 m 2 versus time between a 25% 2-year predicted risk of kidney failure (using the KFRE, which incorporates age, sex, albuminuria, and eGFR to provide kidney failure risk estimation) and eGFR of 5 ml/min per 1.73 m 2 . RESULTS: Among 1290 adults with CKD stages 4-5, using the Chronic Kidney Disease Epidemiology Collaboration equation yielded shorter preemptive waittime between an eGFR of 20 and 5 ml/min per 1.73 m 2 in Black (-6.8 months; 95% confidence interval [CI], -11.7 to -1.9), Hispanic (-10.2 months; -15.3 to -5.1), and Asian/Pacific Islander (-10.3 months; 95% CI, -15.3 to -5.4) patients compared with non-Hispanic White patients. Use of a KFRE threshold to determine waittime yielded smaller differences by race and ethnicity than observed when using a single eGFR threshold, with shorter time still noted for Black (-2.5 months; 95% CI, -7.8 to 2.7), Hispanic (-4.8 months; 95% CI, -10.3 to 0.6), and Asian/Pacific Islander (-5.4 months; -10.7 to -0.1) individuals compared with non-Hispanic White individuals, but findings only met statistical significance criteria in Asian/Pacific Islander individuals. When we compared potential waittime availability using a KFRE versus eGFR threshold, use of the KFRE yielded more equity in waittime for Black ( P = 0.02), Hispanic ( P = 0.002), and Asian/Pacific Islander ( P = 0.002) patients. CONCLUSIONS: Use of a risk-based strategy was associated with greater racial equity in waittime accrual compared with use of a standard single eGFR threshold to determine eligibility for preemptive waitlisting.


Subject(s)
Renal Insufficiency, Chronic , Renal Insufficiency , Adult , Humans , Black or African American , Ethnicity , Hispanic or Latino , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/therapy , Asian American Native Hawaiian and Pacific Islander , White
17.
Transplant Direct ; 10(3): e1581, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38380346

ABSTRACT

Background: Few studies have evaluated the efficacy of transverse abdominis plane (TAP) block in patients undergoing hand-assisted laparoscopic live-donor nephrectomy (HALN). We aimed to evaluate the analgesic effectiveness of TAP block as part of a multimodal pain management regimen in patients undergoing HALN. Methods: We retrospectively reviewed the medical records of living kidney donors at our center between June 2016 and February 2020. HALNs were performed via a transperitoneal approach through a suprapubic incision. Additional laparoscopic ports were used in the upper midabdomen. In consenting donors, TAP block was performed postoperatively under ultrasound guidance with either a single-shot or continuous infusion of long-acting local anesthetic (0.2%-0.5% ropivacaine). All the patients received postoperative around-the-clock ketorolac and acetaminophen. Results: Overall, 72 donors received the block (block group, 38 single-shot, 34 continuous), whereas 86 donors did not receive the block (control group). Baseline characteristics were comparable between the groups except for body weight (control: 71.8 ±â€…13.3 versus block: 77.8 ±â€…17.3 kg; P = 0.01) and intraoperative opioid dose (32.1 ±â€…9.6 versus 26.6 ±â€…10.7 morphine milligram equivalents; P < 0.001). After adjusting for baseline differences, postoperative opioid requirements were similar between the groups. When the baseline pain scale was adjusted for, there was no difference in the overall pain scale scores between the groups (P = 0.242). Subgroup analyses comparing single-shot or continuous TAP versus control did not show any differences. Conclusions: With the caveat of the retrospective nature of the study, the adjunctive effect of TAP block after transabdominal HALN was limited when other multimodal analgesia was used.

18.
Transplant Proc ; 56(8): 1712-1720, 2024 Oct.
Article in English | MEDLINE | ID: mdl-39198066

ABSTRACT

BACKGROUND: The mismatch between the number of patients awaiting kidney transplantation and the supply of donor organs has contributed to the increase in kidney transplantation from donors after circulatory death (DCD). Persistently long waiting times have led the transplant community to continue to explore the use of expanded- criteria DCD kidneys. In parallel, advances in organ preservation strategies have contributed to an overall increase in DCD organ transplantation and are altering the transplant landscape. Some of these techniques may improve kidney allograft outcomes and affect how DCD kidneys are used. We aimed to better understand practices in accepting DCD kidney offers in the modern era. METHODS: Directors of 196 US kidney transplant centers were emailed a link to an online survey over a 5-week period. RESULTS: Forty-eight out of the 364 directors (13%) responded, with all United Network for Organ Sharing regions represented. Definitions of warm ischemia time (WIT) used in DCD kidney evaluation varied widely among the respondents. The maximum total WIT limit varied, with 19 (39.6%) <60-minute responses, followed by 16 (33%) <90-minute responses, and 10 (20.8%) <120-minute responses. CONCLUSIONS: Despite increasing DCD kidney transplantation volumes in the United States, there are no standardized procurement biopsy practices, organ procurement organization preoperative protocols, or consensus definition or limits of WIT. Agreement on terminology may facilitate rapid clinical communication, efficiency of organ allocation and utilization, recording of data, research, and improvements in policy.


Subject(s)
Kidney Transplantation , Tissue Donors , Tissue and Organ Procurement , Humans , United States , Tissue and Organ Procurement/methods , Surveys and Questionnaires , Tissue Donors/supply & distribution , Organ Preservation/methods , Warm Ischemia
20.
Transplant Direct ; 8(1): e1277, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34966844

ABSTRACT

BACKGROUND: Outcomes of liver transplantation (LT) from donation after circulatory death (DCD) have been improving; however, ischemic cholangiopathy (IC) continues to be a problem. In 2014, measures to minimize donor hepatectomy time (DHT) and cold ischemic time (CIT) have been adopted to improve DCD LT outcomes. METHODS: Retrospective review of all patients who underwent DCD LT between 2005 and 2017 was performed. We compared outcomes of patients who were transplanted before 2014 (historic group) with those who were transplanted between 2014 and 2017 (modern group). RESULTS: We identified 112 patients; 44 were in the historic group and 68 in the modern group. Donors in the historic group were younger (26.5 versus 33, P = 0.007) and had a lower body mass index (26.2 versus 28.2, P = 0.007). DHT (min) and CIT (h) were significantly longer in the historic group (21.5 versus 14, P < 0.001 and 5.3 versus 4.2, P < 0.001, respectively). Fourteen patients (12.5%) developed IC, with a significantly higher incidence in the historic group (23.3% versus 6.1%, P = 0.02). There was no difference in graft and patient survival between both groups. CONCLUSION: In appropriately selected recipients, minimization of DHT and CIT may decrease the incidence of IC. These changes can potentially expand the DCD donor pool.

SELECTION OF CITATIONS
SEARCH DETAIL