Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 64
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Ann Surg ; 279(1): 112-118, 2024 01 01.
Article in English | MEDLINE | ID: mdl-37389573

ABSTRACT

OBJECTIVE: To determine the association of sex with access to liver transplantation among candidates with the highest possible model for end-stage liver disease score (MELD 40). BACKGROUND: Women with end-stage liver disease are less likely than men to receive liver transplantation due in part to MELD's underestimation of renal dysfunction in women. The extent of the sex-based disparity among patients with high disease severity and equally high MELD scores is unclear. METHODS: Using national transplant registry data, we compared liver offer acceptance (offers received at match MELD 40) and waitlist outcomes (transplant vs death/delisting) by sex for 7654 waitlisted liver transplant candidates from 2009 to 2019 who reached MELD 40. Multivariable logistic and competing-risks regression was used to estimate the association of sex with the outcome and adjust for the candidate and donor factors. RESULTS: Women (N = 3019, 39.4%) spent equal time active at MELD 40 (median: 5 vs 5 days, P = 0.28) but had lower offer acceptance (9.2% vs 11.0%, P < 0.01) compared with men (N = 4635, 60.6%). Adjusting for candidate/donor factors, offers to women were less likely accepted (odds ratio = 0.87, P < 0.01). Adjusting for candidate factors, once they reached MELD 40, women were less likely to be transplanted (subdistribution hazard ratio = 0.90, P < 0.01) and more likely to die or be delisted (subdistribution hazard ratio = 1.14, P = 0.02). CONCLUSIONS: Even among candidates with high disease severity and equally high MELD scores, women have reduced access to liver transplantation and worse outcomes compared with men. Policies addressing this disparity should consider factors beyond MELD score adjustments alone.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Male , Humans , Female , End Stage Liver Disease/surgery , End Stage Liver Disease/complications , Severity of Illness Index , Tissue Donors , Waiting Lists
2.
Ann Surg ; 277(1): 57-65, 2023 Jan 01.
Article in English | MEDLINE | ID: mdl-33914483

ABSTRACT

OBJECTIVE: To examine potential disparities in patient access to elective procedures during the recovery phase of the COVID-19 pandemic. SUMMARY OF BACKGROUND DATA: Elective surgeries during the pandemic were limited acutely. Access to surgical care was restored in a recovery phase but backlogs and societal shifts are hypothesized to impact surgical access. METHODS: Adults with electronic health record orders for procedures ("procedure requests"), from March 16 to August 25, 2019 and March 16 to August 25, 2020, were included. Logistic regression was performed for requested procedures that were not scheduled. Linear regression was performed for wait time from request to scheduled or completed procedure. RESULTS: The number of patients with procedure requests decreased 20.8%, from 26,789 in 2019 to 21,162 in 2020. Patients aged 36-50 and >65 years, those speaking non-English languages, those with Medicare or no insurance, and those living >100 miles away had disproportionately larger decreases. Requested procedures had significantly increased adjusted odds ratios (aORs) of not being scheduled for patients with primary languages other than English, Spanish, or Cantonese [aOR 1.60, 95% confidence interval (CI) 1.12-2.28]; unpartnered marital status (aOR 1.21, 95% CI 1.07-1.37); uninsured or self-pay (aOR 2.03, 95% CI 1.53-2.70). Significantly longer wait times were seen for patients aged 36-65 years; with Medi-Cal insurance; from ZIP codes with lower incomes; and from ZIP codes >100 miles away. CONCLUSIONS: Patient access to elective surgeries decreased during the pandemic recovery phase with disparities based on patient age, language, marital status, insurance, socioeconomic status, and distance from care. Steps to address modifiable disparities have been taken.


Subject(s)
COVID-19 , Medicare , Adult , Humans , Aged , United States , Pandemics , Elective Surgical Procedures , Medically Uninsured , Healthcare Disparities
3.
Ann Surg ; 276(5): 860-867, 2022 11 01.
Article in English | MEDLINE | ID: mdl-35894428

ABSTRACT

OBJECTIVE: To define benchmark cutoffs for redo liver transplantation (redo-LT). BACKGROUND: In the era of organ shortage, redo-LT is frequently discussed in terms of expected poor outcome and wasteful resources. However, there is a lack of benchmark data to reliably evaluate outcomes after redo-LT. METHODS: We collected data on redo-LT between January 2010 and December 2018 from 22 high-volume transplant centers. Benchmark cases were defined as recipients with model of end stage liver disease (MELD) score ≤25, absence of portal vein thrombosis, no mechanical ventilation at the time of surgery, receiving a graft from a donor after brain death. Also, high-urgent priority and early redo-LT including those for primary nonfunction (PNF) or hepatic artery thrombosis were excluded. Benchmark cutoffs were derived from the 75th percentile of the medians of all benchmark centers. RESULTS: Of 1110 redo-LT, 373 (34%) cases qualified as benchmark cases. Among these cases, the rate of postoperative complications until discharge was 76%, and increased up to 87% at 1-year, respectively. One-year overall survival rate was excellent with 90%. Benchmark cutoffs included Comprehensive Complication Index CCI ® at 1-year of ≤72, and in-hospital and 1-year mortality rates of ≤13% and ≤15%, respectively. In contrast, patients who received a redo-LT for PNF showed worse outcomes with some values dramatically outside the redo-LT benchmarks. CONCLUSION: This study shows that redo-LT achieves good outcome when looking at benchmark scenarios. However, this figure changes in high-risk redo-LT, as for example in PNF. This analysis objectifies for the first-time results and efforts for redo-LT and can serve as a basis for discussion about the use of scarce resources.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Tissue and Organ Procurement , Benchmarking , End Stage Liver Disease/surgery , Graft Survival , Humans , Retrospective Studies , Treatment Outcome
4.
Liver Transpl ; 28(7): 1144-1157, 2022 07.
Article in English | MEDLINE | ID: mdl-35226793

ABSTRACT

Living donor liver transplantation (LDLT) is an attractive option to decrease waitlist dropout, particularly for patients with hepatocellular carcinoma (HCC) who face lengthening waiting times. Using the United Network for Organ Sharing (UNOS) national database, trends in LDLT utilization for patients with HCC were evaluated, and post-LT outcomes for LDLT versus deceased donor liver transplantation (DDLT) were compared. From 1998 to 2018, LT was performed in 20,161 patients with HCC including 726 (3.6%) who received LDLT. The highest LDLT utilization was prior to the 2002 HCC Model for End-Stage Liver Disease (MELD) exception policy (17.5%) and dropped thereafter (3.1%) with a slight increase following the 6-month wait policy in 2015 (3.8%). LDLT was more common in patients from long-wait UNOS regions with blood type O, in those with larger total tumor diameter (2.3 vs. 2.1 cm, p = 0.02), and higher alpha-fetoprotein at LT (11.5 vs. 9.0 ng/ml, p = 0.04). The 5-year post-LT survival (LDLT 77% vs. DDLT 75%), graft survival (72% vs. 72%), and HCC recurrence (11% vs. 13%) were similar between groups (all p > 0.20). In conclusion, LDLT utilization for HCC has remained low since 2002 with only a slight increase after the 6-month wait policy introduction in 2015. Given the excellent post-LT survival, LDLT appears to be an underutilized but valuable option for patients with HCC, especially those at high risk for waitlist dropout.


Subject(s)
Carcinoma, Hepatocellular , End Stage Liver Disease , Liver Neoplasms , Liver Transplantation , End Stage Liver Disease/etiology , End Stage Liver Disease/surgery , Humans , Liver Transplantation/adverse effects , Living Donors , Neoplasm Recurrence, Local , Retrospective Studies , Severity of Illness Index , Treatment Outcome
5.
Clin Transplant ; 36(3): e14539, 2022 03.
Article in English | MEDLINE | ID: mdl-34791697

ABSTRACT

BACKGROUND: Most patients are listed for liver transplant (LT) following extensive workup as outpatients ("conventional evaluation"). Some patients undergo urgent evaluation as inpatients after being transferred to a transplant center ("expedited evaluation"). We hypothesized that expedited patients would have inferior survival due to disease severity at the time of transplant and shorter workup time. METHODS: Patients who underwent evaluation for LT at our institution between 2012 and 2016 were retrospectively reviewed. The expedited and conventional cohorts were defined as above. Living donor LT recipients, combined liver-kidney recipients, acute liver failure patients, and re-transplant patients were excluded. We compared patient characteristics and overall survival between patients who received a transplant following expedited evaluation and those who did not, and between LT recipients based on expedited or conventional evaluation. RESULTS: Five-hundred and nine patients were included (110 expedited, 399 conventional). There was no difference in graft or patient survival at 1 year for expedited versus conventional LT recipients. In multivariable analysis of overall survival, only Donor Risk Index (HR 1.97, CI 1.04-3.73, P = .037, per unit increase) was associated with increased risk of death. CONCLUSIONS: Patients who underwent expedited evaluation for LT had significant demographic and clinical differences from patients who underwent conventional evaluation, but comparable post-transplant survival.


Subject(s)
Liver Transplantation , Graft Survival , Humans , Liver Transplantation/adverse effects , Living Donors , Retrospective Studies , Risk Factors , Transplant Recipients , Treatment Outcome
6.
Clin Transplant ; 36(6): e14610, 2022 06.
Article in English | MEDLINE | ID: mdl-35143698

ABSTRACT

This study used the prospective National Surgical Quality Improvement Program (NSQIP) Transplant pilot database to analyze surgical complications after liver transplantation (LT) in LT recipients from 2017to 2019. The primary outcome was surgical complication requiring intervention (Clavien-Dindo grade II or greater) within 90 days of transplant. Of the 1684 deceased donor and 109 living donor LT cases included from 29 centers, 38% of deceased donor liver recipients and 47% of living donor liver recipients experienced a complication. The most common complications included biliary complications (19% DDLT; 31% LDLT), hemorrhage requiring reoperation (14% DDLT; 9% LDLT), and vascular complications (6% DDLT; 9% LDLT). Management of biliary leaks (35.3% ERCP, 38.0% percutaneous drainage, 26.3% reoperation) and vascular complications (36.2% angioplasty/stenting, 31.2% medication, 29.8% reoperation) was variable. Biliary (aHR 5.14, 95% CI 2.69-9.8, P < .001), hemorrhage (aHR 2.54, 95% CI 1.13-5.7, P = .024) and vascular (aHR 2.88, 95% CI .85-9.7, P = .089) complication status at 30-days post-transplant were associated with lower 1-year patient survival. We conclude that biliary, hemorrhagic and vascular complications continue to be significant sources of morbidity and mortality for LT recipients. Understanding the different risk factors for complications between deceased and living donor liver recipients and standardizing complication management represent avenues for continued improvement.


Subject(s)
Liver Transplantation , Living Donors , Humans , Liver Transplantation/adverse effects , Postoperative Complications/etiology , Prospective Studies , Quality Improvement , Retrospective Studies , Treatment Outcome
7.
Am J Transplant ; 21(4): 1633-1636, 2021 04.
Article in English | MEDLINE | ID: mdl-33171017

ABSTRACT

Living donor liver transplantation (LDLT) enjoys widespread use in Asia, but remains limited to a handful of centers in North America and comprises only 5% of liver transplants performed in the United States. In contrast, living donor kidney transplantation is used frequently in the United States, and has evolved to commonly include paired exchanges, particularly for ABO-incompatible pairs. Liver paired exchange (LPE) has been utilized in Asia, and was recently reported in Canada; here we report the first LPE performed in the United States, and the first LPE to be performed on consecutive days. The LPE performed at our institution was initiated by a nondirected donor who enabled the exchange for an ABO-incompatible pair, and the final recipient was selected from our deceased donor waitlist. The exchange was performed over the course of 2 consecutive days, and relied on the use and compliance of a bridge donor. Here, we show that LPE is feasible at centers with significant LDLT experience and affords an opportunity to expand LDLT in cases of ABO incompatibility or when nondirected donors arise. To our knowledge, this represents the first exchange of its kind in the United States.


Subject(s)
Liver Transplantation , Living Donors , ABO Blood-Group System , Blood Group Incompatibility , Canada , Humans , North America
8.
J Surg Res ; 265: 153-158, 2021 09.
Article in English | MEDLINE | ID: mdl-33940238

ABSTRACT

BACKGROUND: Kidney transplant recipients are frequently prescribed excess opioids at discharge relative to their inpatient requirements. Recipients who fill prescriptions after transplant have an increased risk of death and graft loss. This study examined the impact of standardized prescriptions on discharge amount and number of outpatient refills. MATERIALS AND METHODS: A historical cohort (Group 1) was compared to a cohort without patient-controlled analgesia (Group 2) and a cohort in which providers prescribed no opioids to patients who required none on the day prior to discharge, and 10 pills to those who required opioids on the day prior (Group 3). Demographics, oral morphine equivalents (OMEs) prescribed on the day prior to and at discharge, and outpatient refills were collected. RESULTS: 270 recipients were included. There was a nonsignificant trend towards lower OMEs on the day prior to discharge in Groups 2 and 3. Nonopioid adjunct use increased (P < 0.001). Discharge OMEs significantly decreased (mean 87.2 in Group 1, 62.8 in Group 2, 26.6 in Group 3, P< 0.001). The number of patients discharged without opioids increased (23.8% of Group 1, 37.5% of Group 2, 60.6% of Group 3, P < 0.001). Group 3, Asian descent, and lower OMEs on the day prior were factors significantly associated with decreased discharge OMEs on multivariable linear regression. Twelve percent of Group 2 and 2% of Group 3 patients received an outpatient refill (P = 0.02). CONCLUSIONS: A protocol targeting discharge opioids significantly reduced the amount of opioids prescribed in kidney transplant recipients; most patients subsequently received no opioids at discharge.


Subject(s)
Analgesics, Opioid/administration & dosage , Drug Prescriptions/standards , Kidney Transplantation/adverse effects , Pain, Postoperative/prevention & control , Adult , Aged , Clinical Protocols , Female , Humans , Male , Middle Aged , Pain, Postoperative/etiology , Patient Discharge , Retrospective Studies
9.
Clin Transplant ; 35(9): e14413, 2021 09.
Article in English | MEDLINE | ID: mdl-34196437

ABSTRACT

BACKGROUND: Postoperative pain after living donor hepatectomy is significant. Postoperative coagulopathy may limit the use of epidural analgesia, the gold standard for pain control in abdominal surgery. The erector spinae plane block (ESPB) is a novel regional anesthesia technique that has been shown to provide effective analgesia in abdominal surgery. In this study, we examined the effect of continuous ESPB, administered via catheters, on perioperative opioid requirements after right living donor hepatectomies for liver transplantation. METHODS: We performed a retrospective cohort study in patients undergoing right living donor hepatectomy. Twenty-four patients who received preoperative ESPB were compared to 51 historical controls who did not receive regional anesthesia. The primary endpoint was the total amount of oral morphine equivalents (OMEs) required on the day of surgery and postoperative day (POD) 1. RESULTS: Patients in the ESPB group required a lower total amount of OMEs on the day of surgery and POD 1 [141 (107-188) mg] compared the control group [293 (220-380) mg; P < .001]. CONCLUSIONS: The use of continuous ESPB significantly reduced opioid consumption following right living donor hepatectomy.


Subject(s)
Analgesia, Epidural , Nerve Block , Feasibility Studies , Hepatectomy , Humans , Living Donors , Retrospective Studies
10.
Clin Transplant ; 35(3): e14195, 2021 03.
Article in English | MEDLINE | ID: mdl-33340143

ABSTRACT

Lower extremity (LE) vascular disease and adverse cardiovascular events (ACEs) cause significant long-term morbidity after simultaneous pancreas-kidney (SPK) transplantation. This study's purpose was to describe the incidence of, and risk factors associated with, LE vascular complications and related ACEs following SPK. All SPKs performed at the authors' institution from 2000 to 2019 were retrospectively analyzed. The primary outcome was any LE vascular event, defined as LE endovascular intervention, open surgery, amputation, or invasive podiatry intervention. Secondary outcomes included post-SPK ACE. A total of 363 patients were included, of whom 54 (14.9%) required at least one LE vascular intervention following SPK. Only 3 patients received pre-SPK ankle brachial indices (ABIs). A history of peripheral artery disease (PAD) (HR 2.95, CI 1.4-6.2) was a risk factor for post-SPK LE vascular intervention even after adjustment for other factors. Fifty-nine (16.3%) patients experienced an ACE in follow-up. Requiring a LE intervention post-SPK was associated with a subsequent ACE (HR 2.3, CI 1.2-4.5). LE vascular and cardiovascular complications continue to be significant sources of morbidity for SPK patients, especially for patients with preexisting PAD. The highest risk patients may benefit from more intensive pre- and post-SPK workup with ABIs and follow-up with a vascular surgeon.


Subject(s)
Diabetes Mellitus, Type 1 , Kidney Transplantation , Pancreas Transplantation , Graft Survival , Humans , Kidney , Kidney Transplantation/adverse effects , Lower Extremity , Pancreas , Pancreas Transplantation/adverse effects , Retrospective Studies
11.
Curr Opin Organ Transplant ; 26(2): 139-145, 2021 04 01.
Article in English | MEDLINE | ID: mdl-33595983

ABSTRACT

PURPOSE OF REVIEW: This review describes the history and current state of left lobe living donor liver transplantation (LDLT). The transplant community continues to face an organ shortage on a global scale, and the expansion of LDLT is attractive because it allows us to provide life-saving liver transplants to individuals without drawing from, or depending on, the limited deceased donor pool. Donor safety is paramount in LDLT, and for this reason, left lobe LDLT is particularly attractive because the donor is left with a larger remnant. RECENT FINDINGS: This article reviews the donor and recipient evaluations for left lobe LDLT, discusses small for size syndrome and the importance of portal inflow modification, and reviews recipient outcomes in right lobe versus left lobe LDLT. SUMMARY: Left lobe LDLT was the first adult-to-adult LDLT ever to be performed in Japan in 1993. Since that time, the use of both right and left lobe LDLT has expanded immensely. Recent work in left lobe LDLT has emphasized the need for inflow modification to reduce portal hyperperfusion and early graft dysfunction following transplant. Accumulating evidence suggests, however, that even though early graft dysfunction following LDLT may prolong hospitalization, it does not predict graft or patient survival.


Subject(s)
Liver Transplantation , Adult , Humans , Liver , Liver Transplantation/adverse effects , Living Donors , Treatment Outcome
12.
Alcohol Alcohol ; 53(2): 178-183, 2018 Mar 01.
Article in English | MEDLINE | ID: mdl-29370340

ABSTRACT

AIMS: Alcoholic liver disease (ALD) is now a well-recognized indication for liver transplantation. This paper reviews existing literature on living donor liver transplantation (LDLT) for ALD and presents data from a single, high volume United States liver transplant center. METHODS: For the literature review, a PubMed search was undertaken using the search terms 'living donor' and 'alcoholic liver disease'. Studies were included that presented outcome data for patients who underwent LDLT for ALD. For the single-center data collection, all patients who underwent LDLT from 2003 to 2016 at our center were reviewed and the data for recipients with ALD was subsequently analyzed and compared with those patients who underwent LDLT for other indications. RESULTS: Of 110 studies that resulted from the PubMed query, only 5 contained data that was relevant to this manuscript. These studies represented data collected from two Asian countries: one single center in Korea and a collection of centers in Japan. The relapse rate following LDLT for ALD ranged from 7.9% to 22%, and pre-transplant abstinence did not impact post-transplant relapse in any of these studies. For the single-center data, of 136 LDLT performed at our institution during the time period, 22 were performed for ALD. There was no difference in 1- or 5-year survival between patients transplanted for ALD and those transplanted for other etiologies (94.7% vs. 93.4%, P = 0.79 and 78.9% vs. 87.5%, P = 0.6). CONCLUSION: There is a very limited amount of data available on LDLT for ALD. Existing data suggests that LDLT for ALD results in excellent outcomes. SHORT SUMMARY: Published data on living donor liver transplantation (LDLT) for alcoholic liver disease (ALD) are limited. One- and five-year survival rates range from 82% to 100% and 78% to 87%, respectively. Rates of alcohol relapse following transplant appear low, ranging from 7% to 23%; 6-month abstinence periods prior to LDLT for ALD do not appear to have a significant impact on relapse.


Subject(s)
Liver Diseases, Alcoholic/surgery , Liver Transplantation/statistics & numerical data , Living Donors/statistics & numerical data , Alcohol Abstinence , Humans , Recurrence , Survival Analysis
13.
Pediatr Transplant ; 21(1)2017 Feb.
Article in English | MEDLINE | ID: mdl-27862714

ABSTRACT

A subset of children who receive a liver and/or kidney transplant develop de novo inflammatory bowel disease-like chronic intestinal inflammation, not explained by infection or medications, following transplant. We have conducted a single-center, retrospective case series describing the unique clinical and histologic features of this IBD-like chronic intestinal inflammation following solid organ transplant. At our center, nine of 327 kidney or liver recipients developed de novo IBD following transplant (six liver, two kidney, one liver-kidney). Most children presented with prolonged hematochezia and diarrhea and were treated with aminosalicylates. At time of diagnosis, five were not currently using mycophenolate mofetil for transplant immunosuppression. Histologic and endoscopic findings at IBD diagnosis included inflammation, ulcerations, granulomas, and chronic colitis. Since diagnosis, no patients have required surgical intervention, or escalation to biologic therapy, nor developed stricturing or perianal disease. In this case series, de novo post-transplant IBD developed in 4% of pediatric liver and/or kidney recipients; however, it often does not fit the classic patterns of Crohn's disease or ulcerative colitis.


Subject(s)
Inflammatory Bowel Diseases/diagnosis , Inflammatory Bowel Diseases/etiology , Kidney Transplantation/adverse effects , Liver Failure/surgery , Liver Transplantation/adverse effects , Renal Insufficiency/surgery , Adolescent , Aminosalicylic Acid/therapeutic use , Child , Child, Preschool , Diarrhea/complications , Female , Gastrointestinal Hemorrhage/complications , Humans , Immunosuppressive Agents/therapeutic use , Infant , Inflammation , Liver Failure/complications , Male , Renal Insufficiency/complications , Retrospective Studies
14.
Curr Opin Organ Transplant ; 21(4): 399-404, 2016 08.
Article in English | MEDLINE | ID: mdl-27258578

ABSTRACT

PURPOSE OF REVIEW: With continued optimization of islet isolation and immunosuppression protocols, the medium-term rates of insulin independence following islet transplantation have improved significantly. This review evaluates the most up-to-date outcomes data for both solid organ pancreas and islet transplantation to develop an algorithm for selection of ß-cell replacement in type 1 diabetes patients. RECENT FINDINGS: Solid organ pancreas and islet transplantation have both displayed improved rates of 5-year insulin independence, largely attributable to improvements in immunosuppressive regimens. The medium-term rates of insulin independence following islet transplantation in highly selected type 1 nonuremic diabetic recipients is beginning to approach the success rates observed following solitary pancreas transplantation. SUMMARY: Although pancreas transplantation has historically been favored for ß-cell replacement, current outcomes following islet transplantation justify the use of this minimally invasive therapy in carefully selected patients. Pancreas transplant remains the procedure of choice for ß-cell replacement in uremic patients. Islet transplantation should be considered in nonuremic patients with low BMI and low insulin requirements, patients lacking the cardiovascular reserve to undergo open abdominal surgery, or patients who elect to forego the risks of a major operation in exchange for an increased risk of islet graft failure.


Subject(s)
Diabetes Mellitus, Type 1/surgery , Immunosuppression Therapy/methods , Islets of Langerhans Transplantation/methods , Pancreas Transplantation/methods , Humans , Treatment Outcome
15.
Knee Surg Sports Traumatol Arthrosc ; 23(4): 1065-70, 2015 Apr.
Article in English | MEDLINE | ID: mdl-24493257

ABSTRACT

PURPOSE: Previous investigations have revealed a greater incidence of anterior cruciate ligament (ACL) injuries in female lacrosse versus field hockey players. Lacrosse is played in an upright posture with overhead throwing and catching, while field hockey is almost exclusively played in a crouched, forward-flexed position. Biomechanical factors, including decreased knee, hip, and trunk flexion angles, have been identified as risk factors for ACL injury. The purpose of this study was to assess ACL biomechanical risk factors in female field hockey and lacrosse players to determine whether sport-specific posture might contribute to the increased incidence of ACL injury observed in lacrosse athletes. METHODS: Thirty-one Division I NCAA females from field hockey and lacrosse completed four tasks, three times per leg: bilateral drop jump, single-leg drop jump (SDJ), single-leg jump onto a Bosu ball (SDB), and a 45° anticipated cut. Kinematic and force plate data were used to evaluate knee flexion angle, knee adduction moment, hip flexion angle, and trunk flexion and sway angles. Muscle activity of the lateral hamstrings and vastus lateralis was used to estimate peak hamstring activity and the quadriceps/hamstring ratio at the time of peak quadriceps activity (co-contraction ratio). RESULTS: During the SDJ and SDB, peak knee flexion angles were greater in field hockey compared with lacrosse. During cutting, field hockey players were more flexed at the trunk and had greater trunk sway, compared with the lacrosse players. No significant difference was observed for the co-contraction ratio for any of the tasks. CONCLUSIONS: Decreased knee flexion angle during landing, consistent with sport-specific playing postures, may contribute to the higher incidence of ACL injury in lacrosse players relative to field hockey. Sport-specific training injury prevention programmes may benefit from considering these differences between specialized athletes. LEVEL OF EVIDENCE: II.


Subject(s)
Anterior Cruciate Ligament Injuries , Athletes , Hockey/injuries , Knee Injuries/physiopathology , Knee Joint/physiopathology , Racquet Sports/injuries , Biomechanical Phenomena , California/epidemiology , Female , Humans , Incidence , Knee Injuries/epidemiology , Knee Injuries/surgery , Risk Factors , Young Adult
16.
Knee Surg Sports Traumatol Arthrosc ; 21(6): 1468-74, 2013 Jun.
Article in English | MEDLINE | ID: mdl-22717739

ABSTRACT

PURPOSE: While the effect of local anaesthetics on chondrocyte viability is widely documented, the effect of these medications on synoviocytes is largely unknown. The purpose of this study was to understand the effect of 0.5 % bupivacaine and 0.5 % bupivacaine with epinephrine on synoviocyte viability, cytokine and growth factor release, and breakdown product formation. METHODS: Rabbit fibroblast-like synoviocyte (Type B) cultures were perfused with 0.5 % bupivacaine or 0.5 % bupivacaine with epinephrine (1:200,000) for 24 h. Cell viability was evaluated using a two-colour fluorescence assay. The supernatant was analysed using multiplex inflammatory and matrix metalloproteinase assays. RESULTS: Synoviocytes treated for 24 h with 0.5 % bupivacaine with epinephrine demonstrated a significant decrease in viability (31.3 ± 19.4 % cell death) when compared with synoviocytes cultured in control media (3.8 ± 1.3 % cell death, p = 0.000) and those cultured in 0.5 % bupivacaine alone (12.6 ± 11.1 % cell death, p = 0.003). No significant decrease in cell viability was observed in synoviocytes treated with 0.5 % bupivacaine compared to those in control media (12.6 ± 11.1 % vs 3.8 ± 1.3 % cell death, p = 0.194). Significantly greater amounts of MMP-1 (47.0 ± 9.2 pg/ml) and MMP-3 (250.0 ± 68.8 pg/ml) were observed in 0.5 % bupivacaine cultures compared with controls (14.3 ± 14.3, p = 0.023 and 72.0 ± 84.9, p = 0.045, respectively). CONCLUSIONS: 0.5 % bupivacaine with epinephrine caused a significant increase in cell death of the synoviocytes, while 0.5 % bupivacaine alone produced cell injury and a significant release of matrix metalloproteinases, which may also lead to indirect injury of the surrounding chondrocytes. These results may help explain the onset of chondrolysis observed in patients who have been treated with intra-articular local anaesthetics.


Subject(s)
Anesthetics, Local/pharmacology , Bupivacaine/pharmacology , Synovial Membrane/cytology , Synovial Membrane/drug effects , Animals , Cell Culture Techniques , Cell Survival , Cytokines/biosynthesis , Drug Combinations , Epinephrine/pharmacology , Injections, Intra-Articular , Intercellular Signaling Peptides and Proteins/biosynthesis , Rabbits , Synovial Membrane/metabolism , Vasoconstrictor Agents/pharmacology
17.
Transplant Direct ; 9(6): e1488, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37250489

ABSTRACT

Although steroid avoidance (SA) has been studied in deceased donor liver transplant, little is known about SA in living donor liver transplant (LDLT). We report the characteristics and outcomes, including the incidence of early acute rejection (AR) and complications of steroid use, in 2 cohorts of LDLT recipients. Methods: Routine steroid maintenance (SM) after LDLT was stopped in December 2017. Our single-center retrospective cohort study spans 2 eras. Two hundred forty-two adult recipients underwent LDLT with SM (January 2000-December 2017), and 83 adult recipients (December 2017-August 2021) underwent LDLT with SA. Early AR was defined as a biopsy showing pathologic characteristics within 6 mo after LDLT. Univariate and multivariate logistic regressions were performed to evaluate the effects of relevant recipient and donor characteristics on the incidence of early AR in our cohort. Results: Neither the difference in early AR rate between cohorts (SA 19/83 [22.9%] versus SM 41/242 [17%]; P = 0.46) nor a subset analysis of patients with autoimmune disease (SA 5/17 [29.4%] versus SM 19/58 [22.4%]; P = 0.71) reached statistical significance. Univariate and multivariate logistic regressions for early AR identified recipient age to be a statistically significant risk factor (P < 0.001). Of the patients without diabetes before LDLT, 3 of 56 (5.4%) on SA versus 26 of 200 (13%) on SM needed medications prescribed for glucose control at the time of discharge (P = 0.11). Patient survival was similar between SA and SM cohorts (SA 94% versus SM 91%, P = 0.34) 3 y after transplant. Conclusions: LDLT recipients treated with SA do not exhibit significantly higher rates of rejection or increased mortality than patients treated with SM. Notably, this result is similar for recipients with autoimmune disease.

18.
Knee Surg Sports Traumatol Arthrosc ; 20(9): 1809-14, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22186921

ABSTRACT

PURPOSE: Corticosteroids are commonly injected into the joint space. However, studies have not examined the chondrotoxicity of one-time injection doses. The purpose of this study is to evaluate the effect of dexamethasone sodium phosphate (Decadron), methylprednisolone acetate (Depo-Medrol), betamethasone sodium phosphate and betamethasone acetate (Celestone Soluspan), and triamcinolone acetonide (Kenalog) on human chondrocyte viability in vitro. METHODS: Single-injection doses of each of the corticosteroids were separately delivered to human chondrocytes for their respective average duration of action and compared to controls using a bioreactor containing a continuous infusion pump constructed to mimic joint fluid metabolism. A 14-day time-controlled trial was also performed. A live/dead reduced biohazard viability/cytotoxicity assay was used to quantify chondrocyte viability. RESULTS: Over their average duration of action, betamethasone sodium phosphate/acetate solution and triamcinolone acetonide caused significant decreases in chondrocyte viability compared to control media (19.8 ± 2.9% vs. 5.2 ± 2.1%, P = 0.0025 and 10.2 ± 1.3% vs. 4.8 ± 0.9%, P = 0.0049, respectively). In the 14-day trial, only betamethasone sodium phosphate/acetate solution caused a significant decrease in chondrocyte viability compared to control media (21.5% vs. 4.6%, P < 0.001). CONCLUSIONS: A single-injection dose of betamethasone sodium phosphate and betamethasone acetate solution illustrated consistent and significant chondrotoxicity using a physiologically relevant in vitro model and should be used with caution. Given the observed chondrotoxicity of triamcinolone acetonide in a single trial, there may be some evidence that this medication is chondrotoxic. However, at 14 days, betamethasone sodium phosphate and betamethasone acetate was the only condition that caused significant cell death.


Subject(s)
Cell Survival/drug effects , Chondrocytes/drug effects , Glucocorticoids/toxicity , Betamethasone/pharmacology , Betamethasone/toxicity , Cells, Cultured , Dexamethasone/pharmacology , Dexamethasone/toxicity , Glucocorticoids/pharmacology , Humans , Injections, Intra-Articular , Methylprednisolone/pharmacology , Methylprednisolone/toxicity , Triamcinolone/pharmacology , Triamcinolone/toxicity
19.
Knee Surg Sports Traumatol Arthrosc ; 20(9): 1689-95, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22037813

ABSTRACT

PURPOSE: Local anesthetic and corticosteroid combination injections are often used in clinical practice, however research investigating the chondrotoxic properties of these combinations is minimal. The goal of this study was to evaluate the effect of single injection doses of 1% lidocaine or 0.25% bupivacaine in combination with single injection doses of dexamethasone sodium phosphate (Decadron), methylprednisolone acetate (Depo-Medrol), betamethasone sodium phosphate and betamethasone acetate (Celestone Soluspan), or triamcinolone acetonide (Kenalog) on human chondrocyte viability. METHODS: All treatment conditions were delivered to human chondrocytes in vitro for the medication's respective average duration of action using a bioreactor containing a continuous infusion pump constructed to mimic joint fluid metabolism. A two-color fluorescence assay was used to evaluate cell viability. A mixed-effects regression model was used to evaluate the mean differences in cell viability between treatment groups. RESULTS: At 14 days, a single injection dose of 1% lidocaine or 0.25% bupivacaine in combination with betamethasone sodium phosphate and betamethasone acetate solution illustrated significant chondrotoxicity when compared with the local anesthetics alone (P < 0.01). Methylprednisolone acetate and Triamcinolone acetonide both showed significant evidence of chondrotoxicity (P = 0.013; P = 0.016, respectively) when used in combination with 1% lidocaine compared with lidocaine alone, but showed no significant chondrotoxicity in combination with 0.25% bupivacaine (P's = n.s.). CONCLUSIONS: Clinicians should use caution when injecting 1% lidocaine or 0.25% bupivacaine in conjunction with betamethasone sodium phosphate and betamethasone acetate solution due to its pronounced chondrotoxic effect in this study. 1% lidocaine used in combination with methylprednisolone acetate or triamcinolone acetonide also led to significant chondrotoxicity.


Subject(s)
Anesthetics, Local/pharmacology , Cell Survival/drug effects , Chondrocytes/drug effects , Glucocorticoids/pharmacology , Betamethasone/pharmacology , Bupivacaine/pharmacology , Cell Line , Dexamethasone/pharmacology , Drug Combinations , Humans , Injections, Intra-Articular , Lidocaine/pharmacology , Methylprednisolone/pharmacology , Triamcinolone/pharmacology
20.
J Matern Fetal Neonatal Med ; 35(2): 308-315, 2022 Jan.
Article in English | MEDLINE | ID: mdl-31984817

ABSTRACT

BACKGROUND/PURPOSE: The differential diagnosis for prenatal suprarenal masses (SRMs) is broad and includes neuroblastoma, adrenal hemorrhage, and subdiaphragmatic extralobar pulmonary sequestration (SEPS). We sought to elucidate the appropriate postnatal management for fetuses found to have an SRM. METHODS: We conducted a retrospective review of patients prenatally diagnosed with SRM at our institution between 1998 and 2018. Prenatal characteristics, imaging, and neonatal outcomes were collected. We also performed a PubMed literature search and pooled analysis of all patients with a prenatally diagnosed SRM previously described in the literature. RESULTS: The literature review yielded 32 studies, of which 19 were single case reports. In our case series, 12 patients were included. Seven patients were delivered vaginally, one was terminated. Postnatal diagnoses included: SEPS (n = 5), adrenal hemorrhage (n = 3), polycystic kidney (n = 2), splenic cyst (n = 1), and unknown for one patient. All but two of the final diagnoses had been on the initial diagnostic differential. With the exception of the terminated fetus, all remain alive today. On pooled analysis, patients who underwent operative management were diagnosed later 32 versus 24 weeks and had a significant predominance of left-sided lesions (59.5 versus 39.2%). The published literature demonstrates a trend toward observation versus resection over the past 30 years. CONCLUSIONS: Patients prenatally diagnosed with an SRM have an excellent prognosis. Our series demonstrates a high incidence of SEPS, which were all resected, and adrenal hemorrhage, which were observed with repeat imaging. These patients can be followed with serial postnatal ultrasounds to determine the diagnosis prior to deciding the appropriate treatment.


Subject(s)
Adrenal Gland Diseases , Bronchopulmonary Sequestration , Fetal Therapies , Bronchopulmonary Sequestration/diagnostic imaging , Bronchopulmonary Sequestration/surgery , Female , Humans , Infant, Newborn , Pregnancy , Retrospective Studies , Ultrasonography, Prenatal
SELECTION OF CITATIONS
SEARCH DETAIL