Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
1.
Exp Clin Transplant ; 15(1): 27-33, 2017 Feb.
Article in English | MEDLINE | ID: mdl-27448148

ABSTRACT

OBJECTIVES: Incisional hernias can occur after any abdominal operation, including after renal transplant. Several risk factors have been identified in nonimmunosuppressed surgical patients. We aimed to identify whether specific risk factors correlated with the development of incisional hernias after renal transplant. The existence of associations between these risk factors and postoperative complications was also reviewed. MATERIALS AND METHODS: We reviewed 969 kidney transplants performed between February 2000 and January 2011. Thirty-nine kidney transplant recipients who were treated with rapamycin were excluded. The following potential risk factors were evaluated: recipient age, sex, body mass index at transplant, delayed graft function, diabetes, albumin, postoperative platelet count, drain placement, donor body mass index, donor type, warm ischemic time, and cold ischemic time. We performed univariate and multivariate logistic regression tests. RESULTS: In our patient group, a total of 52 (5.4%) transplants were complicated by incisional hernia. On univariate analysis, we found that delayed graft function (P = .001) and infection (P < .001) were statistically significant predictors for development of incisional hernia. Multivariate analyses revealed that delayed graft function and length of stay remained statistically significant predictors. CONCLUSIONS: Delayed graft function and length of stay are significant predictors of incisional hernia after kidney transplant.


Subject(s)
Delayed Graft Function/etiology , Incisional Hernia/etiology , Kidney Transplantation/adverse effects , Adult , Aged , Chi-Square Distribution , Databases, Factual , Delayed Graft Function/diagnosis , Female , Humans , Incisional Hernia/diagnosis , Kaplan-Meier Estimate , Length of Stay , Logistic Models , Male , Middle Aged , Multivariate Analysis , Odds Ratio , Proportional Hazards Models , Retrospective Studies , Risk Factors , Surgical Wound Infection/etiology , Time Factors , Treatment Outcome
2.
Vascular ; 24(3): 233-40, 2016 Jun.
Article in English | MEDLINE | ID: mdl-26123057

ABSTRACT

OBJECTIVE: Venous thromboembolism (VTE) is a potentially preventable complication following surgery. There is variation with regard to the most effective mode of prophylaxis. We sought to determine if an aggressive approach to VTE prophylaxis would reduce VTE rates on the inpatient vascular surgical service. METHODS: Vascular inpatients from a single institution from July 2010 to March 2013 were included in the analysis. A protocol for VTE prophylaxis was implemented on the inpatient vascular surgical service in November 2011. This included subcutaneous (SQ) heparin initiation within 24 h of admission unless deemed inappropriate by the attending, as well as intermittent compression devices (ICD) and compression stockings (CS). The rate of VTE was compared prior to and following the intervention. Patients were compared using AHRQ comorbidity categories, APR-DRG severity of illness, insurance status, and principle procedure. T-tests were used to compare continuous variables and chi-square analysis used to compare categorical variables. RESULTS: There were 1483 vascular patients in the pre-intervention group and 1652 patients in the post-intervention group. The rate of pharmacologic prophylaxis was 52.57% pre-intervention compared to 69.33% post-intervention (p < 0.001). The rate of pharmacologic or mechanical prophylaxis was 91.76% pre-intervention compared to 93.10% post-intervention (p = 0.54). The overall rate of VTE prior to the intervention was 1.49% compared to after intervention which was 0.38% (p = 0.033). The DVT rate prior to intervention was 1.09% vs 0.189% after intervention (p = 0.0214). The rate of pulmonary embolism trended towards a significant reduction with the intervention (0.681% vs 0.189%, p = 0.095). There were no statistically significant differences in patient groups based on gender, comorbidity category, severity of illness, or insurance type. CONCLUSIONS: The overall rate of VTE was reduced by 75% after the initiation of a standard protocol for pharmacologic VTE prophylaxis. These findings justify an aggressive approach to VTE prophylaxis in vascular surgery patients.


Subject(s)
Anticoagulants/administration & dosage , Heparin/administration & dosage , Intermittent Pneumatic Compression Devices , Pulmonary Embolism/prevention & control , Stockings, Compression , Vascular Surgical Procedures/adverse effects , Venous Thromboembolism/prevention & control , Venous Thrombosis/prevention & control , Aged , Aged, 80 and over , Anticoagulants/adverse effects , Chi-Square Distribution , Female , Heparin/adverse effects , Humans , Injections, Subcutaneous , Male , Middle Aged , Philadelphia , Pulmonary Embolism/diagnostic imaging , Pulmonary Embolism/etiology , Risk Factors , Time Factors , Treatment Outcome , Venous Thromboembolism/diagnostic imaging , Venous Thromboembolism/etiology , Venous Thrombosis/diagnostic imaging , Venous Thrombosis/etiology
3.
HPB (Oxford) ; 17(12): 1074-84, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26373873

ABSTRACT

BACKGROUND: The Model for End-stage Liver Disease (MELD) has been used as a prognostic tool since 2002 to predict pre-transplant mortality. Increasing proportions of transplant candidates with higher MELD scores, combined with improvements in transplant outcomes, mandate the need to study surgical outcomes in patients with MELD scores of ≥40. METHODS: A retrospective longitudinal analysis of United Network for Organ Sharing (UNOS) data on all liver transplantations performed between February 2002 and June 2011 (n = 33,398) stratified by MELD score (<30, 30-39, ≥40) was conducted. The primary outcomes of interest were short- and longterm graft and patient survival. A Kaplan-Meier product limit method and Cox regression were used. A subanalysis using a futile population was performed to determine futility predictors. RESULTS: Of the 33,398 transplant recipients analysed, 74% scored <30, 18% scored 30-39, and 8% scored ≥40 at transplantation. Recipients with MELD scores of ≥40 were more likely to be younger (P < 0.001), non-White and to have shorter waitlist times (P < 0.001). Overall patient survival correlated inversely with increasing MELD score; this trend was consistent for both short-term (30 days and 90 days) and longterm (1, 3 and 5 years) graft and patient survival. In multivariate analysis, increasing age, African-American ethnicity, donor obesity and diabetes were negative predictors of survival. Futility predictors included patient age of >60 years, obesity, peri-transplantation intensive care unit hospitalization with ventilation, and multiple comorbidities. CONCLUSIONS: Liver transplantation in recipients with MELD scores of ≥40 offers acceptable longterm survival outcomes. Futility predictors indicate the need for prospective follow-up studies to define the population to gain the highest benefit from this precious resource.


Subject(s)
Decision Support Techniques , Liver Diseases/surgery , Liver Transplantation , Survivors , Transplant Recipients , Adolescent , Adult , Aged , Allografts , Chi-Square Distribution , Databases, Factual , Female , Humans , Kaplan-Meier Estimate , Liver Diseases/diagnosis , Liver Diseases/mortality , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Male , Middle Aged , Multivariate Analysis , Patient Selection , Predictive Value of Tests , Proportional Hazards Models , Retrospective Studies , Risk Factors , Survivors/statistics & numerical data , Time Factors , Tissue and Organ Procurement , Transplant Recipients/statistics & numerical data , Treatment Outcome , United States , Young Adult
4.
J Surg Educ ; 71(5): 674-9, 2014.
Article in English | MEDLINE | ID: mdl-24813340

ABSTRACT

INTRODUCTION: Oral and poster presentations at major meetings serve to rapidly present and share study results with the scientific community. On the other hand, full-text publication of abstracts in peer-reviewed journals provides dissemination of knowledge. The purpose of this study was to evaluate the publication rate of abstracts presented at the 2009 American Transplant Congress (ATC), to assess the factors influencing publication and determine the impact factor of these journals. METHODS: All abstracts presented at the 2009 ATC were included in the study. A Pubmed-Medline search was performed to identify a matching journal article. Topics, country of origin, study type, study center and publication year were tabulated. Journals and impact factors of publication were noted. RESULTS: Out of 1938 oral and posters abstracts presented, 103 (16.6%) of oral abstracts and 141 poster abstracts (10.9%) were published as full-text articles. Publication rates according to topics of the meeting and country of origin did demonstrate statistical significant differences (p < 0.05). Single-centered studies had higher publication rates 70.87% (73/103) than multi-centered studies among oral abstracts. Abstracts from multi-centered studies had higher publication rates among poster abstracts (68.09% vs. 31.91%), and the journals they were published in had higher impact factors than single center studies (4.578 vs. 3.897). The median impact factor of the journals was 4.2 (4.8 for oral presentations and 3.627 for poster presentations) that went on to be published as full text manuscripts. When comparing multi-center and single institutions, the difference between 12 month and 24 month publication rates was not statistically significant (p = 0.5443 and 0.1134). However, oral and poster abstracts published by study center (multi/single) did demonstrate a statistically significant difference (p < 0.0001); comparing the type of study, there was also a statistically significant difference between the oral and poster abstract (p < 0.0001). CONCLUSION: The publication rate for abstracts of this 2009 ATC was lower than rates from other fields of medicine. Factors leading to failure require elucidation. Encouraging authors to submit their presentations for full-text publication might improve the rate of publication. Authors should be wary of accepting oral and poster abstracts as dogma; authors should refrain from citing them in publications especially if they are from outside United States and are about liver and kidney transplantation.


Subject(s)
Abstracting and Indexing , Congresses as Topic , Organ Transplantation , Publications , Publishing/statistics & numerical data , Journal Impact Factor , United States
5.
Int J Surg ; 12(6): 551-6, 2014.
Article in English | MEDLINE | ID: mdl-24735894

ABSTRACT

BACKGROUND: Warm ischemic time (WIT) in kidney transplantation has significant effects on graft survival, function, and postoperative morbidity. We utilized the Ice Bag Technique (IBT) to determine if eliminating WIT would decrease the incidence and length of delayed graft function (DGF) in our cohort. METHODS: We conducted a prospective study of 150 kidney transplants. We compared the elimination of WIT with IBT to traditional methods. Data was analyzed using non-parametric statistical tests. RESULTS: 66 of the 134 patients underwent transplantation using IBT. 28 right kidneys, 34 left kidneys, and 4 dual kidneys were implanted successfully. Patients with a body mass index (BMI) as high as 41 were transplanted. Kidneys with up to three arteries and two veins, and kidneys up to 15.5 by 9 cm in size were safely transplanted into either iliac fossa. Despite the complete elimination of WIT, there was no difference in DGF, length of DGF, length of stay graft rejection, graft survival, patient survival, or wound or urologic complications between groups (p > 0.05). CONCLUSIONS: The elimination of warm ischemic time using the IBT does not appear to reduce the incidence or length of DGF in this cohort. The technique may be useful for cases with prolonged anastomosis time (AT), but further studies with larger cohorts are required to determine whether it decreases DGF.


Subject(s)
Cold Temperature , Delayed Graft Function/prevention & control , Kidney Transplantation/methods , Warm Ischemia/adverse effects , Adult , Delayed Graft Function/etiology , Female , Graft Rejection , Graft Survival , Humans , Ice , Intraoperative Care/methods , Kidney/blood supply , Kidney Transplantation/adverse effects , Male , Middle Aged , Prospective Studies
7.
ASAIO J ; 60(2): 189-92, 2014.
Article in English | MEDLINE | ID: mdl-24399062

ABSTRACT

Preexisting organ dysfunctions are known factors of death after placement of implantable mechanical circulatory support (MCS). Extracorporeal membrane oxygenation (ECMO) may able to stabilize organ function in patients with cardiogenic shock before MCS implantation. Between 2008 and 2012, 17 patients with cardiogenic shock were supported with ECMO before implantable MCS placement. Patient's end-organ functions were assessed by metabolic, cardiac, hepatic, renal, and respiratory parameters. Survival data after MCS implantations were analyzed for overall survival to discharge, complications, and breakpoint in days on ECMO to survival. Before MCS implantation, lactate, hepatic, and renal functions were improved and pulmonary edema was resolved. The interval between ECMO initiation and MCS placement was 12.1 ± 7.9 days. Overall survival rate to discharge after left ventricular assist device/total artificial heart placement was 76%. The survival of patients transitioned from ECMO to MCS within 14 days was 92% and was significantly better than the survival of patients from ECMO to MCS supported longer than 14 days, 25%, p < 0.05. ECMO support can immediately stabilize organ dysfunction in patients with cardiogenic shock. After improvement of organ function, MCS implantation should be done without delay, since the patients supported for longer than 14 days with ECMO had inferior survival compared to national data.


Subject(s)
Extracorporeal Membrane Oxygenation , Heart-Assist Devices , Shock, Cardiogenic/mortality , Shock, Cardiogenic/surgery , Extracorporeal Membrane Oxygenation/mortality , Female , Humans , Male , Middle Aged
8.
J Bone Joint Surg Am ; 95(24): 2177-84, 2013 Dec 18.
Article in English | MEDLINE | ID: mdl-24352771

ABSTRACT

BACKGROUND: Periprosthetic joint infection continues to potentially complicate an otherwise successful joint replacement. The treatment of this infection often requires multiple surgical procedures associated with increased complications and morbidity. This study examined the relationship between periprosthetic joint infection and mortality and aimed to determine the effect of periprosthetic joint infection on mortality and any predictors of mortality in patients with periprosthetic joint infection. METHODS: Four hundred and thirty-six patients with at least one surgical intervention secondary to confirmed periprosthetic joint infection were compared with 2342 patients undergoing revision arthroplasty for aseptic failure. The incidence of mortality at thirty days, ninety days, one year, two years, and five years after surgery was assessed. Multivariate analysis was used to assess periprosthetic joint infection as an independent predictor of mortality. In the periprosthetic joint infection population, variables investigated as potential risk factors for mortality were evaluated. RESULTS: Mortality was significantly greater (p < 0.001) in patients with periprosthetic joint infection compared with those undergoing aseptic revision arthroplasty at ninety days (3.7% versus 0.8%), one year (10.6% versus 2.0%), two years (13.6% versus 3.9%), and five years (25.9% versus 12.9%). After controlling for age, sex, ethnicity, number of procedures, involved joint, body mass index, and Charlson Comorbidity Index, revision arthroplasty for periprosthetic joint infection was associated with a fivefold increase in mortality compared with revision arthroplasty for aseptic failures. In the periprosthetic joint infection population, independent predictors of mortality included increasing age, higher Charlson Comorbidity Index, history of stroke, polymicrobial infections, and cardiac disease. CONCLUSIONS: Although it is well known that periprosthetic joint infection is a devastating complication that severely limits joint function and is consistently difficult to eradicate, surgeons must also be cognizant of the systemic impact of periprosthetic joint infection and its major influence on fatal outcome in patients.


Subject(s)
Arthroplasty, Replacement, Hip/mortality , Arthroplasty, Replacement, Knee/mortality , Knee Joint/surgery , Prosthesis-Related Infections/mortality , Aged , Arthroplasty, Replacement, Hip/adverse effects , Arthroplasty, Replacement, Knee/adverse effects , Female , Humans , Incidence , Knee Prosthesis/adverse effects , Longitudinal Studies , Male , Prosthesis-Related Infections/etiology , Prosthesis-Related Infections/surgery , Reoperation , Retrospective Studies , Risk , Treatment Outcome
9.
Am J Orthop (Belle Mead NJ) ; 42(10): E88-90, 2013 Oct.
Article in English | MEDLINE | ID: mdl-24278910

ABSTRACT

Diabetes mellitus is a well-established risk factor for postoperative complications of total joint arthroplasty (TJA). We conducted a study to identify a specific hemoglobin A1c (HbA1c) level at which immediate postoperative complication rates increased after TJA. HbA1c levels were measured within 90 days preoperatively. Complications were documented during the acute postoperative period. Charts were reviewed, and each patient was given a score based on how many of these postoperative complications occurred. Overall, 1118 patients were retrospectively analyzed between 2009 and 2011. Patients were grouped into 5 HbA1c level ranges, and a mean postoperative complication point score was obtained for each group. We found that mean postoperative complication rates increased along with HbA1c levels; HbA1c levels higher than 7.5% correlate strongly with a higher rate of postoperative complications. These findings provide a good foundation for prospective studies and further evidence of the effects of HbA1c levels. If an adequate treatment plan for these patients emerges, these findings may help lower readmission rates as well.


Subject(s)
Arthroplasty, Replacement/adverse effects , Diabetes Mellitus/blood , Glycated Hemoglobin/analysis , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Postoperative Complications/etiology , Reoperation , Retrospective Studies , Risk Factors , Treatment Outcome
10.
Foot Ankle Int ; 34(9): 1227-32, 2013 Sep.
Article in English | MEDLINE | ID: mdl-23613329

ABSTRACT

BACKGROUND: Arthrodesis is currently the most commonly performed surgical procedure for the treatment of arthritis of the first metatarsophalangeal (MTP) joint. Hemiarthroplasty of the first MTP joint has been shown to have inferior clinical results and higher revision rates. The objective of this study was to assess the clinical outcome of the salvage of failed hallux phalangeal hemiarthroplasty with conversion to arthrodesis. METHODS: A retrospective review of patients who underwent salvage of the first MTP joint hemiarthroplasty with conversion to arthrodesis was performed. Preoperative assessment included the visual analog pain (VAP) scale and AOFAS Hallux Metatarsophalangeal Interphalangeal scoring system (AOFAS-HMI). Postoperative outcomes were graded via AOFAS-HMI, VAP, and Foot and Ankle Ability Measure (FAAM). RESULTS: Twenty-one hemiarthroplasties were converted to arthrodesis in 21 patients, with 18 available for follow-up included in the study. There were 13 women and 5 men. Local autologous bone graft was used in 12 cases, while 6 patients required tricortical iliac crest bone graft for the treatment of extensive bone loss. At final follow-up, at a mean of 4.3 years, the average VAS pain score had diminished to 0.75 from 7.8 preoperatively out of 10, while the mean AOFAS-HMI improved from 36.2 out of 100 preoperatively to 85.3 out of 90 (modified to exclude first MTP motion). The mean FAAM ADL/sports were 97.3/91.3, respectively. All patients achieved fusion although at a longer interval than primary fusions. CONCLUSIONS: Conversion from a failed hallux phalangeal hemiarthroplasty to arthrodesis showed similar success to primary arthrodesis which was achieved in the majority of cases with the use of regional bone graft for small defects. However, the time to fusion was longer than that of primary arthrodesis, and it sometimes required structural bone graft for augmentation. LEVEL OF EVIDENCE: Level IV, retrospective case series.


Subject(s)
Arthrodesis , Metatarsophalangeal Joint/surgery , Bone Transplantation , Female , Hemiarthroplasty , Humans , Ilium/transplantation , Male , Pain Measurement , Retrospective Studies , Treatment Failure
11.
Clin Orthop Relat Res ; 471(10): 3230-6, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23539123

ABSTRACT

BACKGROUND: Persistent wound drainage after hip arthroplasty is a risk factor for periprosthetic infection. Negative pressure wound therapy (NPWT) has been used in other fields for wound management although it is unclear whether the technique is appropriate for total hip arthroplasty. QUESTIONS/PURPOSES: We determined (1) the rate of wound complications related to use of NPWT for persistent incisional drainage after hip arthroplasty; (2) the rate of resolution of incisional drainage using this modality; and (3) risk factors for failure of NPWT for this indication. METHODS: In a pilot study we identified 109 patients in whom NPWT was used after hip arthroplasty for treating postoperative incisional drainage between April 2006 and April 2010. On average, the NPWT was placed on postoperative Day 3 to 4 (range, 2-9 days) and applied for 2 days (range, 1-10 days). We then determined predictors of subsequent surgery. Patients were followed until failure or a minimum of 1 year (average, 29 months; range, 1-62 months). RESULTS: Eighty-three patients (76%) had no further surgery and 26 patients (24%) had subsequent surgery: 11 had superficial irrigation and débridement (I&D), 12 had deep I&D with none requiring further surgery, and three ultimately had component removal. Predictors of subsequent surgery included international normalized ratio level greater than 2, greater than one prior hip surgery, and device application greater than 48 hours. There were no wound-related complications associated with NPWT. CONCLUSIONS: The majority of our patients had cessation of wound drainage with NPWT. LEVEL OF EVIDENCE: Level IV, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.


Subject(s)
Arthroplasty, Replacement, Hip/methods , Hip Prosthesis/adverse effects , Negative-Pressure Wound Therapy/methods , Prosthesis-Related Infections/prevention & control , Surgical Wound Infection/prevention & control , Adult , Aged , Aged, 80 and over , Arthroplasty, Replacement, Hip/adverse effects , Databases, Factual , Debridement , Drainage , Female , Humans , Incidence , Male , Middle Aged , Pilot Projects , Prosthesis-Related Infections/epidemiology , Risk Factors , Surgical Wound Infection/epidemiology , Treatment Outcome , Wound Healing
12.
J Vasc Surg Venous Lymphat Disord ; 1(3): 316-9, 2013 Jul.
Article in English | MEDLINE | ID: mdl-26992596

ABSTRACT

Lymphatic leakage is an uncommon but serious complication following vascular procedures. When conservative measures fail, accurate identification and ligation of disrupted lymphatic channels is necessary to avoid recurrence. We report the case of a 52-year-old male with a left forearm lymphocele, which occurred following repair of an interosseous artery pseudoaneurysm. Successful lymphatic identification and ligation was performed using intradermal injection of Isosulphan blue dye at the time of operation.

SELECTION OF CITATIONS
SEARCH DETAIL