Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 130
Filter
1.
Transplant Proc ; 35(8): 2998-3002, 2003 Dec.
Article in English | MEDLINE | ID: mdl-14697960

ABSTRACT

The organ allocation system for liver transplantation was recently changed to address criticisms that it was too subjective and relied too heavily on total waiting time. The new system, Model for End-Stage Liver Disease and Pediatric Model for End-Stage Liver Disease (MELD/PELD), stratifies patients based on the risk of 3-month pretransplant mortality, allocating livers thereby. There is concern that such a scheme gives priority to the sickest patients, who may not enjoy good posttransplant outcomes. The aim of the present study was to compare the outcomes of liver transplant recipients who had been admitted to the intensive care unit (ICU) to those who had not. Admission to the ICU is considered here to be another indicator of the severity of illness. Patients who underwent liver transplantation at the Cleveland Clinic between January 1, 1993 and October 31, 1998 and were at least 18 years of age were coded for liver transplantation as status 2, 2A, and 2B (n = 112). These patients fell into three groups: those who had been admitted to an ICU before transplantation (group A, n = 16), those who had been admitted to the hospital but not to an ICU (group B, n = 63), and those who were living at home and had undergone an elective transplant (group C, n = 33). Clinical and demographic information (age, sex, race, disease severity, disease etiology, and cold ischemia time) were associated with patient survival, patient/graft survival, and posttransplant resource utilization (hospital length of stay and hospital charges). Age, sex, race, etiology of disease, and cold ischemia time were similar among the three groups. Patient survival, patient/graft survival, and hospital charges were not statistically different between the three groups. The median length of stay was statistically different only between groups B and C (P =.006). Our data support the idea that if severely ill patients with end-stage liver disease are selected appropriately, liver transplant outcomes are similar to those observed among subjects who are less ill and are transplanted electively from home.


Subject(s)
Graft Survival/physiology , Health Care Rationing/methods , Liver Transplantation/physiology , Female , Humans , Liver Diseases/classification , Liver Diseases/surgery , Liver Transplantation/mortality , Male , Middle Aged , Models, Biological , Organ Preservation/methods , Resource Allocation/methods , Retrospective Studies , Survival Analysis , Time Factors , Treatment Outcome , Waiting Lists
2.
Bone Marrow Transplant ; 30(5): 311-4, 2002 Sep.
Article in English | MEDLINE | ID: mdl-12209353

ABSTRACT

High-dose etoposide (2 g/m(2)) plus G-CSF is a very effective regimen for peripheral blood progenitor cell (PBPC) mobilization. Unfortunately, neutropenia is common. The infectious complications associated with high-dose etoposide have not been previously described. After noting a high incidence of hospitalizations for neutropenic fever, we began a vigorous prophylactic antibiotic regimen for patients receiving high-dose etoposide plus G-CSF, attempting to reduce infectious complications. Ninety-eight patients underwent etoposide mobilization between December 1997 and June 2000. Three chronological patient groups received: (1) no specific antibiotic prophylaxis (n = 44); (2) vancomycin i.v., cefepime i.v., clarithromycin p.o., and ciprofloxacin p.o. (n = 27); and (3) vancomycin i.v., clarithromycin p.o., and ciprofloxacin p.o. (n = 27). The patients not receiving antibiotic prophylaxis had a 68% incidence of hospitalization for neutropenic fever. In the patients receiving prophylaxis, the incidence was reduced to 26% and 15% respectively, for an overall incidence of 20% (P < 0.001 for comparison between prophylaxed and unprophylaxed groups). We conclude that etoposide mobilization is associated with a significant incidence of neutropenic fever, which can be substantially reduced by a vigorous antimicrobial prophylactic program.


Subject(s)
Antibiotic Prophylaxis/methods , Drug Therapy, Combination/therapeutic use , Etoposide/adverse effects , Fever/prevention & control , Hematopoietic Stem Cell Mobilization/adverse effects , Neutropenia/prevention & control , Ambulatory Care , Cefepime , Cephalosporins/adverse effects , Ciprofloxacin/administration & dosage , Clarithromycin/administration & dosage , Data Collection , Etoposide/administration & dosage , Female , Fever/chemically induced , Hematopoietic Stem Cell Mobilization/methods , Humans , Male , Middle Aged , Neutropenia/chemically induced , Opportunistic Infections/prevention & control , Vancomycin/administration & dosage
4.
J Urol ; 166(6): 2043-7, 2001 Dec.
Article in English | MEDLINE | ID: mdl-11696703

ABSTRACT

PURPOSE: The extended outcome after kidney donation has been a particular concern ever since the recognition of hyperfiltration injury. Few published reports have examined donor renal outcome after 20 years or greater. Kidney transplantation has been performed at the Cleveland Clinic Foundation since 1963, at which there is extensive experience with live donor transplantation. We assess the impact of donor nephrectomy on renal function, urinary protein excretion and development of hypertension postoperatively to examine whether renal deterioration occurs with followup after 20 years or greater. MATERIALS AND METHODS: From 1963 to 1975, 180 live donor nephrectomies were performed at the Cleveland Clinic. We attempted to contact all patients to request participation in our study. Those 70 patients who agreed to participate in the study were mailed a package containing a 24-hour urine container (for assessment of creatinine, and total protein and albumin), a vial for blood collection (for assessment of serum creatinine) and a medical questionnaire. All specimens were returned to and processed by the Cleveland Clinic medical laboratories. Blood pressure was taken and recorded by a local physician. A 24-hour creatinine clearance and the Cockcroft-Gault formula were used to estimate renal function, and values were compared with an age adjusted glomerular filtration rate for a solitary kidney. RESULTS: Mean patient followup was 25 years. The 24-hour urinary creatinine clearance decreased to 72% of the value before donation. For the entire study cohort serum creatinine and systolic blood pressure after donation were significantly increased compared with values before, although still in the normal range. The overall incidence of hypertension was comparable to that expected in the age matched general population. There was no gender or age difference (younger or older than 50 years) for 24-hour urinary creatinine clearance, or change in serum creatinine before or after donation. Urinary protein and albumin excretion after donation was significantly higher in males compared with females. There were 13 (19%) subjects who had a 24-hour urinary protein excretion that was greater than 0.15 gm./24 hours, 5 (7%) of whom had greater than 0.8. No gender difference was noted in blood pressure, and there were no significant changes in diastolic pressure based on gender or age. CONCLUSIONS: Overall, renal function is well preserved with a mean followup of 25 years after donor nephrectomy. Males had significantly higher protein and albumin excretion than females but no other clinically significant differences in renal function, blood pressure or proteinuria were noted between them or at age of donation. Proteinuria increases with marginal significance but appears to be of no clinical consequence in most patients. Patients with mild or borderline proteinuria before donation may represent a subgroup at particular risk for the development of significant proteinuria 20 years or greater after donation. The overall incidence of proteinuria in our study is in the range of previously reported values after donor nephrectomy.


Subject(s)
Kidney/physiology , Living Donors , Nephrectomy , Adult , Aged , Aged, 80 and over , Female , Follow-Up Studies , Humans , Kidney Transplantation , Male , Middle Aged , Time Factors
5.
Dis Colon Rectum ; 44(10): 1441-5, 2001 Oct.
Article in English | MEDLINE | ID: mdl-11598472

ABSTRACT

INTRODUCTION: There are no previous comparative studies of total abdominal colectomy by laparoscopic methods in ulcerative colitis and Crohn's disease patients requiring urgent colectomy. This study aimed to determine the safety and efficacy of laparoscopic colectomy in these patients compared with those undergoing conventional urgent colectomy. METHODS: Patients undergoing laparoscopic total colectomy for acute colitis were identified in a prospective registry. All patients underwent a total colectomy with creation of an end ileostomy and buried mucous fistula. No patient had fulminant disease (tachycardia, fever, marked leukocytosis, peritonitis), but all were failing to respond to medical treatment. Patients undergoing conventional total colectomy were matched for age, gender, body mass index, diagnosis, disease severity, and operative period. Median values (range) are listed. RESULTS: From 1997 to 1999, there were 19 laparoscopic and 29 matched conventional patients. There were no inadvertent colotomies or conversions in the laparoscopic group. Although there was no difference in operative blood loss in the laparoscopic group (100 (range, 50-700) ml) when compared with the conventional group (150 (range, 50-500) ml), the operative times were significantly longer in the laparoscopic group (210 (range, 150-270) vs. 120 (range, 60-180) minutes, P < 0.001). Bowel function returned more quickly in the laparoscopic group (1 (range, 1-3) vs. 2 (range, 1-4) days; P = 0.003) and the length of stay was shorter (4 (range, 3-13) vs. 6 (range, 4-24) days; P = 0.04). Complications occurred in three (16 percent) laparoscopic patients (2 wound infection and 1 ileus) and in seven (24 percent) conventional patients (3 wound infection, 3 deep venous thrombosis, 1 upper gastrointestinal bleed). CONCLUSIONS: Laparoscopic total colectomy is feasible and safe in patients with acute nonfulminant colitis and may lead to a faster recovery than conventional resection.


Subject(s)
Colectomy , Colitis/surgery , Laparoscopy , Acute Disease , Adolescent , Adult , Case-Control Studies , Colectomy/methods , Emergencies , Female , Humans , Male , Middle Aged
6.
Ann Thorac Surg ; 72(3): 725-30, 2001 Sep.
Article in English | MEDLINE | ID: mdl-11565648

ABSTRACT

BACKGROUND: Implantable left ventricular assist devices (LVAD) are used as a bridge to transplantation but are associated with a high risk of infection including nosocomial bloodstream infections (BSI). METHODS: We retrospectively reviewed the medical records of all patients with implantable LVAD at the Cleveland Clinic with 72 hours or longer of LVAD support from January 1992 through June 2000, to determine the attack rate, incidence, and impact of nosocomial BSI in patients with LVAD. A nosocomial BSI was defined using Centers for Disease Control and Prevention definition. An LVAD-related BSI was defined as one where the same pathogen is cultured from the device and the blood with no other obvious source. Two hundred fourteen patients were included in the study (17,831 LVAD-days). RESULTS: One hundred forty BSI were identified in 104 patients for an attack rate of 49% and incidence of 7.9 BSI per 1000 LVAD-days. Thirty-eight percent of the BSI were LVAD associated. The most common pathogens causing BSI were coagulase-negative staphylococci (n = 33), Staphylococcus aureus, and Candida spp. (19 each), and Pseudomonas aeruginosa (16 each). A Cox proportional hazard model found BSI in patients with LVAD to be significantly associated with death (hazard ratio = 4.02, p < 0.001). Fungemia had the highest hazard ratio (10.9), followed by gram-negative bacteremia (5.1), and gram-positive bacteremia (2.2). CONCLUSIONS: Patients with implantable LVAD have a high incidence of BSI, which are associated with a significantly increased mortality. Strategies for prevention of infection in LVAD recipients should focus on the drive line exit site until technical advances can achieve a totally implantable device.


Subject(s)
Bacteremia/etiology , Cross Infection/etiology , Fungemia/etiology , Heart-Assist Devices/adverse effects , Bacteremia/microbiology , Bacteremia/mortality , Female , Fungemia/microbiology , Fungemia/mortality , Fungi/isolation & purification , Gram-Negative Bacteria/isolation & purification , Gram-Positive Bacteria/isolation & purification , Heart Transplantation , Heart-Assist Devices/microbiology , Humans , Male , Middle Aged , Proportional Hazards Models , Prostheses and Implants/adverse effects , Prostheses and Implants/microbiology , Retrospective Studies , Risk Factors , Survival Rate
7.
Bone Marrow Transplant ; 27(8): 843-6, 2001 Apr.
Article in English | MEDLINE | ID: mdl-11477442

ABSTRACT

The role of autologous peripheral blood progenitor cell (PBPC) transplantation for high-risk stage II/III breast cancer remains controversial. New prognostic indicators defining subsets of patients who may benefit from autologous PBPC transplantation would be clinically useful. The axillary lymph node ratio, defined by the total number of axillary nodes involved with cancer divided by the number of axillary nodes surgically sampled, has been reported to be of potential prognostic importance in transplantation for high-risk, stage II/III breast cancer. We therefore retrospectively reviewed 111 women with high-risk, stage II/III breast cancer with at least four positive axillary lymph nodes undergoing autologous PBPC transplantation from 1991 to June 1999. None of the patients had received prior radiotherapy and all had completed one, and only one, course of at least three cycles of adjuvant chemotherapy. The median number of axillary nodes sampled was 20 (range 6-40) and the median number of positive axillary nodes was 12 (range 4-35). The median node ratio, dividing the number of positive nodes by the number of sampled nodes, was 0.68. Event-free survival was strongly influenced by node ratio. Patients having a node ratio of < 0.7 had a 5-year event-free survival of 68%, vs those with a node ratio of > or = 0.7 with a 5-year event-free survival of 46% (P = 0.03). Forty percent of patients with a high node ratio have relapsed vs 20% with a low node ratio (P = 0.02). Multivariate analysis revealed that positive estrogen receptor status and a node ratio of < 0.7 were independent factors related to better event-free survival (P = 0.0001 and P = 0.004, respectively). We conclude that patients having a node ratio of < 0.7 have a significantly better prognosis following autologous PBPC transplantation than do patients with a ratio > or = 0.7.


Subject(s)
Breast Neoplasms/diagnosis , Breast Neoplasms/therapy , Lymph Nodes/pathology , Transplantation, Autologous , Adult , Axilla , Breast Neoplasms/pathology , Female , Hematopoietic Stem Cell Transplantation , Humans , Middle Aged , Neoplasm Staging , Prognosis , Receptors, Estrogen/metabolism , Retrospective Studies , Risk Factors , Severity of Illness Index , Survival Analysis
8.
J Heart Lung Transplant ; 20(4): 425-30, 2001 Apr.
Article in English | MEDLINE | ID: mdl-11295580

ABSTRACT

BACKGROUND: Hypogammaglobulinemia (HGG) has been reported after solid organ transplantation and is noted to confer an increased risk of opportunistic infections. OBJECTIVES: In this study, we sought to assess the relationship between severe HGG to infection and acute cellular rejection following heart transplantation. METHODS: Between February 1997 and January 1999, we retrospectively analyzed the clinical outcome of 111 consecutive heart transplant recipients who had immunoglobulin G (IgG) level monitoring at 3 and 6 months post-transplant and when clinically indicated. RESULTS: Eighty-one percent of patients were males, mean age 54 +/- 13 years, and the mean follow-up period was 13.8 +/- 5.7 months. Patients had normal IgG levels prior to transplant (mean 1137 +/- 353 mg/dl). Ten percent (11 of 111) of patients developed severe HGG (IgG < 350 mg/dl) post-transplant. The average time to the lowest IgG level was 196 +/- 125 days. Patients with severe HGG were at increased risk of opportunistic infections compared to patients with IgG > 350 mg/dl (55% [6 of 11] vs. 5% [5 of 100], odds ratio = 22.8, p < 0.001). Compared to patients with no rejection, patients who experienced three or more episodes of rejection had lower mean IgG (580 +/- 309 vs. 751 +/- 325, p = 0.05), and increased incidence of severe HGG (33% [7 of 21] vs. 2.8% [1 of 35], p = 0.001). The incidence of rejection episodes per patient at 1 year was higher in patients with severe HGG compared to patients with IgG >350 (2.82 +/- 1.66 vs. 1.36 +/- 1.45 episodes/patient, p = 0.02). The use of parenteral steroid pulse therapy was associated with an increased risk of severe HGG (odds ratio = 15.28, p < 0.001). CONCLUSIONS: Severe HGG after cardiac transplantation may develop as a consequence of intensification of immunosuppressive therapy for rejection and hence, confers an increased risk of opportunistic infections. IgG level may be a useful marker for identifying patients at high risk.


Subject(s)
Agammaglobulinemia/complications , Graft Rejection/complications , Heart Transplantation , Immunoglobulin G/blood , Opportunistic Infections/etiology , Steroids/adverse effects , Agammaglobulinemia/blood , Agammaglobulinemia/etiology , Biomarkers/blood , Chi-Square Distribution , Female , Graft Rejection/blood , Humans , Immunosuppressive Agents/adverse effects , Infusions, Parenteral , Logistic Models , Male , Middle Aged , Odds Ratio , Opportunistic Infections/blood , Pulse Therapy, Drug , Retrospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL