Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 101
Filter
1.
Osteoarthritis Cartilage ; 31(6): 766-774, 2023 06.
Article in English | MEDLINE | ID: mdl-36696941

ABSTRACT

OBJECTIVE: To determine the effects of acute (≤7 days) femoral head ischemia on the proximal femoral growth plate and metaphysis in a piglet model of Legg-Calvé-Perthes disease (LCPD). We hypothesized that qualitative and quantitative histological assessment would identify effects of ischemia on endochondral ossification. DESIGN: Unilateral femoral head ischemia was surgically induced in piglets, and femurs were collected for histological assessment at 2 (n = 7) or 7 (n = 5) days post-ischemia. Samples were assessed qualitatively, and histomorphometry of the growth plate zones and primary spongiosa was performed. In a subset of samples at 7 days, hypertrophic chondrocytes were quantitatively assessed and immunohistochemistry for TGFß1 and Indian hedgehog was performed. RESULTS: By 2 days post-ischemia, there was significant thinning of the proliferative and hypertrophic zones, by 63 µm (95% CI -103, -22) and -19 µm (95% CI -33, -5), respectively. This thinning persisted at 7 days post-ischemia. Likewise, at 7 days post-ischemia, the primary spongiosa was thinned to absent by an average of 311 µm (95% CI -542, -82) in all ischemic samples. TGFß1 expression was increased in the hypertrophic zone at 7 days post-ischemia. CONCLUSIONS: Alterations to the growth plate zones and metaphysis occurred by 2 days post-ischemia and persisted at 7 days post-ischemia. Our findings suggest that endochondral ossification may be disrupted at an earlier time point than previously reported and that growth disruption may occur in the piglet model as occurs in some children with LCPD.


Subject(s)
Legg-Calve-Perthes Disease , Animals , Swine , Legg-Calve-Perthes Disease/pathology , Femur Head/pathology , Growth Plate/pathology , Hedgehog Proteins , Ischemia
2.
Osteoarthritis Cartilage ; 30(9): 1244-1253, 2022 09.
Article in English | MEDLINE | ID: mdl-35644462

ABSTRACT

OBJECTIVE: To determine if the quantitative MRI techniques T2 and T1ρ mapping are sensitive to ischemic injury to epiphyseal cartilage in vivo in a piglet model of Legg-Calvé-Perthes disease using a clinical 3T MRI scanner. We hypothesized that T2 and T1ρ relaxation times would be increased in the epiphyseal cartilage of operated vs contralateral-control femoral heads 1 week following onset of ischemia. DESIGN: Unilateral femoral head ischemia was surgically induced in eight piglets. Piglets were imaged 1 week post-operatively in vivo at 3T MRI using a magnetization-prepared 3D fast spin echo sequence for T2 and T1ρ mapping and a 3D gradient echo sequence for cartilage segmentation. Ischemia was confirmed in all piglets using gadolinium contrast-enhanced MRI. Median T2 and T1ρ relaxation times were measured in the epiphyseal cartilage of the ischemic and control femoral heads and compared using paired t-tests. Histological assessment was performed on a subset of five piglets. RESULTS: T2 and T1ρ relaxation times were significantly increased in the epiphyseal cartilage of the operated vs control femoral heads (ΔT2 = 11.9 ± 3.7 ms, 95% CI = [8.8, 15.0] ms, P < 0.0001; ΔT1ρ = 12.8 ± 4.1 ms, 95% CI = [9.4, 16.2] ms, P < 0.0001). Histological assessment identified chondronecrosis in the hypertrophic and deep proliferative zones within ischemic epiphyseal cartilage. CONCLUSIONS: T2 and T1ρ mapping are sensitive to ischemic injury to the epiphyseal cartilage in vivo at clinical 3T MRI. These techniques may be clinically useful to assess injury and repair to the epiphyseal cartilage to better stage the extent of ischemic damage in Legg-Calvé-Perthes disease.


Subject(s)
Cartilage, Articular , Legg-Calve-Perthes Disease , Animals , Cartilage/pathology , Cartilage, Articular/diagnostic imaging , Cartilage, Articular/pathology , Femur Head/diagnostic imaging , Femur Head/pathology , Growth Plate/diagnostic imaging , Growth Plate/pathology , Ischemia/diagnostic imaging , Ischemia/etiology , Legg-Calve-Perthes Disease/diagnostic imaging , Legg-Calve-Perthes Disease/pathology , Magnetic Resonance Imaging/methods , Swine
3.
Osteoarthritis Cartilage ; 28(9): 1235-1244, 2020 09.
Article in English | MEDLINE | ID: mdl-32278071

ABSTRACT

OBJECTIVE: Evaluate articular cartilage by magnetic resonance imaging (MRI) T2∗ mapping within the distal femur and proximal tibia in adolescents with juvenile osteochondritis dissecans (JOCD). DESIGN: JOCD imaging studies acquired between August 2011 and February 2019 with clinical and T2∗ mapping MRI knee images were retrospectively collected and analyzed for 31 participants (9F/22M, 15.0 ± 3.8 years old) with JOCD lesions in the medial femoral condyle (MFC). In total, N = 32 knees with JOCD lesions and N = 14 control knees were assessed. Mean T2∗ values in four articular cartilage regions-of-interest (MFC, lateral femoral condyle (LFC), medial tibia (MT), and lateral tibia (LT)) and lesion volume were measured and analyzed using Wilcoxon-rank-sum tests and Spearman correlation coefficients (R). RESULTS: Mean ± standard error T2∗ differences observed between the lesion-sided MFC and the LFC in JOCD-affected knees (28.5 ± 0.9 95% confidence interval [26.8, 30.3] vs 26.3 ± 0.7 [24.8, 27.7] ms, P = 0.088) and between the affected- and control-knee MFC (28.5 ± 0.9 [26.8, 30.3] vs 28.5 ± 0.6 [27.1, 29.9] ms, P = 0.719) were nonsignificant. T2∗ was significantly increased in the lesion-sided MT vs the LT for the JOCD-affected knees (21.5 ± 0.7 [20.1, 22.9] vs 18.0 ± 0.7 [16.5, 19.5] ms, P = 0.002), but this same difference was also observed between the MT and LT in control knees (21.0 ± 0.6 [19.7, 22.3] vs 18.1 ± 1.1 [15.8, 20.4] ms, P = 0.037). There was no significant T2∗ difference between the affected- and control-knee MT (21.5 ± 0.7 [20.1, 22.9] vs 21.0 ± 0.6 [19.7, 22.3] ms, P = 0.905). T2∗ within the lesion-sided MFC was not correlated with patient age (R = 0.20, P = 0.28) or lesion volume (R = 0.06, P = 0.75). T2∗ values were slightly increased near lesions in later-stage JOCD subjects but without statistical significance. CONCLUSIONS: T2∗ relaxations times were not significantly different from control sites in the articular cartilage overlying JOCD lesions in the MFC or adjacent MT cartilage in early-stage JOCD.


Subject(s)
Cartilage, Articular/diagnostic imaging , Knee Joint/diagnostic imaging , Osteochondritis Dissecans/diagnostic imaging , Adolescent , Age of Onset , Child , Female , Femur/diagnostic imaging , Humans , Magnetic Resonance Imaging , Male , Retrospective Studies , Tibia/diagnostic imaging , Young Adult
4.
Am J Transplant ; 16(5): 1503-15, 2016 05.
Article in English | MEDLINE | ID: mdl-26602886

ABSTRACT

Solid phase immunoassays (SPI) are now routinely used to detect HLA antibodies. However, the flow cytometric crossmatch (FCXM) remains the established method for assessing final donor-recipient compatibility. Since 2005 we have followed a protocol whereby the final allocation decision for renal transplantation is based on SPI (not the FCXM). Here we report long-term graft outcomes for 508 consecutive kidney transplants using this protocol. All recipients were negative for donor-specific antibody by SPI. Primary outcomes are graft survival and incidence of acute rejection within 1 year (AR <1 year) for FCXM+ (n = 54) and FCXM- (n = 454) recipients. Median follow-up is 7.1 years. FCXM+ recipients were significantly different from FCXM- recipients for the following risk factors: living donor (24% vs. 39%, p = 0.03), duration of dialysis (31.0 months vs. 13.5 months, p = 0.008), retransplants (17% vs. 7.3%, p = 0.04), % sensitized (63% vs. 19%, p = 0.001), and PRA >80% (20% vs. 4.8%, p = 0.001). Despite these differences, 5-year actual graft survival rates are 87% and 84%, respectively. AR <1 year occurred in 13% FCXM+ and 12% FCXM- recipients. Crossmatch status was not associated with graft outcomes in any univariate or multivariate model. Renal transplantation can be performed successfully, using SPI as the definitive test for donor-recipient compatibility.


Subject(s)
Blood Grouping and Crossmatching , Graft Rejection/diagnosis , Health Care Rationing/methods , Histocompatibility Testing/methods , Isoantibodies/immunology , Kidney Transplantation , Tissue and Organ Procurement , B-Lymphocytes/immunology , Female , Flow Cytometry/methods , Follow-Up Studies , Graft Rejection/prevention & control , Graft Survival , Humans , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Factors , Tissue Donors
5.
Mol Psychiatry ; 20(2): 201-6, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25560762

ABSTRACT

Abnormal metabolism has been reported in bipolar disorder, however, these studies have been limited to specific regions of the brain. To investigate whole-brain changes potentially associated with these processes, we applied a magnetic resonance imaging technique novel to psychiatric research, quantitative mapping of T1 relaxation in the rotating frame (T1ρ). This method is sensitive to proton chemical exchange, which is affected by pH, metabolite concentrations and cellular density with high spatial resolution relative to alternative techniques such as magnetic resonance spectroscopy and positron emission tomography. Study participants included 15 patients with bipolar I disorder in the euthymic state and 25 normal controls balanced for age and gender. T1ρ maps were generated and compared between the bipolar and control groups using voxel-wise and regional analyses. T1ρ values were found to be elevated in the cerebral white matter and cerebellum in the bipolar group. However, volumes of these areas were normal as measured by high-resolution T1- and T2-weighted magnetic resonance imaging. Interestingly, the cerebellar T1ρ abnormalities were normalized in participants receiving lithium treatment. These findings are consistent with metabolic or microstructural abnormalities in bipolar disorder and draw attention to roles of the cerebral white matter and cerebellum. This study highlights the potential utility of high-resolution T1ρ mapping in psychiatric research.


Subject(s)
Bipolar Disorder/pathology , Brain Mapping , Brain/pathology , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Adult , Aged , Female , Humans , Male , Middle Aged , Psychiatric Status Rating Scales , Statistics, Nonparametric , Young Adult
7.
J Clin Invest ; 90(2): 545-53, 1992 Aug.
Article in English | MEDLINE | ID: mdl-1644923

ABSTRACT

Low-frequency ultradian and high-frequency insulin secretion pulses were studied in normal subjects and in metabolically stable pancreas transplant recipients. Insulin secretion pulsatility was evaluated after deconvoluting the pulsatile plasma C peptide concentrations with its kinetic coefficients. In normal subjects, ultradian insulin secretion pulses with periodicities of 75-115 min were consistently observed during the 24-h secretory cycle. Pulse period and relative amplitude during the overnight rest (95 +/- 4 min and 27.6 +/- 2.4%) were similar to those during the steady state of continuous enteral feeding (93 +/- 5 min and 32.6 +/- 3.3%). Sampling at 2-min intervals revealed the presence of high-frequency insulin secretion pulses with periodicities of 14-20 min and an average amplitude of 46.6 +/- 5.4%. Pancreas transplant recipients had normal fasting and fed insulin secretion rates. Both low- and high-frequency insulin secretion pulses were present. The high-frequency pulse characteristics were identical to normal. Low-frequency ultradian pulse periodicity was normal but pulse amplitude was increased. Thus, ultradian insulin secretory pulsatility is a consistent feature in normal subjects. The low- and high-frequency secretion pulsatilities are generated independent of extrinsic innervation. Autonomic innervation might modulate low-frequency ultradian pulse amplitude exerting a dampening effect.


Subject(s)
Insulin/metabolism , Pancreas Transplantation , Adult , Blood Glucose/metabolism , C-Peptide/blood , Circadian Rhythm , Female , Humans , Insulin Secretion , Male , Periodicity , Secretory Rate
8.
Transplant Proc ; 38(10): 3524-6, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17175321

ABSTRACT

BACKGROUND: The occurrence of lymphocele formation following renal transplantation is variable, and the optimal approach to treatment remains undefined. Opening the peritoneum at the time of transplantation is one method of decreasing the incidence of lymphocele formation. The purpose of this study was to determine whether creating a peritoneal window at the time of transplantation decreases the incidence of lymphocele formation. METHODS: We performed a retrospective review of renal transplants conducted at our institution between 2002 and 2004. Records were reviewed to obtain details regarding opening of the peritoneum at the time of transplant and occurrence of lymphocele. Every patient underwent routine ultrasound imaging in the peri-operative period. Graft dysfunction secondary to the lymphocele was the primary indication for intervention. Data were analyzed by chi-square. RESULTS: During the initial transplant the peritoneum was opened in 35% of patients. The overall incidence of fluid collections, identified by ultrasound, was 24%. Opening the peritoneum did not decrease the incidence of lymphocele. However, more patients with a closed peritoneum required an intervention for a symptomatic lymphocele. In the 11 patients with an open peritoneum and a fluid collection, only one required an intervention. In patients whose peritoneum was left intact, 24% of fluid collections required intervention. Graft survival was equivalent. CONCLUSION: Creating a peritoneal window at the time of transplantation did not decrease the overall incidence of postoperative fluid collections. However, forming a peritoneal window at the time of transplantation did decrease the incidence of symptomatic lymphocele.


Subject(s)
Kidney Transplantation/methods , Lymphocele/prevention & control , Peritoneum/surgery , Postoperative Complications/prevention & control , Humans , Incidence , Lymphocele/epidemiology , Postoperative Complications/epidemiology , Retrospective Studies
9.
Free Radic Biol Med ; 29(9): 881-8, 2000 Nov 01.
Article in English | MEDLINE | ID: mdl-11063913

ABSTRACT

Microvascular endothelial cells play a key role in inflammation by undergoing activation and recruiting circulating immune cells into tissues and foci of inflammation, an early and rate-limiting step in the inflammatory process. We have previously [Binion et al., Gastroenterology112:1898-1907, 1997] shown that human intestinal microvascular endothelial cells (HIMEC) isolated from surgically resected inflammatory bowel disease (IBD) patient tissue demonstrate significantly increased leukocyte binding in vitro compared to normal HIMEC. Our studies [Binion et al., Am. J. Physiol.275 (Gastrointest. Liver Physiol. 38):G592-G603, 1998] have also demonstrated that nitric oxide (NO) production by inducible nitric oxide synthase (iNOS) normally plays a key role in downregulating HIMEC activation and leukocyte adhesion. Using primary cultures of HIMEC derived from normal and IBD patient tissues, we sought to determine whether alterations in iNOS-derived NO production underlies leukocyte hyperadhesion in IBD. Both nonselective (N(G)-monomethyl-L-arginine) and specific (N-Iminoethyl-L-lysine) inhibitors of iNOS significantly increased leukocyte binding by normal HIMEC activated with cytokines and lipopolysaccharide (LPS), but had no effect on leukocyte adhesion by similarly activated IBD HIMEC. When compared to normal HIMEC, IBD endothelial cells had significantly decreased levels of iNOS mRNA, protein, and NO production following activation. Addition of exogenous NO by co-culture with normal HIMEC or by pharmacologic delivery with the long-acting NO donor detaNONOate restored a normal leukocyte binding pattern in the IBD HIMEC. These data suggest that loss of iNOS expression is a feature of chronically inflamed microvascular endothelial cells, which leads to enhanced leukocyte binding, potentially contributing to chronic, destructive inflammation in IBD.


Subject(s)
Endothelium, Vascular/enzymology , Endothelium, Vascular/pathology , Inflammatory Bowel Diseases/enzymology , Inflammatory Bowel Diseases/pathology , Intestines/blood supply , Leukocytes/pathology , Nitric Oxide Synthase/deficiency , Cell Adhesion/physiology , Cells, Cultured , Free Radicals/metabolism , Humans , Inflammatory Bowel Diseases/genetics , Nitric Oxide/biosynthesis , Nitric Oxide Synthase/genetics , Nitric Oxide Synthase Type II , RNA, Messenger/genetics , RNA, Messenger/metabolism
10.
Neurology ; 55(4): 468-79, 2000 Aug 22.
Article in English | MEDLINE | ID: mdl-10953176

ABSTRACT

Autism is a common disorder of childhood, affecting 1 in 500 children. Yet, it often remains unrecognized and undiagnosed until or after late preschool age because appropriate tools for routine developmental screening and screening specifically for autism have not been available. Early identification of children with autism and intensive, early intervention during the toddler and preschool years improves outcome for most young children with autism. This practice parameter reviews the available empirical evidence and gives specific recommendations for the identification of children with autism. This approach requires a dual process: 1) routine developmental surveillance and screening specifically for autism to be performed on all children to first identify those at risk for any type of atypical development, and to identify those specifically at risk for autism; and 2) to diagnose and evaluate autism, to differentiate autism from other developmental disorders.


Subject(s)
Autistic Disorder/diagnosis , Mass Screening/methods , Mass Screening/standards , Asperger Syndrome/diagnosis , Autistic Disorder/genetics , Child, Preschool , Developmental Disabilities/diagnosis , Diagnosis, Differential , Disease Management , Electrophysiology , Humans , Infant , Lead Poisoning, Nervous System, Childhood/diagnosis , Neuropsychological Tests , Predictive Value of Tests , Risk Assessment
11.
J Med Chem ; 34(10): 3011-22, 1991 Oct.
Article in English | MEDLINE | ID: mdl-1920352

ABSTRACT

A series of 4-(diarylmethyl)-1-[3-(aryloxy)propyl]piperidines and structurally related compounds were synthesized as calcium-channel blockers and antihypertensive agents. Compounds were evaluated for calcium-channel-blocking activity by determining their ability to antagonize calcium-induced contractions of isolated rabbit aortic strips. The most potent compounds were those with fluoro substituents in the 3- and/or 4-positions of both rings of the diphenylmethyl group. Bis(4-fluorophenyl)acetonitrile analogue 79 was similar in potency to bis(4-fluorophenyl)methyl compound 1. The methylene analogue of 1 (78) and derivatives of 1 that contained a hydroxyl (76), carbamoyl (80), amino (81), or acetamido (82) substituent on the methyl group were less potent. In most cases, substituents on the phenoxy ring, changes in the distance between the aryloxy group and the piperidine nitrogen, and the substitution of S, N(CH3), or CH2 for the oxygen atom of the aryloxy group had only a small to moderate effect on the potency. The best compounds in this series were more potent than verapamil, diltiazem, flunarizine, and lidoflazine, but were less potent than nifedipine. Compounds were evaluated for antihypertensive activity in spontaneously hypertensive rats (SHR) at an oral dose of 30 mg/kg. Of the 55 compounds tested, only nine produced a statistically significant (p less than 0.05) reduction in blood pressure greater than 20%; all of these compounds had fluoro substituents in both rings of the diphenylmethyl group. One of the most active compounds in the SHR at 30 mg/kg was 1-[4-[3-[4-[bis(3,4-difluorophenyl)methyl]-1- piperidinyl]propoxy]-3-methoxyphenyl]ethanone (63), which produced a 35% reduction in blood pressure and was similar in activity to nifedipine. At lower doses, however, 4-[bis(4-fluorophenyl)methyl]-1-[3-(4-chlorophenoxy)propyl]piperidine (93) was one of the most effective antihypertensive agents, producing reductions in blood pressure of 17 and 11% at oral doses of 10 and 3 mg/kg, respectively; 63 was inactive at 10 mg/kg.


Subject(s)
Antihypertensive Agents/pharmacology , Calcium Channel Blockers/pharmacology , Piperidines/pharmacology , Animals , Antihypertensive Agents/chemical synthesis , Antihypertensive Agents/therapeutic use , Aorta , Calcium/pharmacology , Calcium Channel Blockers/chemical synthesis , Calcium Channel Blockers/therapeutic use , Hypertension/drug therapy , Male , Molecular Structure , Muscle Contraction/drug effects , Piperidines/chemical synthesis , Piperidines/therapeutic use , Rabbits , Rats , Rats, Inbred SHR , Structure-Activity Relationship
12.
Transplantation ; 67(3): 486-8, 1999 Feb 15.
Article in English | MEDLINE | ID: mdl-10030301

ABSTRACT

BACKGROUND: Surgical complications after combined kidney and pancreas transplantation are a major source of morbidity and mortality. Complications related to the pancreas occur with greater frequency as compared to renal complications. The occurrence in our practice of two cases of renal infarction resulting from torsion about the vascular pedicle led to our retrospective review of similar vascular complications after combined kidney and pancreas transplantation. METHODS: Charts were reviewed retrospectively, and two patients were identified who experienced torsion about the vascular pedicle of an intra-abdominally placed renal allograft. RESULTS: Two patients who had received combined intraperitoneal kidney and pancreas transplantation presented at 16 and 11 months after transplant, respectively, with abdominal pain and decreased urine output. One patient had radiological documentation of abnormal rotation before the graft loss; unfortunately, the significance of this finding was missed. Diagnosis was made in both patients at laparotomy, where the kidneys were infarcted secondary to torsion of the vascular pedicle. Both patients underwent transplant nephrectomy and subsequently received a successful second cadaveric renal transplant. CONCLUSIONS: The mechanism of this complication is a result of the intra-abdominal placement of the kidney, length of the vascular pedicle, excess ureteral length, and paucity of adhesions secondary to steroid administration. These factors contribute to abnormal mobility of the kidney. Technical modifications such as minimizing excess ureteral length and nephropexy may help to avoid this complication.


Subject(s)
Diabetes Mellitus, Type 1/surgery , Diabetic Nephropathies/surgery , Kidney Failure, Chronic/surgery , Kidney Transplantation/pathology , Pancreas Transplantation , Postoperative Complications/pathology , Adult , Female , Hemorrhage , Humans , Infarction/pathology , Male , Middle Aged , Reoperation , Retrospective Studies , Torsion Abnormality
13.
Transplantation ; 56(6): 1305-9, 1993 Dec.
Article in English | MEDLINE | ID: mdl-8278993

ABSTRACT

The effectors of cell death in allograft rejection are poorly understood. Oxygen derived free radicals (ODFR) may participate in graft destruction. We examined the impact of the antioxidants ascorbic acid (AA) and alpha-tocopherol (AT) with low dose CsA on rat cardiac allograft survival. Lewis rats that had undergone heterotopic abdominal cardiac transplantation with Wistar-Furth allografts (day 0) were divided into 6 groups. Group 1 was the control group; groups 2 and 3 received AA (1200 mg/kg), and groups 4 and 5 received AT (800 IU/kg) by gavage daily until rejection. Groups 3, 5, and 6 were given CsA (2.5 mg/kg i.m.) days 1-15. Allograft rejection times (in days) were 7.7 +/- 1, 10.3 +/- 1.5 (P < 0.01 vs. group 1), 37.1 +/- 6.4 (P < 0.01 vs. group 1, P = 0.0004 vs. group 6), 9.0 +/- 1.4, 26.5 +/- 3.6 (P < 0.01 vs. group 1, P < 0.03 vs. group 6), and 20 +/- 4.9 (P < 0.01 vs. group 1) for groups 1, 2, 3, 4, 5, and 6. To assess the impact of AA on ODFR production, chemiluminescence was performed on zymosan-activated Lewis whole blood from control rats and rats administered AA. AA significantly decreased peak chemiluminescence (P < 0.05) as compared with nontreated rats indicating effective ODFR scavenging. To determine whether AA and AT inhibit lymphocyte stimulation, mixed lymphocyte response testing was performed with irradiated Wistar-Furth lymphocytes as stimulator cells for Lew responder cells from rats treated as groups 3, 5, and 6. CsA significantly suppressed (P < .05) proliferation as compared with untreated controls. Neither AA nor AT enhance CsA's immunosuppressive effect by mixed lymphocyte response testing. In summary, prolongation of allograft survival with antioxidants AA and AT does not result from abrogation of lymphocyte responsiveness or alteration in CsA bioavailability. Rather, these data suggest that ODFR are involved in allograft destruction and support a role for effective antioxidant therapy in the treatment of allograft rejection.


Subject(s)
Antioxidants/administration & dosage , Ascorbic Acid/administration & dosage , Cyclosporine/administration & dosage , Graft Rejection/prevention & control , Heart Transplantation/adverse effects , Heart Transplantation/immunology , Vitamin E/administration & dosage , Animals , Ascorbic Acid/blood , Cyclosporine/blood , Drug Therapy, Combination , Graft Rejection/drug therapy , Graft Rejection/metabolism , Heart Transplantation/physiology , Luminescent Measurements , Lymphocyte Activation , Male , Rats , Rats, Inbred Lew , Rats, Inbred WF , Reactive Oxygen Species/metabolism , Time Factors , Transplantation, Homologous
14.
Transplantation ; 38(6): 575-8, 1984 Dec.
Article in English | MEDLINE | ID: mdl-6390816

ABSTRACT

A rat heart allograft model employing donor-specific transfusions (DSTs) was used to investigate several questions relevant to their clinical usage. The effects of varying blood storage duration (stored vs. fresh), number and timing of DSTs as well as use of concomitant azathioprine and cyclosporine (CsA) were assessed in terms of allograft survival and recipient sensitization. A comparison of stored and fresh-blood DSTs revealed that blood stored for up to 5 weeks was as effective as fresh blood and that a 2-week storage was optimal. Increased storage appeared to be associated with decreased sensitization. Multiple DSTs were more effective than a single DST and the peak effect appeared after six. Transfusions given at the time of transplantation were ineffective. The addition of concomitant (preoperative) azathioprine or CsA resulted in a further decrease in sensitization but also resulted in a dose-dependent diminution of the transfusion effect.


Subject(s)
Blood Transfusion , Heart Transplantation , Animals , Azathioprine/therapeutic use , Cyclosporins/therapeutic use , Graft Survival/drug effects , Male , Organ Preservation , Rats , Rats, Inbred Strains
15.
Transplantation ; 56(4): 822-7, 1993 Oct.
Article in English | MEDLINE | ID: mdl-8212200

ABSTRACT

Weight gain following renal transplantation occurs frequently but has not been investigated quantitatively. A retrospective chart review of 115 adult renal transplant recipients was used to describe patterns of weight gain during the first 5 years after transplantation. Only 23 subjects (21%) were overweight before their transplant. Sixty-six subjects (57%) experienced a weight gain of greater than or equal to 10%, and 49 subjects (43%) were overweight according to Metropolitan relative weight criteria at 1 year after transplantation. There was an inverse correlation between advancing age and weight gain, with the youngest patients (18-29 years) having a 13.3% weight gain and the oldest patients (age greater than 50 years) having the lowest gain of 8.3% at 1 year (P = 0.047). Black recipients experienced a greater weight gain than whites during the first posttransplant year (14.6% vs. 9.0%; P = 0.043), and maintained or increased this difference over the 5-year period. Men and women experienced comparable weight gain during the first year (9.5% vs. 12.1%), but women continued to gain weight throughout the 5-year study (21.0% total weight gain). The men remained stable after the first year (10.8% total weight gain). Recipients who experienced at least a 10% weight gain also increased their serum cholesterol (mean 261 vs. 219) and triglyceride (mean 277 vs. 159) levels significantly, whereas those without weight gain did not. Weight gain did not correlate with cumulative steroid dose, donor source (living-related versus cadaver), rejection history, pre-existing obesity, the number of months on dialysis before transplantation, or posttransplant renal function. Posttransplant weight gain is related mainly to demographic factors, not to treatment factors associated with the transplant. The average weight gain during the first year after renal transplantation is approximately 10%. This increased weight, coupled with changes in lipid metabolism, may be significant in terms of altering risk from cardiovascular morbidity.


Subject(s)
Kidney Transplantation/physiology , Weight Gain , Adult , Black or African American , Black People , Cholesterol/blood , Female , Humans , Male , Medical Records , Retrospective Studies , Sex Factors , Treatment Outcome , Triglycerides/blood , White People , Wisconsin
16.
Transplantation ; 56(4): 827-31, 1993 Oct.
Article in English | MEDLINE | ID: mdl-8212201

ABSTRACT

In January 1988, we initiated a prospective, randomized comparison of prophylactic antilymphoblast globulin (ALG; quadruple therapy) versus no prophylactic ALG (triple therapy) in the setting of immediate graft function (defined by a brisk diuresis and a 20% decline in serum creatinine within 24 hr). Recipients were stratified according to presence of diabetes and age greater or less than 50 years. Recipients on quadruple therapy (n = 61) received 7 days of prophylactic Minnesota ALG (5 mg/kg on day 1, 10 mg/kg on day 2, 20 mg/kg on days 3-7). CsA, 10 mg/kg/day, began on day 6. AZA began at 2.5 mg/kg/day and was adjusted according to white blood cell count. Recipients on triple therapy (n = 60) began immediate CsA, 10 mg/kg/day orally and AZA, 5 mg/kg/day, tapering to 2.5 mg/kg/day by day 8. Both groups received identical prednisone tapers beginning at 1 mg/kg/day, decreasing to 0.5 mg/kg/day by 2 weeks and to 0.15 mg/kg/day by 6 months. Demographic characteristics between groups were not different with respect to diabetes, age, sex, race, per cent panel-reactive antibodies (PRA), or HLA matching. Follow-up ranged from 2 to 4.5 years. Patient survival was 93% for the quadruple therapy group and 90% for triple therapy. Actuarial graft survival was 79% in the quadruple group and 72% in the triple group (P = 0.18). Graft loss due to rejection occurred in 6/61 receiving ALG versus 7/60 in the immediate CsA group. Three of 4 high PRA recipients in the immediate CsA group lost their grafts within 30 days compared with none in the ALG group. The mean time to graft loss was significantly longer for the quadruple therapy group (17 +/- 8 months) compared with the triple therapy group (4 +/- 5 months), P = 0.006. The total number of rejection episodes was similar for both groups (29/61 vs. 31/60), as was the number who were rejection free (51% vs. 47%). The use of OKT3 was also similar between groups (28% vs. 30%). The quadruple therapy group had a higher incidence of CMV infection: 20% vs. 7% (P < 0.05), but no grafts or patients were lost as a result. Serum Cr was not different at 1 and 12 months (1.5 and 1.6 vs. 1.6 and 1.7, respectively), nor were Cr clearances (63 and 68 vs. 60 and 63). Conclusion. Early initiation of oral CsA in the setting of immediate graft function is not associated with significant nephrotoxicity.(ABSTRACT TRUNCATED AT 400 WORDS)


Subject(s)
Graft Survival , Immunosuppressive Agents/therapeutic use , Kidney Transplantation/physiology , Actuarial Analysis , Adult , Aged , Antilymphocyte Serum/administration & dosage , Antilymphocyte Serum/therapeutic use , Azathioprine/administration & dosage , Azathioprine/therapeutic use , Cadaver , Creatinine/metabolism , Cyclosporine/administration & dosage , Cyclosporine/therapeutic use , Drug Therapy, Combination , Female , Follow-Up Studies , Graft Rejection , Humans , Immunosuppression Therapy/methods , Kidney Transplantation/immunology , Male , Middle Aged , Prednisone/administration & dosage , Prednisone/therapeutic use , Prospective Studies , Time Factors
17.
Transplantation ; 45(2): 380-5, 1988 Feb.
Article in English | MEDLINE | ID: mdl-3278431

ABSTRACT

Between September 1980 and June 1984, 246 splenectomized, transfused renal allograft recipients were stratified according to presence of diabetes and donor source, and randomized to treatment with either cyclosporine (CsA)-prednisone (pred) or antilymphoblast-globulin (ALG--azathioprine (AZA)--prednisone. As of August 1986, mean follow-up is 47 months. Over all, actuarial patient survival is 84% and 83%, respectively at 4 years. Corresponding graft survival is 70% and 63% for CsA-treated and ALG-AZA-treated patients (NS). Within the subgroup of diabetic recipients of cadaver grafts, graft survival is 70% for CsA-treated and 53% for ALG-AZA-treated recipients (P = .035). In the CsA group, 71% required either a significant reduction in CsA dosage with the addition of azathioprine or a complete switch to azathioprine, mainly because of CsA-associated nephrotoxicity. Of those CsA patients switched at a mean time of 21.3 +/- 16.4 months posttransplant with mean serum creatinine of 2.40 +/- .67, current serum creatinine is 1.79 +/- .63. Current mean serum creatinine values are significantly greater for patients randomized to CsA-pred (1.73 +/- .60) vs. ALG-AZA-pred (1.49 +/- .59), P = .014, even though most CsA-treated patients were eventually switched. The causes of graft loss are not different between CsA and ALG-AZA randomized patients. In nondiabetics, rejection is the most common cause of graft loss (17/33), whereas in diabetics loss due to complications from overimmunosuppression or death from cardiovascular events is significantly more common (27/44) than corresponding losses in nondiabetics (6/33, P less than .05). Switching does not seem to influence the incidence or cause of graft loss. Since most patients started on CsA-prednisone are ultimately switched to triple drug therapy, the latter is now the preferred initial treatment modality.


Subject(s)
Antilymphocyte Serum/therapeutic use , Azathioprine/therapeutic use , Cyclosporins/therapeutic use , Kidney Transplantation , Adolescent , Adult , Clinical Trials as Topic , Creatinine/blood , Diabetic Nephropathies/drug therapy , Diabetic Nephropathies/therapy , Drug Therapy, Combination , Female , Follow-Up Studies , Graft Rejection/drug effects , Graft Survival/drug effects , Humans , Male , Middle Aged , Random Allocation
18.
Transplantation ; 68(5): 635-41, 1999 Sep 15.
Article in English | MEDLINE | ID: mdl-10507481

ABSTRACT

INTRODUCTION: Short-term and long-term results of renal transplantation have improved over the past 15 years. However, there has been no change in the prevalence of recurrent and de novo diseases. A retrospective study was initiated through the Renal Allograft Disease Registry, to evaluate the prevalence and impact of recurrent and de novo diseases after transplantation. MATERIALS AND METHODS: From October 1987 to December 1996, a total of 4913 renal transplants were performed on adults at the Medical College of Wisconsin, University of Cincinnati, University of California at San Francisco, University of Louisville, University of Washington, Seattle, and Washington University School of Medicine. The patients were followed for a minimum of 1 year. A total of 167 (3.4%) cases of recurrent and de novo disease were diagnosed by renal biopsy. These patients were compared with other patients who did not have recurrent and de novo disease (n=4746). There were more men (67.7% vs. 59.8%, P<0.035) and a higher number of re-transplants (17% vs. 11.5%, P<0.005) in the recurrent and de novo disease group. There was no difference in the rate of recurrent and de novo disease according to the transplant type (living related donor vs. cadaver, P=NS). Other demographic findings were not significantly different. Common forms of glomerulonephritis seen were focal segmental glomerulosclerosis (FSGS), 57; immunoglobulin A nephritis, 22; membranoproliferative glomerulonephritis (GN), 18; and membranous nephropathy, 16. Other diagnoses include: diabetic nephropathy, 19; immune complex GN, 12; crescentic GN (vasculitis), 6; hemolytic uremic syndrome-thrombotic thrombocytopenic purpura (HUS/TTP), 8; systemic lupus erythematosus, 3; Anti-glomerular basement membrane disease, 2; oxalosis, 2; and miscellaneous, 2. The diagnosis of recurrent and de novo disease was made after a mean period of 678 days after the transplant. During the follow-up period, there were significantly more graft failures in the recurrent disease group, 55% vs. 25%, P<0.001. The actuarial 1-, 2-, 3-, 4, and 5-year kidney survival rates for patients with recurrent and de novo disease was 86.5%, 78.5%, 65%, 47.7%, and 39.8%. The corresponding survival rates for patients without recurrent and de novo disease were 85.2%, 81.2%, 76.5%, 72%, and 67.6%, respectively (Log-rank test, P<0.0001). The median kidney survival rate for patients with and without recurrent and de novo disease was 1360 vs. 3382 days (P<0.0001). Multivariate analysis using the Cox proportional hazard model for graft failure was performed to identify various risk factors. Cadaveric transplants, prolonged cold ischemia time, elevated panel reactive antibody, and recurrent disease were identified as risk factors for allograft failure. The relative risk (95% confidence interval) for graft failure because of recurrent and de novo disease was 1.9 (1.57-2.40), P<0.0001. The relative risk for graft failure because of posttransplant FSGS was 2.25 (1.6-3.1), P<0.0001, for membranoprolifera. tive glomerulonephritis was 2.37 (1.3-4.2), P<0.003, and for HUS/TTP was 5.36 (2.2-12.9), P<0.0002. There was higher graft failure (64.9%) and shorter half-life (1244 days) in patients with recurrent FSGS. CONCLUSION: In conclusion, recurrent and de novo disease are associated with poorer long-term survival, and the relative risk of allograft loss is double. Significant impact on graft survival was seen with recurrent and de novo FSGS, membranoproliferative glomerulonephritis, and HUS/TTP.


Subject(s)
Kidney Diseases/etiology , Kidney Glomerulus , Kidney Transplantation , Postoperative Complications , Adult , Female , Graft Rejection/etiology , Humans , Kidney Diseases/complications , Kidney Diseases/epidemiology , Male , Prevalence , Recurrence , Registries , Retrospective Studies , Risk Factors , Survival Analysis
19.
Pediatrics ; 104(4 Pt 1): 978-81, 1999 Oct.
Article in English | MEDLINE | ID: mdl-10506246

ABSTRACT

Care coordination is a process that links children with special health care needs and their families to services and resources in a coordinated effort to maximize the potential of the children and provide them with optimal health care. Care coordination often is complicated because there is no single entry point to multiple systems of care, and complex criteria determine the availability of funding and services among public and private payers. Economic and sociocultural barriers to coordination of care exist and affect families and health care professionals. In their important role of providing a medical home for all children, primary care pediatricians have a vital role in the process of care coordination, in concert with the family.


Subject(s)
Case Management/organization & administration , Child Health Services/organization & administration , Delivery of Health Care, Integrated/organization & administration , Disabled Persons , Pediatrics , Child , Humans , Professional-Family Relations , Referral and Consultation , United States
20.
Am J Kidney Dis ; 31(6): 928-31, 1998 Jun.
Article in English | MEDLINE | ID: mdl-9631835

ABSTRACT

Recurrent or de novo glomerular disease is an important cause of graft dysfunction and eventual loss. Cyclosporine A (CyA) has improved short-term renal allograft outcome but has not altered long-term graft survival. The purpose of the current study is to determine the prevalence of such disease and its impact on graft function in the CyA era. From 1984 to 1994, 1,557 renal allografts were performed at the Medical College of Wisconsin and the University of Cincinnati. Patients were followed up for an average of 7.2 years (minimum, 1 year). Recurrent disease was diagnosed by renal biopsy in 98 (6.3%) patients after an average of 36 months. Demographic characteristics of patients with and without recurrent disease were similar. Glomerulonephritis was the most common finding, occurring in 73 patients, and included focal segmental glomerulosclerosis (FSGS), 25; IgA nephropathy (IgAN), 11; membranous (MN), 11; proliferative, 11; membranoproliferative glomerulonephritis (MPGN), 10; glomerular basement membrane (anti-GBM), 3; and systemic lupus erythematosus (SLE), two. Diabetic nephropathy was present in 22, hemolytic uremic syndrome (HUS) in two, and oxalosis in one. Graft loss occurred in 60 of 98 (61%) recipients. Half-life of the allograft was diminished in patients with recurrent disease, 2,038 +/- 225 versus 3,135 +/- 385 days, P = 0.002. The actuarial allograft survival at 1, 3, 5, and 8 years posttransplantation with recurrence was 88%, 74%, 57%, and 34%, respectively; and the corresponding graft survival for patients without recurrent disease was 80%, 70%, 64%, and 53%, respectively (P = 0.003). The risk of recurrent disease increased with length of graft survival from 2.8% at 2 years to 9.8% and 18.5% at 5 and 8 years, respectively. We conclude that recurrent disease is a significant problem after renal transplantation and is associated with decreased graft survival.


Subject(s)
Kidney Diseases/etiology , Kidney Transplantation , Actuarial Analysis , Adult , Diabetic Nephropathies/surgery , Female , Glomerulonephritis/surgery , Graft Survival , Humans , Male , Recurrence , Registries , Retrospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL