ABSTRACT
BACKGROUND: Delirium in patients with critical limb ischemia (CLI) is associated with increased mortality. The main goal of this study was to investigate the association between delirium and mortality in patients undergoing major lower limb amputation for CLI. In addition, other risk factors associated with mortality were analyzed. METHODS: An observational cohort study was conducted including all patients aged ≥70 years with CLI undergoing a major lower limb amputation between January 2014 and July 2017. Delirium was scored using the Delirium Observation Screening Score in combination with the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition. Risk factors for mortality were analyzed by calculating hazard ratios using a Cox proportional hazards model. RESULTS: In total, 95 patients were included; of which, 29 (31%) patients developed a delirium during admission. Delirium was not associated with an increased risk of mortality (hazard ratio [HR] = 0.84; 95 % confidence interval [CI]: 0.51-1.73; P = 0.84). Variables independently associated with an increased risk of mortality were age (HR 1.1; 95% CI 1.0-1.1), cardiac history (HR 3.3; 95% CI 1.8-6.1), current smoking (HR 2.9; 95% CI 1.6-5.5), preoperative anemia (HR 2.8; 95% CI 1.1-7.2), and living in a nursing home (HR 2.2; 95% CI 1.1-4.4). CONCLUSION: Delirium was not associated with an increased mortality risk in elderly patients with CLI undergoing a major lower limb amputation. Factors related to an increased mortality risk were age, cardiac history, current smoking, preoperative anemia, and living in a nursing home.
Subject(s)
Amputation, Surgical/mortality , Delirium/mortality , Ischemia/surgery , Lower Extremity/blood supply , Peripheral Arterial Disease/surgery , Age Factors , Aged , Aged, 80 and over , Amputation, Surgical/adverse effects , Critical Illness , Delirium/diagnosis , Delirium/psychology , Female , Humans , Incidence , Ischemia/diagnostic imaging , Ischemia/mortality , Ischemia/physiopathology , Male , Peripheral Arterial Disease/diagnostic imaging , Peripheral Arterial Disease/mortality , Peripheral Arterial Disease/physiopathology , Risk Assessment , Risk Factors , Time Factors , Treatment OutcomeABSTRACT
OBJECTIVES: The aim of the study was to test the antiviral efficacy of a triple nucleoside reverse transcriptase inhibitor (NRTI) regimen, with potential beneficial metabolic effects, as maintenance therapy after induction with dual NRTIs and a boosted protease inhibitor (PI). METHODS: An open-label, noninferiority study was carried out. Antiretroviral therapy (ART)-naïve patients with CD4 count ≤ 350 cells/µL and HIV-1 RNA >30000 copies/mL (n=207) were treated with zidovudine/lamivudine and lopinavir/ritonavir. After achieving HIV-1 RNA <50 copies/mL on two consecutive occasions between weeks 12 and 24 after baseline, 120 patients (baseline: median HIV-1 RNA 5.19 log10 copies/mL; median CD4 count 180 cells/µL) were randomized to receive abacavir/lamivudine/zidovudine (ABC/3TC/ZDV) (n=61) or to continue the PI-based ART (n=59). RESULTS: For the proportions of patients (intention-to-treat; missing=failure) with HIV-1 RNA <400 copies/mL (PI group, 66%; ABC/3TC/ZDV group, 71%) and <50 copies/mL (PI group, 63%; ABC/3TC/ZDV group, 62%) at 96 weeks, switching to ABC/3TC/ZDV was noninferior compared with continuing the PI regimen; the difference in failure rate (ABC/3TC/ZDV minus PI) was -4.4 percentage points [95% confidence interval (CI) -21.0 to +12.3 percentage points] and +0.4 percentage points (95% CI -16.9 to +17.7 percentage points), respectively. In the per protocol analysis, the difference in virological failure for HIV-1 RNA >400 copies/mL (0 of 39 patients in the PI group and two of 45 patients in the NRTI group) and for HIV-1 RNA >50 copies/mL (two of 39 and three of 45 patients, respectively) was +4.4 percentage points (95% CI -2.1 to +11.0 percentage points) and +1.5 percentage points (95% CI -8.6 to +11.7 percentage points), respectively, also showing noninferiority. Serum lipids significantly improved in the NRTI group, but not in the PI arm. CONCLUSIONS: A single-class NRTI regimen after successful induction with standard ART had similar antiviral efficacy compared to continuation of a PI-based regimen at 96 weeks after baseline, with improved serum lipids.
Subject(s)
Anti-HIV Agents/administration & dosage , Dideoxynucleosides/administration & dosage , HIV Infections/drug therapy , Lamivudine/administration & dosage , Zidovudine/administration & dosage , Adult , Aged , Belgium/epidemiology , CD4 Lymphocyte Count , Clinical Protocols , Disease Progression , Drug Administration Schedule , Drug Combinations , Drug Therapy, Combination , Female , HIV Infections/immunology , HIV Protease Inhibitors , HIV-1/immunology , Humans , Lipids , Male , Middle Aged , Netherlands/epidemiology , Prospective Studies , RNA, Viral/drug effects , Treatment Outcome , Viral LoadABSTRACT
OBJECTIVES: To determine the prevalence, trends, and potential nosocomial transmission events of the hidden reservoir of rectal carriage of extended-spectrum beta-lactamase-producing Enterobacterales (ESBL-E). METHODS: From 2013 to 2022, yearly point prevalence surveys were conducted in a large Dutch teaching hospital. On the day of the survey, all admitted patients were screened for ESBL-E rectal carriage using peri-anal swabs and a consistent and sensitive selective culturing method. All Enterobacterales phenotypically suspected of ESBL production were analysed using whole genome sequencing for ESBL gene detection and clonal relatedness analysis. RESULTS: On average, the ESBL-E prevalence was 4.6% (188/4,119 patients), ranging from 2.1 to 6.6% per year. The ESBL-prevalence decreased on average 5.5% per year. After time trend correction, the prevalence in 2016 and 2020 was lower compared to the other year. Among the ESBL-E, Escherichia coli (80%) and CTX-M genes (85%) predominated. Potential nosocomial transmission events could be found in 5.9% (11/188) of the ESBL-E carriers. CONCLUSIONS: The ESBL-E rectal carriage prevalence among hospitalized patients was 4.6% with a downward trend from 2013 to 2022. The decrease in ESBL-E prevalence in 2020 could have been due to the COVID-19 pandemic and subsequent countrywide measures as no nosocomial transmission events were detected in 2020. However, the persistently low ESBL-E prevalences in 2021 and 2022 suggest that the decline in ESBL-E prevalence goes beyond the COVID-19 pandemic, indicating that overall ESBL-E carriage rates are declining over time. Continuous monitoring of ESBL-E prevalence and transmission rates can aid infection control policy to keep antibiotic resistance rates in hospitals low.
Subject(s)
Carrier State , Cross Infection , Enterobacteriaceae Infections , Enterobacteriaceae , Hospitals, Teaching , Whole Genome Sequencing , beta-Lactamases , Humans , beta-Lactamases/genetics , Netherlands/epidemiology , Prevalence , Enterobacteriaceae Infections/epidemiology , Enterobacteriaceae Infections/microbiology , Enterobacteriaceae Infections/transmission , Carrier State/epidemiology , Carrier State/microbiology , Male , Female , Enterobacteriaceae/genetics , Enterobacteriaceae/drug effects , Enterobacteriaceae/enzymology , Aged , Cross Infection/epidemiology , Cross Infection/microbiology , Middle Aged , Adult , Rectum/microbiology , Aged, 80 and over , Young AdultABSTRACT
BACKGROUND: A radiographic fat pad sign after an elbow injury in children may indicate an occult fracture. Different incidences and locations of occult fractures have been reported. The primary objective of this meta-analysis was to assess the overall rate of occult fractures in children with a positive fat pad sign from the data of original studies. Secondary objectives were to assess the fracture types and to identify risk factors for sustaining an occult fracture. METHODS: A systematic literature search of the Embase, MEDLINE, and Cochrane databases was performed according to PRISMA guidelines. Studies on pediatric populations with a positive fat pad sign identified using a lateral elbow radiograph and with follow-up imaging were included in this meta-analysis. Included studies were assessed for risk of bias with use of the MINORS (Methodological Index for NOn-Randomized Studies) instrument. RESULTS: Ten studies with a total of 250 patients, of whom 104 had an occult fracture, were included. Accounting for heterogeneity between the studies, the overall occult fracture rate was 44.6% (95% confidence interval: 30.4% to 59.7%). The most common fracture locations were the supracondylar humerus (43%), proximal ulna (19%), proximal radius (17%), and lateral humeral condyle (14%). Definitions of a positive pad fad sign were not uniform among studies, and the follow-up imaging modality also varied (radiography, magnetic resonance imaging, or computed tomography). The average MINORS score was 10.1 for the 7 noncomparative studies and 18.7 for the 3 comparative studies, with both averages classified as moderate quality. We were not able to identify risk factors for an occult fracture in the presence of a positive fat pad sign. CONCLUSIONS: The occult fracture rate was 44.6% in pediatric elbow injuries with a positive fat pad sign. Supracondylar humeral fractures were the most frequently encountered type. The findings of this meta-analysis underline the potential clinical relevance of a positive fat pad sign in children and denote the opportunity for future studies to create evidence-based guidelines. LEVEL OF EVIDENCE: Level II. See Instructions for Authors for a complete description of levels of evidence.
Subject(s)
Elbow Injuries , Elbow Joint , Fractures, Closed , Humeral Fractures , Humans , Child , Fractures, Closed/diagnosis , Fractures, Closed/pathology , Elbow/diagnostic imaging , Elbow Joint/diagnostic imaging , Humeral Fractures/diagnostic imaging , Adipose TissueABSTRACT
OBJECTIVE: To investigate the efficacy of A-V impulse technology (A-V) for oedema prevention and treatment following PTFE femoropopliteal surgery. DESIGN: Prospective randomized clinical trial. MATERIALS: 36 patients undergoing PTFE femoropopliteal bypass reconstructions, either being treated postoperatively with a compression stocking (CS) (Group-1, n = 19) or with A-V (Group-2, n = 17). METHODS: Patients in treatment group-1 used a CS postoperatively during 1 week day and night, patients in group-2 were treated with A-V postoperatively at night during one week. The lower leg circumference was measured preoperatively and at five postoperative time points. RESULTS: Limb circumference has increased postoperatively on day 1 (CS 1.5%/A-V 1.4%), on day 4 (5.7%/6.3%), on day 7 (6.6%/6.1%), on day 14 (7.9%/7.7%) and on day 90 (5.8%/5.2%). Differences between treatment groups were not significant. A re-operation gives a significant 3.9% increase in circumference as compared to a first operation (95% CI: 1.5-6.4%; p = 0.002). CONCLUSION: No significant differences were found in the extent of developed edema between the groups following PTFE femoropopliteal bypass surgery. A redo peripheral bypass operation results in significantly more postoperative oedema than a first-time performed bypass operation.
Subject(s)
Blood Vessel Prosthesis Implantation/adverse effects , Edema/therapy , Intermittent Pneumatic Compression Devices , Lower Extremity/blood supply , Peripheral Arterial Disease/surgery , Adult , Aged , Aged, 80 and over , Biocompatible Materials , Edema/etiology , Female , Femoral Artery/surgery , Humans , Lower Extremity/surgery , Male , Middle Aged , Polytetrafluoroethylene , Popliteal Artery/surgery , Prospective Studies , Reoperation , Stockings, CompressionABSTRACT
BACKGROUND: Besides short stature, gonadal dysgenesis leading to a lack of oestrogen is one of the main characteristics of Turner syndrome (TS). In most TS girls, puberty is induced with exogenous oestrogens. OBJECTIVE: To describe the pubertal development and uterine dimensions achieved by low-dose 17beta-oestradiol (17beta-E2) orally started at an appropriate age. Additionally, to determine whether serum hormone levels aid evaluation of pubertal progression. DESIGN: In 56 TS girls, we prospectively studied pubertal stage, serum E2, LH, FSH, SHBG and oestrone (E1), starting oestrogen treatment with a low-dose 17beta-E2 (5 microg/kg/day) during GH treatment at mean (SD) age 12.7 (0.7) years. Hormone levels were measured at start, 3 months after start and after increasing 17beta-E2 dosage. Uterine dimensions were measured in 39 TS women at age 19.9 (2.2) years. RESULTS: Although breast and pubic hair development were similar to that in normal Dutch girls up to Tanner stage B5 and P5, respectively, breast development was 2 years later. Before oestrogen therapy, E2 levels were comparable to those in prepubertal girls. With a 17beta-E2 dose of 5 microg/kg/day, these levels increased significantly, becoming comparable to normal late pubertal or adult concentrations, whereas SHBG levels were unchanged. At the adult 17beta-E2 dose, SHBG had increased significantly. Uterus shape was juvenile in four (10.2%), cylindrical in four and mature-adult shaped in 31 (79.5%) of TS patients. CONCLUSIONS: During GH treatment in TS girls, normal breast development up to B5 can be mimicked, with just a 2-year delay. In a clinical setting, serum hormone levels provide no additional information for evaluating pubertal progression. After age-appropriate pubertal induction, uterine dimensions in women aged nearly 20 years were subnormal. It remains unclear whether this was related to E2 dosage, timing or duration, or factors related to TS.
Subject(s)
Estrogens/blood , Estrogens/pharmacology , Puberty/drug effects , Sex Characteristics , Turner Syndrome/metabolism , Turner Syndrome/pathology , Uterus/pathology , Administration, Oral , Adolescent , Breast/drug effects , Breast/growth & development , Child , Cross-Sectional Studies , Dose-Response Relationship, Drug , Estradiol/administration & dosage , Estradiol/pharmacology , Estradiol/therapeutic use , Estrogens/administration & dosage , Estrone/blood , Female , Follicle Stimulating Hormone/blood , Follow-Up Studies , Humans , Luteinizing Hormone/blood , Prospective Studies , Sex Hormone-Binding Globulin/metabolism , Turner Syndrome/drug therapy , Uterus/drug effects , Young AdultABSTRACT
SUMMARY: In women older than 60 years with clinical risk factors for osteoporosis but without osteoporosis based on bone mineral density (T-score >or= -2.5), a systematic survey with X-rays of the spine identified previously unknown vertebral deformities in 21% of women. INTRODUCTION: This study determines the prevalence of vertebral deformities in elderly women with clinical risk factors for osteoporosis but with BMD values above the threshold for osteoporosis (T-score >or= -2.5). METHODS: Bisphosphonate naïve women older than 60 years attending 35 general practices in the Netherlands with >or=2 clinical risk factors for osteoporosis were invited for BMD measurement (DXA). In women with T-score >or= -2.5 at both spine and the hips, lateral radiographs of the thoracic and lumbar spine were performed. RESULTS: Of 631 women with a DXA measurement, 187 (30%) had osteoporosis (T-score < -2.5 at the spine or the hip). Of the remaining 444 women with T-score >or= -2.5 at both spine and hip, 387 had additional spine radiographs, of whom 80 (21%) had at least one vertebral deformity. CONCLUSION: In elderly women with clinical risk factors for osteoporosis but BMD T-score >or= -2.5, addition of spine radiographs identified vertebral deformities in 21% (95% CI: 17-25). Since these women are at risk of future fractures, antiosteoporotic treatment should be considered.
Subject(s)
Osteoporosis, Postmenopausal/diagnostic imaging , Spinal Curvatures/diagnostic imaging , Spinal Fractures/diagnostic imaging , Absorptiometry, Photon , Aged , Aged, 80 and over , Bone Density , False Negative Reactions , Female , Humans , Middle Aged , Osteoporosis, Postmenopausal/complications , Osteoporosis, Postmenopausal/physiopathology , Patient Selection , Prospective Studies , Risk Factors , Spinal Curvatures/etiology , Spinal Curvatures/physiopathology , Spinal Fractures/etiology , Spinal Fractures/physiopathologyABSTRACT
In recent years, the atopy patch test (APT) has been suggested as an addition in the allergological work-up of children with atopic dermatitis (AD) and suspected food allergy. We initiated a prospective clinical study in children with AD younger than 3 yr, to evaluate the additional clinical value of the APT next to our own standardized allergological work-up in case of a suspected food allergy. One hundred and thirty-five children were included in the study. They were tested using the skin application food test (SAFT), the APT and measurement of specific IgE. The allergens used in the skin tests were freshly prepared food stuffs and included commercially available cow's milk (CM), the egg white of a hard boiled hen's egg and mashed peanuts in a saline solution. Allergy was defined using a flowchart incorporating the results from the SAFT, oral challenges (OCs) and elimination and (re)introduction periods. To determine the additional value of the APT next to the SAFT, we analyzed the SAFT negative patients per allergen and used an exact binary logistic analysis to evaluate the simultaneous effects of the APT and measurement of specific IgE, calculating mutually adjusted odds ratios (ORs) for positive APTs and specific IgE levels above 0.70 U/l. We found clinically relevant food allergies in 23% (egg white) to 28% (CM and peanut) of our study population. Positive SAFT reactions were observed in 14% (peanut), 16% (egg white) and 21% (CM) of our patient population. Next to the SAFT, we did not observe a significant additional value of the APT for the diagnosis of CM or egg white allergy, but we did find a significant additional value for the diagnosis of peanut allergy (OR = 11.56; p < 0.005, 2-sided). In clinical practice this statistically significant value does not exclude the need for OC and controlled elimination and (re)introduction periods due to the presence of false-negative as well as false-positive results in the APT. In conclusion, we could not find enough support for the current addition of the APT to our standardized allergological work-up in young children below the age of 3 yr with AD and suspected food allergy. At the moment the additional value of the classical delayed-type APT next to the SAFT seems to be very limited at best in this study population and does not justify the time-consuming nature of the skin test.
Subject(s)
Allergens/immunology , Dermatitis, Atopic/diagnosis , Food Hypersensitivity/diagnosis , Patch Tests/standards , Animals , Arachis/immunology , Child, Preschool , Dermatitis, Atopic/immunology , Egg Proteins/immunology , False Positive Reactions , Food Hypersensitivity/immunology , Humans , Hypersensitivity, Delayed/diagnosis , Hypersensitivity, Delayed/immunology , Hypersensitivity, Immediate/diagnosis , Hypersensitivity, Immediate/immunology , Immunoglobulin E/blood , Infant , Infant, Newborn , Milk/immunology , Prospective StudiesABSTRACT
With respect to the pharmacological characteristic, venlafaxine is comparable with tricyclic antidepressants (TCAs), and venlafaxine might be comparable in efficacy. We performed a systematic review investigating the relative efficacy and tolerability of venlafaxine compared with TCAs (imipramine, clomipramine, amitriptyline, nortriptyline and desipramine). Relevant double-blind randomised trials were identified from systematic searches of electronic databases. An exact analysis of the estimated odds ratios of response of the TCA relative to venlafaxine showed no overall significance of treatment effect (P = 0.38). The odds ratios were not homogenous across studies (P = 0.0213). The average dose of venlafaxine was 103.5 mg/day and for the TCA 106.1 mg/day. An exact analysis of the estimated odds ratios of the withdrawals and side effects in the trials with a TCA relative to venlafaxine showed no overall significance of withdrawal. From our review, no significant difference in treatment effect between low dose of both venlafaxine and the TCAs could be found. In our opinion, because of the heterogeneity of the odds ratios, one cannot conclude that they are of equal efficacy.
Subject(s)
Antidepressive Agents, Second-Generation/therapeutic use , Antidepressive Agents, Tricyclic/therapeutic use , Cyclohexanols/therapeutic use , Depressive Disorder/drug therapy , Antidepressive Agents, Second-Generation/adverse effects , Antidepressive Agents, Tricyclic/adverse effects , Cyclohexanols/adverse effects , Depressive Disorder/psychology , Humans , Odds Ratio , Randomized Controlled Trials as Topic , Research Design , Treatment Outcome , Venlafaxine HydrochlorideABSTRACT
BACKGROUND: Culture can have a considerable influence on the way in which a depression is experienced, expressed or presented. Strict Calvinists or reformed pietists form an orthodox protestant cultural minority in the Netherlands. This orthodox wing of the Dutch Reformed Churches places a strong emphasis on personal religious experience of God's work of conversion. It is possible that symptoms of depression in this group differ somewhat from such symptoms in nonaffiliated depressed patients. AIM: To determine whether depressive symptoms in strict Calvinistic patients differ from those in non-affiliated patients. METHOD: Seventy depressed adult Dutch nationals receiving treatment as outpatients under the mental health service were asked to fill in a depression self-scoring list (Beck Depression Inventory II (bdi-ii)). A comparison was made between the total scores and scores of symptom clusters of strict Calvinists and the corresponding scores of non-affiliated patients. results The strict Calvinists had a lower total score than the non-affiliated patients on the bdi-ii and their scores were particularly lower for the symptom clusters suicidality and restrictions in functioning. CONCLUSION: Strict Calvinists differed from the non-affiliated patients in the way in which they presented on a depression self scoring list during depression. Perhaps strict Calvinists have less chance of being diagnosed and treated at an early stage because they conceal their depression and struggle on for a longer time. The study shows that insight into the religious background of Dutch national patients can be important for accurate psychiatric diagnostics.
Subject(s)
Depressive Disorder/diagnosis , Depressive Disorder/psychology , Religion and Psychology , Adolescent , Adult , Aged , Case-Control Studies , Depressive Disorder/epidemiology , Female , Humans , Male , Middle Aged , Psychiatric Status Rating Scales , Severity of Illness Index , Surveys and Questionnaires , Young AdultABSTRACT
BACKGROUND: When patients with cardiovascular disorders undergo electroconvulsive therapy (ect) they sometimes have to be treated for tachycardia and high blood pressure. AIM: To describe the effects of beta-blockers on seizure duration and cardiovascular variables in patients undergoing ect. METHOD: Search for studies in Medline, with the keywords 'beta-adrenergic blocking agents' and 'electroconvulsive therapy'. Only articles based on randomised placebo-controlled investigations were included. results The search strategy produced 21 articles. These were assessed by all authors. Esmolol was the drug administered in most of the trials. Since seizure duration can influence the therapeutic effect of ect it is advisable to use bilateral electrode placement in patients with cardiovascular risk factors and to administer esmolol prior to seizure induction. CONCLUSION: The beta-blocker of choice for use during ect seems to be esmolol; it can shorten seizure duration, although the effect is probably dose-dependent. Esmolol is also the drug of choice in ect sessions for patients without cardiovascular risk factors but who develop prolonged hypertension or tachycardia. A possible alternative is labetalol, but its longer half-life is a disadvantage, particularly if it is administered in a high dose. So far, experience with landiolol is limited, but its short half-life, greater cardioselectivity and higher potency mean that it could be a promising alternative.
Subject(s)
Adrenergic beta-Antagonists/administration & dosage , Electroconvulsive Therapy , Seizures/prevention & control , Blood Pressure/drug effects , Cardiovascular Diseases/complications , Dose-Response Relationship, Drug , Heart Rate/drug effects , Humans , Propanolamines/administration & dosage , Randomized Controlled Trials as Topic , Time FactorsABSTRACT
Immunotherapy with interferon-alpha (IFN-alpha) induces neuropsychiatric side effects, most notably depression. In hepatitis patients treated with IFN-alpha, severity of depression correlates with a decrease in serum activity of dipeptidyl peptidase IV (DPP-IV, EC 3.4.14.5), a membrane-bound protease involved in the cleavage of cytokines and neuroactive peptides. Abnormal serum activity of the cytosolic peptidase prolyl endopeptidase (PEP, EC 3.4.21.26, postprolyl cleaving enzyme, prolyl oligopeptidase) has been documented in patients with a variety of psychiatric disorders, most consistently in mood disorders. The serum activity of PEP and DPP-IV was measured before and after 4 weeks of high-dose induction treatment with IFN-alpha in 18 patients with high-risk melanoma. In this exploratory study, we show a clear decrease in the serum activity of PEP after 4 weeks of treatment with IFN-alpha. This decrease was not related to changes in hematologic parameters. In contrast, serum activity of DPP-IV did not change. Further studies focusing on a possible role of PEP in the pathophysiology of IFN-alpha-induced depression are warranted.
Subject(s)
Depression/blood , Dipeptidyl Peptidase 4/blood , Interferon-alpha/administration & dosage , Melanoma/blood , Serine Endopeptidases/blood , Depression/etiology , Depression/physiopathology , Dose-Response Relationship, Drug , Female , Hepatitis/psychology , Hepatitis/therapy , Humans , Immunotherapy/adverse effects , Interferon-alpha/adverse effects , Male , Melanoma/drug therapy , Melanoma/pathology , Melanoma/psychology , Mood Disorders/blood , Prolyl OligopeptidasesABSTRACT
BACKGROUND: The results of renal transplantation are dependent on many variables. To simplify the decision process related to a kidney offer, the authors wondered which variables had the most important influence on the graft failure risk. METHODS: All transplant patients (n=1,124) between January 1981 and July 2000 were included in the analysis (2.6% had missing values). The variables included were donor and recipient age and gender, recipient original disease, race, donor origin, current smoking, cardiovascular disease, body weight, peak and current panel reactive antibody (PRA), number of preceding transplants, type and duration of renal replacement therapy, and time since failure of native kidneys. Also, human leukocyte antigen (HLA) identity or not, first and second warm and cold ischemia times, left or right kidney and fossa, donor kidney anatomy, donor serum creatinine and proteinuria, and transplantation year were included. RESULTS: In a multivariate model, cold ischemia time and its time-dependent variable significantly influenced the graft failure risk censored for death (P<0.0001) independent of any of the other risk factors. The influence primarily affected the risk in the first week after transplantation; thereafter, it gradually disappeared during the first year after transplantation. Donor serum creatinine also significantly influenced the graft failure risk in a time-dependent manner (P<0.0001). The risk of a high donor serum creatinine is already enlarged in the immediate postoperative phase and increases thereafter; the curve is closely related to the degree of the elevation. The other variables with a significant influence on the graft failure rate were, in order of decreasing significance, recipient age, donor gender, donor age, HLA identity, transplantation year, preceding transplantations, donor origin, and peak PRA. CONCLUSIONS: Donor serum creatinine and cold ischemia time are important time-dependent variables independently influencing the risk of graft failure censored for death. The best strategy for improving the results of cadaveric transplantations is to decrease the cold ischemia time and to allocate kidneys from donors with an elevated serum creatinine to low-risk recipients.
Subject(s)
Creatinine/blood , Ischemia , Kidney Transplantation/mortality , Organ Preservation , Adult , Cold Temperature , Female , Humans , Male , Middle Aged , Multivariate Analysis , Proportional Hazards Models , Risk Factors , Time Factors , Tissue Donors , Treatment FailureABSTRACT
When renal transplantation was still in its infancy, failures were more prevalent and successes could be directly derived from facts and events. Because results have improved dramatically over the last decades and many factors have seemed to be involved in these continuously improving results, it is difficult to ascertain the individual contribution of each factor. Survival analysis is the appropriate method for evaluation of factors influencing results of renal transplantation. In this overview, two different methods for survival analysis are compared and described. The Kaplan-Meier analysis is the oldest and most frequently used in renal transplantation epidemiology. Important shortcomings of this method are described and substantiated with examples. The Cox proportional hazards (PH) analysis was developed in 1972 by Sir David Cox. With this multivariable analysis it is possible to identify those variables that influence the rate of failure. With this method, the influences of all other variables in the model are taken into consideration, and adjustment for interaction with other variables or with time can be made. In this article, the Cox analysis and the statistical terms that go with it are described in words and examples are given. In a complex, observational study concerning a multifactor-influenced population such as the renal transplant population, the use of the Cox model is mandatory to unravel the influences of the different variables on the failure rate.
Subject(s)
Proportional Hazards Models , Humans , Kidney Transplantation , Survival AnalysisABSTRACT
BACKGROUND: The results of living-donor (LD) renal transplantations are better than those of postmortem-donor (PMD) transplantations. To investigate whether this can be explained by a more favorable patient selection procedure in the LD population, we performed a Cox proportional hazards analysis including variables with a known influence on graft survival. METHODS: All patients who underwent transplantations between January 1981 and July 2000 were included in the analysis (n=1,124, 2.6% missing values). There were 243 LD transplantations (including 30 unrelated) and 881 PMD transplantations. The other variables included were the following: donor and recipient age and gender, recipient original disease, race, current smoking habit, cardiovascular disease, body weight, peak and current panel reactive antibody, number of preceding transplants and type and duration of renal replacement therapy, and time since failure of native kidneys. In addition, the number of human leukocyte antigen identical combinations, first and second warm and cold ischemia periods, left or right kidney and fossa, donor kidney anatomy, donor serum creatinine and proteinuria, and transplantation year were included. RESULTS: In a multivariate model, donor origin (PMD vs. LD) significantly influenced the graft failure risk censored for death independently of any of the other risk factors (P=0.0303, relative risk=1.75). There was no time interaction. When the variable cold ischemia time was excluded in the same model, the significance of the influence of donor origin on the graft failure risk increased considerably, whereas the magnitude of the influence was comparable (P=0.0004, relative risk=1.92). The influence of all other variables on the graft failure risk was unaffected when the cold ischemia period was excluded. The exclusion of none of the other variables resulted in a comparable effect. Donor origin did not influence the death risk. CONCLUSION: The superior results of LD versus PMD transplantations can be partly explained by the dichotomy in the cold ischemia period in these populations (selection). However, after adjustment for cold ischemia periods, the influence of donor origin still remained significant, independent of any of the variables introduced. This superiority is possibly caused by factors inherent to the transplanted organ itself, for example, the absence of brain death and cardiovascular instability of the donor before nephrectomy.
Subject(s)
Cadaver , Graft Survival/physiology , Kidney Transplantation/physiology , Living Donors , Organ Preservation/methods , Tissue Donors , Adult , Female , Humans , Isoantibodies/blood , Kidney Transplantation/mortality , Male , Middle Aged , Multivariate Analysis , Patient Selection , Renal Replacement Therapy , Retrospective Studies , Treatment OutcomeABSTRACT
The aim of this study was to compare the efficacy and safety of salmeterol/fluticasone propionate combination product (SFC) with fluticasone propionate (FP) plus oral montelukast (M) over 12 weeks in symptomatic asthma patients. The study was a multinational, randomised, double-blind, double-dummy, parallel-group design in patients aged > or = 15 years. After a 4-week run-in during which all patients received FP 100 microg twice daily, patients were randomised to inhaled SFC (50/100 microg) twice daily or inhaled FP 100 microg twice daily and oral M 10 mg once daily. Patients kept daily records of their peak expiratory flow (PEF) symptom scores and use of rescue medication. Over the 12-week treatment period, the adjusted increase in mean morning PEF was significantly greater in the SFC group (36 l/min) than the FP/M group (19 l/min; P < 0.001). The improvement in FEV1 was also significantly greater in the SFC group (mean treatment difference 0.11 l; P < 0.001). SFC provided significantly better control of daytime and night-time symptoms and there were fewer exacerbations. Patients in the SFC group were also significantly more likely to have a rescue-free day. Both treatments were equally well tolerated. Combination therapy with FP plus salmeterol (SFC) produced significantly greater improvements in lung function and asthma control than the addition of montelukastto FP.
Subject(s)
Acetates/administration & dosage , Albuterol/analogs & derivatives , Albuterol/administration & dosage , Androstadienes/administration & dosage , Asthma/drug therapy , Quinolines/administration & dosage , Administration, Oral , Adolescent , Adult , Aged , Cyclopropanes , Double-Blind Method , Drug Therapy, Combination , Female , Fluticasone , Forced Expiratory Volume/drug effects , Humans , Male , Middle Aged , Peak Expiratory Flow Rate/drug effects , Salmeterol Xinafoate , Sulfides , Treatment OutcomeABSTRACT
The purpose of the study was to investigate possible changes in the prevalence of STD and HIV collected at a Dutch STD clinic in the period 1996 to 2000. Age, gender, ethnic background, sexual preference, intravenous drug use and STD or HIV infection in persons attending an STD outpatient clinic were analysed and compared. The prevalence of HIV infection among the clinic visitors remained stable. The prevalence of Neisseria gonorrhoeae and Chlamydia trachomatis infections increased significantly among heterosexual men and heterosexual women. Among homo- and bisexual men a significant increase was seen in chlamydial infections only. Because of the increasing prevalence of gonococcal and chlamydial infections among STD clinic visitors in Rotterdam, more attention should be paid to coordinated preventive activities, such as health education and contact tracing. Further subgroup analyses should be done in order to get more information on risk behaviour in the different groups.
Subject(s)
Ambulatory Care Facilities , HIV Infections/epidemiology , Sexually Transmitted Diseases, Bacterial/epidemiology , Bisexuality , Chlamydia Infections/epidemiology , Chlamydia Infections/microbiology , Chlamydia trachomatis , Female , Gonorrhea/epidemiology , Gonorrhea/microbiology , HIV Infections/virology , Homosexuality , Humans , Male , Neisseria gonorrhoeae , Netherlands/epidemiology , Prevalence , Sexually Transmitted Diseases, Bacterial/etiology , Substance Abuse, IntravenousABSTRACT
Mucosal inflammatory cellular infiltrates are correlated with nasal complaints in symptomatic allergic rhinitis. Some authors suggest inflammation of a neurogenic or immunogenic nature as an underlying disorder for idiopathic rhinitis (IR). We looked at the possible involvement of inflammatory cells in the pathogenesis of IR. Nasal biopsies were taken from sixty-five IR patients with significant nasal complaints and from twenty healthy controls with no nasal complaints. Inflammatory cells were quantified using monoclonal antibodies directed against lymphocytes, antigen-presenting cells, eosinophils, macrophages, monocytes, mast cells and other IgE-positive cells. No significant differences were found, for any cell, between IR patients and controls. We conclude that inflammatory cells do not seem to play an important role in this meticulously characterised group of IR patients.
Subject(s)
Antigens, CD/physiology , Nasal Mucosa/physiopathology , Rhinitis/physiopathology , Adolescent , Adult , Antibodies, Monoclonal , Antigens, CD/analysis , Cell Count , Female , Humans , Immunohistochemistry , Male , Middle Aged , Nasal Mucosa/metabolism , Rhinitis/pathology , Statistics, NonparametricSubject(s)
Allergens , Dermatitis, Atopic/diagnosis , Dermatitis, Atopic/etiology , Patch Tests , Child, Preschool , Dermatitis, Atopic/blood , Dust , Female , Humans , Immunoglobulin E/blood , Infant , MaleABSTRACT
Depression and anxiety frequently occur together or in extension of each other. According to a previous study in depressed inpatients, a high trait anxiety level correlated with a positive response to the diazepam test (DT) and a low trait anxiety level with a negative response to the test. The aim of this study is to investigate whether positive reaction to the DT is related to a positive response to fluvoxamine and whether a negative reaction to the test is related to positive response to imipramine. The DT was performed in 130 patients diagnosed with a depressive disorder. Following the DT, the patients were randomly assigned to double-blind treatment with either imipramine or fluvoxamine. Doses of both antidepressants were adjusted to attain predefined blood levels, and the outcome was evaluated 4 weeks after attaining these blood levels. Twenty-two patients had a positive response to the DT, whereas 108 patients had a negative response. Although a positive DT is correlated with a high level of trait anxiety, no differences in depressive symptomatology and antidepressant response were found between patients with a positive and a negative DT.