Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
1.
J Cancer Epidemiol ; 2013: 490472, 2013.
Article in English | MEDLINE | ID: mdl-23509460

ABSTRACT

Background. Breast cancer survival has improved significantly in the US in the past 10-15 years. However, disparities exist in breast cancer survival between black and white women. Purpose. To investigate the effect of county healthcare resources and SES as well as individual SES status on breast cancer survival disparities between black and white women. Methods. Data from 1,796 breast cancer cases were obtained from the Surveillance Epidemiology and End Results and the National Longitudinal Mortality Study dataset. Cox Proportional Hazards models were constructed accounting for clustering within counties. Three sequential Cox models were fit for each outcome including demographic variables; demographic and clinical variables; and finally demographic, clinical, and county-level variables. Results. In unadjusted analysis, black women had a 53% higher likelihood of dying of breast cancer and 32% higher likelihood of dying of any cause (P < 0.05) compared with white women. Adjusting for demographic variables explained away the effect of race on breast cancer survival (HR, 1.40; 95% CI, 0.99-1.97), but not on all-cause mortality. The racial difference in all-cause survival disappeared only after adjusting for county-level variables (HR, 1.27; CI, 0.95-1.71). Conclusions. Improving equitable access to healthcare for all women in the US may help eliminate survival disparities between racial and socioeconomic groups.

2.
Integr Environ Assess Manag ; 9(1): 31-49, 2013 Jan.
Article in English | MEDLINE | ID: mdl-22488838

ABSTRACT

The Washington State Department of Ecology annually conducts sediment quality monitoring in Puget Sound as a component of the Puget Sound Ecosystem Monitoring Program. Sediment samples are analyzed to determine the concentrations of about 170 chemical and physical variables. A Sediment Chemistry Index (SCI) was derived using the State of Washington Sediment Management Standards to account for the presence and concentrations of mixtures of toxicants. Mean Sediment Quality Standard quotients (mSQSq) were calculated as the basis for the SCI and compared to the incidence and degree of toxicity in laboratory tests and to metrics of the diversity and abundance of resident benthic assemblages in a database consisting of as many as 664 samples. These data were evaluated with co-occurrence analyses to identify "cut points" (i.e., thresholds) in the index below which the frequency and magnitude of biological effects were relatively low and above which they occurred with increasing frequency or magnitude. Iterative trials of different sets of cut points established the final cut points in mSQSq of 0.1, 0.3, and 0.5. They defined 4 ranges in chemical exposure: Minimum (<0.1), Low (0.1- < 0.3), Moderate (0.3- < 0.5), and Maximum (≥0.5). Across these 4 exposure ranges both the incidence and magnitude of toxicity in some laboratory tests increased, the abundance of most stress-sensitive benthic taxa decreased, and the abundance of most stress-tolerant taxa increased. The mSQSq cut point of 0.1 appears to be the target value for protection of benthic resources, the value below which the probability and magnitude of adverse effects either in the laboratory or the field are the lowest. The mSQSq values are rescaled from 0 to 100 to form the SCI, used by the Puget Sound Partnership and environmental managers as a Dashboard Indicator, with biologically relevant targets selected to monitor ecosystem recovery.


Subject(s)
Bays/chemistry , Ecotoxicology/methods , Environmental Monitoring/methods , Environmental Pollutants/analysis , Geologic Sediments/chemistry , Ecotoxicology/standards , Environmental Monitoring/standards , Environmental Pollutants/chemistry , Environmental Pollutants/toxicity , Laboratories , Reference Standards , Washington
3.
Integr Environ Assess Manag ; 8(4): 638-48, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22987518

ABSTRACT

Data from 7 coastwide and regional benthic surveys were combined and used to assess the number and distribution of estuarine benthic macrofaunal assemblages of the western United States. Q-mode cluster analysis was applied to 714 samples and site groupings were tested for differences in 4 habitat factors (latitude, salinity, sediment grain size, and depth). Eight macrofaunal assemblages, structured primarily by latitude, salinity, and sediment grain size, were identified: (A) Puget Sound fine sediment, (B) Puget Sound coarse sediment, (C) southern California marine bays, (D) polyhaline central San Francisco Bay, (E) shallow estuaries and wetlands, (F) saline very coarse sediment, (G) mesohaline San Francisco Bay, and (H) limnetic and oligohaline. The Puget Sound, southern California, and San Francisco Bay assemblages were geographically distinct, while Assemblages E, F and H were distributed widely along the entire coast. A second Q-mode cluster analysis was conducted after adding replicate samples that were available from some of the sites and temporal replicates that were available for sites that were sampled in successive years. Variabilities due to small spatial scale habitat heterogeneity and temporal change were both low in Puget Sound, but temporal variability was high in the San Francisco estuary where large fluctuations in freshwater inputs and salinity among years leads to spatial relocation of the assemblages.


Subject(s)
Aquatic Organisms/classification , Biota , Environmental Monitoring/methods , Geologic Sediments/analysis , Invertebrates/physiology , Animals , Bays , California , Ecosystem , Estuaries , United States , United States Environmental Protection Agency , Washington
5.
Clin Lung Cancer ; 13(2): 81-9, 2012 Mar.
Article in English | MEDLINE | ID: mdl-22056226

ABSTRACT

BACKGROUND: Nodal staging of non-small-cell lung cancer (NSCLC) is crucial in evaluation of prognosis and determination of therapeutic strategy. This study aimed to determine the negative predictive value (NPV) of combined positron emission tomography and computed tomography (PET-CT) in patients with stage I (T1-2N0) NSCLC and to investigate the possible risk factors for occult nodal disease. METHODS: Studies investigating the performance of PET in conjunction with CT in the nodal staging of stage I NSCLC were identified in the MEDLINE database. The initiative of standards for reporting of diagnostic accuracy (STARD) was used to ensure study quality. Pathologic assessments through mediastinoscopy or thoracotomy were required as the reference standard for evaluation of PET-CT accuracy. Stata-based meta-analysis was applied to calculate the individual and pooled NPVs. RESULTS: Ten studies with a total of 1122 patients with stage I (T1-2N0) NSCLC were eligible for analysis. The NPVs of combined PET and CT for mediastinal metastases were 0.94 in T1 disease and 0.89 in T2 disease. Including both T1 disease and T2 disease, the NPVs were 0.93 for mediastinal metastases and 0.87 for overall nodal metastases. Adenocarcinoma histology type (risk ratio [RR], 2.72) and high fluorine-18 (18F) fluorodeoxyglucose (FDG) uptake in the primary lesion were associated with greater risk of occult nodal metastases. CONCLUSIONS: Although overall occult nodal metastases in clinical stage T1-2N0 NSCLC is not infrequent, combined PET and CT provide a favorable NPV for mediastinal metastases in T1N0 NSCLC, suggesting a low yield from routine invasive staging procedures for this subgroup of patients.


Subject(s)
Carcinoma, Non-Small-Cell Lung/diagnostic imaging , Positron-Emission Tomography , Tomography, X-Ray Computed , Adenocarcinoma/diagnostic imaging , Adenocarcinoma/secondary , Adenocarcinoma/surgery , Aged , Carcinoma, Non-Small-Cell Lung/secondary , Carcinoma, Non-Small-Cell Lung/surgery , Female , Fluorodeoxyglucose F18 , Humans , Lung Neoplasms/diagnostic imaging , Lung Neoplasms/pathology , Lung Neoplasms/surgery , Lymphatic Metastasis , Mediastinal Neoplasms/diagnostic imaging , Mediastinal Neoplasms/secondary , Mediastinal Neoplasms/surgery , Mediastinoscopy , Neoplasm Staging , Predictive Value of Tests , Radiopharmaceuticals
6.
Dig Dis Sci ; 57(3): 806-12, 2012 Mar.
Article in English | MEDLINE | ID: mdl-21953139

ABSTRACT

BACKGROUND AND AIMS: Deceased donor liver transplantation (DDLT) rates for candidates with hepatocellular carcinoma (HCC) have significantly increased in the MELD era because of the extra priority given to these candidates. We examined the incidence and pre-DDLT radiological and donor factors associated with post-DDLT HCC recurrence in the MELD era. METHODS: Outcomes of HCC candidates aged ≥18 years that underwent DDLT between 2/28/02 and 6/30/08 (n = 94) were reviewed. The primary outcome was biopsy-proven post-LT HCC recurrence at any site. Kaplan-Meier analysis was used to calculate the cumulative incidence and Cox regression was used to identify the predictors of post-LT HCC recurrence. RESULTS: The median age of the 94 candidates who met the study criteria was 54 years, 64% had hepatitis C, median lab MELD was 13, and median pre-LT AFP was 47 ng/dl. Based upon pre-DDLT imaging, 94% candidates met the Milan criteria. The median waiting time to transplant was 47 days and 27% received pre-DDLT loco-regional therapy. Seventeen (18%) developed HCC recurrence after 2.1 median years with a cumulative incidence of 6.8, 12, and 19% at 1, 2, and 3 years post-DDLT. The pre-DDLT number of lesions (p = 0.015), largest lesion diameter (p = 0.008), and higher donor age (p = 0.002) were the significant predictors of HCC recurrence after adjusting for pre-LT loco-regional therapy and waiting time. Post-LT HCC recurrence (p < 0.0001) and higher donor age (p = 0.029) were associated with lower post-LT survival. CONCLUSIONS: Post-LT HCC recurrence is higher in our MELD era cohort than the reported rate of 8% at 4 years in Mazzaferro et al.'s study. The risk of HCC recurrence was significantly associated with the number of lesions and size of the largest lesion at the time of DDLT as well as with older donor age. Risk stratification using a predictive model for post-LT HCC recurrence based on pre-LT imaging and donor factors may help guide candidate selection and tailoring of HCC surveillance strategies after LT.


Subject(s)
Carcinoma, Hepatocellular/mortality , Carcinoma, Hepatocellular/surgery , Liver Neoplasms/mortality , Liver Neoplasms/surgery , Liver Transplantation/statistics & numerical data , Neoplasm Recurrence, Local/mortality , Aged , Cadaver , Female , Graft Rejection/drug therapy , Graft Rejection/mortality , Humans , Immunosuppressive Agents/therapeutic use , Incidence , Kaplan-Meier Estimate , Liver Transplantation/mortality , Male , Middle Aged , Predictive Value of Tests , Proportional Hazards Models , Risk Factors , Tissue Donors/statistics & numerical data , Young Adult
7.
Infect Control Hosp Epidemiol ; 32(3): 201-6, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21460503

ABSTRACT

BACKGROUND AND OBJECTIVE: Clostridium difficile spores persist in hospital environments for an extended period. We evaluated whether admission to a room previously occupied by a patient with C. difficile infection (CDI) increased the risk of acquiring CDI. DESIGN: Retrospective cohort study. SETTING: Medical intensive care unit (ICU) at a tertiary care hospital. METHODS: Patients admitted from January 1, 2005, through June 30, 2006, were evaluated for a diagnosis of CDI 48 hours after ICU admission and within 30 days after ICU discharge. Medical, ICU, and pharmacy records were reviewed for other CDI risk factors. Admitted patients who did develop CDI were compared with admitted patients who did not. RESULTS: Among 1,844 patients admitted to the ICU, 134 CDI cases were identified. After exclusions, 1,770 admitted patients remained for analysis. Of the patients who acquired CDI after admission to the ICU, 4.6% had a prior occupant without CDI, whereas 11.0% had a prior occupant with CDI (P = .002). The effect of room on CDI acquisition remained a significant risk factor (P = .008) when Kaplan-Meier curves were used. The prior occupant's CDI status remained significant (p = .01; hazard ratio, 2.35) when controlling for the current patient's age, Acute Physiology and Chronic Health Evaluation III score, exposure to proton pump inhibitors, and antibiotic use. CONCLUSIONS: A prior room occupant with CDI is a significant risk factor for CDI acquisition, independent of established CDI risk factors. These findings have implications for room placement and hospital design.


Subject(s)
Clostridioides difficile , Clostridium Infections/epidemiology , Cross Infection/epidemiology , Patients' Rooms , Adolescent , Adult , Aged , Aged, 80 and over , Clostridium Infections/transmission , Cohort Studies , Cross Infection/transmission , Female , Humans , Intensive Care Units , Kaplan-Meier Estimate , Male , Middle Aged , Retrospective Studies , Risk Factors , Young Adult
8.
Support Care Cancer ; 19(12): 1969-74, 2011 Dec.
Article in English | MEDLINE | ID: mdl-21110047

ABSTRACT

PURPOSE: The purpose of this study was to evaluate the risk factors associated with the treatment failure and 30-day mortality in hematology and bone marrow transplant patients treated with daptomycin or linezolid for vancomycin-resistant enterococci (VRE) bacteremia. The safety and tolerability of therapy was also assessed. METHODS: This single-center, retrospective study included adult patients admitted to the hematology or bone marrow transplant service with documented vancomycin-resistant Enterococcus faecium or Enterococcus faecalis bacteremia and received at least 48 h of either linezolid or daptomycin as primary treatment. Clinical and microbiologic outcomes were assessed at day 7, 14, and 30 of hospital stay. RESULTS: A total of 72 patients were included in the analysis. Forty-three patients received daptomycin as primary treatment and 29 received linezolid as primary treatment. Overall success rate at day 7 was 81.9%, day 14 success rate was 79.2%, and day 30 success rate was 76.4% for all patients. Forty-one patients (57.0%) had high-grade bacteremia defined as greater than one positive blood culture for VRE. The mortality rate was significantly higher if high-grade bacteremia was present (34.1% vs. 7.0%; p = 0.009). CONCLUSIONS: This study suggests that linezolid and daptomycin are both reasonable options for treating VRE bacteremia in hematology and bone marrow transplant patients; however, patients with high-grade VRE bacteremia may be at increased risk for treatment failure.


Subject(s)
Bone Marrow Transplantation , Enterococcus/drug effects , Hematologic Diseases , Outcome Assessment, Health Care , Vancomycin Resistance , Vancomycin/therapeutic use , Aged , Female , Humans , Male , Michigan , Middle Aged , Retrospective Studies
9.
Fertil Steril ; 94(1): 221-9, 2010 Jun.
Article in English | MEDLINE | ID: mdl-19394610

ABSTRACT

OBJECTIVE: To study the link between fatness and gonadotropin secretion. Overweight status is linked to polycystic ovary syndrome (PCOS) in adolescents. We postulated that heavier adolescents without symptoms would secrete LH with: [1] increased pulse frequency (LHPF) and [2] exaggerated integrated concentrations (LHAUC). DESIGN: Cross-sectional. SETTING: General clinical research center. PATIENT(S): Eighty-seven postmenarcheal cyclic adolescents from lean to overweight recruited during the follicular phase. INTERVENTION(S): Luteinizing hormone sampling: [1] every 10 minutes/24 hours; [2] at 20-minute intervals after a GnRH challenge. MAIN OUTCOME MEASURE(S): The LHPF and LHAUC (calculated by the CLUSTER algorithm). Hormonal and metabolic covariates included percent body fat (PercentBF), insulin-like growth factor-I (IGF-I), fasting insulin, and the insulin resistance index HOMA-IR. The SAS software was used for analyses. RESULT(S): The PercentBF and younger gynecological age predicted faster LHPF. Fatness was negatively linked to LHAUC, which was best predicted by PercentBF and IGF-1 in multivariate modeling (R(2) = 0.25). The PercentBF and insulin predicted a lower 20-minute LH response to GnRH. CONCLUSION(S): [1] Higher adiposity and younger gynecological age predict rapid LHPF. [2] The early years after menarche represent a vulnerable window for an exaggerated LHPF with weight gain. [3] In healthy adolescents, higher adiposity is linked to lower LHAUC, thereby preserving pituitary stores.


Subject(s)
Adipose Tissue/metabolism , Insulin/metabolism , Luteinizing Hormone/metabolism , Adipose Tissue/blood supply , Adolescent , Age Factors , Body Mass Index , Cross-Sectional Studies , Female , Humans , Insulin/blood , Insulin Secretion , Luteinizing Hormone/blood , Overweight/blood , Predictive Value of Tests , Prospective Studies , Thinness/blood , Young Adult
10.
Liver Transpl ; 15(9): 1142-8, 2009 Sep.
Article in English | MEDLINE | ID: mdl-19718633

ABSTRACT

The proportion of patients undergoing liver transplantation (LT) with renal insufficiency has significantly increased in the Model for End-Stage Liver Disease (MELD) era. This study was designed to determine the incidence and predictors of post-LT chronic renal failure (CRF) and its effect on patient survival in the MELD era. Outcomes of 221 adult LT recipients who had LT between February 2002 and February 2007 were reviewed retrospectively. Patients who were listed as status 1, were granted a MELD exception, or had living-donor, multiorgan LT were excluded. Renal insufficiency at LT was defined as none to mild [estimated glomerular filtration rate (GFR) >or= 60 mL/minute], moderate (30-59 mL/minute), or severe (<30 mL/minute). Post-LT CRF was defined as an estimated GFR < 30 mL/minute persisting for 3 months, initiation of renal replacement therapy, or listing for renal transplantation. The median age was 54 years, 66% were male, 89% were Caucasian, and 43% had hepatitis C. At LT, the median MELD score was 20, and 6.3% were on renal replacement therapy. After a median follow-up of 2.6 years (range, 0.01-5.99), 31 patients developed CRF with a 5-year cumulative incidence of 22%. GFR at LT was the only independent predictor of post-LT CRF (hazard ratio = 1.33, P < 0.001). The overall post-LT patient survival was 74% at 5 years. Patients with MELD >or= 20 at LT had a higher cumulative incidence of post-LT CRF in comparison with patients with MELD < 20 (P = 0.03). A decrease in post-LT GFR over time was the only independent predictor of survival. In conclusion, post-LT CRF is common in the MELD era with a 5-year cumulative incidence of 22%. Low GFR at LT was predictive of post-LT CRF, and a decrease in post-LT GFR over time was associated with decreased post-LT survival. Further studies of modifiable preoperative, perioperative, and postoperative factors influencing renal function are needed to improve outcomes following LT.


Subject(s)
Kidney Failure, Chronic/etiology , Liver Diseases/surgery , Liver Transplantation/adverse effects , Models, Biological , Renal Insufficiency/complications , Adolescent , Adult , Aged , Disease Progression , Female , Glomerular Filtration Rate , Humans , Immunosuppressive Agents/adverse effects , Incidence , Kaplan-Meier Estimate , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/physiopathology , Kidney Failure, Chronic/therapy , Liver Diseases/complications , Liver Diseases/diagnosis , Liver Diseases/mortality , Liver Transplantation/mortality , Male , Middle Aged , Predictive Value of Tests , Proportional Hazards Models , Renal Insufficiency/mortality , Renal Insufficiency/physiopathology , Renal Insufficiency/therapy , Renal Replacement Therapy , Risk Assessment , Risk Factors , Severity of Illness Index , Time Factors , Treatment Outcome , Young Adult
11.
JACC Cardiovasc Interv ; 2(7): 645-54, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19628188

ABSTRACT

OBJECTIVES: We sought to compare the nephrotoxicity of the iso-osmolar contrast medium, iodixanol, to low-osmolar contrast media (LOCM). BACKGROUND: Contrast-induced acute kidney injury (CI-AKI) is a common cause of in-hospital renal failure. A prior meta-analysis suggested that iodixanol (Visipaque, GE Healthcare, Princeton, New Jersey) was associated with less CI-AKI than LOCM, but this study was limited by ascertainment bias and did not include the most recent randomized controlled trials. METHODS: We searched Medline, Embase, ISI Web of Knowledge, Google Scholar, Current Contents, and International Pharmaceutical Abstracts databases, and the Cochrane Central Register of Controlled Trials from 1980 to November 30, 2008, for randomized controlled trials that compared the incidence of CI-AKI with either iodixanol or LOCM. Random-effects models were used to calculate summary risk ratios (RR) for CI-AKI, need for hemodialysis, and death. RESULTS: A total of 16 trials including 2,763 subjects were pooled. There was no significant difference in the incidence of CI-AKI in the iodixanol group than in the LOCM group overall (summary RR: 0.79, 95% confidence interval [CI]: 0.56 to 1.12, p = 0.19). There was no significant difference in the rates of post-procedure hemodialysis or death. There was a reduction in CI-AKI when iodixanol was compared with ioxaglate (RR: 0.58, 95% CI: 0.37 to 0.92; p = 0.022) and iohexol (RR: 0.19, 95% CI: 0.07 to 0.56; p = 0.002), but no difference when compared with iopamidol (RR: 1.20, 95% CI: 0.66 to 2.18; p = 0.55), iopromide (RR: 0.93, 95% CI: 0.47 to 1.85; p = 0.84), or ioversol (RR: 0.92, 95% CI: 0.60 to 1.39; p = 0.68). CONCLUSIONS: This meta-analysis including 2,763 subjects suggests that iodixanol, when compared with LOCM overall, is not associated with less CI-AKI. The relative renal safety of LOCM compared with iodixanol may vary based on the specific type of LOCM.


Subject(s)
Contrast Media/adverse effects , Kidney Diseases/chemically induced , Triiodobenzoic Acids/adverse effects , Aged , Aged, 80 and over , Consumer Product Safety , Evidence-Based Medicine , Female , Humans , Iohexol/adverse effects , Iopamidol/adverse effects , Ioxaglic Acid/adverse effects , Kidney Diseases/mortality , Kidney Diseases/therapy , Male , Middle Aged , Odds Ratio , Osmolar Concentration , Randomized Controlled Trials as Topic , Renal Dialysis , Risk Assessment , Risk Factors
12.
Am Heart J ; 155(4): 630-9, 2008 Apr.
Article in English | MEDLINE | ID: mdl-18371469

ABSTRACT

BACKGROUND: Drug-eluting stents have emerged as the favored device for percutaneous coronary intervention. It is not clear if there are differences in the currently available drug-eluting stents. We performed a meta-analysis to systematically evaluate currently available data comparing sirolimus-eluting stents (SESs) with paclitaxel-eluting stents (PESs) in patients with coronary artery disease. METHODS: We searched the MEDLINE, Embase, ISI Web of Knowledge, Current Contents, and International Pharmaceutical Abstracts databases, and the Cochrane Central Register of Controlled Trials, as well as scientific meeting abstracts up to November 30, 2006. All randomized controlled trials comparing SES with PES and providing follow-up data of > or = 6 months were eligible for inclusion in our analysis. RESULTS: Data from 12 trials (number of patients 7455) were pooled. There was no difference in death (summary odds ratio [OR] 0.88, 95% CI 0.61-1.25, P = .46), myocardial infarction (summary OR 0.92, 95% CI, 0.71-1.19, P = .51), or stent thrombosis (summary OR 0.75, 95% CI 0.40-1.40, P = .37) between SES and PES. The use of SES was associated with a significant reduction in angiographic restenosis (summary OR 0.64, 95% CI 0.52-0.78, P < .001), target vessel revascularization (5.66% vs 7.70%, summary OR 0.72, 95% CI 0.59-0.88, P = .002), or target lesion revascularization (summary OR 0.67, 95% CI 0.53-0.84, P = .001). CONCLUSIONS: Patients treated with SES appear to have a significantly lower risk of restenosis and need for target vessel revascularization compared with those treated with PES. There is no significant difference between the 2 stents with respect to mortality, myocardial infarction, or early stent thrombosis.


Subject(s)
Coronary Disease/therapy , Drug-Eluting Stents , Stents , Coronary Disease/mortality , Coronary Restenosis/epidemiology , Coronary Thrombosis/epidemiology , Humans , Myocardial Infarction/epidemiology , Odds Ratio , Randomized Controlled Trials as Topic
13.
J Immunol ; 179(1): 623-30, 2007 Jul 01.
Article in English | MEDLINE | ID: mdl-17579084

ABSTRACT

Late mortality in septic patients often exceeds the lethality occurring in acute sepsis, yet the immunoinflammatory alterations preceding chronic sepsis mortality are not well defined. We studied plasma cytokine concentrations preceding late septic deaths (days 6-28) in a murine model of sepsis induced by polymicrobial peritonitis. The late prelethal inflammatory response varied from a virtually nonexistent response in three of 14 to a mixed response in eight of 14 mice to the concurrent presence of nearly all measured cytokines, both proinflammatory and anti-inflammatory in three of 14 mice. In responding mice a consistent prelethal surge of plasma MIP-2 (1.6 vs 0.12 ng/ml in survivors; mean values), MCP-1 (2.0 vs 1.3 ng/ml), soluble TNF receptor type I (2.5 vs 0.66 ng/ml), and the IL-1 receptor antagonist (74.5 vs 3.3 ng/ml) was present, although there were infrequent increases in IL-6 (1.9 vs 0.03 ng/ml) and IL-10 (0.12 vs 0.04 ng/ml). For high mobility group box 1, late mortality was signaled by its decrease in plasma levels (591 vs 864 ng/ml). These results demonstrate that impeding mortality in the chronic phase of sepsis may be accurately predicted by plasma biomarkers, providing a mechanistic basis for individualized therapy. The pattern of late prelethal responses suggest that the systemic inflammatory response syndrome to compensatory anti-inflammatory response syndrome transition paradigm fails to follow a simple linear pattern.


Subject(s)
Inflammation Mediators/metabolism , Sepsis/mortality , Sepsis/pathology , Acute Disease , Animals , Body Weight/immunology , Chronic Disease , Cytokines/biosynthesis , Cytokines/physiology , Disease Models, Animal , Female , Inflammation/immunology , Inflammation/mortality , Inflammation/prevention & control , Inflammation Mediators/physiology , Mice , Mice, Inbred ICR , Oligonucleotide Array Sequence Analysis , Sepsis/immunology
14.
Paediatr Anaesth ; 17(5): 426-30, 2007 May.
Article in English | MEDLINE | ID: mdl-17474948

ABSTRACT

BACKGROUND: Our aim was to describe the incidence of quality assurance events between overweight/obese and normal weight children. METHODS: This is a retrospective review of the quality assurance database of the Mott Children's Hospital, University of Michigan for the period January 2000 to December 2004. Using directly measured height and weight, we computed the body mass index (BMI) in 6094 children. Overweight and obesity were defined using age and gender-specific cut off according to the National Center for Health Statistics (NCHS)/Centers for Disease Control and Prevention (CDC) (2000) growth charts. Frequency of quality assurance events were compared between normal weight, overweight, and obese children. RESULTS: There were 3359 males (55.1%) and 2735 females (44.9%). The mean age for the entire population was 11.9 +/- 5.2 while the mean BMI was 21.6 +/- 6.7 kg x m(-2). The overall prevalence of overweight and obesity was 31.6%. Obesity was more prevalent in boys than girls (P = 0.016). Preoperative diagnoses of hypertension, type II diabetes, and bronchial asthma were more common in overweight and obese than normal weight children (P = 0.0001 for hypertension, P = 0.001 for diabetes and P = 0.014 for bronchial asthma). Difficult airway, upper airway obstruction in the postanesthesia care unit (PACU) and PACU stay longer than 3 h and need for two or more antiemetics were more common in overweight and obese than normal weight children (P = 0.001). There was no significant difference in the incidence of unplanned hospital admission following an outpatient surgical procedure between normal weight and overweight/obese children. DISCUSSION: Studies on perioperative aspects of childhood overweight and obesity are rare. Our report shows a high prevalence of overweight and obesity in this cohort of pediatric surgical patients. Certain perioperative morbidities are more common in overweight and obese than in normal weight children. There is a need for prospective studies of the impact of childhood overweight and obesity on anesthesia and surgical outcome.


Subject(s)
Body Mass Index , Intraoperative Complications/epidemiology , Obesity/epidemiology , Postoperative Complications/epidemiology , Quality Assurance, Health Care , Body Weight , Child , Cohort Studies , Female , Humans , Incidence , Male , Michigan/epidemiology , Overweight , Perioperative Care , Prevalence , Retrospective Studies , Risk Factors , Sex Distribution , Surgical Procedures, Operative
15.
J Immunol ; 177(3): 1967-74, 2006 Aug 01.
Article in English | MEDLINE | ID: mdl-16849510

ABSTRACT

Mortality in sepsis remains unacceptably high and attempts to modulate the inflammatory response failed to improve survival. Previous reports postulated that the sepsis-triggered immunological cascade is multimodal: initial systemic inflammatory response syndrome (SIRS; excessive pro-, but no/low anti-inflammatory plasma mediators), intermediate homeostasis with a mixed anti-inflammatory response syndrome (MARS; both pro- and anti-inflammatory mediators) and final compensatory anti-inflammatory response syndrome (CARS; excessive anti-, but no/low proinflammatory mediators). To verify this, we examined the evolution of the inflammatory response during the early phase of murine sepsis by repetitive blood sampling of septic animals. Increased plasma concentrations of proinflammatory (IL-6, TNF, IL-1beta, KC, MIP-2, MCP-1, and eotaxin) and anti-inflammatory (TNF soluble receptors, IL-10, IL-1 receptor antagonist) cytokines were observed in early deaths (days 1-5). These elevations occurred simultaneously for both the pro- and anti-inflammatory mediators. Plasma levels of IL-6 (26 ng/ml), TNF-alpha (12 ng/ml), KC (33 ng/ml), MIP-2 (14 ng/ml), IL-1 receptor antagonist (65 ng/ml), TNF soluble receptor I (3 ng/ml), and TNF soluble receptor II (14 ng/ml) accurately predicted mortality within 24 h. In contrast, these parameters were not elevated in either the late-deaths (day 6-28) or survivors. Surprisingly, either pro- or anti-inflammatory cytokines were also reliable in predicting mortality up to 48 h before outcome. These data demonstrate that the initial inflammatory response directly correlates to early but not late sepsis mortality. This multifaceted response questions the use of a simple proinflammatory cytokine measurement for classifying the inflammatory status during sepsis.


Subject(s)
Cytokines/antagonists & inhibitors , Cytokines/blood , Disease Models, Animal , Inflammation Mediators/antagonists & inhibitors , Inflammation Mediators/blood , Systemic Inflammatory Response Syndrome/blood , Systemic Inflammatory Response Syndrome/mortality , Animals , Biomarkers/blood , Cecum/surgery , Cytokines/physiology , Disease Progression , Female , Gene Expression Regulation/immunology , Inflammation Mediators/physiology , Interleukin 1 Receptor Antagonist Protein , Interleukin-6/antagonists & inhibitors , Interleukin-6/biosynthesis , Interleukin-6/blood , Ligation , Mice , Mice, Inbred ICR , Predictive Value of Tests , Prognosis , Punctures , Receptors, Interleukin-1/antagonists & inhibitors , Receptors, Interleukin-1/blood , Sialoglycoproteins/biosynthesis , Sialoglycoproteins/blood , Survival Analysis , Systemic Inflammatory Response Syndrome/immunology , Systemic Inflammatory Response Syndrome/therapy , Time Factors
16.
Environ Monit Assess ; 111(1-3): 173-222, 2005 Dec.
Article in English | MEDLINE | ID: mdl-16311828

ABSTRACT

A survey was designed and conducted to determine the severity, spatial patterns, and spatial extent of degraded sediment quality in Puget Sound (Washington State, USA). A weight of evidence compiled from results of chemical analyses, toxicity tests, and benthic infaunal analyses was used to classify the quality of sediments. Sediment samples were collected from 300 locations within a 2363 km(2) area extending from the US/Canada border to the inlets of southern Puget Sound and Hood Canal. Degraded conditions, as indicated with a combination of high chemical concentrations, significant toxicity, and adversely altered benthos, occurred in samples that represented about 1% of the total area. These conditions invariably occurred in samples collected within urbanized bays and industrial waterways, especially near the urban centers of Everett, Seattle, Tacoma, and Bremerton. Sediments with high quality (as indicated by no toxicity, no contamination, and the presence of a relatively abundant and diverse infauna) occurred in samples that represented a majority (68%) of the total study area. Sediments in which results of the three kinds of analyses were not in agreement were classified as intermediate in quality and represented about 31% of the total area. Relative to many other estuaries and marine bays of the USA, Puget Sound sediments ranked among those with minimal evidence of toxicant-induced degradation.


Subject(s)
Geologic Sediments/analysis , Water Pollutants, Chemical/toxicity , Aliivibrio fischeri/metabolism , Amphipoda/drug effects , Animals , Cell Line , Cytochrome P-450 Enzyme System/biosynthesis , Environmental Monitoring , Enzyme Induction , Fertilization/drug effects , Genes, Reporter/genetics , Humans , Luciferases/metabolism , Luminescence , Metals/analysis , Metals/toxicity , Organic Chemicals/analysis , Organic Chemicals/toxicity , Sea Urchins/drug effects , Sea Urchins/physiology , Seawater , Silicon/analysis , Silicon/toxicity , Washington , Water Pollutants, Chemical/analysis
17.
Ann Emerg Med ; 46(2): 123-31, 2005 Aug.
Article in English | MEDLINE | ID: mdl-16046941

ABSTRACT

STUDY OBJECTIVE: We determine whether the use of an emergency medical services (EMS) protocol for selective spine immobilization would result in appropriate immobilization without spinal cord injury associated with nonimmobilization. METHODS: A 4-year prospective study examined EMS and hospital records for patients after the implementation of an EMS protocol for selective spine immobilization. EMS personnel were trained to perform and document a spine injury assessment for out-of-hospital trauma patients with a mechanism of injury judged sufficient to cause a spine injury. The assessment included these clinical criteria: altered mental status, evidence of intoxication, neurologic deficit, suspected extremity fracture, and spine pain or tenderness. The protocol required immobilization for patients with a positive assessment on any of those criteria. Outcome characteristics included the presence or absence of spine injury and spine injury management. RESULTS: The study collected data on 13,483 patients; 126 of the patients were subsequently excluded from the study because of incomplete data, leaving a study sample of 13,357 patients with complete data. Spine injuries were confirmed in the hospital records for 3% (n=415) of patients, including 50 patients with cord injuries and 128 patients with cervical injuries. Sensitivity of the EMS protocol was 92% (95% confidence interval [CI] 89.4 to 94.6%) resulting in nonimmobilization of 8% of the patients with spine injuries (33 of 415). None of the nonimmobilized patients sustained cord injuries. The specificity was 40% (95% CI 38.9 to 40.5%). CONCLUSION: The use of our selective immobilization protocol resulted in spine immobilization for most patients with spine injury without causing harm in cases in which spine immobilization was withheld.


Subject(s)
Clinical Protocols , Emergency Medical Services , Restraint, Physical/statistics & numerical data , Spinal Injuries/therapy , Adolescent , Adult , Aged , Aged, 80 and over , Child , Child, Preschool , Female , Humans , Infant , Male , Middle Aged , Prospective Studies , Sensitivity and Specificity , Spinal Cord Injuries/etiology , Spinal Cord Injuries/prevention & control , Spinal Fractures/therapy , Spinal Injuries/complications , Trauma Severity Indices , Treatment Outcome
18.
BMC Pregnancy Childbirth ; 5(1): 9, 2005 May 05.
Article in English | MEDLINE | ID: mdl-15876345

ABSTRACT

BACKGROUND: Death of an infant in utero or at birth has always been a devastating experience for the mother and of concern in clinical practice. Infant mortality remains a challenge in the care of pregnant women worldwide, but particularly for developing countries and the need to understand contributory factors is crucial for addressing appropriate perinatal health. METHODS: Using information available in obstetric records for all deliveries (17,072 births) at Harare Maternity Hospital, Zimbabwe, we conducted a cross-sectional retrospective analysis of a one-year data, (1997-1998) to assess demographic and obstetric risk factors for stillbirth and early neonatal death. We estimated risk of stillbirth and early neonatal death for each potential risk factor. RESULTS: The annual frequency of stillbirth was 56 per 1,000 total births. Women delivering stillbirths and early neonatal deaths were less likely to receive prenatal care (adjusted relative risk [RR] = 2.54; 95% confidence intervals [CI] 2.19-2.94 and RR = 2.52; 95% CI 1.63-3.91), which for combined stillbirths and early neonatal deaths increased with increasing gestational age (Hazard Ratio [HR] = 3.98, HR = 7.49 at 28 and 40 weeks of gestation, respectively). Rural residence was associated with risk of infant dying in utero, (RR = 1.33; 95% CI 1.12-1.59), and the risk of death increased with increasing gestational age (HR = 1.04, HR = 1.69, at 28 and 40 weeks of gestation, respectively). Older maternal age was associated with risk of death (HR = 1.50; 95% CI 1.21-1.84). Stillbirths were less likely to be delivered by Cesarean section (RR = 0.64; 95% CI 0.51-0.79), but more likely to be delivered as breech (RR = 4.65; 95% CI 3.88-5.57, as were early neonatal deaths (RR = 3.38; 95% CI 1.64-6.96). CONCLUSION: The frequency of stillbirth, especially macerated, is high, 27 per 1000 total births. Early prenatal care could help reduce perinatal death linking the woman to the health care system, increasing the probability that she would seek timely emergency care that would reduce the likelihood of death of her infant in utero. Improved quality of obstetric care during labor and delivery may help reduce the number of fresh stillbirths and early neonatal deaths.

19.
J Am Coll Cardiol ; 45(3): 381-7, 2005 Feb 01.
Article in English | MEDLINE | ID: mdl-15680716

ABSTRACT

OBJECTIVES: This study was designed to assess effects of mitral valve annuloplasty (MVA) on mortality in patients with mitral regurgitation (MR) and left ventricular (LV) systolic dysfunction. BACKGROUND: Mitral valve annuloplasty improves hemodynamics and symptoms in these patients, but effects on long-term mortality are not well established. METHODS: We retrospectively analyzed consecutive patients with significant MR and LV systolic dysfunction on echocardiography between 1995 and 2002. Cox regression analysis, including MVA as a time-dependent covariate and propensity scoring to adjust for differing probabilities of undergoing MVA, was used to identify predictors of death, LV assist device implantation, or United Network for Organ Sharing-1 heart transplantation. RESULTS: Of 682 patients identified, 419 were deemed surgical candidates; 126 underwent MVA. Propensity score derivation identified age, ejection fraction, and LV dimension to be associated with undergoing MVA. End points were reached in 120 (41%) non-MVA and 62 (49%) MVA patients. Increased risk of end point was associated with coronary artery disease (hazard ratio [HR] 1.80, 95% confidence interval [CI] 1.30 to 2.49), blood urea nitrogen (HR 1.01, 95% CI 1.005 to 1.02), cancer (HR 2.77, 95% CI 1.45 to 5.30), and digoxin (HR 1.66, 95% CI 1.15 to 2.39). Reduced risk was associated with angiotensin-converting enzyme inhibitors (HR 0.65, 95% CI 0.44 to 0.95), beta-blockers (HR 0.59, 95% CI 0.42 to 0.83), mean arterial pressure (HR 0.98, 95% CI 0.97 to 0.99), and serum sodium (HR 0.93, 95% CI 0.90 to 0.96). Mitral valve annuloplasty did not predict clinical outcome. CONCLUSIONS: In this analysis, there is no clearly demonstrable mortality benefit conferred by MVA for significant MR with severe LV dysfunction. A prospective randomized control trial is warranted for further study of mortality with MVA in this population.


Subject(s)
Mitral Valve Insufficiency/mortality , Mitral Valve Insufficiency/surgery , Mitral Valve/surgery , Ventricular Dysfunction, Left/complications , Ventricular Dysfunction, Left/mortality , Aged , Aged, 80 and over , Disease-Free Survival , Female , Follow-Up Studies , Humans , Male , Middle Aged , Mitral Valve Insufficiency/etiology , Proportional Hazards Models , Retrospective Studies , Severity of Illness Index
20.
Am J Hum Biol ; 16(5): 523-32, 2004.
Article in English | MEDLINE | ID: mdl-15368600

ABSTRACT

We report here on a longitudinal study of stress and women's reproduction in a small Kaqchikel Mayan community in rural Guatemala. Current understanding of the effects of stress on the reproductive axis in women is mostly derived from clinical studies of individual stressors. Little is known, however, about the cumulative effects of "real life" stress. Cortisol increases in response to a broad variety of individual stressors (Tilbrook et al., 2002). In this article, we evaluate the association between daily fluctuations in women's urinary cortisol and reproductive hormones: estrone conjugates (E(1)C), pregnandiol glucuronide (PdG), luteinizing hormone (LH), and follicle stimulating hormone (FSH). To assess the association between daily changes in cortisol levels and changes in the profiles of the reproductive hormones, we used a random coefficients model based on polynomial regression. The sample includes 92 menstrual cycles provided by 24 participants over a year-long prospective study. Increases in urinary cortisol levels were associated with significant increases in gonadotrophin and progestin levels during the follicular phase. Also, in a time window between days 4 and 10 after ovulation, increased cortisol levels were associated with significantly lower progestin levels. These results are significant because untimely increases in gonadotrophins and low midluteal progesterone levels have previously been reported to impinge on the ovulatory and luteinization processes and to reduce the chances of successful implantation (Ferin, 1999; Baird et al., 1999). Future research should consider the possibility that stress may affect fecundability and implantation without necessarily causing amenorrhoea or oligomenorrhoea.


Subject(s)
Gonadotropins/metabolism , Hydrocortisone/metabolism , Menstrual Cycle/physiology , Reproductive History , Stress, Psychological , Adolescent , Adult , Circadian Rhythm , Developing Countries , Female , Gonadotropins/analysis , Humans , Hydrocortisone/analysis , Longitudinal Studies , Malaysia , Ovulation Prediction , Probability , Prospective Studies , Risk Assessment , Rural Population
SELECTION OF CITATIONS
SEARCH DETAIL
...