Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 16 de 16
Filter
1.
Environ Toxicol Chem ; 39(9): 1746-1754, 2020 09.
Article in English | MEDLINE | ID: mdl-32539159

ABSTRACT

Human-dominated waterways contain thousands of chemicals. Determining which chemical is the most important stressor is important, yet very challenging. The Toxicity Identification Evaluation (TIE) procedure from the US Environmental Protection Agency uses a series of chemical and physical manipulations to fractionate compounds within a matrix and systematically identify potential toxicants through laboratory bioassay testing. Although this may provide useful information, it lacks ecological realism because it is subject to laboratory-related artifacts and is resource intensive. The in situ Toxicity Identification Evaluation (iTIE) technology was developed to improve this approach and has undergone a number of modifications over the past several years. The novel prototype 3 consists of an array of iTIE ambient water fractionation units. Each unit is connected to a peristaltic pumping system with an organism exposure chamber that receives water from a resin chamber to chemically fractionate test site water. Test organisms included freshwater and marine standard toxicity test species. Postfractionation waters are collected for subsequent chemical analyses. Currently, the resins allow for separation of ammonia, metals, and nonpolar organics; the subsequent toxicity responses are compared between treatments and unfractionated, ambient exposures. The iTIE system was deployed to a depth of 3 m and evaluated in streams and marine harbors. Chemical analyses of water and iTIE chemical sorptive resins confirmed chemical groups causing lethal to sublethal responses. The system proved to be as sensitive or more so than the traditional phase 1 TIE test and required almost half of the resources to complete. This iTIE prototype provides a robust technology that improves stressor-causality linkages and thereby supports strong evidence for ecological risk weight-of-evidence assessments. Environ Toxicol Chem 2020;39:1746-1754. © 2020 SETAC.


Subject(s)
Environmental Monitoring/methods , Environmental Pollution/analysis , Toxicity Tests , Ammonia/analysis , Animals , Bivalvia/drug effects , Bivalvia/embryology , Cost-Benefit Analysis , Embryo, Nonmammalian/drug effects , Endpoint Determination , Fresh Water/chemistry , Geologic Sediments/chemistry , Humans , Larva/drug effects , Rivers , Sea Urchins/drug effects , Sea Urchins/embryology , Water Pollutants, Chemical/toxicity
2.
J Great Lakes Res ; 44(4): 725-734, 2018 Aug.
Article in English | MEDLINE | ID: mdl-30319172

ABSTRACT

Incorporation of fish age into the assessment of status and trends for persistent, bioaccumulative and toxic chemicals in the Great Lakes has become an important step for the U.S. EPA's Great Lakes Fish Monitoring and Surveillance Program (GLFMSP). A slowing in the rate of decline for total PCBs in Lake Huron beginning in 2000, led the Program to complete a retrospective analysis to assess how chemical contamination may be influenced by fish age. Analytical results suggest that fish age is an important variable when assessing contaminant trends and that the Program needed to revise its compositing scheme to group fish according to age, rather than by length, prior to homogenization and chemical analysis. An Interlaboratory comparison study of multiple age structures was performed to identify the most appropriate age estimation structure for the Program. The lake trout (Salvelinus namaycush) maxillae was selected, over the otolith, as the most precise, accurate, and rapidly assessed structure for the Program when compared between laboratories and against the known age from the coded wire tag (CWT). Age-normalization practices can now be implemented when assessing contaminant concentrations and trends for the GLFMSP.

3.
J Health Care Poor Underserved ; 28(2): 694-706, 2017.
Article in English | MEDLINE | ID: mdl-28529218

ABSTRACT

PURPOSE: To characterize the quality of health care at student-run free clinics (SRFCs) by analyzing hypertension management and outcomes at the Indiana University Student Outreach Clinic (IUSOC). METHODS: A retrospective review of medical records was conducted for hypertensive patients managed at IUSOC over 15 months (N = 64). Indiana University Student Outreach Clinic's hypertension control rate was compared with National Health and Nutrition Examination Survey (NHANES) data. RESULTS: Blood pressure control rates increased significantly over the study period. Indiana University Student Outreach Clinic's control rate did not differ significantly with the NHANES national average, but was significantly greater than the NHANES group with no usual source of care. Similarly, IUSOC patients without insurance or with unknown insurance status had greater control rates than an uninsured NHANES group, but did not differ significantly from an insured NHANES group. CONCLUSIONS: Despite unfavorable demographic characteristics, records for patients with hypertension who used IUSOC as a regular provider of primary care compared favorably with national data.


Subject(s)
Antihypertensive Agents/therapeutic use , Hypertension/drug therapy , Primary Health Care/organization & administration , Quality of Health Care , Student Run Clinic/organization & administration , Adult , Aged , Antihypertensive Agents/administration & dosage , Blood Pressure , Body Mass Index , Diabetes Mellitus/epidemiology , Female , Health Risk Behaviors , Humans , Hypertension/epidemiology , Hypertension/therapy , Indiana , Insurance Coverage/statistics & numerical data , Insurance, Health/statistics & numerical data , Male , Middle Aged , Nutrition Surveys , Primary Health Care/standards , Retrospective Studies , Sex Factors , Socioeconomic Factors , Student Run Clinic/standards , Students, Medical
4.
J Occup Environ Hyg ; 13(12): 980-992, 2016 12.
Article in English | MEDLINE | ID: mdl-27362274

ABSTRACT

Sample collection procedures and primary receptacle (sample container and bag) decontamination methods should prevent contaminant transfer between contaminated and non-contaminated surfaces and areas during bio-incident operations. Cross-contamination of personnel, equipment, or sample containers may result in the exfiltration of biological agent from the exclusion (hot) zone and have unintended negative consequences on response resources, activities and outcomes. The current study was designed to: (1) evaluate currently recommended sample collection and packaging procedures to identify procedural steps that may increase the likelihood of spore exfiltration or contaminant transfer; (2) evaluate the efficacy of currently recommended primary receptacle decontamination procedures; and (3) evaluate the efficacy of outer packaging decontamination methods. Wet- and dry-deposited fluorescent tracer powder was used in contaminant transfer tests to qualitatively evaluate the currently-recommended sample collection procedures. Bacillus atrophaeus spores, a surrogate for Bacillus anthracis, were used to evaluate the efficacy of spray- and wipe-based decontamination procedures. Both decontamination procedures were quantitatively evaluated on three types of sample packaging materials (corrugated fiberboard, polystyrene foam, and polyethylene plastic), and two contamination mechanisms (wet or dry inoculums). Contaminant transfer results suggested that size-appropriate gloves should be worn by personnel, templates should not be taped to or removed from surfaces, and primary receptacles should be selected carefully. The decontamination tests indicated that wipe-based decontamination procedures may be more effective than spray-based procedures; efficacy was not influenced by material type but was affected by the inoculation method. Incomplete surface decontamination was observed in all tests with dry inoculums. This study provides a foundation for optimizing current B. anthracis response procedures to minimize contaminant exfiltration.


Subject(s)
Bacillus anthracis , Containment of Biohazards/instrumentation , Containment of Biohazards/methods , Decontamination/methods , Specimen Handling/methods , Spores, Bacterial , Gloves, Protective , Materials Testing
5.
Metabolomics ; 11(5): 1302-1315, 2015.
Article in English | MEDLINE | ID: mdl-26366138

ABSTRACT

Zebra mussel, Dreissena polymorpha, in the Great Lakes is being monitored as a bio-indicator organism for environmental health effects by the National Oceanic and Atmospheric Administration's Mussel Watch program. In order to monitor the environmental effects of industrial pollution on the ecosystem, invasive zebra mussels were collected from four stations-three inner harbor sites (LMMB4, LMMB1, and LMMB) in Milwaukee Estuary, and one reference site (LMMB5) in Lake Michigan, Wisconsin. Nuclear magnetic resonance (NMR)-based metabolomics was used to evaluate the metabolic profiles of the mussels from these four sites. The objective was to observe whether there were differences in metabolite profiles between impacted sites and the reference site; and if there were metabolic profile differences among the impacted sites. Principal component analyses indicated there was no significant difference between two impacted sites: north Milwaukee harbor (LMMB and LMMB4) and the LMMB5 reference site. However, significant metabolic differences were observed between the impacted site on the south Milwaukee harbor (LMMB1) and the LMMB5 reference site, a finding that correlates with preliminary sediment toxicity results. A total of 26 altered metabolites (including two unidentified peaks) were successfully identified in a comparison of zebra mussels from the LMMB1 site and LMMB5 reference site. The application of both uni- and multivariate analysis not only confirmed the variability of altered metabolites but also ensured that these metabolites were identified via unbiased analysis. This study has demonstrated the feasibility of the NMR-based metabolomics approach to assess whole-body metabolomics of zebra mussels to study the physiological impact of toxicant exposure at field sites.

6.
PLoS One ; 10(9): e0138083, 2015.
Article in English | MEDLINE | ID: mdl-26372011

ABSTRACT

There is a lack of data for how the viability of biological agents may degrade over time in different environments. In this study, experiments were conducted to determine the persistence of Bacillus anthracis and Bacillus subtilis spores on outdoor materials with and without exposure to simulated sunlight, using ultraviolet (UV)-A/B radiation. Spores were inoculated onto glass, wood, concrete, and topsoil and recovered after periods of 2, 14, 28, and 56 days. Recovery and inactivation kinetics for the two species were assessed for each surface material and UV exposure condition. Results suggest that with exposure to UV, decay of spore viability for both Bacillus species occurs in two phases, with an initial rapid decay, followed by a slower inactivation period. The exception was with topsoil, in which there was minimal loss of spore viability in soil over 56 days, with or without UV exposure. The greatest loss in viable spore recovery occurred on glass with UV exposure, with nearly a four log10 reduction after just two days. In most cases, B. subtilis had a slower rate of decay than B. anthracis, although less B. subtilis was recovered initially.


Subject(s)
Bacillus anthracis/physiology , Bacillus anthracis/radiation effects , Bacillus subtilis/physiology , Bacillus subtilis/radiation effects , Ultraviolet Rays , Dose-Response Relationship, Radiation , Kinetics , Microbial Viability/radiation effects , Porosity , Species Specificity , Spores, Bacterial/physiology , Spores, Bacterial/radiation effects
7.
PLoS One ; 9(12): e114082, 2014.
Article in English | MEDLINE | ID: mdl-25470365

ABSTRACT

A series of experiments was conducted to explore the utility of composite-based collection of surface samples for the detection of a Bacillus anthracis surrogate using cellulose sponge samplers on a nonporous stainless steel surface. Two composite-based collection approaches were evaluated over a surface area of 3716 cm2 (four separate 929 cm2 areas), larger than the 645 cm2 prescribed by the standard Centers for Disease Control (CDC) and Prevention cellulose sponge sampling protocol for use on nonporous surfaces. The CDC method was also compared to a modified protocol where only one surface of the sponge sampler was used for each of the four areas composited. Differences in collection efficiency compared to positive controls and the potential for contaminant transfer for each protocol were assessed. The impact of the loss of wetting buffer from the sponge sampler onto additional surface areas sampled was evaluated. Statistical tests of the results using ANOVA indicate that the collection of composite samples using the modified sampling protocol is comparable to the collection of composite samples using the standard CDC protocol (p  =  0.261). Most of the surface-bound spores are collected on the first sampling pass, suggesting that multiple passes with the sponge sampler over the same surface may be unnecessary. The effect of moisture loss from the sponge sampler on collection efficiency was not significant (p  =  0.720) for both methods. Contaminant transfer occurs with both sampling protocols, but the magnitude of transfer is significantly greater when using the standard protocol than when the modified protocol is used (p<0.001). The results of this study suggest that composite surface sampling, by either method presented here, could successfully be used to increase the surface area sampled per sponge sampler, resulting in reduced sampling times in the field and decreased laboratory processing cost and turn-around times.


Subject(s)
Bacillus anthracis/isolation & purification , Cellulose/chemistry , Specimen Handling/methods , Bacillus anthracis/physiology , Environmental Monitoring , Equipment Contamination , Humidity , Specimen Handling/standards , Spores, Bacterial/isolation & purification , Surface Properties
8.
Behav Med ; 35(4): 112-25, 2010.
Article in English | MEDLINE | ID: mdl-19933058

ABSTRACT

African Americans have greater misperceptions about heart failure (HF) than Caucasians. We examined socioeconomic and medical history factors to determine if they explain differences in accuracy of HF illness beliefs by race. 519 patients completed an illness beliefs and socioeconomic status survey. After establishing univariate associations by race, linear regression with backward selection was used to identify factors associated with HF illness beliefs accuracy. HF illness beliefs were less accurate among African Americans (p < .01). In multivariate models, race remained a predictor of HF illness beliefs accuracy, as did education level and living status (all ps < or = .01). Illness beliefs of African Americans were inaccurate and independently associated with social support and education level. Health care providers must consider patient education processes as a possible cause of differences and focus on what and how they teach, literacy level, materials used, and family engagement and education.


Subject(s)
Attitude to Health/ethnology , Black or African American , Heart Failure/ethnology , Heart Failure/psychology , Racial Groups , Age Factors , Analysis of Variance , Cohort Studies , Databases, Factual , Educational Status , Female , Humans , Interviews as Topic , Linear Models , Male , Social Support , Socioeconomic Factors , Surveys and Questionnaires , United States
9.
Clin J Am Soc Nephrol ; 4(10): 1575-83, 2009 Oct.
Article in English | MEDLINE | ID: mdl-19808242

ABSTRACT

BACKGROUND AND OBJECTIVES: GFR is scaled to body surface area (S), whereas hemodialysis dosage is scaled to total body water (V). Scaling to metabolic rate (M) or liver size (L) has also been proposed. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: In 1551 potential kidney donors (662 men and 889 women) for whom GFR had been estimated from (125)I-iothalamate clearance (iGFR) between the years 1973 and 2005, iGFR scaling was examined. Scaling was to estimates of S, V, M, or L. The study looked at the variation of iGFR by gender, age, S, V, M, and L within the study population. RESULTS: In multiple regression analysis, neither gender nor race was significantly associated with iGFR after controlling for height, weight, and age. Raw iGFR averaged 122 +/- 23 ml/min in men and 106 +/- 21 ml/min in women (P < 0.001). In an adjusted analysis, iGFR scaled to S or L was similar for men and women (NS), whereas iGFR scaled to either V or M was substantially different between the genders (P < 0.001). When the patients by gender were divided into five quintiles of V or S, the iGFR-V ratio varied more with body size than iGFR scaled to the other measures. CONCLUSIONS: iGFR scaled to S or L was similar in men and women. Scaling to either M or V resulted in a sizeable gender difference, whereas scaling to V led to markedly different values of iGFR across body size.


Subject(s)
Body Surface Area , Body Water/metabolism , Glomerular Filtration Rate , Liver/anatomy & histology , Tissue Donors , Adult , Age Factors , Body Height , Body Weight , Female , Humans , Male , Middle Aged , Organ Size , Regression Analysis
10.
Kidney Int ; 75(10): 1079-87, 2009 May.
Article in English | MEDLINE | ID: mdl-19212414

ABSTRACT

Due to the shortage of organs, living donor acceptance criteria are becoming less stringent. An accurate determination of the glomerular filtration rate (GFR) is critical in the evaluation of living kidney donors and a value exceeding 80 ml/min per 1.73 m(2) is usually considered suitable. To improve strategies for kidney donor screening, an understanding of factors that affect GFR is needed. Here we studied the relationships between donor GFR measured by (125)I-iothalamate clearances (mGFR) and age, gender, race, and decade of care in living kidney donors evaluated at the Cleveland Clinic from 1972 to 2005. We report the normal reference ranges for 1057 prospective donors (56% female, 11% African American). Females had slightly higher mGFR than males after adjustment for body surface area, but there were no differences due to race. The lower limit of normal for donors (5th percentile) was less than 80 ml/min per 1.73 m(2) for females over age 45 and for males over age 40. We found a significant doubling in the rate of GFR decline in donors over age 45 as compared to younger donors. The age of the donors and body mass index increased over time, but their mGFR, adjusted for body surface area, significantly declined by 1.49+/-0.61 ml/min per 1.73 m(2) per decade of testing. Our study shows that age and gender are important factors determining normal GFR in living kidney donors.


Subject(s)
Glomerular Filtration Rate , Kidney Transplantation/standards , Living Donors , Adult , Black or African American , Age Factors , Female , Humans , Living Donors/supply & distribution , Male , Middle Aged , Reference Values , Sex Factors , White People
11.
Appl Nurs Res ; 21(4): 181-90, 2008 Nov.
Article in English | MEDLINE | ID: mdl-18995159

ABSTRACT

BACKGROUND: Patients and visitors may perceive nurses as professional based on uniform color and style. Nurse image may affect patient and visitor trust and satisfaction with nursing care. Fitted white dresses have been replaced by loose-fitting or scrub white, colored, or patterned pant sets. OBJECTIVES: This study examines nurse professionalism by assessing the nurse image traits of eight pant uniforms as perceived by pediatric patients, adult patients, and adult visitors. We also examined if uniform preference is congruent with nurse image traits. METHOD: A convenience sample of 499 patients and visitors were surveyed at a large Midwestern tertiary health care center. Subjects viewed photographs of the same registered nurse identically posed in eight uniforms and rated each by image traits. Kruskal-Wallis, Steel-Dwass multiple comparison method, and Wilcoxon signed-rank sum tests were used to test for differences in the Nurse Image Scale (NIS) score by uniform style and color and subject demographics. RESULTS: Subjects were 390 adult patients and visitors (78%) and 109 pediatric patients (21.4%); 66% were female, and 78% were Caucasian. In adults, NIS scores for white uniforms (two styles) were higher than NIS scores for uniforms with small print, bold print, or solid color (all p < .001). White uniform NIS score increased with subject age (all < or = .007). In pediatric patients (7-17 years) and young adults (18-44 years), the highest uniform NIS scores did not differ significantly from the others. Uniform preference was different from NIS score in pediatric and adult subjects, reflecting noncongruence between the perception of nurse professionalism by uniform and uniform preference. DISCUSSION: With aging, adults create perceptions of nurse professionalism based on uniform color and style. Traits of nurse professionalism were highest in white uniforms. Future research is needed to determine if transition to white nurse uniforms improves patient and family satisfaction with nursing care.


Subject(s)
Clothing , Family Nursing , Nurse-Patient Relations , Pediatric Nursing , Social Perception , Adolescent , Adult , Aged , Aged, 80 and over , Attitude to Health , Child , Female , Humans , Male , Middle Aged , Professional Practice , Young Adult
12.
Transplantation ; 86(2): 223-30, 2008 Jul 27.
Article in English | MEDLINE | ID: mdl-18645483

ABSTRACT

BACKGROUND: Accurate determination of kidney function is critical in the evaluation of living kidney donors and higher donor glomerular filtration rate (GFR) is associated with better allograft outcomes. However, among transplant centers donor kidney function evaluation varies widely. METHODS: The performance of creatinine clearance (CrCl), Modification of Diet in Renal Disease (MDRD), the re-expressed MDRD equations with standardized creatinine, and the Cockcroft-Gault (CG) formula as compared with (125)I-iothalamate GFR (iGFR) was analyzed in 423 donors. All methods of GFR measurement were then evaluated for their association with graft function at 1 year. RESULTS: The MDRD and re-expressed MDRD equations underestimated iGFR whereas CG showed minimal bias (median difference=-11.0, -16.3, and -0.5 mL/min/1.73 m(2), respectively). CrCl overestimated iGFR (10 mL/min/1.73 m(2)). The MDRD, re-expressed MDRD, and CG formulas were more accurate (88%, 86%, and 88% of estimates within 30% of iGFR, respectively) than CrCl (80% within 30% of iGFR). Interestingly, low bias and high accuracy were achieved by averaging the MDRD estimation with the CrCl result; both methods available to the clinician in most transplant centers. We also showed that predonation GFR as measured by isotopic renal clearance or any of the creatinine-based estimation formulas may be associated with allograft function at 1 year, whereas the widely used CrCl was not. CONCLUSIONS: Variable performance was seen among different GFR estimations, with CrCl being the poorest. Recent recommendations to use the MDRD equation with standardized serum creatinine did not improve its performance. However, recognizing the limited availability of GFR laboratories, these methods are still clinically useful if used with caution and understanding their limitations.


Subject(s)
Creatinine/blood , Creatinine/urine , Glomerular Filtration Rate , Kidney Transplantation/methods , Adult , Cohort Studies , Female , Humans , Kidney Function Tests , Living Donors , Male , Middle Aged , ROC Curve , Retrospective Studies , Sensitivity and Specificity , Treatment Outcome
13.
J Am Acad Dermatol ; 57(4): 581-7, 2007 Oct.
Article in English | MEDLINE | ID: mdl-17610990

ABSTRACT

BACKGROUND: Complications associated with psoriatic arthritis (PsA) may be prevented with early diagnosis and initiation of therapy. Up to one third of psoriasis patients may have PsA. There is a need to screen psoriasis patients early for symptoms of PsA. OBJECTIVE: To develop and validate a patient self-administered tool to screen psoriasis patients for signs and symptoms of inflammatory arthritis. METHODS: The questionnaire (PASE; Psoriatic Arthritis Screening and Evaluation) was developed using standardized methodology for the development of both functional and health-related instruments geared toward musculoskeletal diseases. A multidisciplinary team of dermatologists, rheumatologists, and patient focus groups were involved in the design of the questionnaire. RESULTS: A total of 69 participants with known psoriasis and PsA before the initiation of systemic therapy were screened with PASE after institutional review board approval. The average age was 51 years, and 51% of the participants were female. A total of 25% (17/69) were diagnosed with PsA in this study, and 37% (24/69) were diagnosed with osteoarthritis. Patients with concomitant PsA and osteoarthritis were excluded. PASE total scores ranged from 23 to 68 (possible range, 15-75). In patients with PsA, the median total score was 53 (25th and 75th percentiles, 49 and 63, respectively) and 39 (25th and 75th percentiles, 28 and 47) in non-PsA patients (P < .001). Median PASE total score for osteoarthritis patients was 43 (25th and 75th percentiles, 37 and 51) and significantly different to PsA patient total median scores (P = .002). Using receiver operator curves, we determined that PASE total score > or =47 was able to distinguish PsA from non-PsA patients with 82% sensitivity and 73% specificity. LIMITATIONS: PASE is a screening tool for PsA and does not replace a comprehensive musculoskeletal evaluation by a rheumatologist. CONCLUSION: The PASE questionnaire is a self-administered tool that can be used to screen for PsA among patients with psoriasis. PASE can distinguish between symptoms of PsA and osteoarthritis. A larger study is needed to validate PASE in dermatology clinics in the community.


Subject(s)
Arthritis, Psoriatic/diagnosis , Mass Screening/methods , Surveys and Questionnaires , Female , Humans , Male , Middle Aged , Pilot Projects
14.
Nephrol Dial Transplant ; 22(8): 2304-15, 2007 Aug.
Article in English | MEDLINE | ID: mdl-17510100

ABSTRACT

BACKGROUND: In 1995, we described the technique of adapting a haemodialysis (HD) machine to produce a composition-adjustable, bicarbonate-based fluid (as our primary source for dialysate) for continuous HD in intensive care unit (ICU) patients with acute renal failure (ARF). The following studies the clinical effects, biochemical changes and economic costs of this practice in a large cohort of patients at a single centre over the last 10 years. METHODS: The CCF-ARF Support Registry (1995-2001) was used to identify 405 patients initially supported with bicarbonate continuous HD. The registry is a prospective, observational cohort database that captures demographic, dialysis therapy, laboratory and outcome data. All supported ARF patients were recorded from 1995-98, and then one in five patients from 1999 to 2001. We also reviewed records of the individual dialysis procedures, dialysate disposal, dialysate monitoring tests and specific costs. RESULTS: Continuous HD was performed for 1292 +/- 587 days from 1994 to 2004. Demographics [age 59.57 +/- 14.41 years, weight 84.2 +/- 24 kg, male 65%, chronic kidney disease (CKD) 34%] and ICU mortality (60.5%) were comparable to other reported series. Day 4 solute [BUN 52.3 mg/dl (95% CI 49.6-54.9), creatinine 2.79 mg/dl (95% CI 2.64-2.95)], electrolyte and acid-base balance [bicarbonate 24.12 mmol/l (95% CI 23.7-24.6)] were well controlled. Dialysate monitoring revealed no positive cultures or elevated endotoxin levels. Variable-composition dialysate was achieved and delivered to all patients without adverse consequences. The cost of dialysate actually declined over time (1995 = $0.91/l, 2005 = $0.67/l). CONCLUSION: We have demonstrated that ICU ARF patients can be safely, effectively and economically supported with continuous HD using this source.


Subject(s)
Acute Kidney Injury/therapy , Dialysis Solutions/chemistry , Aged , Bicarbonates/pharmacology , Cohort Studies , Electrolytes , Female , Glucose/metabolism , Humans , Intensive Care Units , Male , Middle Aged , Prospective Studies , Time Factors , Treatment Outcome
15.
J Heart Lung Transplant ; 26(5): 466-71, 2007 May.
Article in English | MEDLINE | ID: mdl-17449415

ABSTRACT

BACKGROUND: Hepatitis C virus (HCV) infects 4 million people in the USA, with a prevalence of 1.4%. The seropositivity rate among potential lung transplant candidates is 1.9%, yet little information is available regarding outcomes of lung transplantation in HCV-positive lung transplant recipients. Our study reports outcomes of lung transplantation in HCV-positive recipients and compares them to HCV-negative controls. METHODS: A retrospective analysis of the Cleveland Clinic Foundation's lung transplant database (465 patients) identified six HCV-positive patients. Demographic data, etiology of HCV infection, HCV viral load pre- and post-transplant, pre-transplant hepatic pathology, serial transaminases, incidence of acute hepatitis, graft function data and patient survival data were obtained by chart extraction. RESULTS: Five HCV-positive recipients had a pre-transplant liver biopsy, none of whom had evidence of cirrhosis pre-transplant. Although HCV RNA levels markedly increased post-transplant, no concomitant increase in transaminases was noted. There was no significant difference in the incidence of acute rejection at 1 year in our HCV-positive cohort compared with the HCV-negative lung transplant recipients from our institution. One patient developed bronchiolitis obliterans syndrome (BOS) during the follow-up period. Two patient deaths occurred, one at 8 months and the other at 2 years post-transplant. No evidence of hepatic dysfunction was noted in either deceased patient. The four surviving patients are alive at a median 3.2 years (range 1 to 6 years). CONCLUSIONS: No significant difference in patient or graft survival was noted between the HCV-positive lung transplant recipients and the HCV-negative recipients.


Subject(s)
Cause of Death , Hepatitis C, Chronic/diagnosis , Hepatitis C, Chronic/mortality , Lung Transplantation/mortality , Adult , Cohort Studies , Female , Graft Rejection , Graft Survival , Humans , Liver Function Tests , Lung Transplantation/adverse effects , Lung Transplantation/methods , Male , Middle Aged , Probability , Reference Values , Retrospective Studies , Risk Assessment , Severity of Illness Index , Survival Rate , Time Factors
16.
Nurs Econ ; 25(6): 339-44, 2007.
Article in English | MEDLINE | ID: mdl-18240835

ABSTRACT

To create new opportunities for nurses to reenter the workforce, a RN p.r.n. program must meet the needs of nurses who wish to make dual commitments to home and work. The Parent Shift program provides an innovative model of attracting and retaining nurses in a hospital workforce. Many nurses who joined the program were away from the field of nursing for many years and were drawn to the program because of the promise of flexible shifts and minimal requirements for participation. In this study, flexible shifts not only encouraged program entry, they were also a powerful motivator for continued program participation over time. Parent Shift nurse presence was perceived by nurse managers to decrease stressors and improve time efficiency of full-shift staff.


Subject(s)
Employee Incentive Plans , Nursing Staff, Hospital/organization & administration , Parents , Personnel Staffing and Scheduling , Female , Humans , Job Satisfaction , Middle Aged , Midwestern United States , Models, Nursing , Nursing Staff, Hospital/supply & distribution , Personnel Staffing and Scheduling/organization & administration , Program Evaluation , Prospective Studies , Task Performance and Analysis
SELECTION OF CITATIONS
SEARCH DETAIL
...