Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
Add more filters

Affiliation country
Publication year range
1.
Clin Infect Dis ; 78(3): 535-543, 2024 03 20.
Article in English | MEDLINE | ID: mdl-37823421

ABSTRACT

BACKGROUND: Nontyphoidal Salmonella causes an estimated 1.35 million US infections annually. Antimicrobial-resistant strains are a serious public health threat. We examined the association between resistance and the clinical outcomes of hospitalization, length-of-stay ≥3 days, and death. METHODS: We linked epidemiologic data from the Foodborne Diseases Active Surveillance Network with antimicrobial resistance data from the National Antimicrobial Resistance Monitoring System (NARMS) for nontyphoidal Salmonella infections from 2004 to 2018. We defined any resistance as resistance to ≥1 antimicrobial and clinical resistance as resistance to ampicillin, azithromycin, ceftriaxone, ciprofloxacin, or trimethoprim-sulfamethoxazole (for the subset of isolates tested for all 5 agents). We compared outcomes before and after adjusting for age, state, race/ethnicity, international travel, outbreak association, and isolate serotype and source. RESULTS: Twenty percent of isolates (1105/5549) had any resistance, and 16% (469/2969) had clinical resistance. Persons whose isolates had any resistance were more likely to be hospitalized (31% vs 28%, P = .01) or have length-of-stay ≥3 days (20% vs 16%, P = .01). Deaths were rare but more common among those with any than no resistance (1.0% vs 0.4%, P = .01). Outcomes for patients whose isolates had clinical resistance did not differ significantly from those with no resistance. After adjustment, any resistance (adjusted odds ratio 1.23, 95% confidence interval 1.04-1.46) remained significantly associated with hospitalization. CONCLUSIONS: We observed a significant association between nontyphoidal Salmonella infections caused by resistant pathogens and likelihood of hospitalization. Clinical resistance was not associated with poorer outcomes, suggesting that factors other than treatment failure (eg, strain virulence, strain source, host factors) may be important.


Subject(s)
Anti-Infective Agents , Foodborne Diseases , Salmonella Infections , Humans , Anti-Bacterial Agents/pharmacology , Anti-Bacterial Agents/therapeutic use , Watchful Waiting , Microbial Sensitivity Tests , Salmonella Infections/drug therapy , Salmonella Infections/epidemiology , Foodborne Diseases/epidemiology
2.
Br J Nutr ; 127(7): 1018-1025, 2022 04 14.
Article in English | MEDLINE | ID: mdl-34078482

ABSTRACT

Fe deficiency has negative effects on voluntary physical activity (PA); however, the impact of consuming Fe-biofortified staple foods on voluntary PA remains unclear. This study compared the effects of consuming Fe-biofortified pearl millet or a conventional pearl millet on measures of voluntary PA in Indian schoolchildren (ages 12-16 years) during a 6-month randomised controlled feeding trial. PA data were collected from 130 children using Actigraph GT3X accelerometers for 6 d at baseline and endline. Minutes spent in light and in moderate-to-vigorous PA were calculated from accelerometer counts using Crouter's refined two-regression model for children. Mixed regression models adjusting for covariates were used to assess relationships between intervention treatment or change in Fe status and PA. Children who consumed Fe-biofortified pearl millet performed 22·3 (95 % CI 1·8, 42·8, P = 0·034) more minutes of light PA each day compared with conventional pearl millet. There was no effect of treatment on moderate-to-vigorous PA. The amount of Fe consumed from pearl millet was related to minutes spent in light PA (estimate 3·4 min/mg Fe (95 % CI 0·3, 6·5, P = 0·031)) and inversely related to daily sedentary minutes (estimate -5·4 min/mg Fe (95 % CI -9·9, -0·9, P = 0·020)). Consuming Fe-biofortified pearl millet increased light PA and decreased sedentary time in Indian schoolchildren in a dose-dependent manner.


Subject(s)
Iron , Pennisetum , Adolescent , Child , Exercise , Food, Fortified , Humans
3.
Clin Infect Dis ; 70(9): 2005-2007, 2020 04 15.
Article in English | MEDLINE | ID: mdl-31504307

ABSTRACT

Most persons with chronic hepatitis C virus (HCV) infection in the United States are undiagnosed or linked to care. We describe a program for the management of Alaska Native patients infection utilizing a computerized registry and statewide liver clinics resulting in higher linkage to care (86%) than national estimates (~25%).


Subject(s)
Hepatitis C, Chronic , Hepatitis C , Alaska/epidemiology , Hepatitis C, Chronic/drug therapy , Hepatitis C, Chronic/epidemiology , Humans , Registries , United States
4.
J Nutr ; 150(5): 1093-1099, 2020 05 01.
Article in English | MEDLINE | ID: mdl-32006009

ABSTRACT

BACKGROUND: Iron-biofortified staple foods can improve iron status and resolve iron deficiency. However, whether improved iron status from iron biofortification can improve physical performance remains unclear. OBJECTIVE: This study aimed to examine whether changes in iron status from an iron-biofortified bean intervention affect work efficiency. METHODS: A total of 125 iron-depleted (ferritin <20 µg/L) female Rwandan university students (18-26 y) were selected from a larger sample randomly assigned to consume iron-biofortified beans (Fe-Bean; 86.1 mg Fe/kg) or conventional beans (control: 50.6 mg Fe/kg) twice daily for 18 wk (average of 314 g beans consumed/d). Blood biomarkers of iron status (primary outcome) and physical work efficiency (secondary outcome) were measured before and after the intervention. Work performed was assessed during 5-min steady-state periods at 0-, 25-, and 40-W workloads using a mechanically braked cycle ergometer. Work efficiency was calculated at 25 W and 40 W as the work accomplished divided by the energy expended at that workload above that expended at 0 W. General linear models were used to evaluate the relation between changes in iron status biomarkers and work efficiency. RESULTS: The Fe-Bean intervention had significant positive effects on hemoglobin, serum ferritin, and body iron stores but did not affect work efficiency. However, 18-wk change in hemoglobin was positively related to work efficiency at 40 W in the full sample (n = 119; estimate: 0.24 g/L; 95% CI: 0.01, 0.48 g/L; P = 0.044) and among women who were anemic (hemoglobin <120 g/L) at baseline (n = 43; estimate: 0.64 g/L; 95% CI: 0.05, 1.23 g/L; P = 0.036). Among women who were nonanemic at baseline, change in serum ferritin was positively related to change in work efficiency at 40 W (n = 60; estimate: 0.50 µg/L; 95% CI: 0.06, 0.95 µg/L; P = 0.027). CONCLUSIONS: Increasing iron status during an iron-biofortified bean feeding trial improves work efficiency in iron-depleted, sedentary women. This trial was registered at clinicaltrials.gov as NCT01594359.


Subject(s)
Anemia, Iron-Deficiency/prevention & control , Biofortification , Fabaceae , Iron/administration & dosage , Biological Availability , Female , Food, Fortified , Humans , Rwanda , Young Adult
5.
MMWR Morb Mortal Wkly Rep ; 67(39): 1098-1100, 2018 Oct 05.
Article in English | MEDLINE | ID: mdl-30286052

ABSTRACT

Foodborne salmonellosis causes an estimated 1 million illnesses and 400 deaths annually in the United States (1). In recent years, salmonellosis outbreaks have been caused by foods not typically associated with Salmonella. On May 2, 2017, PulseNet, CDC's national molecular subtyping network for foodborne disease surveillance, identified a cluster of 14 Salmonella Chailey isolates with a rare pulsed-field gel electrophoresis (PFGE) pattern. On May 29, Canadian health officials informed CDC that they were also investigating a cluster of five Salmonella Chailey infections in British Columbia with the same PFGE pattern. Nineteen cases were identified and investigated by CDC, U.S. state health departments, the Public Health Agency of Canada, and the British Columbia Centre for Disease Control. Isolates from all cases were highly related by whole genome sequencing (WGS). Illness onset dates ranged from March 10 to May 7, 2017. Initial interviews revealed that infected persons consumed various fresh foods and shopped at grocery chain A; focused questionnaires identified precut coconut pieces from grocery chain A as a common vehicle. The Canadian Food Inspection Agency (CFIA) and the U.S. Food and Drug Administration (FDA) conducted a traceback investigation that implicated a single lot of frozen, precut coconut as the outbreak source. Grocery chain A voluntarily removed precut coconut pieces from their stores. This action likely limited the size and scope of this outbreak.


Subject(s)
Cocos/microbiology , Disease Outbreaks , Food Microbiology , Salmonella Food Poisoning/epidemiology , Salmonella/isolation & purification , Adolescent , Adult , Aged , Aged, 80 and over , Canada/epidemiology , Child , Child, Preschool , Electrophoresis, Gel, Pulsed-Field , Female , Humans , Infant , Male , Middle Aged , United States/epidemiology , Young Adult
6.
MMWR Morb Mortal Wkly Rep ; 67(23): 659-662, 2018 Jun 15.
Article in English | MEDLINE | ID: mdl-29902164

ABSTRACT

On June 26, 2017, a hospital in southern Utah notified the Utah Department of Health of Shiga toxin-producing Escherichia coli (STEC) O157:H7 infections in two children from a small community on the Arizona-Utah border. Both children developed hemolytic uremic syndrome, characterized by hemolytic anemia, acute kidney failure, and thrombocytopenia and died within a few days of illness onset. Over the next few days, several more STEC-associated illnesses were reported in residents of the community. A joint investigation by local and state health agencies from Arizona and Utah and CDC was initiated to identify the outbreak source and prevent additional cases; a total of 12 cases were identified, including the two children who died. Investigators initially explored multiple potential sources of illness; epidemiologic and environmental information revealed cow manure contact as the likely initial cause of the outbreak, which was followed by subsequent person-to-person transmission. One of the outbreak strains was isolated from bull and horse manure collected from a yard near a community household with two ill children. Local health agencies made recommendations to the public related to both animal contact and hand hygiene to reduce the risk for STEC transmission. Animal or animal manure contact should be considered a potential source of STEC O157:H7 during outbreaks in communities where ruminants are kept near the home.


Subject(s)
Disease Outbreaks , Environmental Exposure/adverse effects , Escherichia coli Infections/epidemiology , Escherichia coli O157/isolation & purification , Manure/microbiology , Rural Population , Shiga-Toxigenic Escherichia coli/isolation & purification , Adolescent , Adult , Animals , Arizona/epidemiology , Cattle , Child , Child, Preschool , Female , Horses , Humans , Infant , Male , Rural Population/statistics & numerical data , Utah/epidemiology , Young Adult
7.
J Nutr ; 146(8): 1586-92, 2016 08.
Article in English | MEDLINE | ID: mdl-27358417

ABSTRACT

BACKGROUND: Food-based strategies to reduce nutritional iron deficiency have not been universally successful. Biofortification has the potential to become a sustainable, inexpensive, and effective solution. OBJECTIVE: This randomized controlled trial was conducted to determine the efficacy of iron-biofortified beans (Fe-Beans) to improve iron status in Rwandan women. METHODS: A total of 195 women (aged 18-27 y) with serum ferritin <20 µg/L were randomly assigned to receive either Fe-Beans, with 86 mg Fe/kg, or standard unfortified beans (Control-Beans), with 50 mg Fe/kg, 2 times/d for 128 d in Huye, Rwanda. Iron status was assessed by hemoglobin, serum ferritin, soluble transferrin receptor (sTfR), and body iron (BI); inflammation was assessed by serum C-reactive protein (CRP) and serum α1-acid glycoprotein (AGP). Anthropometric measurements were performed at baseline and at end line. Random weekly serial sampling was used to collect blood during the middle 8 wk of the feeding trial. Mixed-effects regression analysis with repeated measurements was used to evaluate the effect of Fe-Beans compared with Control-Beans on iron biomarkers throughout the course of the study. RESULTS: At baseline, 86% of subjects were iron-deficient (serum ferritin <15 µg/L) and 37% were anemic (hemoglobin <120 g/L). Both groups consumed an average of 336 g wet beans/d. The Fe-Beans group consumed 14.5 ± 1.6 mg Fe/d from biofortified beans, whereas the Control-Beans group consumed 8.6 ± 0.8 mg Fe/d from standard beans (P < 0.05). Repeated-measures analyses showed significant time-by-treatment interactions for hemoglobin, log serum ferritin, and BI (P < 0.05). The Fe-Beans group had significantly greater increases in hemoglobin (3.8 g/L), log serum ferritin (0.1 log µg/L), and BI (0.5 mg/kg) than did controls after 128 d. For every 1 g Fe consumed from beans over the 128 study days, there was a significant 4.2-g/L increase in hemoglobin (P < 0.05). CONCLUSION: The consumption of iron-biofortified beans significantly improved iron status in Rwandan women. This trial was registered at clinicaltrials.gov as NCT01594359.


Subject(s)
Anemia, Iron-Deficiency/diet therapy , Diet , Fabaceae , Food, Fortified , Iron, Dietary/therapeutic use , Iron/therapeutic use , Nutritional Status , Adult , Anemia/epidemiology , Anemia, Iron-Deficiency/blood , Anemia, Iron-Deficiency/epidemiology , C-Reactive Protein/metabolism , Feeding Behavior , Female , Ferritins/blood , Hemoglobins/metabolism , Humans , Iron/blood , Iron/pharmacology , Iron Deficiencies , Iron, Dietary/blood , Iron, Dietary/pharmacology , Receptors, Transferrin/blood , Rwanda/epidemiology , Young Adult
8.
J Nutr ; 145(7): 1576-81, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25948782

ABSTRACT

BACKGROUND: Iron deficiency is the most widespread nutritional deficiency in the world. OBJECTIVE: The objective of this randomized efficacy trial was to determine the effects of iron-biofortified pearl millet (Fe-PM) on iron status compared with control pearl millet (Control-PM). METHODS: A randomized trial of biofortified pearl millet (Pennisetum glaucum), bred to enhance iron content, was conducted in 246 children (12-16 y) for 6 mo in Maharashtra, India. Iron status [hemoglobin, serum ferritin (SF), soluble transferrin receptor (sTfR), and total body iron (TBI)], inflammation (C-reactive protein and α-1 acid glycoprotein), and anthropometric indices were evaluated at enrollment and after 4 and 6 mo. Hodges-Lehmann-Sen 95% CIs were used to examine the effect of the Fe-PM on iron status compared with commercially available Control-PM. Linear and binomial regression models were used to evaluate the effects of Fe-PM on iron status and incidence of anemia and iron deficiency, compared with Control-PM. RESULTS: At baseline, 41% of children were iron deficient (SF <15 µg/L) and 28% were anemic (hemoglobin <12.0 g/dL). Fe-PM significantly increased SF concentrations and TBI after 4 mo compared with Control-PM. Among children who were iron deficient at baseline, those who received Fe-PM were 1.64 times more likely to become iron replete by 6 mo than were those receiving Control-PM (RR: 1.64, 95% CI: 1.07, 2.49, P = 0.02). The effects of Fe-PM on iron status were greater among children who were iron deficient at baseline than among children who were not iron deficient at baseline. CONCLUSIONS: Fe-PM significantly improved iron status in children by 4 mo compared with Control-PM. This study demonstrated that feeding Fe-PM is an efficacious approach to improve iron status in school-age children and it should be further evaluated for effectiveness in a broader population context. This trial was registered at clinicaltrials.gov as NCT02152150.


Subject(s)
Food, Fortified , Iron, Dietary/administration & dosage , Pennisetum/chemistry , Adolescent , Anemia, Iron-Deficiency/blood , Anemia, Iron-Deficiency/diet therapy , C-Reactive Protein/metabolism , Child , Double-Blind Method , Female , Ferritins/blood , Follow-Up Studies , Hemoglobins/metabolism , Humans , India , Iron, Dietary/blood , Linear Models , Male , Nutritional Status , Orosomucoid/metabolism , Prospective Studies , Receptors, Transferrin/blood , Treatment Outcome
9.
Ann Surg ; 260(2): 214-7, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24670856

ABSTRACT

OBJECTIVE: To reduce the incidence of surgical fires. BACKGROUND: Operating room fires represent a potentially life-threatening hazard and are triggered by the electrosurgical unit (ESU) pencil. Carbon dioxide is a fire suppressant and is a routinely used medical gas. We hypothesize that a shroud of protective carbon dioxide covering the tip of the ESU pencil displaces oxygen, thereby preventing fire ignition. METHODS: Using 3-dimensional modeling techniques, a polymer sleeve was created and attached to an ESU pencil. This sleeve was connected to a carbon dioxide source and directed the gas through multiple precisely angled ports, generating a cone of fire-suppressive carbon dioxide surrounding the active pencil tip. This device was evaluated in a flammability test chamber containing 21%, 50%, and 100% oxygen with sustained ESU activation. The sleeve was tested with and without carbon dioxide (control) until a fuel was ignited or 30 seconds elapsed. Time to ignition was measured by high-speed videography. RESULTS: Fires were ignited with each control trial (15/15 trials). The control group median ± SD ignition time in 21% oxygen was 3.0 ± 2.4 seconds, in 50% oxygen was 0.1 ± 1.8 seconds, and in 100% oxygen was 0.03 ± 0.1 seconds. No fire was observed when the fire safety device was used in all concentrations of oxygen (0/15 trials; P < 0.0001). The exact 95% confidence interval for absolute risk reduction of fire ignition was 76% to 100%. CONCLUSIONS: A sleeve creating a cone of protective carbon dioxide gas enshrouding the sparks from an ESU pencil effectively prevents fire in a high-flammability model. Clinical application of this device may reduce the incidence of operating room fires.


Subject(s)
Electrosurgery/instrumentation , Fires/prevention & control , Operating Rooms , Protective Devices , Carbon Dioxide , Equipment Design , Equipment Safety , Humans
10.
Anesth Analg ; 118(4): 772-5, 2014 Apr.
Article in English | MEDLINE | ID: mdl-24651231

ABSTRACT

Operating room fires are sentinel events that present a real danger to surgical patients and occur at least as frequently as wrong-sided surgery. For fire to occur, the 3 points of the fire triad must be present: an oxidizer, an ignition source, and fuel source. The electrosurgical unit (ESU) pencil triggers most operating room fires. Carbon dioxide (CO2) is a gas that prevents ignition and suppresses fire by displacing oxygen. We hypothesize that a device can be created to reduce operating room fires by generating a cone of CO2 around the ESU pencil tip. One such device was created by fabricating a divergent nozzle and connecting it to a CO2 source. This device was then placed over the ESU pencil, allowing the tip to be encased in a cone of CO2 gas. The device was then tested in 21%, 50%, and 100% oxygen environments. The ESU was activated at 50 W cut mode while placing the ESU pencil tip on a laparotomy sponge resting on an aluminum test plate for up to 30 seconds or until the sponge ignited. High-speed videography was used to identify time of ignition. Each test was performed in each oxygen environment 5 times with the device activated (CO2 flow 8 L/min) and with the device deactivated (no CO2 flow-control). In addition, 3-dimensional spatial mapping of CO2 concentrations was performed with a CO2 sampling device. The median ± SD [range] ignition time of the control group in 21% oxygen was 2.9 s ± 0.44 [2.3-3.0], in 50% oxygen 0.58 s ± 0.12 [0.47-0.73], and in 100% oxygen 0.48 s ± 0.50 [0.03-1.27]. Fires were ignited with each control trial (15/15); no fires ignited when the device was used (0/15, P < 0.0001). The CO2 concentration at the end of the ESU pencil tip was 95%, while the average CO2 concentration 1 to 1.4 cm away from the pencil tip on the bottom plane was 64%. In conclusion, an operating room fire prevention device can be created by using a divergent nozzle design through which CO2 passes, creating a cone of fire suppressant. This device as demonstrated in a flammability model effectively reduced the risk of fire. CO2 3-dimensional spatial mapping suggests effective fire reduction at least 1 cm away from the tip of the ESU pencil at 8 L/min CO2 flow. Future testing should determine optimum CO2 flow rates and ideal nozzle shapes. Use of this device may substantially reduce the risk of patient injury due to operating room fires.


Subject(s)
Carbon Dioxide , Electrosurgery/instrumentation , Fires/prevention & control , Operating Rooms , Carbon Dioxide/analysis , Environment , Oxygen/analysis , Risk Reduction Behavior , Surgical Instruments
11.
J Allied Health ; 53(3): e137-e145, 2024.
Article in English | MEDLINE | ID: mdl-39293013

ABSTRACT

INTRODUCTION: Doctor of physical therapy (DPT) students who experienced burnout during the COVID-19 pandemic are now entering the workforce. This study compared burnout and grit scores of DPT graduates who completed their education prior to the pandemic (Group A) with those who completed all DPT education during the pandemic (Group B). METHODS: This is a cross-sectional comparison of burnout and grit among two cohorts of graduates of an entry-level DPT program. Burnout was measured using the Maslach Burnout Inventory-Human Services Survey (MBI-HSS); grit was measured using the 12-Item Grit Scale. RESULTS: Burnout was significantly higher in Group B, as indicated by MBI-HSS subscales for Emotional Exhaustion, (H(1)=14.130, p<0.001) and Personal Accomplishment (H(1)=6.781, p=0.009). There were no significant differences in grit scores between the two groups (H(1)=3.286, p=0.07). CONCLUSION: Pandemic-trained physical therapists in this study were no less gritty than those who graduated prior to the pandemic but were significantly more burned out. IMPACT: Pandemic-trained clinicians and their hiring managers/mentors should screen and support employee mental health.


Subject(s)
Burnout, Professional , COVID-19 , Physical Therapists , Humans , COVID-19/epidemiology , Physical Therapists/psychology , Burnout, Professional/epidemiology , Cross-Sectional Studies , Male , Female , Adult , Pandemics , SARS-CoV-2
12.
J Immunother Cancer ; 12(5)2024 May 03.
Article in English | MEDLINE | ID: mdl-38702144

ABSTRACT

BACKGROUND: Natural killer (NK) cells are key effector cells of antitumor immunity. However, tumors can acquire resistance programs to escape NK cell-mediated immunosurveillance. Identifying mechanisms that mediate this resistance enables us to define approaches to improve immune-mediate antitumor activity. In previous studies from our group, a genome-wide CRISPR-Cas9 screen identified Charged Multivesicular Body Protein 2A (CHMP2A) as a novel mechanism that mediates tumor intrinsic resistance to NK cell activity. METHODS: Here, we use an immunocompetent mouse model to demonstrate that CHMP2A serves as a targetable regulator of not only NK cell-mediated immunity but also other immune cell populations. Using the recently characterized murine 4MOSC model system, a syngeneic, tobacco-signature murine head and neck squamous cell carcinoma model, we deleted mCHMP2A using CRISPR/Cas9-mediated knock-out (KO), following orthotopic transplantation into immunocompetent hosts. RESULTS: We found that mCHMP2A KO in 4MOSC1 cells leads to more potent NK-mediated tumor cell killing in vitro in these tumor cells. Moreover, following orthotopic transplantation, KO of mCHMP2A in 4MOSC1 cells, but not the more immune-resistant 4MOSC2 cells enables both T cells and NK cells to better mediate antitumor activity compared with wild type (WT) tumors. However, there was no difference in tumor development between WT and mCHMP2A KO 4MOSC1 or 4MOSC2 tumors when implanted in immunodeficient mice. Mechanistically, we find that mCHMP2A KO 4MOSC1 tumors transplanted into the immunocompetent mice had significantly increased CD4+T cells, CD8+T cells. NK cell, as well as fewer myeloid-derived suppressor cells (MDSC). CONCLUSIONS: Together, these studies demonstrate that CHMP2A is a targetable inhibitor of cellular antitumor immunity.


Subject(s)
Disease Models, Animal , Head and Neck Neoplasms , Killer Cells, Natural , Squamous Cell Carcinoma of Head and Neck , Animals , Humans , Mice , Cell Line, Tumor , Head and Neck Neoplasms/immunology , Head and Neck Neoplasms/genetics , Immunocompetence , Killer Cells, Natural/immunology , Squamous Cell Carcinoma of Head and Neck/immunology , Squamous Cell Carcinoma of Head and Neck/genetics
13.
Anesthesiology ; 119(4): 770-6, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23872933

ABSTRACT

BACKGROUND: Over 600 operating room fires occur annually although many cases go unreported. Over 81% of operating room fires involve surgical drapes, yet limited data exist on the differing degrees of flammability of drapes and other surgical fuel sources in varying oxygen concentrations. The purpose of this study is to assess the flammability characteristics of fuels in the operating room under varying oxygen concentrations. METHODS: Five fuel sources were analyzed in three levels of oxygen: 21%, 50%, and 100%. Three test samples of each material were burned in a manner similar to that established by the Consumer Product Safety Commission. Time to sample ignition and time to complete burn were measured with video analysis. RESULTS: The median [minimum, maximum] ignition time in 21% oxygen was 0.9 s [0.3, 1.9], in 50% oxygen 0.4 s [0.1, 1.2], and in 100% oxygen 0.2 s [0.0, 0.4]. The median burn time in 21% oxygen was 20.4 s [7.8, 33.5], in 50% oxygen 3.1 s [1.4, 8.1], and in 100% oxygen 1.7 s [0.6, 2.7]. Time to ignite and total burn times decreased as oxygen concentration increased (P < 0.001). Flammability characteristics differed by material and oxygen concentration. Utility drapes and surgical gowns did not support combustion in room air, whereas other materials quickly ignited. Flash fires were detected on woven cotton materials in oxygen-enriched environments. CONCLUSIONS: Operating room personnel should be aware that common materials in the operating room support rapid combustion in oxygen-enriched environments. The risk of ignition and speed of fire propagation increase as oxygen exposure increases. Advances in material science may reduce perioperative fire risk.


Subject(s)
Burns/prevention & control , Fires/statistics & numerical data , Materials Testing/methods , Operating Rooms/statistics & numerical data , Oxygen/analysis , Surgical Drapes/statistics & numerical data , Equipment Safety/statistics & numerical data , Fires/prevention & control , Humans , Materials Testing/statistics & numerical data , Patient Safety , Risk Factors , Time Factors
14.
J Cardiothorac Vasc Anesth ; 27(6): 1128-32, 2013 Dec.
Article in English | MEDLINE | ID: mdl-23992653

ABSTRACT

OBJECTIVE: To compare the noninvasive estimated continuous cardiac output (esCCO), device-derived cardiac output (CO) to simultaneous pulmonary artery catheter (PAC) thermodilution (TD) CO. DESIGN: A prospective study comparing pulse wave transit time (estimated continuous cardiac output, esCCO; Nihon Kohden, Tokyo, Japan) to intermittent TD CO. SETTING: One academic hospital. PARTICIPANTS: Patients presenting for cardiac surgery. INTERVENTIONS: Intraoperative CO measurements at 4 distinct time points (after induction, after sternotomy, after cardiopulmonary bypass, and after chest closure). MEASUREMENTS AND MAIN RESULTS: The study population consisted of American Society of Anesthesiologists (ASA) IV subjects, 27 (77%) males and 8 (23%) females, with a mean age of 64.6 ± 12.2 years. Data points from esCCO and TD were collected simultaneously and means per time point compared using Bland-Altman, Pearson R coefficient, and percent error. Mean TD CO for the study was 5.4 L/min. The Pearson R coefficient, percent error, and bias in L/min were: 0.57, 44%, 0.66 (after induction); 0.54, 51%, 0.88 (after sternotomy); 0.60, 60%, 0.95 (after cardiopulmonary bypass); and 0.57, 60%, 0.75 (after chest closure) respectively. CONCLUSIONS: esCCO is easy to use and provides continuous CO measurements, but has wide limits of agreement and large percentage errors with a consistently positive bias in comparison to TD.


Subject(s)
Cardiac Output/physiology , Heart Diseases/physiopathology , Thermodilution/methods , Adult , Aged , Anesthesia, General , Cardiac Surgical Procedures , Cardiopulmonary Bypass , Catheterization, Swan-Ganz , Female , Humans , Male , Middle Aged , Monitoring, Intraoperative , Pilot Projects , Prospective Studies
15.
Nutrients ; 11(2)2019 Feb 12.
Article in English | MEDLINE | ID: mdl-30759887

ABSTRACT

Iron deficiency is a major public health problem worldwide, with the highest burden among children. The objective of this randomized efficacy feeding trial was to determine the effects of consuming iron-biofortified beans (Fe-Beans) on the iron status in children, compared to control beans (Control-Beans). A cluster-randomized trial of biofortified beans (Phaseolus vulgaris L), bred to enhance iron content, was conducted over 6 months. The participants were school-aged children (n = 574; 5⁻12 years), attending 20 rural public boarding schools in the Mexican state of Oaxaca. Double-blind randomization was conducted at the school level; 20 schools were randomized to receive either Fe-Beans (n = 10 schools, n = 304 students) or Control-Beans (n = 10 schools, n = 366 students). School administrators, children, and research and laboratory staff were blinded to the intervention group. Iron status (hemoglobin (Hb), serum ferritin (SF), soluble transferrin receptor (sTfR), total body iron (TBI), inflammatory biomarkers C-reactive protein (CRP) and -1-acid glycoprotein (AGP)), and anthropometric indices for individuals were evaluated at the enrollment and at the end of the trial. The hemoglobin concentrations were adjusted for altitude, and anemia was defined in accordance with age-specific World Health Organization (WHO) criteria (i.e., Hb <115 g/L for <12 years and Hb <120 g/L for 12 years). Serum ferritin concentrations were adjusted for inflammation using BRINDA methods, and iron deficiency was defined as serum ferritin at less than 15.0 µg/L. Total body iron was calculated using Cook's equation. Mixed models were used to examine the effects of Fe-Beans on hematological outcomes, compared to Control-Beans, adjusting for the baseline indicator, with school as a random effect. An analysis was conducted in 10 schools (n = 269 students) in the Fe-Beans group and in 10 schools (n = 305 students) in the Control-Beans group that completed the follow-up. At baseline, 17.8% of the children were anemic and 11.3% were iron deficient (15.9%, BRINDA-adjusted). A total of 6.3% of children had elevated CRP (>5.0 mg/L), and 11.6% had elevated AGP (>1.0 g/L) concentrations at baseline. During the 104 days when feeding was monitored, the total mean individual iron intake from the study beans (Fe-bean group) was 504 mg (IQR: 352, 616) over 68 mean feeding days, and 295 mg (IQR: 197, 341) over 67 mean feeding days in the control group (p < 0.01). During the cluster-randomized efficacy trial, indicators of iron status, including hemoglobin, serum ferritin, soluble transferrin receptor, and total body iron concentrations improved from the baseline to endline (6 months) in both the intervention and control groups. However, Fe-Beans did not significantly improve the iron status indicators, compared to Control-Beans. Similarly, there were no significant effects of Fe-Beans on dichotomous outcomes, including anemia and iron deficiency, compared to Control-Beans. In this 6-month cluster-randomized efficacy trial of iron-biofortified beans in school children in Mexico, indicators of iron status improved in both the intervention and control groups. However, there were no significant effects of Fe-Beans on iron biomarkers, compared to Control-Beans. This trial was registered at clinicaltrials.gov as NCT03835377.


Subject(s)
Anemia, Iron-Deficiency/epidemiology , Anemia, Iron-Deficiency/prevention & control , Food, Fortified , Iron/administration & dosage , Phaseolus , Biomarkers/blood , Child , Child, Preschool , Diet , Female , Ferritins/blood , Humans , Male , Mexico/epidemiology , Rural Population
16.
Open Forum Infect Dis ; 6(6): ofz223, 2019 Jun.
Article in English | MEDLINE | ID: mdl-31249845

ABSTRACT

BACKGROUND: Chronic hepatitis C virus (HCV) infection diminishes immune function through cell exhaustion and repertoire alteration. Direct acting antiviral (DAA)-based therapy can restore immune cell subset function and reduce exhaustion states. However, the extent of immune modulation following DAA-based therapy and the role that clinical and demographic factors play remain unknown. METHODS: We examined natural killer (NK) cell, CD4+, and CD8+ T cell subsets along with activation and exhaustion phenotypes across an observational study of sofosbuvir-based treatment for chronic HCV infection. Additionally, we examined the ability of clinical variables and duration of infection to predict 12 weeks of sustained virologic response (SVR12) immune marker outcomes. RESULTS: We show that sofosbuvir-based therapy restores NK cell subset distributions and reduces chronic activation by SVR12. Likewise, T cell subsets, including HCV-specific CD8+ T cells, show reductions in chronic exhaustion markers by SVR12. Immunosuppressive CD4+ regulatory T cells decrease at 4-weeks treatment and SVR12. We observe the magnitude and direction of change in immune marker values from pretreatment to SVR12 varies greatly among participants. Although we observed associations between the estimated date of infection, HCV diagnosis date, and extent of immune marker outcome at SVR12, our regression analyses did not indicate any factors as strong SVR12 outcome predictors. CONCLUSION: Our study lends further evidence of immune changes following sofosbuvir-based therapy. Further investigation beyond SVR12 and into factors that may predict posttreatment outcome is warranted.

17.
J Geriatr Phys Ther ; 41(2): 85-101, 2018.
Article in English | MEDLINE | ID: mdl-27824657

ABSTRACT

BACKGROUND AND PURPOSE: Lower extremity osteoarthritis (OA) is a common condition among older adults; given the risks of surgical and pharmaceutical interventions, conservative, lower-cost management options such as footwear warrant further investigation. This systematic review investigated the effects of footwear, including shoe inserts, in reducing lower extremity joint pain and improving gait, mobility, and quality of life in older adults with OA. METHODS: The CINAHL, SPORTDiscus, PubMed, RECAL, and Web of Knowledge databases were searched for publications from January 1990 to September 2014, using the terms "footwear," "shoes," "gait," "pain," and "older adult." Participants who were 50 years or older and those who had OA in at least one lower extremity joint narrowed the results. Outcomes of interest included measures of pain, comfort, function, gait, or quality of life. Exclusion criteria applied to participants with rheumatoid arthritis, amputation, diabetes, multiple sclerosis, use of modified footwear or custom orthotics, purely biomechanical studies, and outcomes of balance or falls only. Single-case studies, qualitative narrative descriptions, and expert opinions were also excluded. RESULTS: The initial search resulted in a total of 417 citations. Eleven articles met inclusion criteria. Two randomized controlled trials and 3 quasiexperimental studies reported lateral wedge insoles may have at least some pain-relieving effects and improved functional mobility in older adults at 4 weeks to 2 years' follow-up, particularly when used with subtalar and ankle strapping. Three randomized controlled trials with large sample sizes reported that lateral wedges provided no knee pain relief compared with flat insoles. Hardness of shoe soles did not significantly affect joint comfort in the foot in a quasiexperimental study. A quasiexperimental designed study investigating shock-absorbing insoles showed reduction in knee joint pain with 1 month of wear. Finally, a cross-sectional prognostic study indicated poor footwear at early ages exhibits an association with hindfoot pain later in life. DISCUSSION AND CONCLUSION: Because of the limited number of randomized control trials, it is not possible to make a definitive conclusion about the long-term effects of footwear on lower extremity joint pain caused by OA. There is mounting evidence that shock-absorbing insoles, subtalar strapping, and avoidance of high heels and sandals early in life may prevent lower extremity joint pain in older adults, but no conclusive evidence exists to show that lateral wedge insoles will provide long-term relief from knee joint pain and improved mobility in older adults with OA. More high-quality randomized control trials are needed to study the effectiveness of footwear and shoe inserts on joint pain and function in older adults with OA.


Subject(s)
Arthralgia/prevention & control , Foot Orthoses , Osteoarthritis, Knee/complications , Osteoarthritis, Knee/physiopathology , Shoes , Aged , Arthralgia/etiology , Cross-Sectional Studies , Female , Gait , Humans , Knee Joint , Male , Osteoarthritis, Knee/therapy , Quality of Life
18.
J Med Eng Technol ; 40(2): 29-34, 2016.
Article in English | MEDLINE | ID: mdl-26745650

ABSTRACT

The electrosurgical unit (ESU) utilizes an electrical discharge to cut and coagulate tissue and is often held above the surgical site, causing a spark to form. The voltage at which the spark is created, termed the breakdown voltage, is governed by the surrounding gaseous environment. Surgeons are now utilizing the ESU laparoscopically with carbon dioxide insufflation, potentially altering ESU operating characteristics. This study examines the clinical implications of altering gas composition by measuring the spark gap distance as a marker of breakdown voltage and use of the ESU on a biologic model, both in room air and carbon dioxide. Paschen's Law predicted a 35% decrease in gap distance in carbon dioxide, while testing revealed an average drop of 37-47% as compared to air. However, surgical model testing revealed no perceivable clinical difference. Electrosurgery can be performed in carbon dioxide environments, although surgeons should be aware of potentially altered ESU performance.


Subject(s)
Carbon Dioxide , Electrosurgery , Air , Animals , Cattle , Copper , Environment , Foot/surgery , Operating Rooms , Red Meat
SELECTION OF CITATIONS
SEARCH DETAIL