Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 89
Filter
1.
Pain Pract ; 23(2): 185-203, 2023 02.
Article in English | MEDLINE | ID: mdl-36251412

ABSTRACT

OBJECTIVES: Specialized pain rehabilitation is recognized as the treatment of choice for youth with pain-related disability. Appropriate outcomes for program evaluation are critical. This study aimed to summarize the effect domains and methods used to evaluate pediatric-specialized outpatient pain rehabilition programs, map them to the PedIMMPACT statement, and highlight future directions. METHODS: An integrated review framework, incorporating stakeholders, was used. Academic Search Complete, CINAHL, ERIC, MEDLINE, PsycINFO, and Google Scholar were searched for studies published in 1999-2021 featuring the treatment effects of specialized outpatient pain rehabilitation on youth with pain-related disability and their parents. Selected studies were critically appraised using the Quality Assessment Tool for Studies of Diverse Design, organized by study characteristics, and analyzed using constant comparison. RESULTS: From the 1951 potentially relevant titles, 37 studies were selected. Twenty-five effects targeted youth and 24 focused on parents, with a maximum of 15 youth and 11 parent effect domains (median = 5 domains per study). Although most studies measured a combination of effect domains and were inclusive of some recommended in the PedIMMPACT statement, no effect was measured consistently across studies. Youth physical functioning and parent emotional functioning were measured most often. Eighty-five instruments were used to assess youth outcomes and 59 for parents, with self-report questionnaires dominating. DISCUSSION: A lack of standardization exists associated with the domains and methods used to evaluate the effects of pediatric-specialized outpatient pain rehabilitation programs, hindering comparisons. Future program evaluations should be founded on their theory, aim, and anticipated outcomes.


Subject(s)
Emotions , Outpatients , Adolescent , Child , Humans , Pain Management , Pain
2.
Environ Int ; 127: 495-502, 2019 06.
Article in English | MEDLINE | ID: mdl-30981020

ABSTRACT

INTRODUCTION: Few studies have comprehensively characterized toxic chemicals related to waterpipe use and secondhand waterpipe exposure. This cross-sectional study investigated biomarkers of toxicants associated with waterpipe use and passive waterpipe exposure among employees at waterpipe venues. METHOD: We collected urine specimens from employees in waterpipe venues from Istanbul, Turkey and Moscow, Russia, and identified waterpipe and cigarette smoking status based on self-report. The final sample included 110 employees. Biomarkers of exposure to sixty chemicals (metals, volatile organic compounds (VOCs), polycyclic aromatic hydrocarbons (PAHs), nicotine, and heterocyclic aromatic amines (HCAAs)) were quantified in the participants' urine. RESULTS: Participants who reported using waterpipe had higher urinary manganese (geometric mean ratio (GMR): 2.42, 95% confidence interval (CI): 1.16, 5.07) than never/former waterpipe or cigarette smokers. Being exposed to more hours of secondhand smoke from waterpipes was associated with higher concentrations of cobalt (GMR: 1.38, 95% CI: 1.10, 1.75). Participants involved in lighting waterpipes had higher urinary cobalt (GMR: 1.43, 95% CI: 1.10, 1.86), cesium (GMR: 1.21, 95% CI: 1.00, 1.48), molybdenum (GMR: 1.45, 95% CI: 1.08, 1.93), 1-hydroxypyrene (GMR: 1.36, 95% CI: 1.03, 1.80), and several VOC metabolites. CONCLUSION: Waterpipe tobacco users and nonsmoking employees of waterpipe venues had higher urinary concentrations of several toxic metals including manganese and cobalt as well as of VOCs, in a distinct signature compared to cigarette smoke. Employees involved in lighting waterpipes may have higher exposure to multiple toxic chemicals compared to other employees.


Subject(s)
Occupational Exposure , Tobacco Smoke Pollution/analysis , Tobacco, Waterpipe , Water Pipe Smoking , Adult , Biomarkers/analysis , Cross-Sectional Studies , Female , Hazardous Substances/analysis , Humans , Male , Nicotine/analysis , Polycyclic Aromatic Hydrocarbons/analysis , Volatile Organic Compounds/analysis , Young Adult
4.
J Public Health Manag Pract ; 25 Suppl 1, Lead Poisoning Prevention: S23-S30, 2019.
Article in English | MEDLINE | ID: mdl-30507766

ABSTRACT

CONTEXT: The Lead and Multielement Proficiency (LAMP) program is an external quality assurance program promoting high-quality blood-lead measurements. OBJECTIVES: To investigate the ability of US laboratories, participating in the Centers for Disease Control and Prevention (CDC) LAMP program to accurately measure blood-lead levels (BLL) 0.70 to 47.5 µg/dL using evaluation criteria of ±2 µg/dL or 10%, whichever is greater. METHODS: The CDC distributes bovine blood specimens to participating laboratories 4 times per year. We evaluated participant performance over 5 challenges on samples with BLL between 0.70 and 47.5 µg/dL. The CDC sent 15 pooled samples (3 samples shipped in 5 rounds) to US laboratories. The LAMP laboratories used 3 primary technologies to analyze lead in blood: inductively coupled plasma mass spectrometry, graphite furnace atomic absorption spectroscopy, and LeadCare technologies based on anodic stripping voltammetry. Laboratories reported their BLL analytical results to the CDC. The LAMP uses these results to provide performance feedback to the laboratories. SETTING: The CDC sent blood samples to approximately 50 US laboratories for lead analysis. PARTICIPANTS: Of the approximately 200 laboratories enrolled in LAMP, 38 to 46 US laboratories provided data used in this report (January 2017 to March 2018). RESULTS: Laboratory precision ranged from 0.26 µg/dL for inductively coupled plasma mass spectrometry to 1.50 µg/dL for LeadCare instruments. All participating US LAMP laboratories reported accurate BLL for 89% of challenge samples, using the ±2 µg/dL or 10% evaluation criteria. CONCLUSIONS: Laboratories participating in the CDC's LAMP program can accurately measure blood lead using the current Clinical Laboratory Improvement Amendments of 1988 guidance of ±4 µg/dL or ±10%, with a success rate of 96%. However, when we apply limits of ±2 µg/dL or ±10%, the success rate drops to 89%. When challenged with samples that have target values between 3 and 5 µg/dL, nearly 100% of reported results fall within ±4 µg/dL, while 5% of the results fall outside of the acceptability criteria used by the CDC's LAMP program. As public health focuses on lower blood lead levels, laboratories must evaluate their ability to successfully meet these analytical challenges surrounding successfully measuring blood lead. In addition proposed CLIA guidelines (±2 µg/dL or 10%) would be achievable performance by a majority of US laboratories participating in the LAMP program.


Subject(s)
Clinical Laboratory Techniques/standards , Lead/analysis , Quality Assurance, Health Care/methods , Centers for Disease Control and Prevention, U.S./organization & administration , Centers for Disease Control and Prevention, U.S./statistics & numerical data , Clinical Laboratory Techniques/methods , Clinical Laboratory Techniques/statistics & numerical data , Humans , Lead/blood , Program Development/methods , Quality Assurance, Health Care/statistics & numerical data , United States
5.
Environ Int ; 122: 310-315, 2019 01.
Article in English | MEDLINE | ID: mdl-30503317

ABSTRACT

INTRODUCTION: Cross-sectional studies suggest that postnatal blood lead (PbB) concentrations are negatively associated with child growth. Few studies prospectively examined this association in populations with lower PbB concentrations. We investigated longitudinal associations of childhood PbB concentrations and subsequent anthropometric measurements in a multi-ethnic cohort of girls. METHODS: Data were from The Breast Cancer and the Environment Research Program at three sites in the United States (U.S.): New York City, Cincinnati, and San Francisco Bay Area. Girls were enrolled at ages 6-8 years in 2004-2007. Girls with PbB concentrations collected at ≤10 years old (mean 7.8 years, standard deviation (SD) 0.82) and anthropometry collected at ≥3 follow-up visits were included (n = 683). The median PbB concentration was 0.99 µg/d (10th percentile = 0.59 µg/dL and 90th percentile = 2.00 µg/dL) and the geometric mean was 1.03 µg/dL (95% Confidence Interval (CI): 0.99, 1.06). For analyses, PbB concentrations were dichotomized as <1 µg/dL (n = 342) and ≥1 µg/dL (n = 341). Anthropometric measurements of height, body mass index (BMI), waist circumference (WC), and percent body fat (%BF) were collected at enrollment and follow-up visits through 2015. Linear mixed effects regression estimated how PbB concentrations related to changes in girls' measurements from ages 7-14 years. RESULTS: At 7 years, mean difference in height was -2.0 cm (95% CI: -3.0, -1.0) for girls with ≥1 µg/dL versus <1 µg/dL PbB concentrations; differences persisted, but were attenuated, with age to -1.5 cm (95% CI: -2.5, -0.4) at 14 years. Mean differences for BMI, WC, and BF% at 7 years between girls with ≥1 µg/dL versus <1 µg/dL PbB concentrations were -0.7 kg/m2 (95% CI: -1.2, -0.2), -2.2 cm (95% CI: -3.8, -0.6), and -1.8% (95% CI: -3.2, -0.4), respectively. Overall, these differences generally persisted with advancing age and at 14 years, differences were -0.8 kg/m2 (95% CI: -1.5, -0.02), -2.9 cm (95% CI: -4.8, -0.9), and -1.7% (95% CI: -3.1, -0.4) for BMI, WC, and BF%, respectively. CONCLUSIONS: These findings suggest that higher concentrations of PbB during childhood, even though relatively low by screening standards, may be inversely associated with anthropometric measurements in girls.


Subject(s)
Body Mass Index , Environmental Exposure , Lead/blood , Waist Circumference , Adolescent , Child , Cross-Sectional Studies , Environmental Exposure/analysis , Environmental Exposure/statistics & numerical data , Female , Humans , New York City/epidemiology
6.
Nutrients ; 10(7)2018 Jul 06.
Article in English | MEDLINE | ID: mdl-29986412

ABSTRACT

We estimated iodine status (median urinary iodine concentration (mUIC (µg/L))) for the US population (6 years and over; n = 4613) and women of reproductive age (WRA) (15⁻44 years; n = 901). We estimated mean intake of key iodine sources by race and Hispanic origin. We present the first national estimates of mUIC for non-Hispanic Asian persons and examine the intake of soy products, a potential source of goitrogens. One-third of National Health and Nutrition Examination Survey (NHANES) participants in 2011⁻2014 provided casual urine samples; UIC was measured in these samples. We assessed dietary intake with one 24-h recall and created food groups using the USDA’s food/beverage coding scheme. For WRA, mUIC was 110 µg/L. For both non-Hispanic white (106 µg/L) and non-Hispanic Asian (81 µg/L) WRA mUIC was significantly lower than mUIC among Hispanic WRA (133 µg/L). Non-Hispanic black WRA had a mUIC of 124 µg/L. Dairy consumption was significantly higher among non-Hispanic white (162 g) compared to non-Hispanic black WRA (113 g). Soy consumption was also higher among non-Hispanic Asian WRA (18 g compared to non-Hispanic black WRA (1 g). Differences in the consumption pattern of key sources of iodine and goitrogens may put subgroups of individuals at risk of mild iodine deficiency. Continued monitoring of iodine status and variations in consumption patterns is needed.


Subject(s)
Deficiency Diseases/prevention & control , Diet , Iodine/administration & dosage , Nutritional Status , Reproductive Health , Sodium Chloride, Dietary/administration & dosage , Women's Health , Adolescent , Adult , Age Factors , Biomarkers/urine , Deficiency Diseases/diagnosis , Deficiency Diseases/ethnology , Diet/adverse effects , Female , Humans , Iodine/deficiency , Iodine/urine , Nutrition Surveys , Nutritive Value , Recommended Dietary Allowances , Reproductive Health/ethnology , Sex Factors , Sodium Chloride, Dietary/urine , United States/epidemiology , Women's Health/ethnology , Young Adult
7.
Clin Chim Acta ; 485: 1-6, 2018 Oct.
Article in English | MEDLINE | ID: mdl-29894782

ABSTRACT

BACKGROUND: Comprehensive information on the effect of time and temperature storage on the measurement of elements in human, whole blood (WB) by inductively coupled plasma-dynamic reaction cell-mass spectrometry (ICP-DRC-MS) is lacking, particularly for Mn and Se. METHODS: Human WB was spiked at 3 concentration levels, dispensed, and then stored at 5 different temperatures: -70 °C, -20 °C, 4 °C, 23 °C, and 37 °C. At 3 and 5 weeks, and at 2, 4, 6, 8, 10, 12, 36 months, samples were analyzed for Pb, Cd, Mn, Se and total Hg, using ICP-DRC-MS. We used a multiple linear regression model including time and temperature as covariates to fit the data with the measurement value as the outcome. We used an equivalence test using ratios to determine if results from the test storage conditions, warmer temperature and longer time, were comparable to the reference storage condition of 3 weeks storage time at -70 °C. RESULTS: Model estimates for all elements in human WB samples stored in polypropylene cryovials at -70 °C were equivalent to estimates from samples stored at 37 °C for up to 2 months, 23 °C up to 10 months, and -20 °C and 4 °C for up to 36 months. Model estimates for samples stored for 3 weeks at -70 °C were equivalent to estimates from samples stored for 2 months at -20 °C, 4 °C, 23 °C and 37 °C; 10 months at -20 °C, 4 °C, and 23 °C; and 36 months at -20 °C and 4 °C. This equivalence was true for all elements and pools except for the low concentration blood pool for Cd. CONCLUSIONS: Storage temperatures of -20 °C and 4 °C are equivalent to -70 °C for stability of Cd, Mn, Pb, Se, and Hg in human whole blood for at least 36 months when blood is stored in sealed polypropylene vials. Increasing the sample storage temperature from -70 °C to -20 °C or 4 °C can lead to large energy savings. The best analytical results are obtained when storage time at higher temperature conditions (e.g. 23 °C and 37 °C) is minimized because recovery of Se and Hg is reduced. Blood samples stored in polypropylene cryovials also lose volume over time and develop clots at higher temperature conditions (e.g., 23 °C and 37 °C), making them unacceptable for elemental testing after 10 months and 2 months, respectively.


Subject(s)
Cadmium/blood , Lead/blood , Magnesium/blood , Mercury/blood , Selenium/blood , Temperature , Humans , Mass Spectrometry , Time Factors
8.
JAMA Netw Open ; 1(8): e185937, 2018 12 07.
Article in English | MEDLINE | ID: mdl-30646298

ABSTRACT

Importance: Use of electronic cigarettes (e-cigarettes) is increasing. Measures of exposure to known tobacco-related toxicants among e-cigarette users will inform potential health risks to individual product users. Objectives: To estimate concentrations of tobacco-related toxicants among e-cigarette users and compare these biomarker concentrations with those observed in combustible cigarette users, dual users, and never tobacco users. Design, Setting, and Participants: A population-based, longitudinal cohort study was conducted in the United States in 2013-2014. Cross-sectional analysis was performed between November 4, 2016, and October 5, 2017, of biomarkers of exposure to tobacco-related toxicants collected by the Population Assessment of Tobacco and Health Study. Participants included adults who provided a urine sample and data on tobacco use (N = 5105). Exposures: The primary exposure was tobacco use, including current exclusive e-cigarette users (n = 247), current exclusive cigarette smokers (n = 2411), and users of both products (dual users) (n = 792) compared with never tobacco users (n = 1655). Main Outcomes and Measures: Geometric mean concentrations of 50 individual biomarkers from 5 major classes of tobacco product constituents were measured: nicotine, tobacco-specific nitrosamines (TSNAs), metals, polycyclic aromatic hydrocarbons (PAHs), and volatile organic compounds (VOCs). Results: Of the 5105 participants, most were aged 35 to 54 years (weighted percentage, 38%; 95% CI, 35%-40%), women (60%; 95% CI, 59%-62%), and non-Hispanic white (61%; 95% CI, 58%-64%). Compared with exclusive e-cigarette users, never users had 19% to 81% significantly lower concentrations of biomarkers of exposure to nicotine, TSNAs, some metals (eg, cadmium and lead), and some VOCs (including acrylonitrile). Exclusive e-cigarette users showed 10% to 98% significantly lower concentrations of biomarkers of exposure, including TSNAs, PAHs, most VOCs, and nicotine, compared with exclusive cigarette smokers; concentrations were comparable for metals and 3 VOCs. Exclusive cigarette users showed 10% to 36% lower concentrations of several biomarkers than dual users. Frequency of cigarette use among dual users was positively correlated with nicotine and toxicant exposure. Conclusions and Relevance: Exclusive use of e-cigarettes appears to result in measurable exposure to known tobacco-related toxicants, generally at lower levels than cigarette smoking. Toxicant exposure is greatest among dual users, and frequency of combustible cigarette use is positively correlated with tobacco toxicant concentration. These findings provide evidence that using combusted tobacco cigarettes alone or in combination with e-cigarettes is associated with higher concentrations of potentially harmful tobacco constituents in comparison with using e-cigarettes alone.


Subject(s)
Inhalation Exposure/analysis , Nicotine/urine , Nitrosamines/urine , Smoking , Vaping , Adult , Biomarkers/urine , Cross-Sectional Studies , Female , Humans , Male , Metals/urine , Middle Aged , Polycyclic Aromatic Hydrocarbons/urine , Smoking/epidemiology , Smoking/urine , United States/epidemiology , Vaping/epidemiology , Vaping/urine
9.
At Spectrosc ; 39(3): 95-99, 2018 May 01.
Article in English | MEDLINE | ID: mdl-32336845

ABSTRACT

A probing study to establish a reliable and robust method for determining the iodine concentration using the ELAN® DRC™ II ICP-MS was performed in combination with a sample digestion and filtration step. Dairy products from locally available sources were evaluated to help determine the possibility and need for further evaluations in relation to the U.S. population's iodine intake. Prior to analysis, the samples were aliquoted and digested for 3 hours at 90±3 °C. Dilution and filtration were performed, following the digestion. The sample extract was analyzed, and the results were confirmed with NIST SRM 1549a Whole Milk Powder. Further experimentation will need to be performed to optimize the method for projected sample concentration and throughput.

10.
At Spectrosc ; 39(6): 219-228, 2018 Dec.
Article in English | MEDLINE | ID: mdl-32336846

ABSTRACT

The Centers for Disease Control and Prevention's (CDC) Environmental Health Laboratory uses modified versions of inductively coupled plasma mass spectrometry (ICP-MS) analytical methods to quantify metals contamination present in items that will come into contact with patient samples during the pre-analytical, analytical, and post-analytical stages. This lot screening process allows us to reduce the likelihood of introducing contamination which can lead to falsely elevated results. This is particularly important when looking at biomonitoring levels in humans which tend to be near the limit of detection of many methods. The fundamental requirements for a lot screening program in terms of facilities and processes are presented along with a discussion of sample preparation techniques used for lot screening. The criteria used to evaluate the lot screening data to determine the acceptability of a particular manufacturing lot is presented as well. As a result of lot testing, unsuitable manufactured lots are identified and excluded from use.

11.
Anal Methods ; 9(23): 3464-3476, 2017.
Article in English | MEDLINE | ID: mdl-29201158

ABSTRACT

The Centers for Disease Control and Prevention developed a biomonitoring method to rapidly and accurately quantify chromium and cobalt in human whole blood by ICP-MS. Many metal-on-metal hip implants which contain significant amounts of chromium and cobalt are susceptible to metal degradation. This method is used to gather population data about chromium and cobalt exposure of the U.S. population that does not include people that have metal-on-metal hip implants so that reference value can be established for a baseline level in blood. We evaluated parameters such as; helium gas flow rate, choice and composition of the diluent solution for sample preparation, and sample rinse time to determine the optimal conditions for analysis. The limits of detection for chromium and cobalt in blood were determined to be 0.41 and 0.06 µg/L, respectively. Method precision, accuracy, and recovery for this method were determined using quality control material created in-house and historical proficiency testing samples. We conducted experiments to determine if quantitative changes in the method parameters affect the results obtained by changing four parameters while analyzing human whole blood spiked with National Institute of Standard and Technology traceable materials: the dilution factor used during sample preparation, sample rinse time, diluent composition, and kinetic energy discrimination gas flow rate. The results at the increased and decreased levels for each parameter were statistically compared to the results obtained at the optimized parameters. We assessed the degree of reproducibility obtained under a variety of conditions and evaluated the method's robustness by analyzing the same set of proficiency testing samples by different analysts, on different instruments, with different reagents, and on different days. The short-term stability of chromium and cobalt in human blood samples stored at room temperature was monitored over a time period of 64 hours by diluting and analyzing samples at different time intervals. The stability of chromium and cobalt post-dilution was also evaluated over a period of 48 hours and at two storage temperatures (room temperature and refrigerated at 4°C). The results obtained during the stability studies showed that chromium and cobalt are stable in human blood for a period of 64 hours.

12.
Pediatrics ; 140(2)2017 Aug.
Article in English | MEDLINE | ID: mdl-28771411

ABSTRACT

In 2012, the Centers for Disease Control and Prevention (CDC) adopted its Advisory Committee on Childhood Lead Poisoning Prevention recommendation to use a population-based reference value to identify children and environments associated with lead hazards. The current reference value of 5 µg/dL is calculated as the 97.5th percentile of the distribution of blood lead levels (BLLs) in children 1 to 5 years old from 2007 to 2010 NHANES data. We calculated and updated selected percentiles, including the 97.5th percentile, by using NHANES 2011 to 2014 blood lead data and examined demographic characteristics of children whose blood lead was ≥90th percentile value. The 97.5th percentile BLL of 3.48 µg/dL highlighted analytical laboratory and clinical interpretation challenges of blood lead measurements ≤5 µg/dL. Review of 5 years of results for target blood lead values <11 µg/dL for US clinical laboratories participating in the CDC's voluntary Lead and Multi-Element Proficiency quality assurance program showed 40% unable to quantify and reported a nondetectable result at a target blood lead value of 1.48 µg/dL, compared with 5.5% at a target BLL of 4.60 µg/dL. We describe actions taken at the CDC's Environmental Health Laboratory in the National Center for Environmental Health, which measures blood lead for NHANES, to improve analytical accuracy and precision and to reduce external lead contamination during blood collection and analysis.


Subject(s)
Lead Poisoning/blood , Lead Poisoning/prevention & control , Lead/blood , Child, Preschool , Female , Humans , Infant , Laboratory Proficiency Testing , Male , Mass Screening , Nutrition Surveys , Quality Assurance, Health Care , Reference Values , United States
14.
Lancet Glob Health ; 5(4): e458-e466, 2017 04.
Article in English | MEDLINE | ID: mdl-28153514

ABSTRACT

BACKGROUND: Outbreaks of unexplained illness frequently remain under-investigated. In India, outbreaks of an acute neurological illness with high mortality among children occur annually in Muzaffarpur, the country's largest litchi cultivation region. In 2014, we aimed to investigate the cause and risk factors for this illness. METHODS: In this hospital-based surveillance and nested age-matched case-control study, we did laboratory investigations to assess potential infectious and non-infectious causes of this acute neurological illness. Cases were children aged 15 years or younger who were admitted to two hospitals in Muzaffarpur with new-onset seizures or altered sensorium. Age-matched controls were residents of Muzaffarpur who were admitted to the same two hospitals for a non-neurologic illness within seven days of the date of admission of the case. Clinical specimens (blood, cerebrospinal fluid, and urine) and environmental specimens (litchis) were tested for evidence of infectious pathogens, pesticides, toxic metals, and other non-infectious causes, including presence of hypoglycin A or methylenecyclopropylglycine (MCPG), naturally-occurring fruit-based toxins that cause hypoglycaemia and metabolic derangement. Matched and unmatched (controlling for age) bivariate analyses were done and risk factors for illness were expressed as matched odds ratios and odds ratios (unmatched analyses). FINDINGS: Between May 26, and July 17, 2014, 390 patients meeting the case definition were admitted to the two referral hospitals in Muzaffarpur, of whom 122 (31%) died. On admission, 204 (62%) of 327 had blood glucose concentration of 70 mg/dL or less. 104 cases were compared with 104 age-matched hospital controls. Litchi consumption (matched odds ratio [mOR] 9·6 [95% CI 3·6 - 24]) and absence of an evening meal (2·2 [1·2-4·3]) in the 24 h preceding illness onset were associated with illness. The absence of an evening meal significantly modified the effect of eating litchis on illness (odds ratio [OR] 7·8 [95% CI 3·3-18·8], without evening meal; OR 3·6 [1·1-11·1] with an evening meal). Tests for infectious agents and pesticides were negative. Metabolites of hypoglycin A, MCPG, or both were detected in 48 [66%] of 73 urine specimens from case-patients and none from 15 controls; 72 (90%) of 80 case-patient specimens had abnormal plasma acylcarnitine profiles, consistent with severe disruption of fatty acid metabolism. In 36 litchi arils tested from Muzaffarpur, hypoglycin A concentrations ranged from 12·4 µg/g to 152·0 µg/g and MCPG ranged from 44·9 µg/g to 220·0 µg/g. INTERPRETATION: Our investigation suggests an outbreak of acute encephalopathy in Muzaffarpur associated with both hypoglycin A and MCPG toxicity. To prevent illness and reduce mortality in the region, we recommended minimising litchi consumption, ensuring receipt of an evening meal and implementing rapid glucose correction for suspected illness. A comprehensive investigative approach in Muzaffarpur led to timely public health recommendations, underscoring the importance of using systematic methods in other unexplained illness outbreaks. FUNDING: US Centers for Disease Control and Prevention.


Subject(s)
Acute Febrile Encephalopathy/diagnosis , Disease Outbreaks/statistics & numerical data , Fruit/toxicity , Litchi/toxicity , Neurotoxicity Syndromes/diagnosis , Acute Febrile Encephalopathy/epidemiology , Acute Febrile Encephalopathy/etiology , Adolescent , Case-Control Studies , Child , Cyclopropanes/analysis , Female , Glycine/analogs & derivatives , Glycine/analysis , Humans , Hypoglycins/analysis , India , Male , Neurotoxicity Syndromes/epidemiology , Neurotoxicity Syndromes/etiology , Odds Ratio
15.
Talanta ; 162: 114-122, 2017 Jan 01.
Article in English | MEDLINE | ID: mdl-27837806

ABSTRACT

We improved our inductively coupled plasma mass spectrometry (ICP-MS) whole blood method [1] for determination of lead (Pb), cadmium (Cd), and mercury (Hg) by including manganese (Mn) and selenium (Se), and expanding the calibration range of all analytes. The method is validated on a PerkinElmer (PE) ELAN® DRC II ICP-MS (ICP-DRC-MS) and uses the Dynamic Reaction Cell (DRC) technology to attenuate interfering background ion signals via ion-molecule reactions. Methane gas (CH4) eliminates background signal from 40Ar2+ to permit determination of 80Se+, and oxygen gas (O2) eliminates several polyatomic interferences (e.g. 40Ar15N+, 54Fe1H+) on 55Mn+. Hg sensitivity in DRC mode is a factor of two higher than vented mode when measured under the same DRC conditions as Mn due to collisional focusing of the ion beam. To compensate for the expanded method's longer analysis time (due to DRC mode pause delays), we implemented an SC4-FAST autosampler (ESI Scientific, Omaha, NE), which vacuum loads the sample onto a loop, to keep the sample-to-sample measurement time to less than 5min, allowing for preparation and analysis of 60 samples in an 8-h work shift. The longer analysis time also resulted in faster breakdown of the hydrocarbon oil in the interface roughing pump. The replacement of the standard roughing pump with a pump using a fluorinated lubricant, Fomblin®, extended the time between pump maintenance. We optimized the diluent and rinse solution components to reduce carryover from high concentration samples and prevent the formation of precipitates. We performed a robust calculation to determine the following limits of detection (LOD) in whole blood: 0.07µgdL-1 for Pb, 0.10µgL-1 for Cd, 0.28µgL-1 for Hg, 0.99µgL-1 for Mn, and 24.5µgL-1 for Se.


Subject(s)
Dietary Exposure/analysis , Environmental Monitoring/methods , Inhalation Exposure/analysis , Mass Spectrometry/methods , Trace Elements/blood , Cadmium/blood , Calibration , Environmental Monitoring/instrumentation , Humans , Lead/blood , Manganese/blood , Mercury/blood , Quality Control , Reference Standards , Reproducibility of Results , Selenium/blood , Trace Elements/standards
16.
Am J Clin Nutr ; 104 Suppl 3: 898S-901S, 2016 Sep.
Article in English | MEDLINE | ID: mdl-27534636

ABSTRACT

The accurate assessment of population iodine status is necessary to inform public health policies and clinical research on iodine nutrition, particularly the role of iodine adequacy in normal neurodevelopment. Urinary iodine concentration (UIC) directly reflects dietary iodine intake and is the most common indicator used worldwide to assess population iodine status. The CDC established the Ensuring the Quality of Iodine Procedures program in 2001 to provide laboratories that measure urinary iodine with an independent assessment of their analytic performance; this program fosters improvement in the assessment of UIC. Clinical laboratory tests of thyroid function (including serum concentrations of the pituitary hormone thyrotropin and the thyroid hormones thyroxine and triiodothyronine) are sometimes used as indicators of iodine status, although such use is often problematic. Even in severely iodine-deficient regions, there is a great deal of intraindividual variation in the ability of the thyroid to adapt. In most settings and in most population subgroups other than newborns, thyroid function tests are not considered sensitive indicators of population iodine status. However, the thyroid-derived protein thyroglobulin is increasingly being used for this purpose. Thyroglobulin can be measured in either serum or dried blood spot (DBS) samples. The use of DBS samples is advantageous in resource-poor regions. Improved methodologies for ascertaining maternal iodine status are needed to facilitate research on developmental correlates of iodine status. Thyroglobulin may prove to be a useful biomarker for both maternal and neonatal iodine status, but validated assay-specific reference ranges are needed for the determination of iodine sufficiency in both pregnant women and neonates, and trimester-specific ranges are possibly needed for pregnant women. UIC is currently a well-validated population biomarker, but individual biomarkers that could be used for research, patient care, and public health are lacking.


Subject(s)
Iodine/blood , Nutrition Assessment , Nutritional Status , Thyroglobulin/blood , Thyroid Gland/metabolism , Adult , Biomarkers/blood , Developing Countries , Female , Humans , Infant, Newborn , Iodine/deficiency , Pregnancy , Public Health , Thyroid Function Tests , Thyrotropin , Thyroxine
17.
J Radioanal Nucl Chem ; 301(1): 285-291, 2016 Jul.
Article in English | MEDLINE | ID: mdl-27375308

ABSTRACT

Quantification of 241Am in urine at low levels is important for assessment of individuals' or populations' accidental, environmental, or terrorism-related internal contamination, but no convenient, precise method has been established to rapidly determine these low levels. Here we report a new analytical method to measure 241Am as developed and validated at the Centers for Disease Control and Prevention (CDC) by means of the selective retention of Am from urine directly on DGA resin, followed by SF-ICP-MS detection. The method provides rapid results with a Limit of Detection (LOD) of 0.22 pg/L (0.028 Bq/L), which is lower than 1/3 of the C/P CDG for 241Am at 5 days post-exposure. The results obtained by this method closely agree with CDC values as measured by Liquid Scintillation Counting, and with National Institute of Standards Technology (NIST) Certified Reference Materials (CRM) target values.

18.
Environ Res ; 149: 179-188, 2016 08.
Article in English | MEDLINE | ID: mdl-27208469

ABSTRACT

The sodium iodide-symporter (NIS) mediates uptake of iodide into thyroid follicular cells. This key step in thyroid hormone synthesis is inhibited by perchlorate, thiocyanate (SCN) and nitrate (NO3) anions. When these exposures occur during pregnancy the resulting decreases in thyroid hormones may adversely affect neurodevelopment of the human fetus. Our objectives were to describe and examine the relationship of these anions to the serum thyroid indicators, thyroid stimulating hormone (TSH) and free thyroxine (FT4), in third trimester women from the initial Vanguard Study of the National Children's Study (NCS); and to compare urine perchlorate results with those in pregnant women from the National Health and Nutritional Examination Survey (NHANES). Urinary perchlorate, SCN, NO3, and iodine, serum TSH, FT4, and cotinine were measured and a food frequency questionnaire (FFQ) was administered to pregnant women enrolled in the initial Vanguard Study. We used multiple regression models of FT4 and TSH that included perchlorate equivalent concentration (PEC, which estimates combined inhibitory effects of the anions perchlorate, SCN, and NO3 on the NIS). We used multiple regression to model predictors of each urinary anion, using FFQ results, drinking water source, season of year, smoking status, and demographic characteristics. Descriptive statistics were calculated for pregnant women in NHANES 2001-2012. The geometric mean (GM) for urinary perchlorate was 4.04µg/L, for TSH 1.46mIU/L, and the arithmetic mean for FT4 1.11ng/dL in 359 NCS women. In 330 women with completed FFQs, consumption of leafy greens, winter season, and Hispanic ethnicity were significant predictors of higher urinary perchlorate, which differed significantly by study site and primary drinking water source, and bottled water was associated with higher urinary perchlorate compared to filtered tap water. Leafy greens consumption was associated with higher urinary NO3 and higher urinary SCN. There was no association between urinary perchlorate or PEC and TSH or FT4, even for women with urinary iodine <100µg/L. GM urinary perchlorate concentrations in the full sample (n=494) of third trimester NCS women (4.03µg/L) were similar to pregnant women in NHANES (3.58µg/L).


Subject(s)
Antithyroid Agents/pharmacology , Environmental Exposure , Nitrates/urine , Perchlorates/urine , Symporters/antagonists & inhibitors , Thiocyanates/urine , Thyrotropin/blood , Thyroxine/blood , Adult , Female , Humans , Nutrition Surveys , Pregnancy , Pregnancy Trimester, Third , Thyroid Function Tests , United States , Young Adult
19.
J Anal Toxicol ; 40(3): 222-8, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26912563

ABSTRACT

In this study, we evaluated the effect of temperature on the long-term stability of three mercury species in bovine blood. We used inductively coupled plasma mass spectrometry (ICP-MS) analysis to determine the concentrations of inorganic (iHg), methyl (MeHg) and ethyl (EtHg) mercury species in two blood pools stored at temperatures of -70, -20, 4, 23°C (room temperature) and 37°C. Over the course of a year, we analyzed aliquots of pooled specimens at time intervals of 1, 2, 4 and 6 weeks and 2, 4, 6, 8, 10 and 12 months. We applied a fixed-effects linear model, step-down pairwise comparison and coefficient of variation statistical analysis to examine the temperature and time effects on changes in mercury species concentrations. We observed several instances of statistically significant differences in mercury species concentrations between different temperatures and time points; however, with considerations of experimental factors (such as instrumental drift and sample preparation procedures), not all differences were scientifically important. We concluded that iHg, MeHg and EtHg species in bovine whole blood were stable at -70, -20, 4 and 23°C for 1 year, but blood samples stored at 37°C were stable for no more than 2 weeks.


Subject(s)
Ethylmercury Compounds/blood , Mercury/blood , Methylmercury Compounds/blood , Drug Stability , Drug Storage , Ethylmercury Compounds/chemistry , Humans , Mass Spectrometry , Mercury/chemistry , Methylmercury Compounds/chemistry , Spectrophotometry, Atomic , Temperature , Time Factors
20.
Sci Total Environ ; 544: 701-10, 2016 Feb 15.
Article in English | MEDLINE | ID: mdl-26674699

ABSTRACT

There is little published literature on the efficacy of strategies to reduce exposure to residential well water arsenic. The objectives of our study were to: 1) determine if water arsenic remained a significant exposure source in households using bottled water or point-of-use treatment systems; and 2) evaluate the major sources and routes of any remaining arsenic exposure. We conducted a cross-sectional study of 167 households in Maine using one of these two strategies to prevent exposure to arsenic. Most households included one adult and at least one child. Untreated well water arsenic concentrations ranged from <10 µg/L to 640 µg/L. Urine samples, water samples, daily diet and bathing diaries, and household dietary and water use habit surveys were collected. Generalized estimating equations were used to model the relationship between urinary arsenic and untreated well water arsenic concentration, while accounting for documented consumption of untreated water and dietary sources. If mitigation strategies were fully effective, there should be no relationship between urinary arsenic and well water arsenic. To the contrary, we found that untreated arsenic water concentration remained a significant (p ≤ 0.001) predictor of urinary arsenic levels. When untreated water arsenic concentrations were <40 µg/L, untreated water arsenic was no longer a significant predictor of urinary arsenic. Time spent bathing (alone or in combination with water arsenic concentration) was not associated with urinary arsenic. A predictive analysis of the average study participant suggested that when untreated water arsenic ranged from 100 to 500 µg/L, elimination of any untreated water use would result in an 8%-32% reduction in urinary arsenic for young children, and a 14%-59% reduction for adults. These results demonstrate the importance of complying with a point-of-use or bottled water exposure reduction strategy. However, there remained unexplained, water-related routes of exposure.


Subject(s)
Arsenic/analysis , Drinking Water/chemistry , Environmental Exposure/statistics & numerical data , Water Pollutants, Chemical/analysis , Water Wells , Environmental Monitoring , Family Characteristics , Humans , Maine , Water Purification
SELECTION OF CITATIONS
SEARCH DETAIL