Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
1.
Int J Drug Policy ; 104: 103695, 2022 06.
Article in English | MEDLINE | ID: mdl-35472727

ABSTRACT

BACKGROUND: Recent reports of lead poisoning suggest that people who use opium may be exposed to high amounts of lead. Here, we investigate the association between opium use and blood lead levels (BLL) in a population-based cohort study. METHODS: In 2017, we studied a random sample of 410 people who currently (both within the past year and the past month) used opium and 104 who did not from participants of the Golestan Cohort Study in northeast Iran. Participants were stratified by sex and tobacco use history, completed a comprehensive opiate and tobacco use questionnaire and provided blood. BLL was measured by Lead Care® II Blood Lead Test Kit, validated by inductively coupled plasma triple quadrupole mass spectrometry. BLL was categorized as "<5 µg/dL", "elevated" (5-10 µg/dL), "high" (10-50 µg/dL), and "very high" (above 50 µg/dL). To assess the association between BLL categories and opiate use, route of consumption and weekly use, we used ordered logistic regression models, and report OR (odds ratio) and 95% CI (confidence interval) adjusted for age, sex, place of residence, education, occupation, household fuel type, and tobacco use. RESULTS: In the cohort, participants used only raw (teriak) or refined (shireh) opium, which were smoked (45%, n = 184), taken orally (46%, n = 189), or both (9%, n = 37), for a mean duration of 24.2 (standard deviation: 11.6) years. The median BLL was significantly higher in people who currently used opium (11.4 µg/dL; IQR: 5.2-23.4) compared with those who did not (2.3 µg/dL; IQR: 2.3-4.2), and the highest median BLL was seen in oral use (21.7 µg/dL; IQR: 12.1-34.1). The BLL was <5 µg/dL among 79.8% of people with no opiate use, compared with only 22.7% in those using opium. BLL was elevated in 21.7%, high in 50.5% and very high in 5.1% of people using opium. About 95% of those with oral (180/189) or dual use (35/37) and 55% (102/184) of those who smoked opium had levels of blood lead above 5 µg/dL. The OR for the association between any opium use and each unit of increase in BLL category was 10.5 (95%CI: 5.8-19.1), and oral use of opium was a very strong predictor of increasing BLL category (OR=74.1; 95%CI: 35.1-156.3). This odds ratio was 38.8 (95%CI: 15.9-95.1) for dual use and 4.9 (95%CI: 2.6-9.1) for opium smoking. There was an independent dose-response association between average weekly dose and BLL among people using opium, overall and when stratified by route of use. CONCLUSION: Our results indicate that regular use of lead-adulterated opium can expose individuals to high levels of lead, which may contribute to mortality and cancer risks associated with long-term opium use.


Subject(s)
Lead Poisoning , Opiate Alkaloids , Opium Dependence , Analgesics, Opioid , Cohort Studies , Humans , Lead , Opium , Opium Dependence/epidemiology
2.
Clin Chim Acta ; 485: 1-6, 2018 Oct.
Article in English | MEDLINE | ID: mdl-29894782

ABSTRACT

BACKGROUND: Comprehensive information on the effect of time and temperature storage on the measurement of elements in human, whole blood (WB) by inductively coupled plasma-dynamic reaction cell-mass spectrometry (ICP-DRC-MS) is lacking, particularly for Mn and Se. METHODS: Human WB was spiked at 3 concentration levels, dispensed, and then stored at 5 different temperatures: -70 °C, -20 °C, 4 °C, 23 °C, and 37 °C. At 3 and 5 weeks, and at 2, 4, 6, 8, 10, 12, 36 months, samples were analyzed for Pb, Cd, Mn, Se and total Hg, using ICP-DRC-MS. We used a multiple linear regression model including time and temperature as covariates to fit the data with the measurement value as the outcome. We used an equivalence test using ratios to determine if results from the test storage conditions, warmer temperature and longer time, were comparable to the reference storage condition of 3 weeks storage time at -70 °C. RESULTS: Model estimates for all elements in human WB samples stored in polypropylene cryovials at -70 °C were equivalent to estimates from samples stored at 37 °C for up to 2 months, 23 °C up to 10 months, and -20 °C and 4 °C for up to 36 months. Model estimates for samples stored for 3 weeks at -70 °C were equivalent to estimates from samples stored for 2 months at -20 °C, 4 °C, 23 °C and 37 °C; 10 months at -20 °C, 4 °C, and 23 °C; and 36 months at -20 °C and 4 °C. This equivalence was true for all elements and pools except for the low concentration blood pool for Cd. CONCLUSIONS: Storage temperatures of -20 °C and 4 °C are equivalent to -70 °C for stability of Cd, Mn, Pb, Se, and Hg in human whole blood for at least 36 months when blood is stored in sealed polypropylene vials. Increasing the sample storage temperature from -70 °C to -20 °C or 4 °C can lead to large energy savings. The best analytical results are obtained when storage time at higher temperature conditions (e.g. 23 °C and 37 °C) is minimized because recovery of Se and Hg is reduced. Blood samples stored in polypropylene cryovials also lose volume over time and develop clots at higher temperature conditions (e.g., 23 °C and 37 °C), making them unacceptable for elemental testing after 10 months and 2 months, respectively.


Subject(s)
Cadmium/blood , Lead/blood , Magnesium/blood , Mercury/blood , Selenium/blood , Temperature , Humans , Mass Spectrometry , Time Factors
3.
Talanta ; 162: 114-122, 2017 Jan 01.
Article in English | MEDLINE | ID: mdl-27837806

ABSTRACT

We improved our inductively coupled plasma mass spectrometry (ICP-MS) whole blood method [1] for determination of lead (Pb), cadmium (Cd), and mercury (Hg) by including manganese (Mn) and selenium (Se), and expanding the calibration range of all analytes. The method is validated on a PerkinElmer (PE) ELAN® DRC II ICP-MS (ICP-DRC-MS) and uses the Dynamic Reaction Cell (DRC) technology to attenuate interfering background ion signals via ion-molecule reactions. Methane gas (CH4) eliminates background signal from 40Ar2+ to permit determination of 80Se+, and oxygen gas (O2) eliminates several polyatomic interferences (e.g. 40Ar15N+, 54Fe1H+) on 55Mn+. Hg sensitivity in DRC mode is a factor of two higher than vented mode when measured under the same DRC conditions as Mn due to collisional focusing of the ion beam. To compensate for the expanded method's longer analysis time (due to DRC mode pause delays), we implemented an SC4-FAST autosampler (ESI Scientific, Omaha, NE), which vacuum loads the sample onto a loop, to keep the sample-to-sample measurement time to less than 5min, allowing for preparation and analysis of 60 samples in an 8-h work shift. The longer analysis time also resulted in faster breakdown of the hydrocarbon oil in the interface roughing pump. The replacement of the standard roughing pump with a pump using a fluorinated lubricant, Fomblin®, extended the time between pump maintenance. We optimized the diluent and rinse solution components to reduce carryover from high concentration samples and prevent the formation of precipitates. We performed a robust calculation to determine the following limits of detection (LOD) in whole blood: 0.07µgdL-1 for Pb, 0.10µgL-1 for Cd, 0.28µgL-1 for Hg, 0.99µgL-1 for Mn, and 24.5µgL-1 for Se.


Subject(s)
Dietary Exposure/analysis , Environmental Monitoring/methods , Inhalation Exposure/analysis , Mass Spectrometry/methods , Trace Elements/blood , Cadmium/blood , Calibration , Environmental Monitoring/instrumentation , Humans , Lead/blood , Manganese/blood , Mercury/blood , Quality Control , Reference Standards , Reproducibility of Results , Selenium/blood , Trace Elements/standards
4.
Radiat Prot Dosimetry ; 162(4): 618-24, 2014 Dec.
Article in English | MEDLINE | ID: mdl-24563523

ABSTRACT

Quantification of the isotopic composition of uranium in urine at low levels of concentration is important for assessing both military and civilian populations' exposures to uranium. However, until now there has been no convenient, precise method established for rapid determination of multiple uranium isotope ratios. Here, the authors report a new method to measure (234)U/(238)U, (235)U/(238)U and (236)U/(238)U. It uses solid-phase chelation extraction (via TRU columns) of actinides from the urine matrix, followed by measurement using a magnetic sector field inductively coupled plasma mass spectrometer (SF-ICP-MS-Thermo Element XR) equipped with a high-efficiency nebulizer (Apex PFA microflow) and coupled with a membrane desolvating nebulizer system (Aridus II™). This method provides rapid and reliable results and has been used successfully to analyse Certified Reference Materials.


Subject(s)
Mass Spectrometry/methods , Uranium/urine , Environmental Exposure/adverse effects , Humans , Limit of Detection , Mass Spectrometry/instrumentation , Mass Spectrometry/statistics & numerical data , Radiation Monitoring , Radioactive Pollutants/adverse effects , Radioactive Pollutants/urine , Radioisotopes/adverse effects , Radioisotopes/urine , Solid Phase Extraction , Uranium/adverse effects
6.
Pediatrics ; 114(1): 19-26, 2004 Jul.
Article in English | MEDLINE | ID: mdl-15231903

ABSTRACT

OBJECTIVE: Some children in the United States continue to be exposed to levels of lead that increase their risk for lowered intellectual functioning and behavior problems. It is unclear whether chelation therapy can prevent or reverse the neurodevelopmental sequelae of lead toxicity. The objective of this study was to determine whether chelation therapy with succimer (dimercaptosuccinic acid) in children with referral blood lead levels between 20 and 44 microg/dL (0.96-2.12 micromol/L) at 12 to 33 months of age has neurodevelopmental benefits at age 7 years. METHODS: The Treatment of Lead-Exposed Children (TLC) study is a randomized, double-blind, placebo-controlled trial that was conducted between September 1994 and June 2003 in Philadelphia, PA; Newark, NJ; Cincinnati, OH; and Baltimore, MD. Of 1854 referred children who were between the ages of 12 to 33 months and screened for eligibility, 780 were randomized to the active drug and placebo groups stratified by clinical center, body surface area, blood lead level, and language spoken at home. At 7 years of age, 647 subjects remained in the study. Participants were randomly assigned to receive oral succimer or placebo. Up to 3 26-day courses of succimer or placebo therapy were administered depending on response to treatment in those who were given active drug. Eighty-nine percent had finished treatment by 6 months, with all children finishing by 13 months after randomization. All participants received residential lead hazard control measures before treatment. TLC subjects also received a daily multivitamin supplement before and after treatment(s) with succimer or placebo. Scores on standardized neuropsychological measures that tap cognition, behavior, learning and memory, attention, and neuromotor skills were measured. RESULTS: Chelation therapy with succimer lowered average blood lead levels for approximately 6 months but resulted in no benefit in cognitive, behavioral, and neuromotor endpoints. CONCLUSION: These new follow-up data confirm our previous finding that the TLC regimen of chelation therapy is not associated with neurodevelopmental benefits in children with blood lead levels between 20 and 44 microg/dL (0.96-2.17 micromol/L). These results emphasize the importance of taking environmental measures to prevent exposure to lead. Chelation therapy with succimer cannot be recommended for children with blood lead levels between 20 and 44 microg/dL (0.96-2.12 micromol/L).


Subject(s)
Chelating Agents/pharmacology , Chelation Therapy , Child Behavior/drug effects , Child Development/drug effects , Lead Poisoning/drug therapy , Succimer/pharmacology , Chelating Agents/therapeutic use , Child , Child, Preschool , Double-Blind Method , Environmental Exposure , Humans , Infant , Intelligence/drug effects , Lead/blood , Lead Poisoning/psychology , Neuropsychological Tests , Succimer/therapeutic use
7.
Environ Res ; 94(3): 319-26, 2004 Mar.
Article in English | MEDLINE | ID: mdl-15016600

ABSTRACT

High concentrations of uranium (mean=620 microg/L) were detected in water samples collected from private wells in a residential community. Based on isotopic analyses, the source of the uranium contamination appeared to be from naturally occurring geological deposits. In homes where well water concentrations of uranium exceeded the drinking water standard, the residents were advised to use an alternate water source for potable purposes. Several months after the residents had stopped drinking the water, urine samples were collected and tested for uranium. Elevated concentrations of uranium (mean=0.40 microg/g creatinine) were detected in urine samples, and 85 percent of the urine uranium concentrations exceeded the 95th percentile concentration of a national reference population. Urine uranium concentrations were positively correlated with water uranium concentrations, but not with the participants' ages or how long they had been drinking the water. Six months later, a second urine sample was collected and tested for uranium. Urine uranium concentrations decreased in most (63 percent) of the people. In those people with the highest initial urine uranium concentrations, the urine levels decreased an average of 78 percent. However, urine uranium concentrations remained elevated (mean=0.27 microg/g), and 87 percent of the urine uranium concentrations exceeded the 95th percentile concentration of the reference population. The results of this investigation demonstrated that after long-term ingestion of uranium in drinking water, elevated concentrations of uranium in urine could be detected up to 10 months after exposure had stopped.


Subject(s)
Environmental Exposure/analysis , Fresh Water/analysis , Uranium/analysis , Uranium/urine , Female , Humans , Male , South Carolina , Spectrum Analysis , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL