Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
1.
Malar J ; 19(1): 108, 2020 Mar 04.
Article in English | MEDLINE | ID: mdl-32131841

ABSTRACT

BACKGROUND: Ethiopia has set a goal for malaria elimination by 2030. Low parasite density infections may go undetected by conventional diagnostic methods (microscopy and rapid diagnostic tests) and their contribution to malaria transmission varies by transmission settings. This study quantified the burden of subpatent infections from samples collected from three regions of northwest Ethiopia. METHODS: Sub-samples of dried blood spots from the Ethiopian Malaria Indicator Survey 2015 (EMIS-2015) were tested and compared using microscopy, rapid diagnostic tests (RDTs), and nested polymerase chain reaction (nPCR) to determine the prevalence of subpatent infection. Paired seroprevalence results previously reported along with gender, age, and elevation of residence were explored as risk factors for Plasmodium infection. RESULTS: Of the 2608 samples collected, the highest positive rate for Plasmodium infection was found with nPCR 3.3% (95% CI 2.7-4.1) compared with RDT 2.8% (95% CI 2.2-3.5) and microscopy 1.2% (95% CI 0.8-1.7). Of the nPCR positive cases, Plasmodium falciparum accounted for 3.1% (95% CI 2.5-3.8), Plasmodium vivax 0.4% (95% CI 0.2-0.7), mixed P. falciparum and P. vivax 0.1% (95% CI 0.0-0.4), and mixed P. falciparum and Plasmodium malariae 0.1% (95% CI 0.0-0.3). nPCR detected an additional 30 samples that had not been detected by conventional methods. The majority of the nPCR positive cases (61% (53/87)) were from the Benishangul-Gumuz Region. Malaria seropositivity had significant association with nPCR positivity [adjusted OR 10.0 (95% CI 3.2-29.4), P < 0.001]. CONCLUSION: Using nPCR the detection rate of malaria parasites increased by nearly threefold over rates based on microscopy in samples collected during a national cross-sectional survey in 2015 in Ethiopia. Such subpatent infections might contribute to malaria transmission. In addition to strengthening routine surveillance systems, malaria programmes may need to consider low-density, subpatent infections in order to accelerate malaria elimination efforts.


Subject(s)
Disease Eradication/methods , Malaria, Falciparum/epidemiology , Malaria, Vivax/epidemiology , Adolescent , Adult , Child , Child, Preschool , Cross-Sectional Studies , Dried Blood Spot Testing , Ethiopia/epidemiology , Female , Humans , Malaria, Falciparum/diagnosis , Malaria, Falciparum/prevention & control , Malaria, Vivax/diagnosis , Malaria, Vivax/prevention & control , Male , Middle Aged , Plasmodium falciparum , Plasmodium vivax , Prevalence , Seroepidemiologic Studies , Young Adult
2.
J Med Entomol ; 51(3): 694-701, 2014 May.
Article in English | MEDLINE | ID: mdl-24897864

ABSTRACT

Changes in the structure of managed red pine forests in Wisconsin caused by interacting root- and stem-colonizing insects are associated with increased abundance of the blacklegged tick, Ixodes scapularis Say, in comparison with nonimpacted stands. However, the frequency and variability of the occurrence of tick-borne pathogens in this coniferous forest type across Wisconsin is unknown. Red pine forests were surveyed from 2009 to 2013 to determine the prevalence of Borrelia burgdorferi and Anaplasma phagocytophilum in questing I. scapularis nymphs. Polymerase chain reaction analysis revealed geographical differences in the nymphal infection prevalence (NIP) of these pathogens in red pine forests. In the Kettle Moraine State Forest (KMSF) in southeastern Wisconsin, NIP of B. burgdorferi across all years was 35% (range of 14.5-53.0%). At the Black River State Forest (BRSF) in western Wisconsin, NIP of B. burgdorferi across all years was 26% (range of 10.9-35.5%). Differences in NIP of B. burgdorferi between KMSF and BRSF were statistically significant for 2010 and 2011 and for all years combined (P < 0.05). NIP ofA. phagocytophilum (human agent) averaged 9% (range of 4.6-15.8%) at KMSF and 3% (range of 0-6.4%) at BRSF, and was significantly different between the sites for all years combined (P < 0.05). Differences in coinfection of B. burgdorferi and A. phagocytophilum were not statistically significant between KMSF and BRSF, with an average of 3.4% (range of 1.7-10.5%) and 2.5% (range of 0-5.5%), respectively. In 2013, the density of infected nymphs in KMSF and BRSF was 14 and 30 per 1000m2, respectively, among the highest ever recorded for the state. Differences in the density of nymphs and NIP among sites were neither correlated with environmental factors nor time since tick colonization. These results document significant unexplained variation in tick-borne pathogens between coniferous forests in Wisconsin that warrants further study.


Subject(s)
Anaplasma phagocytophilum/isolation & purification , Borrelia burgdorferi/isolation & purification , Ehrlichiosis/epidemiology , Ixodes/microbiology , Lyme Disease/epidemiology , Anaplasma phagocytophilum/genetics , Anaplasma phagocytophilum/metabolism , Animals , Borrelia burgdorferi/genetics , Borrelia burgdorferi/metabolism , Ecosystem , Ehrlichiosis/microbiology , Ixodes/growth & development , Lyme Disease/microbiology , Nymph/growth & development , Nymph/microbiology , Pinus , Polymerase Chain Reaction , Seasons , Wisconsin
3.
J Community Health ; 39(1): 90-8, 2014 Feb.
Article in English | MEDLINE | ID: mdl-23934476

ABSTRACT

Heat-related illnesses (HRI) are the most frequent cause of environmental exposure-related injury treated in US emergency departments (ED). While most individuals with HRI evaluated in EDs are discharged to home, understanding predictors of individuals hospitalized with HRI may help public health practitioners and medical providers identify high risk groups who would benefit from educational outreach. We analyzed data collected by the Georgia Department of Public Health, Office of Health Indicators for Planning, regarding ED and hospital discharges for HRI, as identified by ICD-9 codes, between 2002 and 2008 to determine characteristics of individuals receiving care in EDs. Temperature data from CDC's Environmental Public Health Tracking Network were linked to the dataset to determine if ED visits occurred during an extreme heat event (EHE). A multivariable logistic regression model was developed to determine characteristics predicting hospitalization versus ED discharge using demographic characteristics, comorbid conditions, socioeconomic status, the public health district of residence, and the presence of an EHE. Men represented the majority of ED visits (75 %) and hospitalizations (78 %). In the multivariable model, the odds of admission versus ED discharge with an associated HRI increased with age among both men and women, and odds were higher among residents of specific public health districts, particularly in the southern part of the state. Educational efforts targeting the specific risk groups identified by this study may help reduce the burden of hospitalization due to HRI in the state of Georgia.


Subject(s)
Emergency Service, Hospital/statistics & numerical data , Heat Stress Disorders/epidemiology , Hot Temperature/adverse effects , Patient Admission/statistics & numerical data , Seasons , Adolescent , Adult , Age Factors , Aged , Aged, 80 and over , Child , Child, Preschool , Comorbidity , Female , Georgia/epidemiology , Humans , Infant , Infant, Newborn , Logistic Models , Male , Middle Aged , Residence Characteristics , Risk Factors , Sex Factors , Socioeconomic Factors , Young Adult
4.
Article in English | MEDLINE | ID: mdl-24967556

ABSTRACT

Surface water contaminants in Kentucky during and after 2011 flooding were characterized. Surface water samples were collected during flood stage (May 2-4, 2011; n = 15) and after (July 25-26, 2011; n = 8) from four different cities along the Ohio River and were analyzed for the presence of microbial indicators, pathogens, metals, and chemical contaminants. Contaminant concentrations during and after flooding were compared using linear and logistic regression. Surface water samples collected during flooding had higher levels of E. coli, enterococci, Salmonella, Campylobacter, E. coli O157:H7, adenovirus, arsenic, copper, iron, lead, and zinc compared to surface water samples collected 3-months post-flood (P < 0.05). These results suggest that flooding increases microbial and chemical loads in surface water. These findings reinforce commonly recommended guidelines to limit exposure to flood water and to appropriately sanitize contaminated surfaces and drinking wells after contamination by flood water.


Subject(s)
Bacteria/isolation & purification , Rivers/chemistry , Rivers/microbiology , Water Pollutants, Chemical/analysis , Water Pollution/analysis , Bacteria/classification , Bacteria/genetics , Environmental Monitoring , Floods , Kentucky
5.
Biosecur Bioterror ; 12(1): 42-8, 2014.
Article in English | MEDLINE | ID: mdl-24552361

ABSTRACT

During routine screening in 2011, US Customs and Border Protection (CBP) identified 2 persons with elevated radioactivity. CBP, in collaboration with Los Alamos National Laboratory, informed the Food and Drug Administration (FDA) that these people could have increased radiation exposure as a result of undergoing cardiac Positron Emission Tomography (PET) scans several months earlier with rubidium Rb 82 chloride injection from CardioGen-82. We conducted a multistate investigation to assess the potential extent and magnitude of radioactive strontium overexposure among patients who had undergone Rb 82 PET scans. We selected a convenience sample of clinical sites in 4 states and reviewed records to identify eligible study participants, defined as people who had had an Rb 82 PET scan between February and July 2011. All participants received direct radiation screening using a radioisotope identifier able to detect the gamma energy specific for strontium-85 (514 keV) and urine bioassay for excreted radioactive strontium. We referred a subset of participants with direct radiation screening counts above background readings for whole body counting (WBC) using a rank ordering of direct radiation screening. The rank order list, from highest to lowest, was used to contact and offer voluntary enrollment for WBC. Of 308 participants, 292 (95%) had direct radiation screening results indistinguishable from background radiation measurements; 261 of 265 (98%) participants with sufficient urine for analysis had radioactive strontium results below minimum detectable activity. None of the 23 participants who underwent WBC demonstrated elevated strontium activity above levels associated with routine use of the rubidium Rb 82 generator. Among investigation participants, we did not identify evidence of strontium internal contamination above permissible levels. This investigation might serve as a model for future investigations of radioactive internal contamination incidents.


Subject(s)
Positron-Emission Tomography , Rubidium Radioisotopes , Strontium/isolation & purification , Adult , Aged , Aged, 80 and over , Female , Heart/diagnostic imaging , Humans , Male , Middle Aged , Rubidium Radioisotopes/analysis , Tomography, X-Ray Computed , United States
6.
Am J Prev Med ; 44(3): 199-206, 2013 Mar.
Article in English | MEDLINE | ID: mdl-23415115

ABSTRACT

BACKGROUND: Migrant farmworkers are at risk for heat-related illness (HRI) at work. PURPOSE: The purpose of this study was to determine which risk factors could potentially reduce the prevalence of HRI symptoms among migrant farmworkers in Georgia. METHODS: Trained interviewers conducted in-person interviews of adults who attended the South Georgia Farmworker Health Project clinics in June 2011. The analysis was conducted in 2011-2012. Population intervention models were used to assess where the greatest potential impact could be made to reduce the prevalence of HRI symptoms. RESULTS: In total, 405 farmworkers participated. One third of participants had experienced three or more HRI symptoms in the preceding week. Migrant farmworkers faced barriers to preventing HRI at work, including lack of prevention training (77%) and no access to regular breaks (34%); shade (27%); or medical attention (26%). The models showed that the prevalence of three or more HRI symptoms (n=361, 34.3%) potentially could be reduced by increasing breaks in the shade (-9.2%); increasing access to medical attention (-7.3%); reducing soda intake (-6.7%); or increasing access to regular breaks (-6.0%). CONCLUSIONS: Migrant farmworkers experienced high levels of HRI symptoms and faced substantial barriers to preventing these symptoms. Although data are cross-sectional, results suggest that heat-related illness may be reduced through appropriate training of workers on HRI prevention, as well as regular breaks in shaded areas.


Subject(s)
Agricultural Workers' Diseases/epidemiology , Agriculture/statistics & numerical data , Heat Stress Disorders/epidemiology , Transients and Migrants/statistics & numerical data , Adolescent , Adult , Aged , Aged, 80 and over , Agricultural Workers' Diseases/ethnology , Body Mass Index , Cross-Sectional Studies , Female , Georgia/epidemiology , Health Behavior , Health Services Accessibility , Heat Stress Disorders/ethnology , Humans , Interviews as Topic , Male , Middle Aged , Risk Factors , Young Adult
7.
Disaster Med Public Health Prep ; 4(2): 129-34, 2010 Jun.
Article in English | MEDLINE | ID: mdl-20526135

ABSTRACT

BACKGROUND: During June 2008, heavy precipitation and 500-year flood events resulted in the displacement of thousands of families throughout eastern Iowa. The objectives of this study were to assess the effectiveness and preferred sources of health messages communicated to the public following the disaster. METHODS: Three hundred twenty-seven households were surveyed in 4 counties hit hardest by the flooding. A 48-item questionnaire containing items on demographics, housing, health information sources, and 8 specific health issues was administered. RESULTS: Almost all of the participants (99.0%) received information on at least 1 of the health topics covered by the survey. Most participants received information regarding vaccination (84.1%), mold (79.5%), safe use of well water (62.7%), respirator use (58.7%), or stress (53.8%). Television was the primary (54.7%) and preferred (60.2%) source of health information for most people, followed by the Internet (11.0% and 30.3% as source and preference, respectively). CONCLUSIONS: Public health messages were received by a wide audience in the flood-affected communities. Along with more traditional health communication channels such as television, radio, or newspapers, continued emphasis on the development of health information Web sites and other technological alternatives may result in useful and effective health communication in similar situations.


Subject(s)
Consumer Health Information/methods , Disasters , Environmental Health/methods , Floods , Information Dissemination/methods , Access to Information , Adolescent , Adult , Aged , Environmental Health/organization & administration , Humans , Information Seeking Behavior , Internet , Iowa , Male , Mass Media , Middle Aged , Multivariate Analysis , Public Health Administration/methods , Sampling Studies , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL