Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 20
Filtrar
1.
One Health ; 8: 100112, 2019 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-31788532

RESUMO

The emergence, spread, and persistence of antimicrobial resistance (AMR) remains a pressing global concern. Increased promotion of commercial small-scale agriculture within low-resource settings has facilitated an increased use in antimicrobials as growth promoters globally, creating antimicrobial-resistant animal reservoirs. We conducted a longitudinal field study in rural Ecuador to monitor the AMR of Escherichia coli populations from backyard chickens and children at three sample periods with approximately 2-month intervals (February, April, and June 2017). We assessed AMR to 12 antibiotics using generalized linear mixed effects models (GLMM). We also sampled and assessed AMR to the same 12 antibiotics in one-day-old broiler chickens purchased from local venders. One-day-old broiler chickens showed lower AMR at sample period 1 compared to sample period 2 (for 9 of the 12 antibiotics tested); increases in AMR between sample periods 2 and 3 were minimal. Two months prior to the first sample period (December 2016) there was no broiler farming activity due to a regional collapse followed by a peak in annual farming in February 2017. Between sample periods 1 and 2, we observed significant increases in AMR to 6 of the 12 antibiotics in children and to 4 of the 12 antibiotics in backyard chickens. These findings suggest that the recent increase in farming, and the observed increase of AMR in the one-day old broilers, may have caused the increase in AMR in backyard chickens and children. Small-scale farming dynamics could play an important role in the spread of AMR in low- and middle-income countries.

2.
BMC Public Health ; 19(1): 90, 2019 Jan 19.
Artigo em Inglês | MEDLINE | ID: mdl-30660198

RESUMO

BACKGROUND: Globally, diarrhea is a leading cause of child morbidity and mortality. Although latrines are integral for reducing enteric pathogen transmission, several studies have shown no evidence that latrine ownership improved child health. There are a number of explanations for these results. One explanation is that latrine access does not equate to latrine use. Latrine use, however, is difficult to accurately ascertain, as defecation behavior is often stigmatized. To address this measurement issue, we measure latrine use as a latent variable, indicated by a suite of psychosocial variables. METHODS: We administered a survey of 16 defecation-related psychosocial questions to 251 individuals living in rural Ecuador. We applied latent class analysis (LCA) to these data to model the probability of latrine use as a latent variable. To account for uncertainty in predicted latent class membership, we used a pseudo-class approach to impute five different probabilities of latrine use for each respondent. Via regression modeling, we tested the association between household sanitation and each imputed latrine use variable. RESULTS: The optimal model presented strong evidence of two latent classes (entropy = 0.86): consistent users (78%) and inconsistent users (22%), predicted by 5 of our 16 psychosocial variables. There was no evidence of an association between the probability of latrine use, predicted from the LCA, and household access to basic sanitation (OR = 1.1, 95% CI = 0.6-2.1). This suggests that home access to a sanitation facility may not ensure the use of the facility for every family member at all times. CONCLUSION: Effective implementation and evaluation of sanitation programs requires accurate measurement of latrine use. Psychosocial variables, such as norms, perceptions, and attitudes may provide robust proxy-measures. Future longitudinal studies will help to strengthen the use of these surrogate measures, as many of these factors may be subject to secular trends. Additionally, subgroup analyses will elucidate how our  proxy indicators of latrine defecation vary by individual-level characteristics.


Assuntos
Análise de Classes Latentes , Propriedade/estatística & dados numéricos , Saneamento/estatística & dados numéricos , Banheiros/estatística & dados numéricos , Adulto , Criança , Saúde da Criança/estatística & dados numéricos , Defecação , Equador , Características da Família , Feminino , Humanos , Masculino , Probabilidade , População Rural/estatística & dados numéricos , Estereotipagem , Inquéritos e Questionários , Adulto Jovem
3.
Epidemics ; 20: 21-36, 2017 09.
Artigo em Inglês | MEDLINE | ID: mdl-28283373

RESUMO

Waning immunity could allow transmission of polioviruses without causing poliomyelitis by promoting silent circulation (SC). Undetected SC when oral polio vaccine (OPV) use is stopped could cause difficult to control epidemics. Little is known about waning. To develop theory about what generates SC, we modeled a range of waning patterns. We varied both OPV and wild polio virus (WPV) transmissibility, the time from beginning vaccination to reaching low polio levels, and the infection to paralysis ratio (IPR). There was longer SC when waning continued over time rather than stopping after a few years, when WPV transmissibility was higher or OPV transmissibility was lower, and when the IPR was higher. These interacted in a way that makes recent emergence of prolonged SC a possibility. As the time to reach low infection levels increased, vaccine rates needed to eliminate polio increased and a threshold was passed where prolonged low-level SC emerged. These phenomena were caused by increased contributions to the force of infection from reinfections. The resulting SC occurs at low levels that would be difficult to detect using environmental surveillance. For all waning patterns, modest levels of vaccination of adults shortened SC. Previous modeling studies may have missed these phenomena because (1) they used models with no or very short duration waning and (2) they fit models to paralytic polio case counts. Our analyses show that polio case counts cannot predict SC because nearly identical polio case count patterns can be generated by a range of waning patterns that generate different patterns of SC. We conclude that the possibility of prolonged SC is real but unquantified, that vaccinating modest fractions of adults could reduce SC risk, and that joint analysis of acute flaccid paralysis and environmental surveillance data can help assess SC risks and ensure low risks before stopping OPV.


Assuntos
Modelos Teóricos , Poliomielite/imunologia , Poliomielite/transmissão , Vacina Antipólio Oral/imunologia , Poliovirus/imunologia , Adulto , Humanos , Risco , Fatores de Tempo
4.
Epidemiol Infect ; 141(8): 1572-84, 2013 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-23507473

RESUMO

Norovirus is a common cause of gastroenteritis in all ages. Typical infections cause viral shedding periods of days to weeks, but some individuals can shed for months or years. Most norovirus risk models do not include these long-shedding individuals, and may therefore underestimate risk. We reviewed the literature for norovirus-shedding duration data and stratified these data into two distributions: regular shedding (mean 14-16 days) and long shedding (mean 105-136 days). These distributions were used to inform a norovirus transmission model that predicts the impact of long shedders. Our transmission model predicts that this subpopulation increases the outbreak potential (measured by the reproductive number) by 50-80%, the probability of an outbreak by 33%, the severity of transmission (measured by the attack rate) by 20%, and transmission duration by 100%. Characterizing and understanding shedding duration heterogeneity can provide insights into community transmission that can be useful in mitigating norovirus risk.


Assuntos
Infecções por Caliciviridae/epidemiologia , Infecções por Caliciviridae/transmissão , Surtos de Doenças , Gastroenterite/epidemiologia , Gastroenterite/virologia , Norovirus/fisiologia , Infecções por Caliciviridae/virologia , Humanos , Modelos Biológicos , Fatores de Risco , Eliminação de Partículas Virais
5.
Epidemiol Infect ; 141(8): 1563-71, 2013 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-23433247

RESUMO

Causal mechanisms of norovirus outbreaks are often not revealed. Understanding the transmission route (e.g. foodborne, waterborne, or environmental) and vehicle (e.g. shellfish or recreational water) of a norovirus outbreak, however, is of great public health importance; this information can facilitate interventions for an ongoing outbreak and regulatory action to limit future outbreaks. Towards this goal, we conducted a systematic review to examine whether published outbreak information was associated with the implicated transmission route or vehicle. Genogroup distribution was associated with transmission route and food vehicle, but attack rate and the presence of GII.4 strain were not associated with transmission route, food vehicle, or water vehicle. Attack rate, genogroup distribution, and GII.4 strain distribution also varied by other outbreak characteristics (e.g. setting, season, hemisphere). These relationships suggest that different genogroups exploit different environmental conditions and thereby can be used to predict the likelihood of various transmission routes or vehicles.


Assuntos
Infecções por Caliciviridae/epidemiologia , Infecções por Caliciviridae/transmissão , Surtos de Doenças , Microbiologia de Alimentos , Gastroenterite/epidemiologia , Gastroenterite/virologia , Norovirus/fisiologia , Infecções por Caliciviridae/virologia , Humanos , Incidência , Análise Multivariada , Norovirus/genética
6.
Epidemiol Infect ; 140(7): 1161-72, 2012 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-22444943

RESUMO

The purpose of this study was to examine global epidemiological trends in human norovirus (NoV) outbreaks by transmission route and setting, and describe relationships between these characteristics, viral attack rates, and the occurrence of genogroup I (GI) or genogroup II (GII) strains in outbreaks. We analysed data from 902 reverse transcriptase-polymerase chain reaction-confirmed, human NoV outbreaks abstracted from a systematic review of articles published from 1993 to 2011 and indexed under the terms 'norovirus' and 'outbreak'. Multivariate regression analyses demonstrated that foodservice and winter outbreaks were significantly associated with higher attack rates. Foodborne and waterborne outbreaks were associated with multiple strains (GI+GII). Waterborne outbreaks were significantly associated with GI strains, while healthcare-related and winter outbreaks were associated with GII strains. These results identify important trends for epidemic NoV detection, prevention, and control.


Assuntos
Infecções por Caliciviridae/epidemiologia , Infecção Hospitalar/epidemiologia , Surtos de Doenças , Norovirus/classificação , Número Básico de Reprodução , Infecções por Caliciviridae/virologia , Infecção Hospitalar/virologia , Alimentos/virologia , Gastroenterite/epidemiologia , Gastroenterite/virologia , Genótipo , Saúde Global , Humanos , Norovirus/genética , Norovirus/isolamento & purificação , Fatores de Risco , Estações do Ano , Microbiologia da Água
8.
Epidemiol Infect ; 129(2): 315-23, 2002 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-12405100

RESUMO

This manuscript extends our previously published work (based on data from one clinic) on the association between three drinking water-treatment modalities (boiling, filtering, and bottling) and diarrhoeal disease in HIV-positive persons by incorporating data from two additional clinics collected in the following year. We conducted a cross-sectional survey of drinking water patterns, medication usage, and episodes of diarrhoea among HIV-positive persons attending clinics associated with the San Francisco Community Consortium. We present combined results from our previously published work in one clinic (n = 226) with data from these two additional clinics (n = 458). In this combined analysis we employed logistic regression and marginal structural modelling of the data. The relative risk of diarrhoea for 'always' vs. 'never' drinking boiled water was 0.68 (95% CI 0.45-1.04) and for 'always' vs. 'never' drinking bottled water was 1.22 (95 % CI 0.82-1.82). Drinking filtered water was unrelated to diarrhoea (1.03 (95% CI 0.78, 1.35) for 'always' vs. 'never' drinking filtered water]. Adjustment for confounding did not have any notable effect on the point estimates (0.61, 1.35 and 0.98 for boiled, bottled, and filtered water respectively, as defined above). The risk of diarrhoea was lower among those consuming boiled water but this finding was not statistically significant. Because of these findings, the importance of diarrhoea in immunocompromised individuals, and the limitations of cross-sectional data further prospective investigations of water consumption and diarrhoea among HIV-positive individuals are needed.


Assuntos
Diarreia/epidemiologia , Diarreia/etiologia , Infecções por HIV , Purificação da Água/métodos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Contagem de Linfócito CD4 , California/epidemiologia , Criança , Fatores de Confusão Epidemiológicos , Estudos Transversais , Feminino , Humanos , Modelos Logísticos , Masculino , Prontuários Médicos , Pessoa de Meia-Idade , Fatores de Risco , São Francisco/epidemiologia , Abastecimento de Água
9.
Epidemiol Infect ; 128(1): 73-81, 2002 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-11895094

RESUMO

In a cross-sectional survey of 226 HIV-infected men, we examined the occurrence of diarrhoea and its relationship to drinking water consumption patterns, risk behaviours, immune status and medication use. Diarrhoea was reported by 47% of the respondents. Neither drinking boiled nor filtered water was significantly associated with diarrhoea (OR = 0.5 [0.2, 1.6], 1.2 [0.6, 2.5] respectively), whereas those that drank bottled water were at risk for diarrhoea (OR = 3.0 [1.1, 7.8]). Overall, 47% always or often used at least one water treatment. Of the 37% who were very concerned about drinking water, 62% had diarrhoea, 70% always or often used at least one water treatment. An increase in CD4 count was protective only for those with a low risk of diarrhoea associated with medication (OR = 0.6 [0.5, 0.9]). A 30% attributable risk to diarrhoea was estimated for those with high medication risk compared to those with low medication risk. The significant association between concern with drinking water and diarrhoea as well as between concern with drinking water and water treatment suggests awareness that drinking water is a potential transmission pathway for diarrhoeal disease. At the same time we found that a significant portion of diarrhoea was associated with other sources not related to drinking water such as medication usage.


Assuntos
Diarreia/etiologia , Infecções por HIV/complicações , Hospedeiro Imunocomprometido , Assunção de Riscos , Abastecimento de Água , Adulto , Estudos Transversais , Diarreia/epidemiologia , Comportamento de Ingestão de Líquido , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos , Humanos , Masculino , Pessoa de Meia-Idade , Razão de Chances , Fatores de Risco
10.
Emerg Infect Dis ; 7(6): 1004-9, 2001.
Artigo em Inglês | MEDLINE | ID: mdl-11747729

RESUMO

Advances in serologic assays for Cryptosporidium parvum have made serology an attractive surveillance tool. The sensitivity, specificity, and predictive value of these new assays for surveillance of immunocompromised populations, however, have not been reported. Using stored serum specimens collected for the San Francisco Men's Health Study, we conducted a case-control study with 11 clinically confirmed cases of cryptosporidiosis. Based on assays using a 27-kDa antigen (CP23), the serum specimens from cases had a median response immunoglobulin (Ig) G level following clinical diagnosis (1,334) and a net response (433, change in IgG level from baseline) that were significantly higher than their respective control values (329 and -32, Wilcoxon p value = 0.01). Receiver operator curves estimated a cutoff of 625 U as the optimal sensitivity (0.86 [0.37, 1.0]) and specificity (0.86 [0.37, 1.0]) for predicting Cryptosporidium infection. These data suggest that the enzyme-linked immunosorbent assay technique can be an effective epidemiologic tool to monitor Cryptosporidium infection in immunocompromised populations.


Assuntos
Infecções Oportunistas Relacionadas com a AIDS/imunologia , Anticorpos Antiprotozoários/sangue , Criptosporidiose/imunologia , Cryptosporidium parvum/imunologia , Infecções Oportunistas Relacionadas com a AIDS/sangue , Infecções Oportunistas Relacionadas com a AIDS/epidemiologia , Infecções Oportunistas Relacionadas com a AIDS/parasitologia , Adulto , Animais , Estudos de Casos e Controles , Criptosporidiose/sangue , Criptosporidiose/epidemiologia , Criptosporidiose/parasitologia , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , São Francisco/epidemiologia
12.
Epidemiology ; 9(3): 255-63, 1998 May.
Artigo em Inglês | MEDLINE | ID: mdl-9583416

RESUMO

We combined information on the temporal pattern of disease incidence for the 1993 cryptosporidiosis outbreak in Milwaukee with information on oocyst levels to obtain insight into the epidemic process. We constructed a dynamic process model of the epidemic with continuous population compartments using reasonable ranges for the possible distribution of the model parameters. We then explored which combinations of parameters were consistent with the observations. A poor fit of the March 1-22 portion of the time series suggested that a smaller outbreak occurred before the March 23 treatment failure, beginning sometime on or before March 1. This finding suggests that had surveillance systems detected the earlier outbreak, up to 85% of the cases might have been prevented. The same conclusion was obtained independent of the model by transforming the incidence time series data of Mac Kenzie et al. This transformation is based on a background monthly incidence rate for watery diarrhea in the Milwaukee area of 0.5%. Further analysis using the incidence data from the onset of the major outbreak, March 23, through the end of April, resulted in three inferred properties of the infection process: (1) the mean incubation period was likely to have been between 3 and 7 days; (2) there was a necessary concurrent increase in Cryptospordium oocyst influent concentration and a decrease in treatment efficiency of the water; and (3) the variability of the dose-response function in the model did not appreciably affect the simulated outbreaks.


Assuntos
Simulação por Computador , Criptosporidiose/epidemiologia , Surtos de Doenças , Modelos Biológicos , Microbiologia da Água , Abastecimento de Água , Animais , Cryptosporidium/patogenicidade , Diarreia/microbiologia , Transmissão de Doença Infecciosa , Humanos , Incidência , Fatores de Tempo , Wisconsin
13.
Risk Anal ; 16(4): 549-63, 1996 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-8819345

RESUMO

Traditionally, microbial risk assessors have used point estimates to evaluate the probability that an individual will become infected. We developed a quantitative approach that shifts the risk characterization perspective from point estimate to distributional estimate, and from individual to population. To this end, we first designed and implemented a dynamic model that tracks traditional epidemiological variables such as the number of susceptible, infected, diseased, and immune, and environmental variables such as pathogen density. Second, we used a simulation methodology that explicitly acknowledges the uncertainty and variability associated with the data. Specifically, the approach consists of assigning probability distributions to each parameter, sampling from these distributions for Monte Carlo simulations, and using a binary classification to assess the output of each simulation. A case study is presented that explores the uncertainties in assessing the risk of giardiasis when swimming in a recreational impoundment using reclaimed water. Using literature-based information to assign parameters ranges, our analysis demonstrated that the parameter describing the shedding of pathogens by infected swimmers was the factor that contributed most to the uncertainty in risk. The importance of other parameters was dependent on reducing the a priori range of this shedding parameter. By constraining the shedding parameter to its lower subrange, treatment efficiency was the parameter most important in predicting whether a simulation resulted in prevalences above or below non outbreak levels. Whereas parameters associated with human exposure were important when the shedding parameter was constrained to a higher subrange. This Monte Carlo simulation technique identified conditions in which outbreaks and/or nonoutbreaks are likely and identified the parameters that most contributed to the uncertainty associated with a risk prediction.


Assuntos
Medição de Risco , Água/parasitologia , Animais , Árvores de Decisões , Métodos Epidemiológicos , Giardia/patogenicidade , Giardíase/epidemiologia , Giardíase/etiologia , Humanos , Modelos Teóricos , Método de Monte Carlo , Natação
15.
J Theor Biol ; 176(4): 501-10, 1995 Oct 21.
Artigo em Inglês | MEDLINE | ID: mdl-8551745

RESUMO

A three-species food-chain model which was previously shown to exhibit chaotic dynamics was revisited. By exploring the sensitivity of that result this study found that complex behavior depended on the functional form chosen to model the interaction between the two highest species in the food chain. Two separate scenarios were explored: the gradual addition of refugia modeling the escape from predation at low prey densities; and the gradual addition of predator interference modeling territorial behavior. The addition of even a small amount of refugia provided a stabilizing influence as the chaotic dynamics collapsed to stable limit cycles. The results of adding interference to the model were more complex. Although the numerical simulations indicated that a low level of interference provided a stabilizing influence, the analytical results suggest that complex dynamics are possible for a range of parameter values that are biologically relevant. The sensitivity of the stability profile to functional changes in the model suggests two important ecological motivations for structural stability analysis. First, in ecological systems, environmental fluctuations cause continuous changes in the functional relationships between and within species, resulting in potential changes in the complexity of the dynamics over time. Second, slight changes in ecological structure may cause significant bifurcations; however, most ecological data are inadequate to distinguish such phenomena.


Assuntos
Ecologia , Alimentos , Comportamento Predatório , Animais , Modelos Biológicos , Dinâmica não Linear
16.
Life Support Biosph Sci ; 1(3-4): 141-57, 1995.
Artigo em Inglês | MEDLINE | ID: mdl-11538586

RESUMO

There are several characteristics of a Controlled Ecological Life Support System that are distinct from commonly engineered systems. These are: 1) the uncertainty, due to limited data availability, and variability due to the heterogeneity of biological subsystems; 2) the closed, ecological nature of the system; and 3) the primary criterion of maximizing the probability of survival. Consequences of these features include: complex dynamics characterized by time scales ranging from milliseconds to months, posing difficult problems with respect to mathematical modeling and predictability; and the necessity for a unique controller design that can translate the high level requirement of survivability to low-level actuator tasks. Future research in the systems and control area should include an ecological perspective focusing on the unique dynamical characteristics of a Controlled Ecological Life Support System.


Assuntos
Sistemas Ecológicos Fechados , Sistemas de Manutenção da Vida , Modelos Biológicos , Modelos Teóricos , Teoria de Sistemas , Desenho de Equipamento , Arquitetura de Instituições de Saúde , Integração de Sistemas
17.
J Med Entomol ; 32(2): 83-97, 1995 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-7608931

RESUMO

The population dynamics of Culex tarsalis in the Coachella and southern San Joaquin valleys of California were studied using Monte Carlo simulations. Multiple years of abundance data were averaged to extract a generalized seasonal pattern for each site. These patterns were used to establish qualitative goodness-of-fit criteria to assess model output and to evaluate the importance of model parameters in simulating mosquito population trends. The parameters associated with the degree of temperature and density dependency on larval mortality were found to be important components in determining whether or not the output was classified as having passed or failed on the basis of our criteria, whereas autogeny and the heterogeneity of developmental time were found not to be important.


Assuntos
Culex , Modelos Biológicos , Animais , Simulação por Computador , Culex/crescimento & desenvolvimento , Culex/fisiologia , Ecologia , Método de Monte Carlo , Dinâmica Populacional
18.
J Med Entomol ; 32(2): 98-106, 1995 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-7608932

RESUMO

A simulation model described (Eisenberg et al. 1994a) was used to compare the population dynamics of Culex tarsalis Coquillett in the Coachella and southern portion of the San Joaquin Valleys of California. Model outputs were classified as a pass if they met criteria that defined typical seasonal abundance patterns established by CO2 and New Jersey light trap data. The sensitivity of this classification to the model parameters was assessed by running multiple simulations for each valley site. Parameter sets associated with a pass were first analyzed separately for each valley and then compared. The two study sites were distinguished by the distributional characteristics of two parameters associated with temperature dependency. One of these parameters described the temperature dependence of larval mortality and the other the temperature dependence of adult egg development. We hypothesize that these isolated Cx. tarsalis populations evolved separately to maximize survival in their respective temperature regimes by adapting to different optimal larval survival temperatures and egg-development rates.


Assuntos
Culex , Modelos Biológicos , Análise de Variância , Animais , Culex/fisiologia , Ecologia , Dinâmica Populacional
20.
Postgrad Med J ; 60(708): 705-6, 1984 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-6494097

RESUMO

A combination of an oral beta-adrenergic blocking agent and verapamil has been advocated as a safe treatment for angina. A case of Wenckebach type atrioventricular block occurring in a patient on metoprolol and verapamil is reported. It is suggested that this combination is used with caution.


Assuntos
Metoprolol/efeitos adversos , Verapamil/efeitos adversos , Angina Pectoris/tratamento farmacológico , Interações Medicamentosas , Quimioterapia Combinada , Bloqueio Cardíaco/induzido quimicamente , Humanos , Masculino , Pessoa de Meia-Idade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...