Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 23
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Curr Hypertens Rep ; 26(5): 183-199, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38363454

RESUMO

PURPOSE OF REVIEW: To define resistant hypertension (RHT), review its pathophysiology and disease burden, identify barriers to effective hypertension management, and to highlight emerging treatment options. RECENT FINDINGS: RHT is defined as uncontrolled blood pressure (BP) ≥ 130/80 mm Hg despite concurrent prescription of ≥ 3 or ≥ 4 antihypertensive drugs in different classes or controlled BP despite prescription of ≥ to 4 drugs, at maximally tolerated doses, including a diuretic. BP is regulated by a complex interplay between the renin-angiotensin-aldosterone system, the sympathetic nervous system, the endothelin system, natriuretic peptides, the arterial vasculature, and the immune system; disruption of any of these can increase BP. RHT is disproportionately manifest in African Americans, older patients, and those with diabetes and/or chronic kidney disease (CKD). Amongst drug-treated hypertensives, only one-quarter have been treated intensively enough (prescribed > 2 drugs) to be considered for this diagnosis. New treatment strategies aimed at novel therapeutic targets include inhibition of sodium-glucose cotransporter 2, aminopeptidase A, aldosterone synthesis, phosphodiesterase 5, xanthine oxidase, and dopamine beta-hydroxylase, as well as soluble guanylate cyclase stimulation, nonsteroidal mineralocorticoid receptor antagonism, and dual endothelin receptor antagonism. The burden of RHT remains high. Better use of currently approved therapies and integrating emerging therapies are welcome additions to the therapeutic armamentarium for addressing needs in high-risk aTRH patients.


Assuntos
Anti-Hipertensivos , Hipertensão , Humanos , Anti-Hipertensivos/uso terapêutico , Hipertensão/tratamento farmacológico , Hipertensão/fisiopatologia , Resistência a Medicamentos , Pressão Sanguínea/efeitos dos fármacos , Efeitos Psicossociais da Doença
2.
Ecol Lett ; 20(3): 275-292, 2017 03.
Artigo em Inglês | MEDLINE | ID: mdl-28090753

RESUMO

Our ability to infer unobservable disease-dynamic processes such as force of infection (infection hazard for susceptible hosts) has transformed our understanding of disease transmission mechanisms and capacity to predict disease dynamics. Conventional methods for inferring FOI estimate a time-averaged value and are based on population-level processes. Because many pathogens exhibit epidemic cycling and FOI is the result of processes acting across the scales of individuals and populations, a flexible framework that extends to epidemic dynamics and links within-host processes to FOI is needed. Specifically, within-host antibody kinetics in wildlife hosts can be short-lived and produce patterns that are repeatable across individuals, suggesting individual-level antibody concentrations could be used to infer time since infection and hence FOI. Using simulations and case studies (influenza A in lesser snow geese and Yersinia pestis in coyotes), we argue that with careful experimental and surveillance design, the population-level FOI signal can be recovered from individual-level antibody kinetics, despite substantial individual-level variation. In addition to improving inference, the cross-scale quantitative antibody approach we describe can reveal insights into drivers of individual-based variation in disease response, and the role of poorly understood processes such as secondary infections, in population-level dynamics of disease.


Assuntos
Coiotes , Patos , Métodos Epidemiológicos/veterinária , Gansos , Influenza Aviária/epidemiologia , Peste/veterinária , Doenças das Aves Domésticas/epidemiologia , Fatores Etários , Animais , Anticorpos Antivirais/análise , Simulação por Computador , Estudos Transversais , Vírus da Influenza A/fisiologia , Influenza Aviária/virologia , Estudos Longitudinais , Territórios do Noroeste/epidemiologia , Peste/epidemiologia , Peste/microbiologia , Doenças das Aves Domésticas/virologia , Prevalência , Medição de Risco/métodos , Estudos Soroepidemiológicos , Yersinia pestis/fisiologia
3.
J Anim Ecol ; 86(3): 460-472, 2017 May.
Artigo em Inglês | MEDLINE | ID: mdl-28207932

RESUMO

Identifying mechanisms driving pathogen persistence is a vital component of wildlife disease ecology and control. Asymptomatic, chronically infected individuals are an oft-cited potential reservoir of infection, but demonstrations of the importance of chronic shedding to pathogen persistence at the population-level remain scarce. Studying chronic shedding using commonly collected disease data is hampered by numerous challenges, including short-term surveillance that focuses on single epidemics and acutely ill individuals, the subtle dynamical influence of chronic shedding relative to more obvious epidemic drivers, and poor ability to differentiate between the effects of population prevalence of chronic shedding vs. intensity and duration of chronic shedding in individuals. We use chronic shedding of Leptospira interrogans serovar Pomona in California sea lions (Zalophus californianus) as a case study to illustrate how these challenges can be addressed. Using leptospirosis-induced strands as a measure of disease incidence, we fit models with and without chronic shedding, and with different seasonal drivers, to determine the time-scale over which chronic shedding is detectable and the interactions between chronic shedding and seasonal drivers needed to explain persistence and outbreak patterns. Chronic shedding can enable persistence of L. interrogans within the sea lion population. However, the importance of chronic shedding was only apparent when surveillance data included at least two outbreaks and the intervening inter-epidemic trough during which fadeout of transmission was most likely. Seasonal transmission, as opposed to seasonal recruitment of susceptibles, was the dominant driver of seasonality in this system, and both seasonal factors had limited impact on long-term pathogen persistence. We show that the temporal extent of surveillance data can have a dramatic impact on inferences about population processes, where the failure to identify both short- and long-term ecological drivers can have cascading impacts on understanding higher order ecological phenomena, such as pathogen persistence.


Assuntos
Surtos de Doenças/veterinária , Leptospira interrogans/fisiologia , Leptospirose/veterinária , Leões-Marinhos , Eliminação de Partículas Virais , Animais , California/epidemiologia , Feminino , Incidência , Leptospirose/epidemiologia , Leptospirose/microbiologia , Leptospirose/transmissão , Masculino , Modelos Teóricos , Prevalência , Estações do Ano
4.
Proc Biol Sci ; 283(1844)2016 12 14.
Artigo em Inglês | MEDLINE | ID: mdl-27974523

RESUMO

Socially transmitted wildlife behaviours that create human-wildlife conflict are an emerging problem for conservation efforts, but also provide a unique opportunity to apply principles of infectious disease control to wildlife management. As an example, California sea lions (Zalophus californianus) have learned to exploit concentrations of migratory adult salmonids below the fish ladders at Bonneville Dam, impeding endangered salmonid recovery. Proliferation of this foraging behaviour in the sea lion population has resulted in a controversial culling programme of individual sea lions at the dam, but the impact of such culling remains unclear. To evaluate the effectiveness of current and alternative culling strategies, we used network-based diffusion analysis on a long-term dataset to demonstrate that social transmission is implicated in the increase in dam-foraging behaviour and then studied different culling strategies within an epidemiological model of the behavioural transmission data. We show that current levels of lethal control have substantially reduced the rate of social transmission, but failed to effectively reduce overall sea lion recruitment. Earlier implementation of culling could have substantially reduced the extent of behavioural transmission and, ultimately, resulted in fewer animals being culled. Epidemiological analyses offer a promising tool to understand and control socially transmissible behaviours.


Assuntos
Comportamento Alimentar , Aprendizagem , Leões-Marinhos/fisiologia , Comunicação Animal , Animais , Animais Selvagens , Conservação dos Recursos Naturais
5.
Ecol Appl ; 26(3): 740-51, 2016 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-27411247

RESUMO

Migratory behavior of waterfowl populations in North America has traditionally been broadly characterized by four north-south flyways, and these flyways have been central to the management of waterfowl populations for more than 80 yr. However, previous flyway characterizations are not easily updated with current bird movement data and fail to provide assessments of the importance of specific geographical regions to the identification of flyways. Here, we developed a network model of migratory movement for four waterfowl species, Mallard (Anas platyrhnchos), Northern Pintail (A. acuta), American Green-winged Teal (A. carolinensis), and Canada Goose (Branta canadensis), in North America, using bird band and recovery data. We then identified migratory flyways using a community detection algorithm and characterized the importance of smaller geographic regions in identifying flyways using a novel metric, the consolidation factor. We identified four main flyways for Mallards, Northern Pintails, and American Green-winged Teal, with the flyway identification in Canada Geese exhibiting higher complexity. For Mallards, flyways were relatively consistent through time. However, consolidation factors revealed that for Mallards and Green-winged Teal, the presumptive Mississippi flyway was potentially a zone of high mixing between other flyways. Our results demonstrate that the network approach provides a robust method for flyway identification that is widely applicable given the relatively minimal data requirements and is easily updated with future movement data to reflect changes in flyway definitions and management goals.


Assuntos
Migração Animal , Patos/fisiologia , Modelos Biológicos , Animais , Patos/classificação , Monitoramento Ambiental , América do Norte , Especificidade da Espécie , Fatores de Tempo
6.
Hypertension ; 77(1): 72-81, 2021 01.
Artigo em Inglês | MEDLINE | ID: mdl-33161774

RESUMO

Refractory hypertension (RfH) is a severe phenotype of antihypertension treatment failure. Treatment-resistant hypertension (TRH), a less severe form of difficult-to-treat hypertension, has been associated with significantly worse health outcomes. However, no studies currently show how health outcomes may worsen upon progression to RfH. RfH and TRH were studied in 3147 hypertensive participants in the CRIC (Chronic Renal Insufficiency Cohort study). The hypertensive phenotype (ie, no TRH or RfH, TRH, or RfH) was identified at the baseline visit, and health outcomes were monitored at subsequent visits. Outcome risk was compared using Cox proportional hazards models with time-varying covariates. A total of 136 (4.3%) individuals were identified with RfH at baseline. After adjusting for participant characteristics, individuals with RfH had increased risk for the composite renal outcome across all study years (50% decline in estimated glomerular filtration rate or end-stage renal disease; hazard ratio for study years 0-10=1.73 [95% CI, 1.42-2.11]) and the composite cardiovascular disease outcome during later study years (stroke, myocardial infarction, or congestive heart failure; hazard ratio for study years 0-3=1.25 [0.91-1.73], for study years 3-6=1.50 [0.97-2.32]), and for study years 6-10=2.72 [1.47-5.01]) when compared with individuals with TRH. There was no significant difference in all-cause mortality between those with refractory versus TRH. We provide the first evidence that RfH is associated with worse long-term health outcomes compared with TRH.


Assuntos
Anti-Hipertensivos/uso terapêutico , Hipertensão/tratamento farmacológico , Insuficiência Renal Crônica/complicações , Adulto , Idoso , Estudos de Coortes , Feminino , Humanos , Hipertensão/complicações , Hipertensão/epidemiologia , Masculino , Pessoa de Meia-Idade , Avaliação de Resultados da Assistência ao Paciente , Modelos de Riscos Proporcionais
7.
Ecol Evol ; 10(14): 7221-7232, 2020 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-32760523

RESUMO

Obtaining accurate estimates of disease prevalence is crucial for the monitoring and management of wildlife populations but can be difficult if different diagnostic tests yield conflicting results and if the accuracy of each diagnostic test is unknown. Bayesian latent class analysis (BLCA) modeling offers a potential solution, providing estimates of prevalence levels and diagnostic test accuracy under the realistic assumption that no diagnostic test is perfect.In typical applications of this approach, the specificity of one test is fixed at or close to 100%, allowing the model to simultaneously estimate the sensitivity and specificity of all other tests, in addition to infection prevalence. In wildlife systems, a test with near-perfect specificity is not always available, so we simulated data to investigate how decreasing this fixed specificity value affects the accuracy of model estimates.We used simulations to explore how the trade-off between diagnostic test specificity and sensitivity impacts prevalence estimates and found that directional biases depend on pathogen prevalence. Both the precision and accuracy of results depend on the sample size, the diagnostic tests used, and the true infection prevalence, so these factors should be considered when applying BLCA to estimate disease prevalence and diagnostic test accuracy in wildlife systems. A wildlife disease case study, focusing on leptospirosis in California sea lions, demonstrated the potential for Bayesian latent class methods to provide reliable estimates under real-world conditions.We delineate conditions under which BLCA improves upon the results from a single diagnostic across a range of prevalence levels and sample sizes, demonstrating when this method is preferable for disease ecologists working in a wide variety of pathogen systems.

8.
Am J Hypertens ; 33(6): 528-533, 2020 05 21.
Artigo em Inglês | MEDLINE | ID: mdl-31930338

RESUMO

BACKGROUND: Intensively treated participants in the SPRINT study experienced fewer primary cardiovascular composite study endpoints (CVD events) and lower mortality, although 38% of participants experienced a serious adverse event (SAE). The relationship of SAEs with CVD events is unknown. METHODS: CVD events were defined as either myocardial infarction, acute coronary syndrome, decompensated heart failure, stroke, or death from cardiovascular causes. Cox models were utilized to understand the occurrence of SAEs with CVD events according to baseline atherosclerotic cardiovascular disease (ASCVD) risk. RESULTS: SAEs occurred in 96% of those experiencing a CVD event but only in 34% (P < 0.001) of those not experiencing a CVD event. Occurrence of SAEs monotonically increased across the range of baseline ASCVD risk being approximately twice as great in the highest compared with the lowest risk category. SAE occurrence was strongly associated with ASCVD risk but was similar within risk groups across treatment arms. In adjusted Cox models, experiencing a CVD event was the strongest predictor of SAEs in all risk groups. By the end of year 1, the hazard ratios for the low, middle, and high ASCVD risk tertiles, and baseline clinical CVD group were 2.56 (95% CI = 1.39-4.71); 2.52 (1.63-3.89); 3.61 (2.79-4.68); 1.86 (1.37-2.54), respectively-a trend observed in subsequent years until study end. Intensive treatment independently predicted SAEs only in the second ASVCD risk tertile. CONCLUSIONS: The occurrence of SAEs is multifactorial and mostly related to prerandomization patient characteristics, most prominently ASCVD risk, which, in turn, relates to in-study CVD events.


Assuntos
Anti-Hipertensivos/uso terapêutico , Pressão Sanguínea/efeitos dos fármacos , Doenças Cardiovasculares/prevenção & controle , Hipertensão/tratamento farmacológico , Idoso , Anti-Hipertensivos/efeitos adversos , Doenças Cardiovasculares/mortalidade , Doenças Cardiovasculares/fisiopatologia , Análise por Conglomerados , Feminino , Fatores de Risco de Doenças Cardíacas , Humanos , Hipertensão/mortalidade , Hipertensão/fisiopatologia , Masculino , Pessoa de Meia-Idade , Ensaios Clínicos Controlados Aleatórios como Assunto , Medição de Risco , Fatores de Tempo , Resultado do Tratamento
9.
PLoS Negl Trop Dis ; 14(6): e0008407, 2020 06.
Artigo em Inglês | MEDLINE | ID: mdl-32598393

RESUMO

Confronted with the challenge of understanding population-level processes, disease ecologists and epidemiologists often simplify quantitative data into distinct physiological states (e.g. susceptible, exposed, infected, recovered). However, data defining these states often fall along a spectrum rather than into clear categories. Hence, the host-pathogen relationship is more accurately defined using quantitative data, often integrating multiple diagnostic measures, just as clinicians do to assess their patients. We use quantitative data on a major neglected tropical disease (Leptospira interrogans) in California sea lions (Zalophus californianus) to improve individual-level and population-level understanding of this Leptospira reservoir system. We create a "host-pathogen space" by mapping multiple biomarkers of infection (e.g. serum antibodies, pathogen DNA) and disease state (e.g. serum chemistry values) from 13 longitudinally sampled, severely ill individuals to characterize changes in these values through time. Data from these individuals describe a clear, unidirectional trajectory of disease and recovery within this host-pathogen space. Remarkably, this trajectory also captures the broad patterns in larger cross-sectional datasets of 1456 wild sea lions in all states of health but sampled only once. Our framework enables us to determine an individual's location in their time-course since initial infection, and to visualize the full range of clinical states and antibody responses induced by pathogen exposure. We identify predictive relationships between biomarkers and outcomes such as survival and pathogen shedding, and use these to impute values for missing data, thus increasing the size of the useable dataset. Mapping the host-pathogen space using quantitative biomarker data enables more nuanced understanding of an individual's time course of infection, duration of immunity, and probability of being infectious. Such maps also make efficient use of limited data for rare or poorly understood diseases, by providing a means to rapidly assess the range and extent of potential clinical and immunological profiles. These approaches yield benefits for clinicians needing to triage patients, prevent transmission, and assess immunity, and for disease ecologists or epidemiologists working to develop appropriate risk management strategies to reduce transmission risk on a population scale (e.g. model parameterization using more accurate estimates of duration of immunity and infectiousness) and to assess health impacts on a population scale.


Assuntos
Biomarcadores/sangue , Interações Hospedeiro-Patógeno/fisiologia , Leptospira/patogenicidade , Leptospirose/diagnóstico , Leptospirose/veterinária , Leões-Marinhos/microbiologia , Doenças dos Animais/diagnóstico , Doenças dos Animais/imunologia , Doenças dos Animais/microbiologia , Animais , Anticorpos Antibacterianos/sangue , Derrame de Bactérias , California , Estudos Transversais , Interações Hospedeiro-Patógeno/imunologia , Imunidade , Cinética , Leptospira interrogans , Leptospirose/imunologia , Taxa de Sobrevida
11.
Int J Pharm Pract ; 27(4): 380-385, 2019 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-30847977

RESUMO

OBJECTIVE: To assess whether hypoglycaemia incidence during management of adult diabetic ketoacidosis (DKA) differed following transition from a fixed-rate insulin protocol to a protocol using an empiric insulin rate reduction after normoglycaemia. METHODS: We retrospectively reviewed charts from adult patients managed with a DKA order set before and after order set revision. In cohort 1 (n = 77), insulin rate was 0.1 unit/kg/h with no adjustments and dextrose was infused at 12.5 g/h after glucose reached 250 mg/dl. In cohort 2 (n = 78), insulin was reduced to 0.05 unit/kg/h concurrent with dextrose initiation at 12.5 g/h after glucose reached 200 mg/dl. The primary outcome was hypoglycaemia (glucose < 70 mg/dl) within 24 h of the first order for insulin. KEY FINDINGS: The 24-h incidence of hypoglycaemia was 19.2% in cohort 2 versus 32.5% in cohort 1; the adjusted odds ratio was 0.46 (95% confidence interval (CI) [0.21, 0.98]; P = 0.047). The 24-h use of dextrose 50% in water (D50W) was also reduced in cohort 2. No differences were seen in anion gap or bicarbonate normalization, rebound hyperglycaemia or ICU length of stay. In most patients who became hypoglycaemic, the preceding glucose value was below 100 mg/dl. CONCLUSIONS: The insulin rate-reduction protocol was associated with less hypoglycaemia and no obvious disadvantage. Robust intervention for low-normal glucose values could plausibly achieve low hypoglycaemia rates with either approach.


Assuntos
Glicemia/análise , Cetoacidose Diabética/tratamento farmacológico , Hipoglicemia/epidemiologia , Hipoglicemiantes/efeitos adversos , Insulina/efeitos adversos , Adulto , Cetoacidose Diabética/sangue , Relação Dose-Resposta a Droga , Feminino , Humanos , Hipoglicemia/sangue , Hipoglicemia/induzido quimicamente , Hipoglicemiantes/administração & dosagem , Incidência , Insulina/administração & dosagem , Tempo de Internação , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Adulto Jovem
12.
J Hypertens ; 37(9): 1797-1804, 2019 09.
Artigo em Inglês | MEDLINE | ID: mdl-31058798

RESUMO

OBJECTIVES: Refractory hypertension has been defined as uncontrolled blood pressure (at or above 140/90 mmHg) when on five or more classes of antihypertensive medication, inclusive of a diuretic. Because unbiased estimates of the prevalence of refractory hypertension in the United States are lacking, we aim to provide such estimates using data from the National Health and Nutrition Examination Surveys (NHANES). METHODS: Refractory hypertension was assessed across multiple NHANES cycles using the aforementioned definition. Eight cycles of NHANES surveys (1999-2014) representing 41 552 patients are the subject of this study. Prevalence of refractory hypertension across these surveys was estimated in the drug-treated hypertensive population after adjusting for the complex survey design and standardizing for age. RESULTS: Across all surveys, refractory hypertension prevalence was 0.6% [95% confidence interval (CI) (0.5, 0.7)] amongst drug-treated hypertensive adults; 6.2% [95% CI (5.1, 7.6)] of individuals with treatment-resistant hypertension actually had refractory hypertension. Although the prevalence of refractory hypertension ranged from 0.3% [95% CI (0.1, 1.0)] to 0.9% [95% CI (0.6, 1.2)] over the eight cycles considered, there was no significant trend in prevalence over time. Refractory hypertension prevalence amongst those prescribed five or more drugs was 34.5% [95% CI (27.9, 41.9)]. Refractory hypertension was associated with advancing age, lower household income, black race, and also chronic kidney disease, albuminuria, diabetes, prior stroke, and coronary heart disease. CONCLUSIONS: We provided the first nationally representative estimate of refractory hypertension prevalence in US adults.


Assuntos
Anti-Hipertensivos/uso terapêutico , Diuréticos/uso terapêutico , Hipertensão/epidemiologia , Idoso , Idoso de 80 Anos ou mais , Albuminúria/etiologia , Anti-Hipertensivos/farmacologia , Pressão Sanguínea/efeitos dos fármacos , Determinação da Pressão Arterial , Feminino , Humanos , Hipertensão/complicações , Hipertensão/tratamento farmacológico , Masculino , Pessoa de Meia-Idade , Inquéritos Nutricionais , Prevalência , Insuficiência Renal Crônica/etiologia , Acidente Vascular Cerebral/etiologia , Estados Unidos/epidemiologia
13.
J Am Soc Hypertens ; 12(11): 809-817, 2018 11.
Artigo em Inglês | MEDLINE | ID: mdl-30392848

RESUMO

Apparent treatment-resistant hypertension (aTRH) is associated with higher prevalence of secondary hypertension, greater risk for adverse pressure-related clinical outcomes, and influences diagnostic and therapeutic decision-making. We previously showed that cross-sectional prevalence estimates of aTRH are lower than its true prevalence as patients with uncontrolled hypertension undergoing intensification/optimization of therapy will, over time, increasingly satisfy diagnostic criteria for aTRH. aTRH was assessed in an urban referral hypertension clinic using a 140/90 mm Hg goal blood pressure target in 745 patients with uncontrolled blood pressure, who were predominately African-American (86%) and female (65%). Analyses were stratified according to existing prescription of diuretic at initial visit. Risk for aTRH was estimated using logistic regression with patient characteristics at index visit as predictors. Among those prescribed diuretics, 84/363 developed aTRH; the risk score discriminated well (area under the receiver operating curve = 0.77, bootstrapped 95% CI [0.71, 0.81]). In patients not prescribed a diuretic, 44/382 developed aTRH, and the risk score showed a significantly better discriminative ability (area under the receiver operating curve = 0.82 [0.76, 0.87]; P < .001). In the diuretic and nondiuretic cohorts, 145/363 and 290/382 of patients had estimated risks for development of aTRH <15%. Of these low-risk patients, 139/145 and 278/290 did not develop aTRH (negative predictive value, diuretics - 0.94 [0.91, 0.98], no diuretics - 0.95 [0.93, 0.97]). We created a novel clinical score that discriminates well between those who will and will not develop aTRH, especially among those without existing diuretic prescriptions. Irrespective of baseline diuretic treatment status, a low-risk score had very high negative predictive value.

14.
Sci Rep ; 7(1): 18062, 2017 12 22.
Artigo em Inglês | MEDLINE | ID: mdl-29273783

RESUMO

Environmental reservoirs are important to infectious disease transmission and persistence, but empirical analyses are relatively few. The natural environment is a reservoir for prions that cause chronic wasting disease (CWD) and influences the risk of transmission to susceptible cervids. Soil is one environmental component demonstrated to affect prion infectivity and persistence. Here we provide the first landscape predictive model for CWD based solely on soil characteristics. We built a boosted regression tree model to predict the probability of the persistent presence of CWD in a region of northern Illinois using CWD surveillance in deer and soils data. We evaluated the outcome for possible pathways by which soil characteristics may increase the probability of CWD transmission via environmental contamination. Soil clay content and pH were the most important predictive soil characteristics of the persistent presence of CWD. The results suggest that exposure to prions in the environment is greater where percent clay is less than 18% and soil pH is greater than 6.6. These characteristics could alter availability of prions immobilized in soil and contribute to the environmental risk factors involved in the epidemiological complexity of CWD infection in natural populations of white-tailed deer.


Assuntos
Argila/química , Modelos Teóricos , Príons/metabolismo , Solo/química , Doença de Emaciação Crônica/metabolismo , Animais , Animais Selvagens , Cervos , Meio Ambiente , Concentração de Íons de Hidrogênio , Illinois
15.
Prev Vet Med ; 134: 82-91, 2016 Nov 01.
Artigo em Inglês | MEDLINE | ID: mdl-27836049

RESUMO

The application of network analysis to cattle shipments broadens our understanding of shipment patterns beyond pairwise interactions to the network as a whole. Such a quantitative description of cattle shipments in the U.S. can identify trade communities, describe temporal shipment patterns, and inform the design of disease surveillance and control strategies. Here, we analyze a longitudinal dataset of beef and dairy cattle shipments from 2009 to 2011 in the United States to characterize communities within the broader cattle shipment network, which are groups of counties that ship mostly to each other. Because shipments occur over time, we aggregate the data at various temporal scales to examine the consistency of network and community structure over time. Our results identified nine large (>50 counties) communities based on shipments of beef cattle in 2009 aggregated into an annual network and nine large communities based on shipments of dairy cattle. The size and connectance of the shipment network was highly dynamic; monthly networks were smaller than yearly networks and revealed seasonal shipment patterns consistent across years. Comparison of the shipment network over time showed largely consistent shipping patterns, such that communities identified on annual networks of beef and diary shipments from 2009 still represented 41-95% of shipments in monthly networks from 2009 and 41-66% of shipments from networks in 2010 and 2011. The temporal aspects of cattle shipments suggest that future applications of the U.S. cattle shipment network should consider seasonal shipment patterns. However, the consistent within-community shipping patterns indicate that yearly communities could provide a reasonable way to group regions for management.


Assuntos
Criação de Animais Domésticos/métodos , Comércio , Meios de Transporte , Criação de Animais Domésticos/economia , Animais , Bovinos , Feminino , Estudos Longitudinais , Masculino , Modelos Teóricos , Estações do Ano , Análise Espacial , Estados Unidos
16.
Elife ; 42015 Sep 02.
Artigo em Inglês | MEDLINE | ID: mdl-26329460

RESUMO

The controversy surrounding 'gain-of-function' experiments on high-consequence avian influenza viruses has highlighted the role of ferret transmission experiments in studying the transmission potential of novel influenza strains. However, the mapping between influenza transmission in ferrets and in humans is unsubstantiated. We address this gap by compiling and analyzing 240 estimates of influenza transmission in ferrets and humans. We demonstrate that estimates of ferret secondary attack rate (SAR) explain 66% of the variation in human SAR estimates at the subtype level. Further analysis shows that ferret transmission experiments have potential to identify influenza viruses of concern for epidemic spread in humans, though small sample sizes and biological uncertainties prevent definitive classification of human transmissibility. Thus, ferret transmission experiments provide valid predictions of pandemic potential of novel influenza strains, though results should continue to be corroborated by targeted virological and epidemiological research.


Assuntos
Influenza Humana/transmissão , Infecções por Orthomyxoviridae/transmissão , Animais , Modelos Animais de Doenças , Furões , Humanos , Modelos Teóricos
17.
Aquat Mamm ; 41(2): 203-212, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-30792564

RESUMO

Stranded California sea lions (Zalophus californianus) along the California coast have been diagnosed with leptospirosis every year since at least the 1980s. Between September 2010 and November 2011, we followed 14 stranded California sea lions that survived to release and evaluated antibiotic efficacy in eliminating leptospiruria (urinary shedding of leptospires). Leptospiruria was assessed by real-time PCR of urine and urine culture, with persistence assessed using longitudinally collected samples. Serum chemistry was used to assess recovery of normal renal function. Microscopic agglutination testing (MAT) was performed to assess serum anti-Leptospira antibody titers, and the MAT reactivity patterns were consistent with L. interrogans serovar Pomona infection frequently observed in this population. Animals were initially treated for 6 to 16 d (median = 10.5; mean = 10.8) with antibiotics from the penicillin family, with some receiving additional antibiotics to treat other medical conditions. All urine cultures were negative; therefore, the presence of leptospiruria was assessed using PCR. Leptospiruria continued beyond the initial course of penicillin family antibiotics in 13 of the 14 sea lions, beyond the last antibiotic dose in 11 of the 14 sea lions, beyond recovery of renal function in 13 of the 14 sea lions, and persisted for at least 8 to 86 d (median = 45; mean = 46.8). Five animals were released with no negative urine PCR results detected; thus, their total shedding duration may have been longer. Cessation of leptospiruria was more likely in animals that received antibiotics for a greater duration, especially if coverage was uninterrupted. Real-time PCR results indicate that an antibiotic protocol commonly used to treat leptospirosis in rehabilitating California sea lions does not eliminate leptospiruria. It is possible that antibiotic protocols given for a longer duration and/or including other antibiotics may be effective in eliminating leptospiruria. These results may have important human and animal health implications, especially in rehabilitation facilities, as Leptospira transmission may occur through contact with animals with persistent leptospiruria.

18.
Epidemics ; 10: 26-30, 2015 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-25843378

RESUMO

Many disease systems exhibit complexities not captured by current theoretical and empirical work. In particular, systems with multiple host species and multiple infectious agents (i.e., multi-host, multi-agent systems) require novel methods to extend the wealth of knowledge acquired studying primarily single-host, single-agent systems. We outline eight challenges in multi-host, multi-agent systems that could substantively increase our knowledge of the drivers and broader ecosystem effects of infectious disease dynamics.


Assuntos
Doenças Transmissíveis/epidemiologia , Doenças Transmissíveis/transmissão , Ecologia , Cadeia Alimentar , Interações Hospedeiro-Patógeno , Humanos , Estágios do Ciclo de Vida , Modelos Estatísticos , Dinâmica Populacional
19.
Parasit Vectors ; 8: 98, 2015 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-25889533

RESUMO

BACKGROUND: Vector control remains the primary defense against dengue fever. Its success relies on the assumption that vector density is related to disease transmission. Two operational issues include the amount by which mosquito density should be reduced to minimize transmission and the spatio-temporal allotment of resources needed to reduce mosquito density in a cost-effective manner. Recently, a novel technology, MI-Dengue, was implemented city-wide in several Brazilian cities to provide real-time mosquito surveillance data for spatial prioritization of vector control resources. We sought to understand the role of city-wide mosquito density data in predicting disease incidence in order to provide guidance for prioritization of vector control work. METHODS: We used hierarchical Bayesian regression modeling to examine the role of city-wide vector surveillance data in predicting human cases of dengue fever in space and time. We used four years of weekly surveillance data from Vitoria city, Brazil, to identify the best model structure. We tested effects of vector density, lagged case data and spatial connectivity. We investigated the generality of the best model using an additional year of data from Vitoria and two years of data from other Brazilian cities: Governador Valadares and Sete Lagoas. RESULTS: We found that city-wide, neighborhood-level averages of household vector density were a poor predictor of dengue-fever cases in the absence of accounting for interactions with human cases. Effects of city-wide spatial patterns were stronger than within-neighborhood or nearest-neighborhood effects. Readily available proxies of spatial relationships between human cases, such as economic status, population density or between-neighborhood roadway distance, did not explain spatial patterns in cases better than unweighted global effects. CONCLUSIONS: For spatial prioritization of vector controls, city-wide spatial effects should be given more weight than within-neighborhood or nearest-neighborhood connections, in order to minimize city-wide cases of dengue fever. More research is needed to determine which data could best inform city-wide connectivity. Once these data become available, MI-dengue may be even more effective if vector control is spatially prioritized by considering city-wide connectivity between cases together with information on the location of mosquito density and infected mosquitos.


Assuntos
Dengue/epidemiologia , Dengue/prevenção & controle , Transmissão de Doença Infecciosa/prevenção & controle , Monitoramento Epidemiológico , Alocação de Recursos para a Atenção à Saúde/métodos , Controle de Mosquitos/métodos , Animais , Brasil/epidemiologia , Cidades/epidemiologia , Dengue/transmissão , Humanos , Modelos Estatísticos , Análise Espaço-Temporal
20.
PLoS One ; 9(3): e91724, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-24670977

RESUMO

Globalization has increased the potential for the introduction and spread of novel pathogens over large spatial scales necessitating continental-scale disease models to guide emergency preparedness. Livestock disease spread models, such as those for the 2001 foot-and-mouth disease (FMD) epidemic in the United Kingdom, represent some of the best case studies of large-scale disease spread. However, generalization of these models to explore disease outcomes in other systems, such as the United States's cattle industry, has been hampered by differences in system size and complexity and the absence of suitable livestock movement data. Here, a unique database of US cattle shipments allows estimation of synthetic movement networks that inform a near-continental scale disease model of a potential FMD-like (i.e., rapidly spreading) epidemic in US cattle. The largest epidemics may affect over one-third of the US and 120,000 cattle premises, but cattle movement restrictions from infected counties, as opposed to national movement moratoriums, are found to effectively contain outbreaks. Slow detection or weak compliance may necessitate more severe state-level bans for similar control. Such results highlight the role of large-scale disease models in emergency preparedness, particularly for systems lacking comprehensive movement and outbreak data, and the need to rapidly implement multi-scale contingency plans during a potential US outbreak.


Assuntos
Doenças dos Bovinos/epidemiologia , Surtos de Doenças/veterinária , Movimento , Animais , Bovinos , Doenças dos Bovinos/prevenção & controle , Doenças dos Bovinos/transmissão , Surtos de Doenças/prevenção & controle , Geografia , Modelos Biológicos , Densidade Demográfica , Análise de Componente Principal , Fatores de Risco , Estados Unidos/epidemiologia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA