Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 31
Filtrar
1.
Epidemiol Infect ; 147: e152, 2019 01.
Artículo en Inglés | MEDLINE | ID: mdl-31063089

RESUMEN

Clostridium difficile infections (CDIs) affect patients in hospitals and in the community, but the relative importance of transmission in each setting is unknown. We developed a mathematical model of C. difficile transmission in a hospital and surrounding community that included infants, adults and transmission from animal reservoirs. We assessed the role of these transmission routes in maintaining disease and evaluated the recommended classification system for hospital- and community-acquired CDIs. The reproduction number in the hospital was 1 for nearly all scenarios without transmission from animal reservoirs (range: 1.0-1.34). However, the reproduction number for the human population was 3.5-26.0%) of human exposures originated from animal reservoirs. Symptomatic adults accounted for <10% transmission in the community. Under conservative assumptions, infants accounted for 17% of community transmission. An estimated 33-40% of community-acquired cases were reported but 28-39% of these reported cases were misclassified as hospital-acquired by recommended definitions. Transmission could be plausibly sustained by asymptomatically colonised adults and infants in the community or exposure to animal reservoirs, but not hospital transmission alone. Under-reporting of community-onset cases and systematic misclassification underplays the role of community transmission.


Asunto(s)
Portador Sano/epidemiología , Portador Sano/veterinaria , Infecciones por Clostridium/transmisión , Infecciones Comunitarias Adquiridas/transmisión , Reservorios de Enfermedades , Transmisión de Enfermedad Infecciosa , Animales , Portador Sano/microbiología , Infecciones por Clostridium/epidemiología , Infecciones Comunitarias Adquiridas/epidemiología , Humanos , Lactante , Modelos Teóricos
2.
Epidemiol Infect ; 143(9): 1816-25, 2015 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-25366865

RESUMEN

There were multiple waves of influenza-like illness in 1918, the last of which resulted in a highly lethal pandemic killing 50 million people. It is difficult to study the initial waves of influenza-like illness in early 1918 because few deaths resulted and few morbidity records exist. Using extant military mortality records, we constructed mortality maps based on location of burial in France and Belgium in the British Army, and on home town in Vermont and New York in the USA Army. Differences between early and more lethal later waves in late 1918 were consistent with historical descriptions in France. The maps of Vermont and New York support the hypothesis that previous exposure may have conferred a degree of protection against subsequent infections; soldiers from rural areas, which were likely to have experienced less mixing than soldiers from urban areas, were at higher risk of mortality. Differences between combat and disease mortality in 1918 were consistent with limited influenza virus circulation during the early 1918 wave. We suggest that it is likely that more than one influenza virus was circulating in 1918, which might help explain the higher mortality rates in those unlikely to have been infected in early 1918.


Asunto(s)
Subtipo H1N1 del Virus de la Influenza A/fisiología , Gripe Humana/historia , Pandemias , Francia/epidemiología , Historia del Siglo XX , Humanos , Subtipo H1N1 del Virus de la Influenza A/genética , Gripe Humana/epidemiología , Gripe Humana/mortalidad , Personal Militar , New York/epidemiología , Estudios Retrospectivos , Factores de Riesgo , Reino Unido/epidemiología , Vermont/epidemiología , Guerra
4.
Sci Rep ; 13(1): 1444, 2023 01 25.
Artículo en Inglés | MEDLINE | ID: mdl-36697451

RESUMEN

The rate of soil-transmitted helminth (STH) infection is estimated to be around 20% in Indonesia. Health promotion and health education are cost-effective strategies to supplement STH prevention and control programs. Existing studies suggest that quantitative tools for knowledge, attitudes and practices (KAP) are important to monitor effective community-based STH interventions. However, evidence is limited regarding the applicability of such tools. This study aims to identify the socio-demographic predictors for STH-related knowledge and practices and validate the quantitative tools in population use. A cross-sectional study design was conducted among residents of 16 villages in Central Java, Indonesia. Adult and child respondents were interviewed to assess general knowledge and practices in relation to STH. Two mixed effects models identified the significant factors in predicting knowledge and practice scores. The model predicted knowledge and practice scores were compared with the observed scores to validate the quantitative measurements developed in this study. Participants' socio-demographic variables were significant in predicting an individual's STH-related knowledge level and their hand washing and hygiene practices, taking into account household-level variability. Model validation results confirmed that the quantitative measurement tools were suitable for assessing STH associated knowledge and behaviour. The questionnaire developed in this study can be used to support school- and community-based health education interventions to maximize the effect of STH prevention and control programs.


Asunto(s)
Helmintiasis , Helmintos , Niño , Adulto , Humanos , Animales , Suelo , Indonesia/epidemiología , Estudios Transversales , Helmintiasis/epidemiología , Helmintiasis/prevención & control , Encuestas y Cuestionarios , Prevalencia , Heces
5.
Environ Res ; 110(6): 604-11, 2010 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-20519131

RESUMEN

Hot and cold temperatures significantly increase mortality rates around the world, but which measure of temperature is the best predictor of mortality is not known. We used mortality data from 107 US cities for the years 1987-2000 and examined the association between temperature and mortality using Poisson regression and modelled a non-linear temperature effect and a non-linear lag structure. We examined mean, minimum and maximum temperature with and without humidity, and apparent temperature and the Humidex. The best measure was defined as that with the minimum cross-validated residual. We found large differences in the best temperature measure between age groups, seasons and cities, and there was no one temperature measure that was superior to the others. The strong correlation between different measures of temperature means that, on average, they have the same predictive ability. The best temperature measure for new studies can be chosen based on practical concerns, such as choosing the measure with the least amount of missing data.


Asunto(s)
Frío , Calor , Mortalidad , Anciano , Anciano de 80 o más Años , Predicción , Humanos , Persona de Mediana Edad , Distribución de Poisson , Estados Unidos/epidemiología
6.
Ann Trop Med Parasitol ; 104(4): 303-18, 2010 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-20659391

RESUMEN

In terms of their applicability to the field of tropical medicine, geographical information systems (GIS) have developed enormously in the last two decades. This article reviews some of the pertinent and representative applications of GIS, including the use of such systems and remote sensing for the mapping of Chagas disease and human helminthiases, the use of GIS in vaccine trials, and the global applications of GIS for health-information management, disease epidemiology, and pandemic planning. The future use of GIS as a decision-making tool and some barriers to the widespread implementation of such systems in developing settings are also discussed.


Asunto(s)
Sistemas de Información Geográfica/organización & administración , Informática en Salud Pública/organización & administración , Comunicaciones por Satélite/organización & administración , Medicina Tropical , Predicción , Humanos
7.
BMC Infect Dis ; 9: 145, 2009 Sep 01.
Artículo en Inglés | MEDLINE | ID: mdl-19719852

RESUMEN

BACKGROUND: To allow direct comparison of bloodstream infection (BSI) rates between hospitals for performance measurement, observed rates need to be risk adjusted according to the types of patients cared for by the hospital. However, attribute data on all individual patients are often unavailable and hospital-level risk adjustment needs to be done using indirect indicator variables of patient case mix, such as hospital level. We aimed to identify medical services associated with high or low BSI rates, and to evaluate the services provided by the hospital as indicators that can be used for more objective hospital-level risk adjustment. METHODS: From February 2001-December 2007, 1719 monthly BSI counts were available from 18 hospitals in Queensland, Australia. BSI outcomes were stratified into four groups: overall BSI (OBSI), Staphylococcus aureus BSI (STAPH), intravascular device-related S. aureus BSI (IVD-STAPH) and methicillin-resistant S. aureus BSI (MRSA). Twelve services were considered as candidate risk-adjustment variables. For OBSI, STAPH and IVD-STAPH, we developed generalized estimating equation Poisson regression models that accounted for autocorrelation in longitudinal counts. Due to a lack of autocorrelation, a standard logistic regression model was specified for MRSA. RESULTS: Four risk services were identified for OBSI: AIDS (IRR 2.14, 95% CI 1.20 to 3.82), infectious diseases (IRR 2.72, 95% CI 1.97 to 3.76), oncology (IRR 1.60, 95% CI 1.29 to 1.98) and bone marrow transplants (IRR 1.52, 95% CI 1.14 to 2.03). Four protective services were also found. A similar but smaller group of risk and protective services were found for the other outcomes. Acceptable agreement between observed and fitted values was found for the OBSI and STAPH models but not for the IVD-STAPH and MRSA models. However, the IVD-STAPH and MRSA models successfully discriminated between hospitals with higher and lower BSI rates. CONCLUSION: The high model goodness-of-fit and the higher frequency of OBSI and STAPH outcomes indicated that hospital-specific risk adjustment based on medical services provided would be useful for these outcomes in Queensland. The low frequency of IVD-STAPH and MRSA outcomes indicated that development of a hospital-level risk score was a more valid method of risk adjustment for these outcomes.


Asunto(s)
Infección Hospitalaria/epidemiología , Hospitales Públicos/estadística & datos numéricos , Evaluación de Resultado en la Atención de Salud , Sepsis/epidemiología , Estudios de Cohortes , Humanos , Modelos Teóricos , Queensland/epidemiología , Análisis de Regresión , Estudios Retrospectivos , Ajuste de Riesgo
8.
Parasitology ; 136(13): 1719-30, 2009 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-19631008

RESUMEN

Schistosomiasis remains one of the most prevalent parasitic diseases in developing countries. After malaria, schistosomiasis is the most important tropical disease in terms of human morbidity with significant economic and public health consequences. Although schistosomiasis has recently attracted increased focus and funding for control, it has been estimated that less than 20% of the funding needed to control the disease in Africa is currently available. In this article the following issues are discussed: the rationale, development and objectives of the Schistosomiasis Control Initiative (SCI)-supported programmes; the management approaches followed to achieve implementation by each country; mapping, monitoring and evaluation activities with quantifiable impact of control programmes; monitoring for any potential drug resistance; and finally exit strategies within each country. The results have demonstrated that morbidity due to schistosomiasis has been reduced by the control programmes. While challenges remain, the case for the control of schistosomiasis has been strengthened by research by SCI teams and the principle that a national programme using 'preventive chemotherapy' can be successfully implemented in sub-Saharan Africa, whenever the resources are available. SCI and partners are now actively striving to raise further funds to expand the coverage of integrated control of neglected tropical diseases (NTDs) in sub-Saharan Africa.


Asunto(s)
Control de Enfermedades Transmisibles/organización & administración , Programas Nacionales de Salud/organización & administración , Esquistosomiasis/epidemiología , Esquistosomiasis/prevención & control , Adolescente , África del Sur del Sahara/epidemiología , Niño , Control de Enfermedades Transmisibles/métodos , Educación en Salud , Humanos , Cooperación Internacional , Programas Nacionales de Salud/economía , Salud Pública/métodos , Factores de Tiempo
9.
J Hosp Infect ; 102(2): 157-164, 2019 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-30880267

RESUMEN

BACKGROUND: Clostridium difficile infection (CDI) is the leading cause of antibiotic-associated diarrhoea with peak incidence in late winter or early autumn. Although CDI is commonly associated with hospitals, community transmission is important. AIM: To explore potential drivers of CDI seasonality and the effect of community-based interventions to reduce transmission. METHODS: A mechanistic compartmental model of C. difficile transmission in a hospital and surrounding community was used to determine the effect of reducing transmission or antibiotic prescriptions in these settings. The model was extended to allow for seasonal antibiotic prescriptions and seasonal transmission. FINDINGS: Modelling antibiotic seasonality reproduced the seasonality of CDI, including approximate magnitude (13.9-15.1% above annual mean) and timing of peaks (0.7-1.0 months after peak antibiotics). Halving seasonal excess prescriptions reduced the incidence of CDI by 6-18%. Seasonal transmission produced larger seasonal peaks in the prevalence of community colonization (14.8-22.1% above mean) than seasonal antibiotic prescriptions (0.2-1.7% above mean). Reducing transmission from symptomatic or hospitalized patients had little effect on community-acquired CDI, but reducing transmission in the community by ≥7% or transmission from infants by ≥30% eliminated the pathogen. Reducing antibiotic prescription rates led to approximately proportional reductions in infections, but limited reductions in the prevalence of colonization. CONCLUSION: Seasonal variation in antibiotic prescription rates can account for the observed magnitude and timing of C. difficile seasonality. Even complete prevention of transmission from hospitalized patients or symptomatic patients cannot eliminate the pathogen, but interventions to reduce transmission from community residents or infants could have a large impact on both hospital- and community-acquired infections.


Asunto(s)
Antibacterianos/uso terapéutico , Infecciones por Clostridium/prevención & control , Infecciones por Clostridium/transmisión , Transmisión de Enfermedad Infecciosa/prevención & control , Utilización de Medicamentos , Control de Infecciones/métodos , Modelos Teóricos , Adulto , Anciano , Humanos , Lactante , Prescripciones/estadística & datos numéricos , Prevalencia , Estaciones del Año
10.
Int J Parasitol ; 38(3-4): 401-15, 2008 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-17920605

RESUMEN

Spatial modelling was applied to self-reported schistosomiasis data from over 2.5 million school students from 12,399 schools in all regions of mainland Tanzania. The aims were to derive statistically robust prevalence estimates in small geographical units (wards), to identify spatial clusters of high and low prevalence and to quantify uncertainty surrounding prevalence estimates. The objective was to permit informed decision-making for targeting of resources by the Tanzanian national schistosomiasis control programme. Bayesian logistic regression models were constructed to investigate the risk of schistosomiasis in each ward, based on the prevalence of self-reported schistosomiasis and blood in urine. Models contained covariates representing climatic and demographic effects and random effects for spatial clustering. Degree of urbanisation, median elevation of the ward and median normalised difference vegetation index (NDVI) were significantly and negatively associated with schistosomiasis prevalence. Most regions contained wards that had >95% certainty of schistosomiasis prevalence being >10%, the selected threshold for bi-annual mass chemotherapy of school-age children. Wards with >95% certainty of schistosomiasis prevalence being >30%, the selected threshold for annual mass chemotherapy of school-age children, were clustered in north-western, south-western and south-eastern regions. Large sample sizes in most wards meant raw prevalence estimates were robust. However, when uncertainties were investigated, intervention status was equivocal in 6.7-13.0% of wards depending on the criterion used. The resulting maps are being used to plan the distribution of praziquantel to participating districts; they will be applied to prioritising control in those wards where prevalence was unequivocally above thresholds for intervention and might direct decision-makers to obtain more information in wards where intervention status was uncertain.


Asunto(s)
Teorema de Bayes , Schistosoma mansoni , Esquistosomiasis Urinaria/epidemiología , Esquistosomiasis mansoni/epidemiología , Animales , Niño , Humanos , Prevalencia , Esquistosomiasis Urinaria/prevención & control , Esquistosomiasis mansoni/prevención & control , Encuestas y Cuestionarios , Tanzanía/epidemiología
11.
J Hosp Infect ; 99(4): 453-460, 2018 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-29258917

RESUMEN

BACKGROUND: Clostridium difficile infections occur frequently among hospitalized patients, with some infections acquired in hospital and others in the community. International guidelines classify cases as hospital-acquired if symptom onset occurs more than two days after admission. This classification informs surveillance and infection control, but has not been verified by empirical or modelling studies. AIM: To assess current classification of C. difficile acquisition using a simulation model as a reference standard. METHODS: C. difficile transmission was simulated in a range of hospital scenarios. The sensitivity, specificity and precision of classifications that use cut-offs ranging from 0.25 h to 40 days were calculated. The optimal cut-off that correctly estimated the proportion of cases that were hospital acquired and the balanced cut-off that had equal sensitivity and specificity were identified. FINDINGS: The recommended two-day cut-off overestimated the incidence of hospital-acquired cases in all scenarios and by >100% in the base scenario. The two-day cut-off had good sensitivity (96%) but poor specificity (48%) and precision (52%) to identify cases acquired during the current hospitalization. A five-day cut-off was balanced, and a six-day cut-off was optimal in the base scenario. The optimal and balanced cut-offs were more than two days for nearly all scenarios considered (ranges: four to nine days and two to eight days, respectively). CONCLUSION: Current guidelines for classifying C. difficile infections overestimate the proportion of cases acquired in hospital in all model scenarios. To reduce misclassification bias, an infection should be classified as being acquired prior to admission if symptoms begin within five days of admission.


Asunto(s)
Clostridioides difficile/aislamiento & purificación , Infecciones por Clostridium/epidemiología , Infecciones Comunitarias Adquiridas/diagnóstico , Infecciones Comunitarias Adquiridas/epidemiología , Infección Hospitalaria/diagnóstico , Infección Hospitalaria/epidemiología , Métodos Epidemiológicos , Clostridioides difficile/clasificación , Clostridioides difficile/genética , Infecciones por Clostridium/microbiología , Infecciones Comunitarias Adquiridas/microbiología , Infección Hospitalaria/microbiología , Humanos , Incidencia , Modelos Teóricos , Sensibilidad y Especificidad
12.
J Hosp Infect ; 66(2): 148-55, 2007 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-17493705

RESUMEN

This study evaluated the US National Nosocomial Infection Surveillance (NNIS) risk index (RI) in Australia for different surgical site infection (SSI) outcomes (overall, in-hospital, post-discharge, deep-incisional and superficial-incisional infection) and investigated local risk factors for SSI. A SSI surveillance dataset containing 43 611 records for 13 common surgical procedures, conducted in 23 hospitals between February 2001 and June 2005, was used for the analysis. The NNIS RI was evaluated against the observed SSI data using diagnostic test evaluation statistics (sensitivity, specificity, positive predictive value, negative predictive value). Sensitivity was low for all SSI outcomes (ranging from 0.47 to 0.69 and from 0.09 to 0.20 using RI thresholds of 1 and 2 respectively), while specificity varied depending on the RI threshold (0.55 and 0.93 with thresholds of 1 and 2 respectively). Mixed-effects logistic regression models were developed for the five SSI outcomes using a range of available potential risk factors. American Society of Anaesthesiologists (ASA) physical status score >2, duration of surgery, absence of antibiotic prophylaxis and type of surgical procedure were significant risk factors for one or more SSI outcomes, and risk factors varied for different SSI outcomes. The discriminatory ability of the NNIS RI was insufficient for its use as an accurate risk stratification tool for SSI surveillance in Australia and its sensitivity was too low for it to be appropriately used as a prognostic indicator.


Asunto(s)
Infección Hospitalaria/epidemiología , Infección de la Herida Quirúrgica/epidemiología , Adulto , Anciano , Anciano de 80 o más Años , Australia/epidemiología , Femenino , Humanos , Masculino , Persona de Mediana Edad , Medición de Riesgo/métodos , Factores de Riesgo
13.
Prev Vet Med ; 140: 78-86, 2017 May 01.
Artículo en Inglés | MEDLINE | ID: mdl-28460753

RESUMEN

Results obtained from a nationwide longitudinal study were extended to estimate the population-level effects of selected risk factors on the incidence of bovine respiratory disease (BRD) during the first 50days at risk in medium-sized to large Australian feedlots. Population attributable fractions (PAF) and population attributable risks (PAR) were used to rank selected risk factors in order of importance from the perspective of the Australian feedlot industry within two mutually exclusive categories: 'intervention' risk factors had practical strategies that feedlot managers could implement to avoid exposure of cattle to adverse levels of the risk factor and a precise estimate of the population-level effect while 'others' did not. An alternative method was also used to quantify the expected effects of simultaneously preventing exposure to multiple management-related factors whilst not changing exposure to factors that were more difficult to modify. The most important 'intervention' risk factors were shared pen water (PAF: 0.70, 95% credible interval: 0.45-0.83), breed (PAF: 0.67, 95% credible interval: 0.54-0.77), the animal's prior lifetime history of mixing with cattle from other herds (PAF: 0.53, 95% credible interval: 0.30-0.69), timing of the animal's move to the vicinity of the feedlot (PAF: 0.45, 95% credible interval: 0.17-0.68), the presence of Bovine viral diarrhoea virus 1 (BVDV-1) in the animal's cohort (PAF: 0.30, 95% credible interval: 0.04-0.50), the number of study animals in the animal's group 13days before induction (PAF: 0.30, 95% credible interval: 0.10-0.44) and induction weight (PAF: 0.16, 95% credible interval: 0.09-0.23). Other important risk factors identified and prioritised for further research were feedlot region, season of induction and cohort formation patterns. An estimated 82% of BRD incidence was attributable to management-related risk factors, whereby the lowest risk category of a composite management-related variable comprised animals in the lowest risk category of at least four of the five component variables (shared pen water, mixing, move timing, BVDV-1 in the cohort and the number of animals in the animal's group-13). This indicated that widespread adoption of appropriate interventions including ensuring pen water is not shared between pens, optimising animal mixing before induction, timing of the animal's move to the vicinity of the feedlot, and group size prior to placing animals in feedlot pens, and avoiding BVDV-1 in cohorts could markedly reduce the incidence of BRD in medium-sized to large Australian feedlots.


Asunto(s)
Crianza de Animales Domésticos/métodos , Complejo Respiratorio Bovino/epidemiología , Complejo Respiratorio Bovino/etiología , Animales , Australia/epidemiología , Bovinos , Femenino , Incidencia , Modelos Logísticos , Estudios Longitudinales , Masculino , Factores de Riesgo
14.
J Hosp Infect ; 97(2): 115-121, 2017 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-28576454

RESUMEN

BACKGROUND: Hospital volume is known to have a direct impact on the outcomes of major surgical procedures. However, it is unclear if the evidence applies specifically to surgical site infections. AIMS: To determine if there are procedure-specific hospital outliers [with higher surgical site infection rates (SSIRs)] for four major surgical procedures, and to examine if hospital volume is associated with SSIRs in the context of outlier performance in New South Wales (NSW), Australia. METHODS: Adults who underwent one of four surgical procedures (colorectal, joint replacement, spinal and cardiac procedures) at a NSW healthcare facility between 2002 and 2013 were included. The hospital volume for each of the four surgical procedures was categorized into tertiles (low, medium and high). Multi-variable logistic regression models were built to estimate the expected SSIR for each procedure. The expected SSIRs were used to compute indirect standardized SSIRs which were then plotted in funnel plots to identify hospital outliers. FINDINGS: One hospital was identified to be an overall outlier (higher SSIRs for three of the four procedures performed in its facilities), whereas two hospitals were outliers for one specific procedure throughout the entire study period. Low-volume facilities performed the best for colorectal surgery and worst for joint replacement and cardiac surgery. One high-volume facility was an outlier for spinal surgery. CONCLUSIONS: Surgical site infections seem to be mainly a procedure-specific, as opposed to a hospital-specific, phenomenon in NSW. The association between hospital volume and SSIRs differs for different surgical procedures.


Asunto(s)
Artroplastia de Reemplazo/estadística & datos numéricos , Procedimientos Quirúrgicos Cardíacos/estadística & datos numéricos , Cirugía Colorrectal/estadística & datos numéricos , Hospitales/estadística & datos numéricos , Columna Vertebral/cirugía , Infección de la Herida Quirúrgica/epidemiología , Anciano , Infección Hospitalaria/epidemiología , Bases de Datos Factuales/estadística & datos numéricos , Femenino , Investigación sobre Servicios de Salud , Humanos , Modelos Logísticos , Masculino , Persona de Mediana Edad , Nueva Gales del Sur/epidemiología
15.
Clin Microbiol Infect ; 23(1): 48.e1-48.e7, 2017 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-27615716

RESUMEN

OBJECTIVES: To investigate the prevalence and risk factors for asymptomatic toxigenic (TCD) and nontoxigenic Clostridium difficile (NTCD) colonization in a broad cross section of the general hospital population over a 3-year period. METHODS: Patients without diarrhoea admitted to two Australian tertiary hospitals were randomly selected through six repeated cross-sectional surveys conducted between 2012 and 2014. Stool specimens were cultured under anaerobic conditions, and C. difficile isolates were tested for the presence of toxin genes and ribotyped. Patients were then grouped into noncolonized, TCD colonized or NTCD colonized for identifying risk factors using multinomial logistic regression models. RESULTS: A total of 1380 asymptomatic patients were enrolled; 76 patients (5.5%) were TCD colonized and 28 (2.0%) were NTCD colonized. There was a decreasing annual trend in TCD colonization, and asymptomatic colonization was more prevalent during the summer than winter months. TCD colonization was associated with gastro-oesophageal reflux disease (relative risk ratio (RRR) = 2.20; 95% confidence interval (CI) 1.17-4.14), higher number of admissions in the previous year (RRR = 1.24; 95% CI 1.10-1.39) and antimicrobial exposure during the current admission (RRR = 2.78; 95% CI 1.23-6.28). NTCD colonization was associated with chronic obstructive pulmonary disease (RRR = 3.88; 95% CI 1.66-9.07) and chronic kidney failure (RRR = 5.78; 95% CI 2.29-14.59). Forty-eight different ribotypes were identified, with 014/020 (n = 23), 018 (n = 10) and 056 (n = 6) being the most commonly isolated. CONCLUSIONS: Risk factors differ between patients with asymptomatic colonization by toxigenic and nontoxigenic strains. Given that morbidity is largely driven by toxigenic strains, this novel finding has important implications for disease control and prevention.


Asunto(s)
Portador Sano , Clostridioides difficile/aislamiento & purificación , Hospitales , Adulto , Anciano , Anciano de 80 o más Años , Australia/epidemiología , Infecciones por Clostridium/epidemiología , Infecciones por Clostridium/microbiología , Estudios Transversales , Femenino , Humanos , Masculino , Persona de Mediana Edad , Factores de Riesgo , Estaciones del Año
16.
Prev Vet Med ; 128: 23-32, 2016 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-27237387

RESUMEN

Bovine respiratory disease (BRD) is the major cause of clinical disease and death in feedlot cattle. A prospective longitudinal study was conducted in a population of Australian feedlot cattle to assess associations between factors related to feedlot management and risk of BRD. In total, 35,131 animals in 170 pens (cohorts) inducted into 14 feedlots were included in statistical analyses. Causal diagrams were used to inform model building to allow separate estimation of total and direct effects. Multilevel mixed effects logistic regression models were fitted within the Bayesian framework. The placement of pen water troughs such that they could be accessed by animals in adjoining pens was associated with markedly increased risk of BRD (OR 4.3, 95% credible interval: 1.4-10.3). Adding animals to pens over multiple days was associated with increased risk of BRD across all animals in those pens compared to placing all animals in the pen on a single day (total effect: OR 1.9, 95% credible interval: 1.2-2.8). The much attenuated direct effect indicated that this was primarily mediated via factors on indirect pathways so it may be possible to ameliorate the adverse effects of adding animals to pens over multiple days by altering exposure to these intervening factors (e.g. mixing history). In pens in which animals were added to the pen over multiple days, animals added ≥7 days (OR: 0.7, credible interval: 0.5-0.9) or 1-6 days (OR: 0.8, credible interval: 0.7-1.0) before the last animal was added were at modestly reduced risk of BRD compared to the animals that were added to the pen on the latest day. Further research is required to disentangle effects of cohort formation patterns at animal-level and higher levels on animal-level risk of BRD. Vaccination against Bovine herpesvirus 1 at feedlot entry was investigated but results were inconclusive and further research is required to evaluate vaccine efficacy. We conclude that there are practical interventions available to feedlot managers to reduce the risk of cattle developing BRD at the feedlot. We recommend placement of water troughs in feedlot pens so that they cannot be accessed by animals in adjoining pens. Further research is required to identify practical and cost-effective management strategies that allow longer adaption times for cattle identified prior to induction as being at higher risk of developing BRD.


Asunto(s)
Crianza de Animales Domésticos/métodos , Complejo Respiratorio Bovino/epidemiología , Animales , Australia/epidemiología , Teorema de Bayes , Complejo Respiratorio Bovino/virología , Bovinos , Modelos Logísticos , Estudios Longitudinales , Estudios Prospectivos , Factores de Riesgo
17.
Sci Rep ; 6: 30299, 2016 07 25.
Artículo en Inglés | MEDLINE | ID: mdl-27452598

RESUMEN

To prevent diseases associated with inadequate sanitation and poor hygiene, people needing latrines and behavioural interventions must be identified. We compared two indicators that could be used to identify those people. Indicator 1 of household latrine coverage was a simple Yes/No response to the question "Does your household have a latrine?" Indicator 2 was more comprehensive, combining questions about defecation behaviour with observations of latrine conditions. Using a standardized procedure and questionnaire, trained research assistants collected data from 6,599 residents of 16 rural villages in Indonesia. Indicator 1 identified 30.3% as not having a household latrine, while Indicator 2 identified 56.0% as using unimproved sanitation. Indicator 2 thus identified an additional 1,710 people who were missed by Indicator 1. Those 1,710 people were of lower socioeconomic status (p < 0.001), and a smaller percentage practiced appropriate hand-washing (p < 0.02). These results show how a good indicator of need for sanitation and hygiene interventions can combine evidences of both access and use, from self-reports and objective observation. Such an indicator can inform decisions about sanitation-related interventions and about scaling deworming programmes up or down. Further, a comprehensive and locally relevant indicator allows improved targeting to those most in need of a hygiene-behaviour intervention.


Asunto(s)
Desinfección de las Manos , Higiene , Saneamiento , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Ensayos Clínicos como Asunto , Ambiente , Composición Familiar , Femenino , Humanos , Indonesia/epidemiología , Masculino , Persona de Mediana Edad , Población Rural , Clase Social , Encuestas y Cuestionarios , Cuartos de Baño , Adulto Joven
18.
Prev Vet Med ; 125: 66-74, 2016 Mar 01.
Artículo en Inglés | MEDLINE | ID: mdl-26830058

RESUMEN

A prospective longitudinal study was conducted in a population of Australian feedlot cattle to assess associations between animal characteristic and environmental risk factors and risk of bovine respiratory disease (BRD). Animal characteristics were recorded at induction, when animals were individually identified and enrolled into study cohorts (comprising animals in a feedlot pen). Environmental risk factors included the year and season of induction, source region and feedlot region and summary variables describing weather during the first week of follow-up. In total, 35,131 animals inducted into 170 cohorts within 14 feedlots were included in statistical analyses. Causal diagrams were used to inform model building and multilevel mixed effects logistic regression models were fitted within the Bayesian framework. Breed, induction weight and season of induction were significantly and strongly associated with risk of BRD. Compared to Angus cattle, Herefords were at markedly increased risk (OR: 2.0, 95% credible interval: 1.5-2.6) and tropically adapted breeds and their crosses were at markedly reduced risk (OR: 0.5, 95% credible interval: 0.3-0.7) of developing BRD. Risk of BRD declined with increased induction weight, with cattle in the heaviest weight category (≥480kg) at moderately reduced risk compared to cattle weighing <400kg at induction (OR: 0.6, 95% credible interval: 0.5-0.7). Animals inducted into feedlots during summer (OR: 2.4, 95% credible interval: 1.4-3.8) and autumn (OR: 2.1, 95% credible interval: 1.2-3.2) were at markedly increased risk compared to animals inducted during spring. Knowledge of these risk factors may be useful in predicting BRD risk for incoming groups of cattle in Australian feedlots. This would then provide the opportunity for feedlot managers to tailor management strategies for specific subsets of animals according to predicted BRD risk.


Asunto(s)
Complejo Respiratorio Bovino/epidemiología , Enfermedades de los Bovinos/epidemiología , Ambiente , Animales , Australia/epidemiología , Peso Corporal , Complejo Respiratorio Bovino/etiología , Complejo Respiratorio Bovino/genética , Bovinos , Enfermedades de los Bovinos/etiología , Enfermedades de los Bovinos/genética , Estudios Longitudinales , Estudios Prospectivos , Factores de Riesgo , Estaciones del Año
19.
Prev Vet Med ; 127: 37-43, 2016 May 01.
Artículo en Inglés | MEDLINE | ID: mdl-27094138

RESUMEN

Bovine respiratory disease (BRD) is the major cause of clinical disease and death in feedlot populations worldwide. A longitudinal study was conducted to assess associations between risk factors related to on-farm management prior to transport to the feedlot and risk of BRD in a population of feedlot beef cattle sourced from throughout the cattle producing regions of Australia. Exposure variables were derived from questionnaire data provided by farmers supplying cattle (N=10,721) that were a subset of the population included in a nationwide prospective study investigating numerous putative risk factors for BRD. Causal diagrams were used to inform model building to allow estimation of effects of interest. Multilevel mixed effects logistic regression models were fitted within the Bayesian framework. Animals that were yard weaned were at reduced risk (OR: 0.7, 95% credible interval: 0.5-1.0) of BRD at the feedlot compared to animals immediately returned to pasture after weaning. Animals that had previously been fed grain (OR: 0.6, 95% credible interval: 0.3-1.1) were probably at reduced risk of BRD at the feedlot compared to animals not previously fed grain. Animals that received prior vaccinations against Bovine viral diarrhoea virus 1 (OR: 0.8, 95% credible interval: 0.5-1.1) or Mannheimia haemolytica (OR: 0.8, 95% credible interval: 0.6-1.0) were also probably at reduced risk compared to non-vaccinated animals. The results of this study confirm that on-farm management before feedlot entry can alter risk of BRD after beef cattle enter feedlots.


Asunto(s)
Crianza de Animales Domésticos/métodos , Complejo Respiratorio Bovino/epidemiología , Animales , Australia/epidemiología , Complejo Respiratorio Bovino/etiología , Bovinos , Modelos Logísticos , Estudios Longitudinales , Estudios Prospectivos , Factores de Riesgo
20.
Prev Vet Med ; 126: 159-69, 2016 Apr 01.
Artículo en Inglés | MEDLINE | ID: mdl-26907209

RESUMEN

Viruses play a key role in the complex aetiology of bovine respiratory disease (BRD). Bovine viral diarrhoea virus 1 (BVDV-1) is widespread in Australia and has been shown to contribute to BRD occurrence. As part of a prospective longitudinal study on BRD, effects of exposure to BVDV-1 on risk of BRD in Australian feedlot cattle were investigated. A total of 35,160 animals were enrolled at induction (when animals were identified and characteristics recorded), held in feedlot pens with other cattle (cohorts) and monitored for occurrence of BRD over the first 50days following induction. Biological samples collected from all animals were tested to determine which animals were persistently infected (PI) with BVDV-1. Data obtained from the Australian National Livestock Identification System database were used to determine which groups of animals that were together at the farm of origin and at 28days prior to induction (and were enrolled in the study) contained a PI animal and hence to identify animals that had probably been exposed to a PI animal prior to induction. Multi-level Bayesian logistic regression models were fitted to estimate the effects of exposure to BVDV-1 on the risk of occurrence of BRD. Although only a total of 85 study animals (0.24%) were identified as being PI with BVDV-1, BVDV-1 was detected on quantitative polymerase chain reaction in 59% of cohorts. The PI animals were at moderately increased risk of BRD (OR 1.9; 95% credible interval 1.0-3.2). Exposure to BVDV-1 in the cohort was also associated with a moderately increased risk of BRD (OR 1.7; 95% credible interval 1.1-2.5) regardless of whether or not a PI animal was identified within the cohort. Additional analyses indicated that a single quantitative real-time PCR test is useful for distinguishing PI animals from transiently infected animals. The results of the study suggest that removal of PI animals and/or vaccination, both before feedlot entry, would reduce the impact of BVDV-1 on BRD risk in cattle in Australian feedlots. Economic assessment of these strategies under Australian conditions is required.


Asunto(s)
Diarrea Mucosa Bovina Viral/epidemiología , Virus de la Diarrea Viral Bovina Tipo 1 , Alimentación Animal/virología , Crianza de Animales Domésticos , Animales , Anticuerpos Antivirales/sangre , Diarrea Mucosa Bovina Viral/diagnóstico , Diarrea Mucosa Bovina Viral/prevención & control , Diarrea Mucosa Bovina Viral/transmisión , Bovinos , Estudios de Cohortes , Virus de la Diarrea Viral Bovina Tipo 1/genética , Virus de la Diarrea Viral Bovina Tipo 1/aislamiento & purificación , Prevalencia , Reacción en Cadena en Tiempo Real de la Polimerasa/veterinaria , Factores de Riesgo , Vacunas Virales/administración & dosificación
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA