Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 229
Filtrar
Más filtros

Intervalo de año de publicación
1.
Emerg Infect Dis ; 30(7): 1447-1449, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-38916636

RESUMEN

We report the effect of a rodent control program on the incidence of zoonotic cutaneous leishmaniasis in an endemic region of Iran. A 1-year interruption in rodent control led to 2 years of increased incidence of zoonotic cutaneous leishmaniasis. Restarting rodent control led to a decline of zoonotic cutaneous leishmaniasis.


Asunto(s)
Leishmaniasis Cutánea , Zoonosis , Irán/epidemiología , Leishmaniasis Cutánea/epidemiología , Leishmaniasis Cutánea/prevención & control , Animales , Zoonosis/epidemiología , Zoonosis/prevención & control , Humanos , Incidencia , Control de Roedores/métodos , Roedores/parasitología , Reservorios de Enfermedades/parasitología , Reservorios de Enfermedades/veterinaria
2.
New Microbiol ; 47(2): 172-179, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-39023527

RESUMEN

Italy is recognized as having the highest Hepatitis C virus (HCV) prevalence in Europe. The Tuscany region, where the prevalence of HCV infection is approximately 0.8%, implemented two programs for the control of chronic hepatitis C in Tuscany from 2018 to 2022. This retrospective study aims to investigate the incidence of HCV in a population screened in Southeastern Tuscany from 2013 to 2022. The study population included 246,137 patients from the provincial area of Arezzo and Grosseto, Tuscany, spanning from January 2013 to October 2022. Among the subjects included in the study, 3,190 (1.29%) tested positive for anti-HCV antibodies. Of this population, 2,119 patients (66.43%) also tested positive for HCV-RNA quantification, leading to their enrolment for subsequent viral genotyping. 1,106 patients had genotype (GT) 1 (52.2%), 484 had GT 3 (22.8%), 371 had GT 2 (17.5%), and 158 had GT 4 (7.5%). Our study underscores the prevalence of HCV GTs 1 and 3 as the most predominant GTs in the Southeast Tuscany region. We also observe a correlation between age, sex and HCV genotypic distribution.


Asunto(s)
Genotipo , Hepacivirus , Hepatitis C , Humanos , Italia/epidemiología , Hepacivirus/genética , Hepacivirus/aislamiento & purificación , Hepacivirus/clasificación , Estudios Retrospectivos , Masculino , Femenino , Persona de Mediana Edad , Adulto , Hepatitis C/epidemiología , Hepatitis C/virología , Anciano , Adulto Joven , Prevalencia , Adolescente , Anciano de 80 o más Años , Niño
3.
Rev Panam Salud Publica ; 47: e70, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37089786

RESUMEN

Objective: This study aimed to determine the performance of infection prevention and control (IPC) programs in eight core components in level 2 and level 3 hospitals across all provinces in Colombia. Methods: This cross-sectional study used self-assessed IPC performance data voluntarily reported by hospitals to the Ministry of Health and Social Protection during 2021. Each of the eight core components of the World Health Organization's checklist in the Infection Prevention and Control Assessment Framework contributes a maximum score of 100, and the overall IPC performance score is the sum of these component scores. IPC performance is graded according to the overall score as inadequate (0-200), basic (201-400), intermediate (401-600) or advanced (601-800). Results: Of the 441 level 2 and level 3 hospitals, 267 (61%) reported their IPC performance. The median (interquartile range [IQR]) overall IPC score was 672 (IQR: 578-715). Of the 267 hospitals reporting, 187 (70%) achieved an advanced level of IPC. The median overall IPC score was significantly higher in private hospitals (690, IQR: 598-725) than in public hospitals (629, IQR: 538-683) (P < 0.001). Among the core components, scores were highest for the category assessing IPC guidelines (median score: 97.5) and lowest for the category assessing workload, staffing and bed occupancy (median score: 70). Median overall IPC scores varied across the provinces (P < 0.001). Conclusions: This countrywide assessment showed that 70% of surveyed hospitals achieved a self-reported advanced level of IPC performance, which reflects progress in building health system resilience. Since only 61% of eligible hospitals participated, an important next step is to ensure the participation of all hospitals in future assessments.

4.
Indian J Public Health ; 67(1): 84-91, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37039211

RESUMEN

Background: Improved longevity of people living with HIV on highly active antiretroviral therapy and accelerated aging processes are considered contributory to Metabolic Syndrome. Objectives: The current study investigated metabolic syndrome (MetS) in people living with HIV (PLH) who were receiving antiretroviral therapy (ART) under the ongoing National AIDS Control Program. Methods: Clinic attendees (n = 3088) who were on ART for more than 6 months constituted the sampling frame, from which 378 study participants were randomly drawn and included in the analysis following the eligibility check. One hundred and fifty-nine clinic attendees, initiated on ART in ≤6 months, provided an opportunity to estimate the prevalence of MetS in them. Sixty-two PLH from this smaller group were enrolled. Results: MetS was found among 19% (73/378; 95% confidence interval [CI] 15.5%-23.7%) PLH who were on ART >6 months compared with 24% (15/62; 95% CI 14.2%-36.7%) in those who were on ART for ≤6 months based on harmonization criteria for the Asian population; the confidence intervals overlapped and apparently observed difference was not statistically significant. Adjusted for age, body mass index (BMI), protease inhibitor (PI)-based ART regimen, duration of ART, insulin resistance (IR), reported family history of hypertension and residential setting, factors independently associated with MetS were PI containing ART regimen, IR, duration of ART intake and BMI. In the adjusted model, the odds of MetS were three times higher among PLH on PI containing ART regimen (95% CI of adjusted odds ratio; aOR 1.27-8.51) and those having IR (95% CI of aOR 1.48-5.07). The odds of MetS among PLH with BMI ≥23 kg/m2 was 4 (95% CI of aOR 2.08-6.81) times higher than those with lower BMI. Conclusions: MetS in PLH requires the attention of health-care workers in India. Appropriate screening would help initiate early management.


Asunto(s)
Infecciones por VIH , Síndrome Metabólico , Humanos , Síndrome Metabólico/epidemiología , Estudios Transversales , India/epidemiología , Infecciones por VIH/tratamiento farmacológico , Infecciones por VIH/epidemiología , Infecciones por VIH/complicaciones , Terapia Antirretroviral Altamente Activa
5.
Int J Cancer ; 151(12): 2128-2135, 2022 Dec 15.
Artículo en Inglés | MEDLINE | ID: mdl-35869869

RESUMEN

Cancer survival is a key indicator for the national cancer control programs. However, survival data in the East Mediterranean region (EMR) are limited. We designed a national cancer survival study based on population-based cancer registries (PBCRs) from nine provinces in Iran. The current study reports 5-year net survival of 15 cancers in Iranian adults (15-99 years) during 2014 to 2015 in nine provinces of Iran. We used data linkages between the cancer registries and the causes of death registry and vital statistics and active follow-up approaches to ascertain the vital status of the patients. Five-year net survival was estimated through the relative survival analysis. We applied the international cancer survival standard weights for age standardization. Five-year survival was highest for prostate cancer (74.9%, 95% CI 73.0, 76.8), followed by breast (74.4%, 95% CI 72.50, 76.3), bladder (70.4%, 95% CI 69.0, 71.8) and cervix (65.2%, 95% CI 60.5, 69.6). Survival was below 25% for cancers of the pancreas, lung, liver, stomach and esophagus. Iranian cancer patients experience a relatively poor prognosis as compared to those in high-income countries. Implementation of early detection programs and improving the quality of care are required to improve the cancer survival among Iranian patients. Further studies are needed to monitor the outcomes of cancer patients in Iran and other EMR countries.


Asunto(s)
Neoplasias , Adulto , Masculino , Femenino , Humanos , Irán/epidemiología , Incidencia , Sistema de Registros , Análisis de Supervivencia
6.
Saudi Pharm J ; 30(4): 462-469, 2022 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-35527826

RESUMEN

Background: Extensively drug resistant tuberculosis (XDR-TB) is considered as a major threat to global health. This study aimed to analyse the treatment outcomes and identify the factors significantly associated with unfavourable treatment outcomes among XDR-TB patients. Methods: We conducted a retrospective observational study at 10 Programmatic Management Units of the National Tuberculosis Control Program of Pakistan. The Electronic Nominal Recording Reporting System records were used to collect data of all eligible XDR-TB patients registered at the study sites between March 2012 and August 2018. Treatment outcomes were analysed as per the standard criteria. Factors associated with unfavourable treatment outcomes were analysed by using multivariate binary logistic regression analysis. Results: Out of the total 184 patients, 59 (32.1%) completed their treatment successfully. Whereby, 83 patients (45.1%) died, 24 (13%) had treatment failure, and 11 (6%) were lost to follow-up. Treatment outcomes were not evaluated in 7 (3.8%) patients. Factors significantly associated with unfavourable treatment outcomes included; conventional therapy with bedaquiline, unfavourable interim treatment outcomes and occurrence of adverse drug events (negative association). Conclusion: Treatment success rate in the study cohort was sub-optimal (i.e., <75%). The poor success rate and high mortality are concerning, and requires immediate attention of the program managers and clinicians.

7.
Indian J Public Health ; 66(3): 358-361, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36149123

RESUMEN

Iron-deficiency anemia has continued to remain high in India. It is possibly due to relying on only iron-folic acid (IFA) supplementation through Anemia Control Program (ACP) that is National Iron Plus Initiative (NIPI). Based on the WHO's recommendations, we studied different interventions that can help to increase the effectiveness of NIPI such as Vitamin C supplementation with IFA, low-dose iron (LDI) with intensified health education (IHE), LDI with Vitamin C, and iron-rich food items to increase hemoglobin (Hb%) among adolescent girls through public-private partnership named Rashtriya Kishor Swasthya Karyakram. Increments in Hb after 12 weeks of interventions were compared with that of control groups one with NIPI and the other without any intervention. Highest increment in Hb% was observed in IFA under NIPI plus Vitamin C group, followed by LDI plus IHE group which was comparable to Hb increment in only the NIPI group. It emphasizes the need of making existing NIPI more stringent and comprehensive by integrating effective measures based on up-to-date scientific knowledge.


Asunto(s)
Anemia Ferropénica , Anemia , Adolescente , Anemia/epidemiología , Anemia Ferropénica/epidemiología , Anemia Ferropénica/prevención & control , Ácido Ascórbico , Suplementos Dietéticos , Femenino , Ácido Fólico/uso terapéutico , Hemoglobinas/análisis , Humanos , Imidazoles , India/epidemiología , Hierro/uso terapéutico , Nitrilos
8.
Indian J Public Health ; 66(Supplement): S60-S65, 2022 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-36412476

RESUMEN

Background: Delay in diagnosis and treatment enhances tuberculosis (TB) transmission and mortality. Understanding causes for delay can help in TB elimination by 2025, the stated goal of India. Objectives: Estimate diagnostic and treatment delay in Ernakulam district of Kerala, identify associated factors, and determine health-seeking behavior and knowledge regarding TB among new pulmonary TB patients. Materials and Methods: Community-based cross-sectional study among the new pulmonary TB patients registered under Revised National TB Control Program. Patients interviewed in-person and data collected using pretested semi-structured questionnaire. Descriptive statistics expressed as frequency, percent, interquartile range, median, and mean. The Chi-square test was used to assess statistical significance (P < 0.05) of association. Backward conditional method logistic regression done using variables with P < 0.2 in univariate analysis and adjusting for possible confounders. Results: Two hundred and twenty-nine patients interviewed and the median patient, health-care system, and treatment delay were 25 days, 22 days, and 1 day, respectively. While the patient delay (>30 days) and treatment delay (>2 days) were seen in 47.6% and 41% of patients, respectively, health-care system delay was seen in 79.9% of the patients. Choosing pharmacy for initial treatment (adjusted odds ratio [aOR] = 5.217), unskilled occupation (aOR = 3.717), female gender (aOR = 3.467), previously not heard about TB (aOR = 3.410), and lower education level (aOR = 2.774) were the independent predictors of the patient delay. Visiting two or more doctors (aOR = 5.855) and initially visiting a doctor of undergraduate qualification (aOR = 3.650) were the independent predictors of health-care system delay. The diagnosis in private sector (aOR = 8.989), not being admitted (aOR = 3.441), and age above 60 years (aOR = 0.394) was the independent predictors of treatment delay. Conclusion: Initial treatment from pharmacy, consulting multiple physicians, and diagnosis by private sector cause significant delay in diagnosis and treatment of pulmonary TB.


Asunto(s)
Tuberculosis Pulmonar , Tuberculosis , Humanos , Femenino , Persona de Mediana Edad , Tiempo de Tratamiento , Estudios Transversales , Diagnóstico Tardío , India/epidemiología , Tuberculosis Pulmonar/diagnóstico , Tuberculosis Pulmonar/tratamiento farmacológico , Tuberculosis Pulmonar/epidemiología
9.
Med J Islam Repub Iran ; 36: 169, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-37159758

RESUMEN

Background: Annually, over 131000 new cases of cancer have been identified in Iran, with an increasing trend that is predicted to grow by 40% by 2025. The most important contributing factors to this increase are the improvement of the health service delivery system, increased life expectancy, and the aging of the population. The aim of this study was to develop Iran's "National Cancer Control Program" (IrNCCP). Methods: The present study is a cross-sectional study that was conducted in 2013 using the method of reviewing studies and documents and focused group discussions and a panel of experts. In this study, the available evidence related to cancer status and its care in Iran and other countries, as well as national and international upstream documents, were reviewed and analyzed. Then, by analyzing the current situation in Iran and other countries and conducting stakeholder analysis with the strategic planning approach, the IrNCCP was developed with a 12-year horizon consisting of goals, strategies, programs, and performance indicators. Results: This program has 4 main components, including Prevention, Early Detection, Diagnosis and Treatment, and Supportive and Palliative care, as well as 7 supporting components including Governance and policy-making, Cancer Research, Developing facilities, equipment, and service delivery network, Providing and managing human resources, Providing and managing financial resources, Cancer information system management and registry, and Participation of NGOs, charities, and the private sector. Conclusion: Iran's National Cancer Control Program has been developed comprehensively with cross-sectoral cooperation and stakeholder participation. However, like any long-term health intervention, strengthening its governance structure both in terms of implementation and achievement of expected goals and evaluation and modification during the implementation of the program is essential.

10.
Artículo en Ruso | MEDLINE | ID: mdl-35157380

RESUMEN

In 1960s, the fundamental normative planning research of health care needs differentiated according age, medical care and its resource support depending on various profiles including "phthisiology" has been carried out regularly. After the budget insurance model of medical care was implemented, the attention to renewal of normative base decreased that resulted in significant regional disproportions between planned (normative) and factual volumes of medical care, including its application to children population on profile "phthisiology" in hospital conditions. THE PURPOSE OF THE STUDY: To compare normative, factual and estimated rate of hospitalization to provide medical care of children population on profile "phthisiology" in hospitals of the subjects of The Russian Federation. MATERIALS AND METHODS: Such methods as analysis of statistical information, normative and analytic technique, method of ratios and proportions, correlation analysis were applied. THE RESULTS: The data on number of patients with active tuberculosis were used to estimate need in mentioned medical care by 14 groups of patients that made up to 0.2 cases per 1000 children that is three times less than factual (0.6) and four times less than normative (0,8) indices. In the comparison groups, deficiency of factual vs. normative volumes of medical care increases as group morbidity increases. However, there are no signs of unmet needs in medical care. Thus, as bed occupancy rate is below approved level in all study groups. There is no correlation between bed occupancy rate and factual vs. normative admission rates ratio (Kendall's tau_b=0,178, р=0,101). CONCLUSION: The mismatch between factual and normative admission rates on profile "phthisiology" demonstrates both uneven provision of medical care in the subjects of The Russian Federation and overestimation of approved (normative) medical care that is four times higher than the estimated rate. To validate the obtained results special study of health care in question is needed with focused on primary data combined with expert assessment of validity of hospitalization.


Asunto(s)
Hospitalización , Hospitales , Niño , Atención a la Salud , Humanos , Morbilidad , Federación de Rusia/epidemiología
11.
Emerg Infect Dis ; 27(3): 949-952, 2021 03.
Artículo en Inglés | MEDLINE | ID: mdl-33622480

RESUMEN

We report the implementation of an animal sporotrichosis surveillance and control program that evaluates strategies to identify suspected and infected cats in a municipality in southeastern Brazil. All adopted measures reinforced the program, although strategies had different abilities to detect the presence of infection.


Asunto(s)
Enfermedades de los Gatos , Sporothrix , Esporotricosis , Animales , Brasil , Gatos , Zoonosis
12.
J Dairy Sci ; 104(5): 6194-6199, 2021 May.
Artículo en Inglés | MEDLINE | ID: mdl-33685689

RESUMEN

Paratuberculosis is a chronic enteric disease affecting virtually all ruminants, but only anecdotal information is currently available about the occurrence of this disease in water buffaloes (Bubalus bubalis). We carried out a survey study aimed at determining the prevalence of paratuberculosis in 2 provinces in the region of Campania, Italy, where about half of all Italian buffaloes are reared. From May 2017 to December 2018, we collected 201,175 individual serum samples from 995 buffalo herds. The sera were collected from animals over 24 mo old and were tested using a commercial ELISA test. The herd-level apparent prevalence result was 54.7%, and the animal-level apparent prevalence was 1.8%. The herd-level true prevalence was estimated using a Bayesian approach, demonstrating a high herd-level prevalence of paratuberculosis in water buffaloes from the Campania area. These findings suggest that the urgent adoption of paratuberculosis herd-control programs for water buffaloes in this area would be beneficial.


Asunto(s)
Enfermedades de los Bovinos , Paratuberculosis , Animales , Teorema de Bayes , Búfalos , Bovinos , Ensayo de Inmunoadsorción Enzimática/veterinaria , Italia/epidemiología , Paratuberculosis/epidemiología , Prevalencia , Estudios Seroepidemiológicos
13.
J Dairy Sci ; 104(4): 4549-4560, 2021 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-33663863

RESUMEN

We developed a custom bovine leukemia virus (BLV) control program for the Alberta dairy industry, consisting of a risk assessment and a comprehensive list of best management practices (BMP) aimed at prevention of BLV transmission between cattle. This control program was implemented on 11 farms for approximately 1 yr. Blood samples were collected from all cattle ≥12 mo old, and serum was tested with a commercial ELISA. Risk assessments were performed on each farm, risk-connected on-farm management was identified, and management changes expected to prevent transmission of BLV between cattle were suggested by the first author and agreed upon with each farmer. Throughout the following year, all participating farmers were visited multiple times to identify and overcome barriers to implementation and to monitor progress. After approximately 1 yr of implementing BLV control, all cattle ≥12 mo old on farm with a negative or no previous test result were sampled, and the within-herd prevalence was determined. The median number of cattle on farm that were ≥12 mo was 195 (range 110-524). The initial prevalence averaged 39% (13-66%). On average, 5 BMP (3-7) were suggested to each farmer. On average, 4 BMP (1-7) were implemented. At the second sampling, the average within-herd prevalence of all animals that tested positive (including the previous sampling) was 36% (12-62%). Eight farms reduced their within-herd BLV prevalence, within-herd prevalence stayed constant on 1 farm, and it increased on 1 farm. The remaining farm terminated their participation before the second sampling. The number of seroconversions per farm ranged from 3 to 109, highlighting the success of some producers to minimize new infections. The risk assessment was proven to be a valuable tool to identify flaws in on-farm management, although risk assessment score was unrelated to the within-herd BLV prevalence. Finally, it appeared that implementation of BMP aimed at prevention of BLV transmission between cattle could reduce within-herd BLV prevalence when farmers committed to their implementation.


Asunto(s)
Enfermedades de los Bovinos , Leucosis Bovina Enzoótica , Virus de la Leucemia Bovina , Alberta/epidemiología , Animales , Bovinos , Enfermedades de los Bovinos/epidemiología , Enfermedades de los Bovinos/prevención & control , Industria Lechera , Leucosis Bovina Enzoótica/epidemiología , Leucosis Bovina Enzoótica/prevención & control , Granjas
14.
J Dairy Sci ; 104(9): 10217-10231, 2021 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-34147217

RESUMEN

Bovine viral diarrhea virus (BVDV) infection has a major effect on the health of cows and consequently on herd performance. Many countries have implemented control or eradication programs to mitigate BVDV infection and its negative effects. These negative effects of BVDV infection on dairy herds are well documented, but there is much less information about the effects of new introduction of BVDV on dairy herds already participating in a BVDV control program. The objective of our study was to investigate the effect of a new BVDV introduction in BVDV-free herds participating in the Dutch BVDV-free program on herd performance. Longitudinal herd-level surveillance data were combined with herd information data to create 4 unique data sets, including a monthly test-day somatic cell count (SCC) data set, annual calving interval (CIV) and culling risk (CR) data sets, and a quarterly calf mortality rate (CMR) data set. Each database contained 2 types of herds: herds that remained BVDV free during the whole study period (defined as free herds), and herds that lost their BVDV-free status during the study period (defined as breakdown herds). The date of losing the BVDV-free status was defined as breakdown date. To compare breakdown herds with free herds, a random breakdown date was artificially generated for free herds by simple random sampling from the distribution of the breakdown month of the breakdown herds. The SCC and CIV before and after a new introduction of BVDV were compared through linear mixed-effects models with a Gaussian distribution, and the CR and CMR were modeled using a negative binomial distribution in generalized linear mixed-effects models. The explanatory variables for all models included herd type, BVDV status, year, and a random herd effect. Herd size was included as an explanatory variable in the SCC, CIV, and CMR model. Season was included as an explanatory variable in the SCC and CMR model. Results showed that free herds have lower SCC, CR, CMR, and shorter CIV than the breakdown herds. Within the breakdown herds, the new BVDV introduction affected the SCC and CMR. In the year after BVDV introduction, the SCC was higher than that in the year before BVDV introduction, with a factor of 1.011 [2.5th to 97.5th percentile (95% PCTL): 1.002, 1.020]. Compared with the year before BVDV breakdown, the CMR in the year of breakdown and the year after breakdown was higher, with factors of 1.170 (95% PCTL: 1.120; 1.218) and 1.096 (95% PCTL: 1.048; 1.153), respectively. This study reveals that a new introduction of BVDV had a negative but on average relatively small effect on herd performance in herds participating in a BVDV control program.


Asunto(s)
Diarrea Mucosa Bovina Viral , Enfermedades de los Bovinos , Virus de la Diarrea Viral Bovina Tipo 1 , Virus de la Diarrea Viral Bovina , Animales , Diarrea Mucosa Bovina Viral/prevención & control , Bovinos , Recuento de Células/veterinaria , Diarrea/veterinaria , Femenino
15.
J Dairy Sci ; 104(2): 2074-2086, 2021 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-33309379

RESUMEN

Dairy cows are negatively affected by the introduction of bovine viral diarrhea virus (BVDV), and consequently, produce less milk. Existing literature on potential milk production losses is based on relatively outdated data and hardly evaluates milk production loss in relation to a new BVDV infection in a surveillance system. This study determined the annual and quarterly loss in milk production of BVDV introduction in 3,126 dairy herds participating in the Dutch BVDV-free program between 2007 and 2017. Among these herds, 640 were "breakdown-herds" that obtained and subsequently lost their BVDV-free status during the study period, and 2,486 herds obtained and retained their BVDV-free status during the study period. Milk yields before and after BVDV introduction were compared through annual and quarterly linear mixed models. The fixed variables for both models included herd type (breakdown-herd or free-herd), bovine viral diarrhea status (on an annual and quarterly basis), year, season, and a random herd effect. The dependent variable was the average daily milk yield on the test day. To define the possible BVDV-introduction dates, 4 scenarios were developed. In the default scenario, the date of breakdown (i.e., loss of the BVDV-free status) was assumed as the BVDV-introduction date. For the other 3 scenarios, the BVDV-introduction dates were set at 4, 6, and 9 mo before the date of breakdown, based on the estimated birth date of a persistently infected calf. In the default scenario, the loss in milk yield due to BVDV introduction occurred mainly in the first year after breakdown, with a reduction in yield of 0.08 kg/cow per day compared with the last year before breakdown. For the other 3 scenarios, the greatest yield reduction occurred in the second year after BVDV introduction, with a loss of 0.09, 0.09, and 0.1 kg/cow per day, respectively. For the first 4 quarters after BVDV introduction in the default scenario, milk yield loss was 0.14, 0.09, 0.02, and 0.08 kg/cow per day, respectively. These quarterly results indicated that milk yield loss was greatest in the first quarter after BVDV introduction. Overall, BVDV introduction had a negative, but on average a relatively small, effect on milk yield for herds participating in the BVDV-free program. This study will enable dairy farmers and policymakers to have a clearer understanding of the quantitative milk production effect of BVDV on dairy farms in a control program.


Asunto(s)
Diarrea Mucosa Bovina Viral/fisiopatología , Virus de la Diarrea Viral Bovina , Leche , Animales , Anticuerpos Antivirales , Diarrea Mucosa Bovina Viral/epidemiología , Bovinos , Industria Lechera , Femenino
16.
Rev Argent Microbiol ; 53(2): 104-109, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-33010958

RESUMEN

The National Quality Control Program in Mycology (PNCCM) of Argentina was established in 1996 to improve the quality of the mycological diagnosis, to help establish and to set up standardized procedures and continuous training of laboratory staff. The aim of this study was to assess the effectiveness of the PNCCM in the 1996-2018 period. Data from the National Mycology Laboratory Network (NMLN) and PNCCM database was used to estimate the increase in the number of controlled laboratories and jurisdictions, the percentage of participation, the improvement in the quality of results and the adherence to the program. Satisfaction surveys were performed to assess user satisfaction. The number of controlled laboratories increased from 29 to 146; participation increased from 49% to 93% and general adherence was 72% in the evaluated period (1996-2018). Improvement in the quality of the results was 15% for low complexity samples; 7% for intermediate complexity samples and 14% for the identification of high complexity strains. Up to 84% of the users consider the PNCCM to be "very good" and 16% "satisfactory". These results show the importance of the PNCCM, which is widely accepted by mycological diagnostic laboratories from Argentina.


Asunto(s)
Laboratorios , Micología , Argentina , Pruebas Diagnósticas de Rutina , Humanos , Control de Calidad
17.
Emerg Infect Dis ; 26(7): 1374-1381, 2020 07.
Artículo en Inglés | MEDLINE | ID: mdl-32568038

RESUMEN

During 2016-2018, San Diego County, California, USA, experienced one of the largest hepatitis A outbreaks in the United States in 2 decades. In close partnership with local healthcare systems, San Diego County Public Health led a public health response to the outbreak that focused on a 3-pronged strategy to vaccinate, sanitize, and educate. Healthcare systems administered nearly half of the vaccinations delivered in San Diego County. At University of California San Diego Health, the use of informatics tools assisted with the identification of at-risk populations and with vaccine delivery across outpatient and inpatient settings. In addition, acute care facilities helped prevent further disease transmission by delaying the discharge of patients with hepatitis A who were experiencing homelessness. We assessed the public health roles that acute care hospitals can play during a large community outbreak and the critical nature of ongoing collaboration between hospitals and public health systems in controlling such outbreaks.


Asunto(s)
Hepatitis A , Centros Médicos Académicos , California/epidemiología , Brotes de Enfermedades , Hepatitis A/epidemiología , Hepatitis A/prevención & control , Humanos , Salud Pública
18.
BMC Public Health ; 20(1): 1200, 2020 Aug 04.
Artículo en Inglés | MEDLINE | ID: mdl-32753044

RESUMEN

BACKGROUND: Most countries in Subsaharan Africa have well-established National Tuberculosis Control Programs with relatively stable routine performances. However, major epidemiological events may result in significant disruptions. In March 2014, the World Health Organization announced the outbreak of Ebola virus disease in Guinea, a country with a high incidence of TB and HIV. Our study aimed to assess the impact of the Ebola virus disease outbreak on TB notification, treatment, and surveillance, using main indicators. METHODS: This is a retrospective cohort study that compared TB trends using surveillance data from the periods before (2011-2013), during (2014-2016), and after (2017-2018) Ebola virus disease outbreak. A time-series analysis was conducted to investigate the linkages between the decline in TB notification and the Ebola virus disease outbreak through cross-correlation. The lag in the cross-correlation test was evaluated using ANCOVA type II delayed variable dependent model. The surveillance system was assessed using TB surveillance standards and benchmarks and vital registration systems recommended by WHO, compared with those of 2015 during the Ebola virus disease. RESULTS: The rate of reporting of TB declined from 120 cases per 100,000 in 2011 to 100 cases per 100,000 in 2014, at the peak of the Ebola virus disease outbreak. The time-series cross-correlation test of all notified cases of TB and Ebola showed a significant lag of - 0.4 (40%), reflecting a drop in the rate of notification (F-value = 5.7 [95% CI: 0.2-21.3]). The Ebola virus disease had no negative impact on patient treatment outcomes (F-value = 1.3 [95% CI: 0.0-8.8]). Regarding the surveillance system, five out of 13 WHO standards and benchmarks were met following their evaluation in 2019, after the Ebola virus disease outbreak, compared to three in 2015. CONCLUSION: Major epidemics such as the Ebola virus disease outbreak may have a significant impact on well-established TB control programs as shown in the example of Guinea. Sudden disruptions of routine performance may lead programs to improve their surveillance system. The experience acquired in the fight against EVD and the investments made should make it possible to prepare the health system in a coherent manner for the other probable episodes.


Asunto(s)
Fiebre Hemorrágica Ebola , Vigilancia de la Población , Tuberculosis , Atención a la Salud , Brotes de Enfermedades , Epidemias , Guinea/epidemiología , Fiebre Hemorrágica Ebola/epidemiología , Humanos , Estudios Retrospectivos , Tuberculosis/diagnóstico , Tuberculosis/epidemiología , Organización Mundial de la Salud
19.
J Dairy Sci ; 103(5): 4654-4671, 2020 May.
Artículo en Inglés | MEDLINE | ID: mdl-32147269

RESUMEN

For endemic infections in cattle that are not regulated at the European Union level, such as bovine viral diarrhea virus (BVDV), European Member States have implemented control or eradication programs (CEP) tailored to their specific situations. Different methods are used to assign infection-free status in CEP; therefore, the confidence of freedom associated with the "free" status generated by different CEP are difficult to compare, creating problems for the safe trade of cattle between territories. Safe trade would be facilitated with an output-based framework that enables a transparent and standardized comparison of confidence of freedom for CEP across herds, regions, or countries. The current paper represents the first step toward development of such a framework by seeking to describe and qualitatively compare elements of CEP that contribute to confidence of freedom. For this work, BVDV was used as a case study. We qualitatively compared heterogeneous BVDV CEP in 6 European countries: Germany, France, Ireland, the Netherlands, Sweden, and Scotland. Information about BVDV CEP that were in place in 2017 and factors influencing the risk of introduction and transmission of BVDV (the context) were collected using an existing tool, with modifications to collect information about aspects of control and context. For the 6 participating countries, we ranked all individual elements of the CEP and their contexts that could influence the probability that cattle from a herd categorized as BVDV-free are truly free from infection. Many differences in the context and design of BVDV CEP were found. As examples, CEP were either mandatory or voluntary, resulting in variation in risks from neighboring herds, and risk factors such as cattle density and the number of imported cattle varied greatly between territories. Differences were also found in both testing protocols and definitions of freedom from disease. The observed heterogeneity in both the context and CEP design will create difficulties when comparing different CEP in terms of confidence of freedom from infection. These results highlight the need for a standardized practical methodology to objectively and quantitatively determine confidence of freedom resulting from different CEP around the world.


Asunto(s)
Diarrea Mucosa Bovina Viral/prevención & control , Virus de la Diarrea Viral Bovina/fisiología , Diarrea/virología , Animales , Diarrea Mucosa Bovina Viral/epidemiología , Diarrea Mucosa Bovina Viral/virología , Bovinos , Diarrea/epidemiología , Diarrea/prevención & control , Erradicación de la Enfermedad , Monitoreo Epidemiológico , Europa (Continente)/epidemiología , Femenino , Factores de Riesgo
20.
BMC Vet Res ; 15(1): 266, 2019 Jul 29.
Artículo en Inglés | MEDLINE | ID: mdl-31358004

RESUMEN

BACKGROUND: Johne's disease is a major production limiting disease of dairy cows caused by infection with Mycobacterium avium subsp. paratuberculosis in calf-hood. The disease is chronic, progressive, contagious and widespread with no treatment and no cure. Economic losses arise from decreased productivity through reduced growth, milk yield, fertility and also capital losses due to premature culling or death. Control chiefly centers upon removing those animals which actively shed bacteria and protecting calves from infection. A prolonged pre-clinical shedding phase, lack of test sensitivity, organism persistence and abundance in the environment as well as management systems that expose susceptible calves to infection make control challenging, particularly in pastoral, seasonal dairy systems. Combining a novel testing strategy to remove infected cows along with limited measures to protect vulnerable calves at pasture, this study reports the successful reduction over a four-year period of seroprevalence of cows testing positive for MAP infection in a New Zealand pastoral dairy herd. RESULTS: For all age groups considered the apparent seroprevalence of cows testing positive decreased from 297 / 1,122 (26%) in 2013-2014, to 24 / 1,030 (2.3%) in 2016-2017. Over the same period, the apparent seroprevalence in primiparous cows decreased from 39 / 260 (15%) to 7 / 275 (2.5%) and in multiparous cows from 258 / 862 (29.9%) to 17 / 755 (2.3%). The reported proportion of calved cows culled annually from suspected clinical Johne's disease fell from 55 / 1,201 (5%) in the year preceding the control program to 5 / 1,283 (0.4%) in the final year of the study. CONCLUSIONS: On this farm, reduction in the prevalence of infection was achieved by reducing the infectious pressure through targeted culling of heavily shedding animals together with limited measures to protect calves at pasture from exposure to Mycobacterium avium subsp. paratuberculosis. Whilst greater protection of young animals through separation from infected cows and their colostrum and milk would have reduced the risk of neonatal infection, this study demonstrates, in this case, that these management measures while prudent were not essential for effective reduction in the prevalence of MAP infection.


Asunto(s)
Enfermedades de los Bovinos/epidemiología , Mycobacterium avium subsp. paratuberculosis/fisiología , Paratuberculosis/epidemiología , Animales , Bovinos , Enfermedades de los Bovinos/prevención & control , Industria Lechera , Nueva Zelanda/epidemiología , Paratuberculosis/prevención & control , Estudios Seroepidemiológicos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA