RESUMEN
OBJECTIVE: Determine the sensitivity and specificity of the ESBP for diagnosis in patients with intermediate risk of choledocholithiasis, referred to the specialized surgical Gastroenterology center of Unión de Cirujanos SAS - Oncologists of the West Zentria group - Manizales - Colombia between March 01, 2020 to January 31, 2022. MATERIALS AND METHODS: Retrospective cross-sectional study in patients with intermediate risk for choledocholithiasis. The diagnostic performance of ESBP was calculated and confirmed with ERCP. Negative ESBPs were followed up by telephone. RESULTS: 752 cases with ESBP were analyzed, of which 43.2% (n=325) were positive and 56.8% (n=427) were negative. ERCP was performed in positive cases who accepted the procedure (n=317); 73.5% (n:233) were positive for choledocholithiasis, 25.8% (n=82) tumors and 0.6% (n=2) biliary roundworms. Patients with positive ESBP underwent ERCP. S= 98.3% (95% CI: 95.7-99.5) was obtained; E= 88.1% (95% CI: 79.2-94.1); PPV = 95.8% (95% CI: 92.4-98.0); NPV = 94.9% (95% CI: 87.4-98.7). The AUC of ESBP was 0.9319 (95% CI 0.8961-0.967). CONCLUSION: In patients with intermediate risk for choledocholithiasis, ESBP is a useful diagnostic option in the study of pancreatic pathologies, extrahepatic biliary tree, and the identification of biliary microlithiasis; Therefore, it also allows us to complement it with a therapeutic intervention such as ERCP in a single time.
Asunto(s)
Colangiopancreatografia Retrógrada Endoscópica , Coledocolitiasis , Endosonografía , Sensibilidad y Especificidad , Humanos , Coledocolitiasis/diagnóstico por imagen , Coledocolitiasis/diagnóstico , Estudios Transversales , Estudios Retrospectivos , Masculino , Femenino , Persona de Mediana Edad , Anciano , Endosonografía/métodos , Adulto , Anciano de 80 o más AñosRESUMEN
Despite the importance of dairy farming in Uruguay, little information on dairy systems in this country is available in the scientific literature, and management practices that influence calf welfare at the herd level have not been explored. The aims of this study were to (1) describe the prepartum and calf-rearing systems, as well as the management practices that may influence calf welfare in pastured dairy herds in Uruguay, (2) estimate the annual calf mortality risk from birth to weaning, and (3) identify the primary clinical disease syndromes shown by the calves before death. A survey comprising a farm visit and a questionnaire was conducted on 225 randomly selected dairies with >30 milking cows, in 3 strata (31-99, 100-299, and ≥300 milking cows) of 6 Uruguayan departments where dairies are concentrated. Retrospective information from July 2013 to June 2014 was collected. A descriptive analysis was performed and results were inferred into the national dairy cattle population. Several management practices that could contribute to poor calf welfare were identified in a large proportion of farms. The annual calf mortality risk (calves that died between birth and weaning/calves born death or alive × 100, n = 149 farms) was 15.2%. Age at weaning averaged 75 d. Farmers reported that the most common clinical syndromes were diarrhea and respiratory disease in 85.2% and 47.5% of the farms, respectively. There was no continuous veterinary advice in 61.3% of the farms, 20.0% lacked data records, 38.5% had poor drainage in the prepartum area with waterlogging after rainfall, 52.1% monitored the prepartum area ≤2 times per day during the calving season, 65.1% did not perform navel antisepsis on newborns, 62.3% separated the calves from their dams at >24 h postpartum, 95.2% did not have a colostrum management program, 72.4% did not rotate the calf-rearing areas, 59.0% did not disinfect the calf feeders, 85.7% did not have staff dedicated exclusively to calf rearing, and 39.8% did not separate sick from healthy calves. The average volume of milk or milk replacer offered per calf was 4.5 L/day. Several of the identified management practices that affect calf welfare in the prepartum and calf-rearing periods could explain the high mortality risk. An effort should be made to conduct extension work focusing on the dissemination of good management practices to improve calf welfare and reduce calf mortality in Uruguayan dairy farms.
Asunto(s)
Bienestar del Animal , Enfermedades de los Bovinos/mortalidad , Industria Lechera , Animales , Bovinos , Industria Lechera/métodos , Femenino , Masculino , Leche , Embarazo , Estudios Retrospectivos , Factores de Riesgo , Encuestas y Cuestionarios , Uruguay , DesteteRESUMEN
Lipidomics is a rapidly developing field in modern biomedical research. While LC-MS systems are able to detect most of the known lipid classes in a biological matrix, there is no single technique able to extract all of them simultaneously. In comparison with two-phase extractions, one-phase extraction systems are of particular interest, since they decrease the complexity of the experimental procedure. By using an untargeted lipidomics approach, we explored the differences/similarities between the most commonly used two-phase extraction systems (Folch, Bligh and Dyer, and MTBE) and one of the more recently introduced one-phase extraction systems for lipid analysis based on the MMC solvent mixture (MeOH/MTBE/CHCl3). The four extraction methods were evaluated and thoroughly compared against a pooled extract that qualitatively and quantitatively represents the average of the combined extractions. Our results show that the lipid profile obtained with the MMC system displayed the highest similarity to the pooled extract, indicating that it was most representative of the lipidome in the original sample. Furthermore, it showed better extraction efficiencies for moderate and highly apolar lipid species in comparison with the Folch, Bligh and Dyer, and MTBE extraction systems. Finally, the technical simplicity of the MMC procedure makes this solvent system highly suitable for automated, untargeted lipidomics analysis.
Asunto(s)
Fraccionamiento Químico/métodos , Lípidos/sangre , Lípidos/aislamiento & purificación , Transición de Fase , Cromatografía Líquida de Alta Presión/métodos , Humanos , Lípidos/análisis , Espectrometría de Masas/métodos , Metabolómica/métodos , Análisis MultivarianteRESUMEN
Recent advances in analytical chemistry have set the stage for metabolite profiling to help understand complex molecular processes in physiology. Despite ongoing efforts, there are concerns regarding metabolomics workflows, since it has been shown that internal (enzyme activity, blood contamination, and the dynamic nature of metabolite concentrations) as well as external factors (storage, handling, and analysis method) may affect the metabolome profile. Many metabolites are intrinsically instable, particularly some of those associated with central carbon metabolism. While enzymatic conversions have been studied in great detail, nonenzymatic, chemical conversions received comparatively little attention. This review aims to give an in-depth overview of nonenzymatic energy metabolite degradation/interconversion chemistry focusing on a selected range of metabolites. Special attention will be given to qualitative (degradation pathways) as well as quantitative aspects, that may affect the acquisition of accurate data in the context of metabolomics studies. Problems related to the use of isotopically labeled internal standards hindering the quantitative analysis of common metabolites will be presented with an experimental example. Finally, general conclusions and perspectives are given.
RESUMEN
Malnutrition during cancer has a negative impact on prognosis and quality of life. Therefore, it is important to identify those patients at higher nutritional risk to prevent its development. There are nutritional screening tools, such as MUST and NRS-2002, that focus on the patient on admission to hospital. However, most patients will develop malnutrition in the outpatient or ambulatory setting. This study aims to determine which nutritional screening tool is most effective in assessing nutritional risk in the outpatient oncology patient, highlighting the parameters analysed by these tools. Seventeen articles were reviewed, with the most important variables being tumour location, tumour stage, age, and gender, as well as recent weight loss, dietary intake, and digestive disorders. The Nutriscore, NRS-2002, and MUST tools are considered suitable, but the choice varies depending on these parameters. MNA is suitable for elderly patients, while SNAQ was not considered reliable in this population. In conclusion, MUST, NRS-2002, and Nutriscore are suitable tools, but their choice depends on specific characteristics. There is currently no universal tool for nutritional risk assessment in outpatients.
Asunto(s)
Desnutrición , Neoplasias , Humanos , Anciano , Evaluación Nutricional , Estado Nutricional , Pacientes Ambulatorios , Calidad de Vida , Detección Precoz del Cáncer , Desnutrición/epidemiologíaRESUMEN
Leptospirosis, caused by pathogenic spirochetes of the genus Leptospira spp., is a globally significant zoonotic disease that affects humans and animals. In cattle, leptospirosis is associated not only with overt clinical manifestations but also with reproductive diseases, including infertility. This study assesses the potential correlation between leptospirosis and infertility in Uruguayan beef cattle. A case-control study involved 31 beef herds with no prior history of Leptospira vaccination. In each herd, veterinarians identified 10 non-pregnant (cases) and 25 pregnant cows (controls) using ultrasound, and blood and urine samples were collected from each cow. Serological diagnosis was performed using the Microscopic Agglutination Test (MAT), and quantitative PCR (qPCR) was used to assess Leptospira excretion. Additionally, antibodies against bovine viral diarrhea virus (BVDV) and infectious bovine rhinotracheitis (IBR) were tested. The results demonstrated an association between seropositivity to the Sejroe serogroup (cut-off 1:200) and infertility in cattle (OR=1.31; p-value=0.06). Furthermore, the level of Leptospira excretion (qPCR) in urine was associated with increased infertility risk, with cows excreting over 100 copies per mL of urine having the highest odds of infertility (OR=2.34; p-value<0.01). This study suggests a potential association between leptospirosis and infertility in Uruguayan beef cattle, emphasizing the importance of both serological and molecular diagnostics for assessing reproductive health in cattle herds. Future research should explore the impact of Leptospira serogroups on other reproductive disorders in cattle.
Asunto(s)
Enfermedades de los Bovinos , Leptospira , Leptospirosis , Animales , Leptospirosis/veterinaria , Leptospirosis/epidemiología , Bovinos , Enfermedades de los Bovinos/microbiología , Enfermedades de los Bovinos/epidemiología , Enfermedades de los Bovinos/virología , Femenino , Estudios de Casos y Controles , Uruguay/epidemiología , Leptospira/aislamiento & purificación , Embarazo , Infertilidad/veterinaria , Infertilidad/etiologíaRESUMEN
Leptospirosis is a zoonotic disease of worldwide importance. In Uruguay, it is endemic in cattle and primarily affects people with occupational exposure to livestock. The aim of this study was to determine the national seroprevalence and associated factors of local pathogen Leptospires in dairy cattle. A cross-sectional study was carried out. Herds were stratified by size (1-50, 51-250, and > 250 cattle), and up to 60 dairy cows per herd were randomly selected. A total of 4269 serum samples from 101 dairy herds were analyzed by microscopic agglutination test (MAT). A two-stage sampling design was used to estimate population seroprevalence of Leptospira spp. In order to determine the factors associated with the disease, herds with at least 1 seropositive animal were considered as case herds. Seroprevalence of Leptospira was 27.80% with a 95% CI [21.06, 34.54] at the animal level and 86.92% with a 95% CI [80.00, 93.75] at the herd level. The serology confirms the predominance of serogroups Sejroe and Pomona in our herd with the presence of incidental leptospires infection, in smaller proportion, but with a wide distribution at farm level. The population size and purchasing replacement of cows on dairy farms were associated with infection at farm level. The serologic studies confirmed that exposure to Leptospira spp. is endemic in our herds, and the spreading over dairy herds. Although the movement of purchased females and the size of the herd were associated with the disease, more studies should be conducted, to better understand the epidemiology of the disease and to highlight the possible risks to public health, especially in rural workers, farmers and veterinarians.
Asunto(s)
Enfermedades de los Bovinos , Leptospira , Leptospirosis , Humanos , Femenino , Bovinos , Animales , Estudios Seroepidemiológicos , Estudios Transversales , Uruguay/epidemiología , Leptospirosis/epidemiología , Leptospirosis/veterinaria , Factores de RiesgoRESUMEN
Large epidemics provide the opportunity to understand the epidemiology of diseases under the specific conditions of the affected population. Whilst foot-and-mouth disease (FMD) epidemics have been extensively studied in developed countries, epidemics in developing countries have been sparsely studied. Here we address this limitation by systematically studying the 2001 epidemic in Uruguay where a total of 2,057 farms were affected. The objective of this study was to identify the risk factors (RF) associated with infection and spread of the virus within the country. The epidemic was divided into four periods: (1) the high-risk period (HRP) which was the period between the FMD virus introduction and detection of the index case; (2) the local control measures period (LCM) which encompassed the first control measures implemented before mass vaccination was adopted; (3) the first mass vaccination, and (4) the second mass vaccination round. A stochastic model was developed to estimate the time of initial infection for each of the affected farms. Our analyses indicated that during the HRP around 242 farms were probably already infected. In this period, a higher probability of infection was associated with: (1) animal movements [OR: 1.57 (95% CI: 1.19-2.06)]; (2) farms that combined livestock with crop production [OR: 1.93 (95% CI: 1.43-2.60)]; (3) large and medium farms compared to small farms (this difference was dependent on regional herd density); (4) the geographical location. Keeping cattle only (vs farms that kept also sheep) was a significant RF during the subsequent epidemic period (LCM), and remained as RF, together with large farms, for the entire epidemic. We further explored the RF associated with FMDV infection in farms that raised cattle by fitting another model to a data subset. We found that dairy farms had a higher probability of FMDV infection than beef farms during the HRP [OR: 1.81 (95% CI: 1.12-2.83)], and remained as RF until the end of the first round of vaccination. The delay in the detection of the index case associated with unrestricted animal movements during the HRP may have contributed to this large epidemic. This study contributes to the knowledge of FMD epidemiology in extensive production systems.
RESUMEN
Petroleum-based oils are widely used as processing aids in rubber composites to improve processability but can adversely affect rubber composite performance and increase carbon footprint. In this research, liquid guayule natural rubber (LGNR), produced from guayule natural rubber, was used as a renewable processing aid to replace naphthenic oil (NO) in Hevea natural rubber, styrene-butadiene rubber (SBR) and guayule natural rubber (GNR) composites. The rheological properties, thermal stability, glass transition temperature, dynamic mechanical properties, aging, and ozone resistance of rubber composites with and without NO or LGNR were compared. Natural and synthetic rubber composites made with LGNR had similar processability to those made with NO, but had improved thermal stability, mechanical properties after aging, and ozone resistance. This was due to the strong LGNR-filler interaction and additional crosslinks formed between LGNR and the rubber matrices. The glass transition temperature of SBR composites was reduced by LGNR because of its increased molecular mobility. Thus, unlike NO, LGNR processing aid can simultaneously improve rubber composite durability, dynamic performance and renewability. The commercialization of LGNR has the potential to open a new sustainable processing-aid market.
RESUMEN
Fused deposition modeling (FDM) uses lattice arrangements, known as infill, within the fabricated part. The mechanical properties of parts fabricated via FDM are dependent on these infill patterns, which make their study of great relevance. One of the advantages of FDM is the wide range of materials that can be employed using this technology. Among these, polylactic acid (PLA)-wood has been recently gaining attention as it has become commercially available. In this work, the stiffness of two different lattice structures fabricated from PLA-wood material using FDM are studied: hexagonal and star. Rectangular samples with four different infill densities made of PLA-wood material were fabricated via FDM. Samples were subjected to 3-point bending to characterize the effective stiffness and their sensitivity to shear deformation. Lattice beams proved to be more sensitive to shear deformations, as including the contribution of shear in the apparent stiffness of these arrangements leads to more accurate results. This was evaluated by comparing the effective Young's modulus characterized from 3-point bending using equations with and without shear inclusion. A longer separation between supports yielded closer results between both models (~41% for the longest separation tested). The effective stiffness as a function of the infill density of both topologies showed similar trends. However, the maximum difference obtained at low densities was the hexagonal topology that was ~60% stiffer, while the lowest difference was obtained at higher densities (star topology being stiffer by ~20%). Results for stiffness of PLA-wood samples were scattered. This was attributed to the defects at the lattice element level inherent to the material employed in this study, confirmed via micro-characterization.
RESUMEN
INTRODUCTION: Bovine mastitis is the most common disease affecting the dairy industry, with staphylococci being considered as one of the most significant and prevalent causes. This study aimed to assess the presence of staphylococcal subclinical mastitis (SCM) in Uruguayan dairy farms and to identify Staphylococcus aureus (SA) and non-aureus staphylococci (NAS) in milking cows. In addition, the antibiotic susceptibility of isolated staphylococci was evaluated. METHODOLOGY: We tested 546 apparently healthy milking cows from 11 farms for detecting SCM using the California Mastitis Test (CMT). The cows were not treated with antibiotics. CMT-positive samples were cultured, and colonies compatible with Staphylococcus spp. were further identified through molecular techniques. The susceptibility of the Staphylococcus spp. isolates against thirteen antibiotics was determined using the disk diffusion method. RESULTS: Subclinical staphylococcal mastitis was present in almost all (82%) farms. SA (n = 39) was more common than NAS (n = 9) in the 48 samples tested. Isolates exhibited resistance to one, two, and even three different antibiotics. Resistance to penicillin was the most frequent among SA (23/39) and NAS (4/9). No staphylococci isolates exhibited resistance to cefoxitin, vancomycin, trimethoprim-sulfamethoxazole, erythromycin, or clindamycin. CONCLUSIONS: Staphylococcal SCM is one of the most common diseases in Uruguayan dairy farms. SA was the prevalent pathogen, however SA and NAS mastitis coexisted in many farms. NAS were identified and its distribution was similar to other countries. Penicillin had the highest and most frequent percentage of resistance.
Asunto(s)
Mastitis Bovina , Infecciones Estafilocócicas , Animales , Antibacterianos/farmacología , Bovinos , Granjas , Femenino , Humanos , Mastitis Bovina/epidemiología , Leche , Penicilinas , Infecciones Estafilocócicas/epidemiología , Infecciones Estafilocócicas/veterinaria , Staphylococcus , Staphylococcus aureusRESUMEN
Bovine tuberculosis (bTB) prevalence substantially increased over the past two decades with relatively high impact on large dairy herds, raising the concern of regulatory authorities and industry stakeholders, and threatening animal and public health. Lack of resources, together with the economic and social consequences of whole-herd stamping-out, makes depopulation an impractical disease control alternative in these herds. The increase in bTB prevalence was associated with demographic and management changes in the dairy industry in Uruguay, reducing the efficacy of the current control programme (i.e. status quo) based on intradermal serial testing with caudal fold- and comparative-cervical tuberculin test-and-slaughter of reactors (CFT-CCT). Here, we aimed to assess the epidemiological effectiveness of six alternative control scenarios based on test-and-slaughter of positive animals, using mathematical modelling to infer bTB-within-herd dynamics. We simulated six alternative control strategies consisting of testing adult cattle (>1 year) in the herd every 3 months using one test (in vivo or in vitro) or a combination in parallel of two tests (CFT, interferon-gamma release assay-IGRA- or enzyme-linked immunosorbent assay). Results showed no significant differences overall in the time needed to reach bTB eradication (median ranging between 61 and 82 months) or official bovine tuberculosis-free status (two consecutive negative herd tests) between any of the alternative strategies and the status quo (median ranging between 50 and 59 months). However, we demonstrate how alternative strategies can significantly reduce bTB prevalence when applied for restricted periods (6, 12 or 24 months), and in the case of IGRAc (IGRA using peptide-cocktail antigens), without incurring on higher unnecessary slaughter of animals (false positives) than the status quo in the first 6 months of the programme (p-value < .05). Enhanced understanding bTB-within-herd dynamics with the application of different control strategies help to identify optimal strategies to ultimately improve bTB control and bTB eradication from dairies in Uruguay and similar endemic settings.
Asunto(s)
Sacrificio de Animales , Industria Lechera , Enfermedades Endémicas/veterinaria , Tuberculosis Bovina/diagnóstico , Tuberculosis Bovina/prevención & control , Animales , Bovinos , Ensayos de Liberación de Interferón gamma/veterinaria , Modelos Teóricos , Prevalencia , Prueba de Tuberculina/veterinaria , Tuberculosis Bovina/epidemiología , UruguayRESUMEN
High-Grade Gliomas (HGG) are the most frequent brain tumor in adults. The gold standard of clinical care recommends beginning chemoradiation within 6 weeks of surgery. Disparities in access to healthcare in Argentina are notorious, often leading to treatment delays. We conducted this retrospective study to evaluate if time to chemoradiation after surgery is correlated with progression-free survival (PFS). Our study included clinical cases with a histological diagnosis of Glioblastoma (GBM), Anaplastic Astrocytoma (AA) or High-Grade Glioma (HGG) in patients over 18 years of age from 2014 to 2020. We collected data on clinical presentation, type of resection, time to surgery, time to chemoradiation, location within the Buenos Aires Metropolitan Area (BAMA) and type of health insurance. We found 63 patients that fit our inclusion criteria, including 26 (41.3%) females and 37 (58.7%) males. Their median age was 54 years old (19-86). Maximal safe resection was achieved in 49.2% (n = 31) of the patients, incomplete resection in 34.9% (n = 22) and the other 15.9% (n = 10) received a biopsy, but no resection. The type of health care insurance was almost evenly divided, with 55.6% (n = 35) of the patients having public vs. 44.4% (n = 28) having private health insurance. Median time to chemoradiation after surgery was 8 (CI 6.68-9.9) weeks for the global population. When we ordered the patients PFS by time to chemoradiation we found that there was a statistically significant effect of time to chemoradiation on patient PFS. Patients had a PFS of 10 months (p = 0.014) (CI 6.89-13.10) when they received chemoradiation <5 weeks vs a PFS of 7 months (CI 4.93-9.06) when they received chemoradiation between 5 to 8 weeks and a PFS of 4 months (CI 3.76-4.26 HR 2.18 p = 0.006) when they received chemoradiation >8 weeks after surgery. Also, our univariate and multivariate analysis found that temporal lobe location (p = 0.03), GMB histology (p = 0.02) and biopsy as surgical intervention (p = 0.02) all had a statistically significant effect on patient PFS. Thus, time to chemoradiation is an important factor in patient PFS. Our data show that although an increase in HGG severity contributes to a decrease in patient PFS, there is also a large effect of time to chemoradiation. Our results suggest that we can improve patient PFS by making access to healthcare in Buenos Aires more equitable by reducing the average time to chemoradiation following tumor resection.
Asunto(s)
Neoplasias Encefálicas/patología , Neoplasias Encefálicas/terapia , Quimioradioterapia , Glioma/patología , Glioma/terapia , Adolescente , Adulto , Argentina , Femenino , Humanos , Masculino , Persona de Mediana Edad , Clasificación del Tumor , Supervivencia sin Progresión , Estudios Retrospectivos , Factores de TiempoRESUMEN
Bovine brucellosis has been under eradication in Uruguay since 1998. The eradication program includes, among other interventions, individual sera sampling of beef animals at slaughter, and annual serum testing of all dairy cows-accounting for two million samples annually. At a herd prevalence of 0.8%, a pooled-sera sample approach could reduce the economic burden of the surveillance system by reducing the testing and operational costs. Our objective was to evaluate the analytic sensitivity of an indirect ELISA test for Brucella abortus in serum pools. Sixty-two Brucella abortus-positive bovine sera samples (based upon rose bengal and fluorescent polarization assay) were used as the positive control samples. Rose bengal-negative sera from negative farms were used to dilute the positive samples to the desired concentrations. Positive samples were diluted by using 1 ml of positive sera and 1 ml of negative sera (1/2 dilution) up to 1/1,024. Data were analyzed using generalized linear mixed models with a binary outcome (positive or negative), dilution number as a fixed effect, and a random effect for sample ID. Analytic sensitivity was 99.0% [95% confidence interval (CI): 96.3-99.7], 98.3% (95% CI: 93.1-99.6), 97.3% (95% CI: 87.4-99.4) for dilutions 1/2, 1/4, and 1/8, respectively. The analytical sensitivity, however, decreased when diluted to greater proportions. Given the current herd prevalence in Uruguay, it seems plausible that the use of a pooled sample approach could be adopted by policymakers to reduce the cost of the surveillance program and increase the number of samples being tested.
RESUMEN
Neosporosis is one of the leading causes of abortion in cattle worldwide, posing a great economic burden on cattle producers. The aim of this study was to determine the national seroprevalence and putative risk factors of Neospora caninum (N. caninum) in dairy cattle in Uruguay. A cross-sectional study was carried out. Herds were stratified by size (1-50, 51-250, and >250 cattle) and up to 60 dairy cows per herd were randomly selected. Four thousand two hundred twenty-three serum samples from 102 dairy herds were analyzed by indirect ELISA test, under the manufacturer's recommendations. In addition, the herdsman was surveyed and a population study was carried out. The in degree data, geographical coordinates, and seroprevalence of bovine viral diarrhea, enzootic bovine leukemia and infectious bovine rhinotracheitis were available for each herd. A sampling design was used to estimate population seroprevalence of N. caninum. In order to determine the factors associated with the disease, herds with an intra-herd seroprevalence over 20% were considered as case herds. Seroprevalence of N. caninum was 22.3% (95% CI: 18.7-25.9 %) and 96.0% (95% CI: 92.1-99.8%) at the animal and herd level, respectively. The number of dogs on the dairy farms were associated with infection levels (OR: 1.43, 95% CI: 1.02 to 2.03). It was concluded that N. caninum is endemic in the country, and is spreading over dairy herds. Although this study showed evidence that the number of dogs were associated with high levels of infection, more studies should be conducted, to better understand the epidemiology of the disease and thus develop efficient control measures.
Asunto(s)
Enfermedades de los Bovinos/epidemiología , Coccidiosis/veterinaria , Industria Lechera , Neospora/aislamiento & purificación , Animales , Bovinos , Enfermedades de los Bovinos/parasitología , Coccidiosis/epidemiología , Coccidiosis/parasitología , Estudios Transversales , Perros , Femenino , Prevalencia , Factores de Riesgo , Estudios Seroepidemiológicos , Uruguay/epidemiologíaRESUMEN
Oil price showed sharp fluctuations in recent years which revived the interest in its effect on inflation. In this paper, we discuss the relationship between oil price and inflation in Spain, at national and regional levels, and making the distinction between energy and non-energy inflation. To this end, we fit econometric models to measure the effect of oil price shocks on inflation and to predict them under different scenarios. Our results show that almost half of the volatility of changes in total inflation is explained by changes in oil price. As could be expected, the energy component of inflation drives this effect. We also find that, under the most likely scenarios, 1-year ahead total inflation will be moderate, with relevant differences across regions.
RESUMEN
Accuracy of new or alternative diagnostic tests is typically estimated in relation to a well-standardized reference test referred to as a gold standard. However, for bovine tuberculosis (bTB), a chronic disease of cattle, affecting animal and public health, no reliable gold standard is available. In this context, latent-class models implemented using a Bayesian approach can help to assess the accuracy of diagnostic tests incorporating previous knowledge on test performance and disease prevalence. In Uruguay, bTB-prevalence has increased in the past decades partially because of the limited accuracy of the diagnostic strategy in place, based on intradermal testing (caudal fold test, CFT, for screening and comparative cervical test, CCT, for confirmation) and slaughter of reactors. Here, we evaluated the performance of two alternative bTB-diagnostic tools, the interferon-gamma assay, IGRA, and the enzyme-linked immunosorbent assay (ELISA), which had never been used in Uruguay in the absence of a gold standard. In order to do so animals from two heavily infected dairy herds and tested with CFT-CCT were also analyzed with the IGRA using two antigens (study 1) and the ELISA (study 2). The accuracy of the IGRA and ELISA was assessed fitting two latent-class models: a two test-one population model (LCA-a) based on the analysis of CFT/CFT-CCT test results and one in-vitro test (IGRA/ELISA), and a one test-one population model (LCA-b) using the IGRA or ELISA information in which the prevalence was modeled using information from the skin tests. Posterior estimates for model LCA-a suggested that IGRA was as sensitive (75-78%) as the CFT and more sensitive than the serial use of CFT-CCT. Its specificity (90-96%) was superior to the one for the CFT and equivalent to the use of CFT-CCT. Estimates from LCA-b models consistently yielded lower posterior Se estimates for the IGRA but similar results for its Sp. Estimates for the Se (52% 95%PPI:44.41-71.28) and the Sp (92% 95%PPI:78.63-98.76) of the ELISA were however similar regardless of the model used. These results suggest that the incorporation of IGRA for detection of bTB in highly infected herds could be a useful tool to improve the sensitivity of the bTB-control in Uruguay.
RESUMEN
Abstract The anterior approach to cervical spine surgery can cause esophageal injuries; however, it is an infrequent complication with a 0.02-0.25% prevalence. It usually appears in two high-risk areas: Killian's dehiscence and the thyrohyoid membrane. Delayed esophageal perforations typically occur due to chronic friction and usually have a benign course. Most cases of late migration occur in the first 18 months of the surgical procedure, and the clinical manifestation varies between asymptomatic patients in the case of delayed perforations and patients with dysphagia, subcutaneous emphysema, and sepsis in the case of acute perforations.
Resumen El abordaje quirúrgico de la columna cervical por vía anterior puede generar lesiones esofágicas; sin embargo, es una complicación muy infrecuente con una prevalencia que varía entre el 0,02% y el 0,25%. Suelen presentarse en dos zonas de mayor riesgo: el triángulo de Killian y la membrana tirohioidea. Las perforaciones esofágicas tardías usualmente se presentan debido a fricción crónica y suelen tener un curso benigno. La mayoría de los casos de migración tardía se presentan en los primeros 18 meses del procedimiento quirúrgico y la presentación clínica varía entre pacientes asintomáticos en caso de perforaciones tardías y pacientes con disfagia, enfisema subcutáneo y sepsis en caso de perforaciones agudas.
RESUMEN
RESUMEN Objetivo: Determinar la sensibilidad y la especificidad de la endosonografía biliopancreática (ESBP) para diagnóstico en pacientes con riesgo intermedio de coledocolitiasis, derivados a un centro especializado de Gastroenterología quirúrgica de Unión de Cirujano SAS - Oncólogos de Occidente grupo Zentria - Manizales - Colombia entre el 01 de marzo de 2020 al 31 de enero de 2022. Materiales y métodos: Estudio retrospectivo transversal en pacientes con riesgo intermedio para coledocolitiasis. Se calculó el rendimiento diagnóstico de la ESBP y se confirmó con CPRE. Se hizo seguimiento telefónico a los ESBP negativas. Resultados: Se analizaron 752 casos con ESBP de los cuales el 43,2% (n=325) fue positivo y el 56,8% (n=427) fue negativo. Se practicó CPRE en los casos positivos que aceptaron el procedimiento (n=317); el 73,5% (n=233) fueron positivos para coledocolitiasis, el 25,8% (n=82) tumores y el 0,6% (n=2) áscaris biliares. Pacientes con ESBP positiva fueron intervenidos con CPRE. Se obtuvo S= 98,3% (IC 95%: 95,7-99,5); E= 88,1% (IC 95%: 79,2-94,1); VPP = 95,8% (IC 95%: 92,4-98,0); VPN = 94,9% (IC 95%: 87,4-98,7). El AUC de ESBP fue de 0,9319 (IC 95% 0,8961-0,967). Conclusión: En pacientes con riesgo intermedio para coledocolitiasis, la ESBP es una opción diagnostica útil en el estudio de patologías pancreáticas, árbol biliar extrahepático, y la identificación de microlitiasis biliares; por lo que nos permite además poder complementarla con una intervención terapéutica como la CPRE en un solo tiempo.
ABSTRACT Objective: Determine the sensitivity and specificity of the ESBP for diagnosis in patients with intermediate risk of choledocholithiasis, referred to the specialized surgical Gastroenterology center of Unión de Cirujanos SAS - Oncologists of the West Zentria group - Manizales - Colombia between March 01, 2020 to January 31, 2022. Materials and methods: Retrospective cross-sectional study in patients with intermediate risk for choledocholithiasis. The diagnostic performance of ESBP was calculated and confirmed with ERCP. Negative ESBPs were followed up by telephone. Results: 752 cases with ESBP were analyzed, of which 43.2% (n=325) were positive and 56.8% (n=427) were negative. ERCP was performed in positive cases who accepted the procedure (n=317); 73.5% (n:233) were positive for choledocholithiasis, 25.8% (n=82) tumors and 0.6% (n=2) biliary roundworms. Patients with positive ESBP underwent ERCP. S= 98.3% (95% CI: 95.7-99.5) was obtained; E= 88.1% (95% CI: 79.2-94.1); PPV = 95.8% (95% CI: 92.4-98.0); NPV = 94.9% (95% CI: 87.4-98.7). The AUC of ESBP was 0.9319 (95% CI 0.8961-0.967). Conclusion: In patients with intermediate risk for choledocholithiasis, ESBP is a useful diagnostic option in the study of pancreatic pathologies, extrahepatic biliary tree, and the identification of biliary microlithiasis; Therefore, it also allows us to complement it with a therapeutic intervention such as ERCP in a single time.
RESUMEN
Oxidative stress is suggested to play an important role in several pathophysiological conditions. A recent study showed that decreasing 5-oxoproline (pyroglutamate) concentration, an important mediator of oxidative stress, by over-expressing 5-oxoprolinase, improves cardiac function post-myocardial infarction in mice. The aim of the current study is to gain a better understanding of the role of the glutathione cycle in a mouse model of myocardial infarction by establishing quantitative relationships between key components of this cycle. We developed and validated an LC-MS method to quantify 5-oxoproline, L-glutamate, reduced glutathione (GSH) and oxidized GSH (GSSG) in different biological samples (heart, kidney, liver, plasma, and urine) of mice with and without myocardial infarction. 5-oxoproline concentration was elevated in all biological samples from mice with myocardial infarction. The ratio of GSH/GSSG was significantly decreased in cardiac tissue, but not in the other tissues/body fluids. This emphasizes the role of 5-oxoproline as an inducer of oxidative stress related to myocardial infarction and as a possible biomarker. An increase in the level of 5-oxoproline is associated with a decrease in the GSH/GSSG ratio, a well-established marker for oxidative stress, in cardiac tissue post-myocardial infarction. This suggests that 5-oxoproline may serve as an easily measurable marker for oxidative stress resulting from cardiac injury. Our findings show further that liver and kidneys have more capacity to cope with oxidative stress conditions in comparison to the heart, since the GSH/GSSG ratio is not affected in these organs despite a significant increase in 5-oxoproline.