Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 149
Filter
1.
Plant Biol (Stuttg) ; 26(2): 245-256, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38196283

ABSTRACT

This study was designed to elucidate the physiological responses of three Lotus forage accessions to alkaline stress, and the influence of inoculating with Pantoea eucalypti endophyte strain on alkaline stress mitigation. A diploid L. corniculatus (Lc) accession, L. tenuis (Lt), and the interspecific hybrid Lt × Lc obtained from these two parental lines were exposed to alkaline stress (pH 8.2). Both Lt and the Lt × Lc hybrid are alkaline-tolerant compared to Lc, based on observations that dry mass was not reduced under stress, and there were no chlorosis symptoms on leaf blades. In all three Lotus accessions, Fe2+ concentration under stress decreased in aerial parts and simultaneously increased in roots. Inoculation with P. eucalypti considerably increased Fe2+ content in shoots of all three Lotus forage species under alkaline treatment. Photochemical efficiency of PSII was affected in Lc accession only when exposed to alkaline treatment. However, when cultivated under alkalinity with inoculation, plants recovered and had photosynthetic parameters equivalent to those in the control treatment. Together, the results highlight the importance of inoculation with P. eucalypti, which contributes significantly to mitigating alkaline stress. All results provide useful information for improving alkaline tolerance traits of Lotus forage species and their interspecific hybrids.


Subject(s)
Lotus , Pantoea , Lotus/physiology , Hybridization, Genetic , Photosynthesis
2.
Arch. Soc. Esp. Oftalmol ; 98(2): 83-97, feb. 2023. tab
Article in Spanish | IBECS | ID: ibc-215176

ABSTRACT

Objetivo Identificar las enfermedades oculares que se reportan como causas de la baja visión en los niños. Material y métodos La búsqueda sistemática se realizó en Medline (PubMed), Embase y Lilacs. Se seleccionaron estudios observacionales con poblaciones entre 0-18 años de edad, que reportaran datos de agudeza visual entre 20/60-20/400, y que informaran sobre la frecuencia de enfermedades oculares. Se excluyeron los estudios en los que el diagnóstico de la condición no hubiera sido verificado por un profesional, o que abarcaran únicamente casos de ceguera, defectos refractivos no corregidos o ambliopía. La calidad metodológica de los artículos se evaluó mediante el instrumento del Instituto Joanna Briggs para estudios de prevalencia. Resultados Fueron incluidos 27 estudios realizados en Asia (13 publicaciones), África (6 estudios), Oceanía (4 estudios) y Europa y Sudamérica (2 estudios cada uno). Las causas de la baja visión más reportadas fueron: la catarata, con prevalencias comprendidas entre el 0,8 y el 27,2%; el albinismo desde el 1,1 al 47%; el nistagmo, con prevalencias entre el 1,3 y el 22%; las distrofias de retina entre el 3,5 y el 50%; la retinopatía del prematuro (ROP) con prevalencias entre el 1,1 y el 65,8%; la atrofia óptica entre el 0,2 y el 17,6% y el glaucoma entre el 2,4 y el 18,1%. Conclusiones La catarata, el albinismo y el nistagmo son las enfermedades oculares más mencionadas por los estudios como causas de la baja visión en los niños, también enfermedades de la retina tales como la ROP y del nervio óptico como la atrofia. Sin embargo, son numerosas las condiciones oculares que pueden causar la baja visión en la población pediátrica. (AU)


Objective To identify the ocular pathologies that are reported as causes of low vision in children. Material and methods The systematic search was carried out in Medline (PubMed), Embase and Lilacs. Observational studies with populations between 0-18 years of age, reporting visual acuity data between 20/60-20/400 and reporting the frequency of ocular pathologies were selected. Studies in which the diagnosis of the condition had not been verified by a professional, or which covered only cases of blindness, uncorrected refractive errors, or amblyopia, were excluded. The methodological quality of the articles was evaluated using the Joanna Briggs Institute instrument for prevalence studies. Results27 studies conducted in Asia (13 publications), Africa (6 studies), Oceania (4 studies), Europe and South America (2 studies each) were included. The most reported causes of low vision were: cataract, with prevalence between 0.8% and 27.2%; albinism with from 1.1% to 47%; nystagmus, with prevalence between 1.3% and 22%; retinal dystrophies between 3.5% and 50%; retinopathy of prematurity (ROP) with prevalence between 1.1% and 65.8%, optic atrophy between 0.2% and 17.6%, and glaucoma from 2.4% to 18.1%. Conclusions Cataract, albinism and nystagmus are the ocular pathologies most mentioned by studies as a cause of low vision in children, as well as retinal diseases such as ROP and optic nerve diseases such as atrophy. However, there are numerous eye conditions that can result in low vision in the pediatric population. (AU)


Subject(s)
Infant, Newborn , Infant , Child, Preschool , Child , Adolescent , Eye Diseases/complications , Vision, Low/etiology , Prevalence
3.
Arch Soc Esp Oftalmol (Engl Ed) ; 98(2): 83-97, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36068132

ABSTRACT

OBJECTIVE: To identify the ocular pathologies that are reported as causes of low vision in children. MATERIAL AND METHODS: The systematic search was carried out in Medline (PubMed), Embase and Lilacs. Observational studies with populations between 0-18 years of age, reporting visual acuity data between 20/60-20/400 and reporting the frequency of ocular pathologies were selected. Studies in which the diagnosis of the condition had not been verified by a professional, or which covered only cases of blindness, uncorrected refractive errors, or amblyopia, were excluded. The methodological quality of the articles was evaluated using the Joanna Briggs Institute instrument for prevalence studies. RESULTS: 27 studies conducted in Asia (13 publications), Africa (6 studies), Oceania (4 studies), Europe and South America (2 studies each) were included. The most reported causes of low vision were: cataract, with prevalence between 0.8% and 27.2%; albinism with from 1.1% to 47%; nystagmus, with prevalence between 1.3% and 22%; retinal dystrophies between 3.5% and 50%; retinopathy of prematurity (ROP) with prevalence between 1.1% and 65.8%, optic atrophy between 0.2% and 17.6%, and glaucoma from 2.4% to 18.1%. CONCLUSIONS: Cataract, albinism and nystagmus are the ocular pathologies most mentioned by studies as a cause of low vision in children, as well as retinal diseases such as ROP and optic nerve diseases such as atrophy. However, there are numerous eye conditions that can result in low vision in the pediatric population.


Subject(s)
Cataract , Glaucoma , Nystagmus, Pathologic , Retinopathy of Prematurity , Vision, Low , Infant, Newborn , Humans , Child , Vision, Low/etiology , Vision, Low/complications , Blindness/etiology , Glaucoma/complications , Cataract/complications , Retinopathy of Prematurity/complications
5.
Plant Biol (Stuttg) ; 23(2): 363-374, 2021 Mar.
Article in English | MEDLINE | ID: mdl-33190297

ABSTRACT

Waterlogging and salinity impair crop growth and productivity worldwide, with their combined effects being larger than the additive effects of the two stresses separately. Here, a common forage tetraploid Lotus corniculatus (cv. San Gabriel) and a diploid L. corniculatus accession, collected from a coastal area with high frequency of waterlogging-saline stress events, were evaluated for tolerance to waterlogging, salinity and these two stresses combined. We hypothesize that, due to its environmental niche, the diploid accession would show better adaptation to combined waterlogging-saline stress compared to the tetraploid L. corniculatus. Plants were evaluated under control conditions, waterlogging, salinity and a combined waterlogging-saline treatment for 33 days. Shoot and root growth were assessed, together with chlorophyll fluorescence and gas exchange measurements. Results showed that salinity and waterlogging effects were more severe for the tetraploid accession, with a larger effect being observed under the combined stress condition. Concentrations of Na+ , Cl- and K+ were measured in apical and basal leaves, and in roots. A larger accumulation of Na+ and Cl- was observed under both saline and combined stress treatments for the tetraploid L. corniculatus, for which ion toxicity effects were evident. The expression of CLC gene, coding for a Cl- transporter, was only increased in diploid L. corniculatus plants in response to the combined stress condition, suggesting that ion compartmentalization mechanisms were induced in this accession. Thus, this recently characterized L. corniculatus could be used for the introduction of new tolerance traits in other Lotus species used as forage.


Subject(s)
Lotus , Sodium Chloride , Stress, Physiological , Lotus/drug effects , Lotus/genetics , Plant Leaves/drug effects , Plant Roots/drug effects , Salinity , Sodium Chloride/toxicity , Stress, Physiological/genetics , Water/pharmacology
7.
Public Health ; 149: 49-56, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28551470

ABSTRACT

OBJECTIVE: Despite the harmful effects of cigarette smoking, this habit in asthmatic adolescents continues to be a health problem worldwide. Our objectives were to determine the epidemiological profile of smoking and the degree of nicotine dependence among asthmatic adolescents. STUDY DESIGN: Through a cross-sectional investigation, 3383 adolescents (13-19 years of age) were studied. METHODS: Information was collected using a previously validated questionnaire. Two study groups of adolescent smokers were formed: one composed of asthmatic adolescents and the other of healthy youths. RESULTS: Asthmatic adolescents were found to be more likely to smoke (21.6% vs 11.8%) and to have some degree of nicotine dependence compared with healthy adolescents (51.6% vs 48.8%). The most important characteristic of smoking in asthmatic adolescents was found to be an onset before 11 years of age due to curiosity about cigarettes. Asthmatic youths continue smoking because this habit decreases their anxiety and stress. Adolescents know that smoking is addictive and often smoke on waking up in the morning or when they are sick. Yet, these adolescents do not consider smoking to be a problem. CONCLUSION: In this study, curiosity about cigarettes was the primary reason why asthmatic adolescents smoked for the first time and developed a greater dependence to nicotine compared with healthy adolescents. Moreover, the findings show that many of the factors that favour the development of smoking are preventable, given that they are present in the family and social environment.


Subject(s)
Asthma/epidemiology , Behavior, Addictive , Smoking/epidemiology , Tobacco Use Disorder/epidemiology , Adolescent , Cross-Sectional Studies , Female , Humans , Male , Smoking/psychology , Surveys and Questionnaires , Tobacco Use Disorder/psychology , Young Adult
8.
Asian-Australas J Anim Sci ; 29(5): 666-73, 2016 May.
Article in English | MEDLINE | ID: mdl-26954168

ABSTRACT

Two experiments were conducted to evaluate the effects of the level of corn dry distillers grains with solubles (CDDGS) supplementation on growing performance, blood metabolites, digestion characteristics and ruminal fermentation patterns in steers grazing dormant forage. In Exp. 1, of growth performance, 120 steers (204±5 kg initial body weight [BW]) were distributed randomly into 3 groups (each of 40 steers), which were provided with the following levels of CDDGS supplement: 0%, 0.25%, or 0.50% BW. All groups of steers were grazed for 30 days in each of 3 grazing periods (March, April, and May). Approximately 1,000 ha of the land was divided with electric fencing into 3 equally sized pastures (333 ha in size). Blood samples were collected monthly from 20 steers in each grazing group for analysis of glucose (G), urea-nitrogen (UN) and non-esterified fatty acids. Final BW, average daily gain (ADG) and supplement conversion (CDDGS-C) increased with increasing levels of CDDGS supplementation (p<0.05).The CDDGS supplementation also increased the plasma G and UN concentrations (p<0.05). In Exp. 2, of digestive metabolism, 9 ruminally cannulated steers (BW = 350±3 kg) were distributed, following a completely randomized design, into groups of three in each pasture. The ruminally cannulated steers were provided the same levels of CDDGS supplementation as in the growing performance study (0%, 0.25%, and 0.50% BW), and they grazed along with the other 40 steers throughout the grazing periods. The dry matter intake, crude protein intake, neutral detergent fiber intake (NDFI), apparent digestibility of dry matter (ADDM), crude protein (ADCP) and neutral detergent fiber (ADNDF) increased with increasing levels of CDDGS supplementation (p<0.05). The ruminal degradation rates of CP (kdCP), NDF (kdNDF) and passage rate (kp) also increased with increasing levels of CDDGS supplementation (p<0.05). Ruminal ammonia nitrogen (NH3-N) and propionate concentrations also increased with increasing levels of CDDGS supplementation (p<0.05). However, acetate concentrations decreased with increasing levels of CDDGS supplementation (p<0.05). Liquid dilution rate increased with increasing levels of CDDGS supplementation but ruminal liquid volume decreased (p<0.05). On the basis of these findings, we can conclude that CDDGS supplementation enhanced the productive performance of cattle grazing native rangeland without negatively affecting forage intake, glucose and urea-nitrogen blood concentrations, ruminal degradation and ruminal fermentation patterns.

9.
Plant Biol (Stuttg) ; 18(4): 703-9, 2016 Jul.
Article in English | MEDLINE | ID: mdl-27007305

ABSTRACT

A common stress on plants is NaCl-derived soil salinity. Genus Lotus comprises model and economically important species, which have been studied regarding physiological responses to salinity. Leaf area ratio (LAR), root length ratio (RLR) and their components, specific leaf area (SLA) and leaf mass fraction (LMF) and specific root length (SRL) and root mass fraction (RMF) might be affected by high soil salinity. We characterised L. tenuis, L. corniculatus, L. filicaulis, L. creticus, L. burtii and L. japonicus grown under different salt concentrations (0, 50, 100 and 150 mm NaCl) on the basis of SLA, LMF, SRL and RMF using PCA. We also assessed effects of different salt concentrations on LAR and RLR in each species, and explored whether changes in these traits provide fitness benefit. Salinity (150 mm NaCl) increased LAR in L. burtii and L. corniculatus, but not in the remaining species. The highest salt concentration caused a decrease of RLR in L. japonicus Gifu, but not in the remaining species. Changes in LAR and RLR would not be adaptive, according to adaptiveness analysis, with the exception of SLA changes in L. corniculatus. PCA revealed that under favourable conditions plants optimise surfaces for light and nutrient acquisition (SLA and SRL), whereas at higher salt concentrations they favour carbon allocation to leaves and roots (LMF and RMF) in detriment to their surfaces. PCA also showed that L. creticus subjected to saline treatment was distinguished from the remaining Lotus species. We suggest that augmented carbon partitioning to leaves and roots could constitute a salt-alleviating mechanism through toxic ion dilution.


Subject(s)
Lotus/drug effects , Sodium Chloride/pharmacology , Biomass , Carbon/metabolism , Light , Lotus/physiology , Lotus/radiation effects , Phenotype , Plant Leaves/drug effects , Plant Leaves/physiology , Plant Leaves/radiation effects , Plant Roots/drug effects , Plant Roots/physiology , Plant Roots/radiation effects , Plant Stems/drug effects , Plant Stems/physiology , Plant Stems/radiation effects , Salinity , Salt Tolerance , Soil/chemistry , Stress, Physiological
10.
Rev. calid. asist ; 30(4): 166-174, jul.-ago. 2015. tab, ilus
Article in Spanish | IBECS | ID: ibc-137603

ABSTRACT

Objetivos. Identificar y caracterizar los eventos adversos (EA) en el servicio de Medicina Interna de un hospital comarcal mediante el uso de la herramienta Global Trigger Tool (GTT) ampliada, analizando su validez diagnóstica. Material y métodos. Estudio observacional, analítico, descriptivo y retrospectivo de altas de pacientes en 2013 en un servicio de Medicina Interna para la detección de EA mediante la identificación de triggers (evento relacionado frecuentemente con EA). Los triggers y los EA se localizaron mediante la revisión sistemática de la documentación clínica. Una vez detectado el EA, se procedió a su caracterización. Resultados. Se detectaron 149 EA en 291 altas durante el año 2013, de los cuales el 75,3% fueron puestos en evidencia directamente por la herramienta, mientras que el resto no tuvieron asociado un trigger. El porcentaje de altas que presentó al menos un EA fue del 35,4%. El EA más frecuentemente hallado fue la úlcera por presión (12%), seguido de delirium, estreñimiento, infección respiratoria nosocomial y alteración del nivel de conciencia por fármacos. El 47,6% de los EA estuvieron relacionados con el uso de fármacos. Se consideraron evitables el 32,2% de EA. La herramienta demostró tener una sensibilidad del 91,3% (IC 95%: 88,9-93,2) y una especificidad del 32,5% (IC 95%: 29,9-35,1). Presentaron un valor predictivo positivo del 42,5% (IC 95%: 40,1-45,1) y un valor predictivo negativo del 87,1% (IC 95%: 83,8-89,9). Conclusiones. La herramienta empleada en este trabajo es válida, útil y reproducible para la detección de EA. Asimismo, sirve para determinar tasas de daño y observar su evolución en el tiempo. En este estudio se ha hallado una frecuencia elevada tanto de EA como de eventos evitables (AU)


Objectives. To identify and characterize adverse events (AE) in an Internal Medicine Department of a district hospital using an extension of the Global Trigger Tool (GTT), analyzing the diagnostic validity of the tool. Methods. An observational, analytical, descriptive and retrospective study was conducted on 2013 clinical charts from an Internal Medicine Department in order to detect EA through the identification of ‘triggers’ (an event often related to an AE). The ‘triggers’ and AE were located by systematic review of clinical documentation. The AE were characterized after they were identified. Results. A total of149 AE were detected in 291 clinical charts during 2013, of which 75.3% were detected directly by the tool, while the rest were not associated with a trigger. The percentage of charts that had at least one AE was 35.4%. The most frequent AE found was pressure ulcer (12%), followed by delirium, constipation, nosocomial respiratory infection and altered level of consciousness by drugs. Almost half (47.6%) of the AE were related to drug use, and 32.2% of all AE were considered preventable. The tool demonstrated a sensitivity of 91.3% (95% CI: 88.9-93.2) and a specificity of 32.5% (95% CI: 29.9-35.1). It had a positive predictive value of 42.5% (95% CI: 40.1-45.1) and a negative predictive value of 87.1% (95% CI: 83.8-89.9). Conclusions. The tool used in this study is valid, useful and reproducible for the detection of AE. It also serves to determine rates of injury and to observe their progression over time. A high frequency of both AE and preventable events were observed in this study (AU)


Subject(s)
Adult , Female , Humans , Male , Internal Medicine/ethics , Internal Medicine/organization & administration , Internal Medicine/standards , Hospitalization/legislation & jurisprudence , Hospitalization/statistics & numerical data , Patient Safety/legislation & jurisprudence , Patient Safety/standards , Patient Discharge/standards , Patient Discharge/trends , Cross Infection/epidemiology , Cross Infection/prevention & control , Constipation/complications , Constipation/epidemiology , Predictive Value of Tests , Retrospective Studies
11.
Rev Calid Asist ; 30(4): 166-74, 2015.
Article in Spanish | MEDLINE | ID: mdl-26025386

ABSTRACT

OBJECTIVES: To identify and characterize adverse events (AE) in an Internal Medicine Department of a district hospital using an extension of the Global Trigger Tool (GTT), analyzing the diagnostic validity of the tool. METHODS: An observational, analytical, descriptive and retrospective study was conducted on 2013 clinical charts from an Internal Medicine Department in order to detect EA through the identification of 'triggers' (an event often related to an AE). The 'triggers' and AE were located by systematic review of clinical documentation. The AE were characterized after they were identified. RESULTS: A total of 149 AE were detected in 291 clinical charts during 2013, of which 75.3% were detected directly by the tool, while the rest were not associated with a trigger. The percentage of charts that had at least one AE was 35.4%. The most frequent AE found was pressure ulcer (12%), followed by delirium, constipation, nosocomial respiratory infection and altered level of consciousness by drugs. Almost half (47.6%) of the AE were related to drug use, and 32.2% of all AE were considered preventable. The tool demonstrated a sensitivity of 91.3% (95%CI: 88.9-93.2) and a specificity of 32.5% (95%CI: 29.9-35.1). It had a positive predictive value of 42.5% (95%CI: 40.1-45.1) and a negative predictive value of 87.1% (95%CI: 83.8-89.9). CONCLUSIONS: The tool used in this study is valid, useful and reproducible for the detection of AE. It also serves to determine rates of injury and to observe their progression over time. A high frequency of both AE and preventable events were observed in this study.


Subject(s)
Risk Management/organization & administration , Adult , Consciousness Disorders/chemically induced , Consciousness Disorders/epidemiology , Constipation/epidemiology , Cross Infection/epidemiology , Delirium/epidemiology , Hospital Departments/organization & administration , Hospitals, District/organization & administration , Humans , Internal Medicine/organization & administration , Medical Errors , Medication Errors , Predictive Value of Tests , Pressure Ulcer/epidemiology , Retrospective Studies , Sampling Studies , Sensitivity and Specificity
12.
Rev. MVZ Córdoba ; 20(1): 4461-4471, ene.-abr. 2015. ilus, tab
Article in English | LILACS | ID: biblio-957302

ABSTRACT

Objective. To develop a Quality Index Method (QIM) for gutted and ungutted red tilapia from aquaculture ponds. Materials and methods. 40 specimens of gutted red tilapia and 40 ungutted ones were located in foam polyethylene boxes within layers of ice and storage at 4°C. Three fish were randomly sampled on days 0, 3, 5, 8, 11, 14 and 17 for gutted tilapia, and on days 0, 3, 6, 9, 11, 14 and 16 for ungutted tilapia. A sensorial panel of 8 experts was formed to evaluate the product. With three samples each day with average points of the sensorial attributes proposed in the method, the quality index for gutted and ungutted red tilapia was obtained based on the storage time on ice. Results. The Quality Index Method obtained for gutted and ungutted red tilapia showed maximum values of 21 and 29, respectively. It was adjusted in an increasing lineal model with high correlation between the Quality Index and the storage time on ice. Conclusions. The developed model is useful to determine deterioration levels and to define storage and consumption time. For gutted red tilapia the panel rejected the fish after 8 - 11 days of storage whereas the ungutted red tilapia was rejected after 6 - 9 days.


Objetivo. Desarrollar los esquemas del Método de Índice de Calidad (MIC) para la tilapia roja de piscifactoría eviscerada y sin eviscerar. Materiales y métodos. 40 especímenes de tilapia roja eviscerados y 40 sin eviscerar, fueron ubicados en cajas de polietileno expandido entre capas de hielo y almacenadas a 4°C; se realizaron muestreos los días 0, 3, 5, 8, 11, 14 y 17 para tilapia eviscerada, mientras en la tilapia sin eviscerar los días 0, 3, 6, 9, 11, 14 y 16; se conformó un panel de 8 expertos para la evaluación sensorial de tres ejemplares en cada día de muestreo; con los puntajes promedio de los atributos sensoriales propuestos en el esquema, se obtuvo el índice de calidad para la tilapia roja eviscerada y sin eviscerar en función del tiempo de almacenamiento en hielo. Resultados. Los esquemas del Método índice de Calidad desarrollados para tilapia roja entera eviscerada y sin eviscerar, obtuvieron valores máximos de Índice de Calidad de 21 y 29 puntos respectivamente, ajustados a un modelo lineal creciente con alta correlación entre el Índice de Calidad y el tiempo de almacenamiento en hielo. Conclusiones. Los esquemas del MIC desarrollados son útiles para determinar el nivel de deterioro y definir los tiempos de almacenamiento y consumo. En la tilapia roja eviscerada los panelistas rechazaron el pescado para consumo entre los días 8 y 11 de almacenamiento, mientras que para la tilapia sin eviscerar el producto fue rechazado entre los días 6 y 9 de almacenamiento.

13.
J Food Prot ; 77(9): 1588-92, 2014 Sep.
Article in English | MEDLINE | ID: mdl-25198852

ABSTRACT

The minimal effective dose of sodium chlorate as an intervention to reduce the carriage of pathogenic bacteria in food-producing animals has not been clearly established. The effect of low-level oral chlorate administration to ewes was assessed by comparing the diversity of prominent bacterial populations in their gastrointestinal tract. Twelve lactating crossed Pelibuey and Blackbelly-Dorper ewes (average body weight, 65 kg) were randomly assigned (four per treatment) to receive a control treatment (TC; consisting of 3 g of NaCl per animal per day) or one of two chlorate treatments (T3 or T9; consisting of 1.8 or 5.4 g of NaClO3 per animal per day, respectively). Treatments were administered twice daily via oral gavage for 5 days. Ruminal and fecal samples were collected daily, starting 3 days before and ending 6 days after treatment, and were subjected to denaturing gradient gel electrophoresis of the 16S rRNA gene sequence amplified from total population DNA. For ruminal microbes, percent similarity coefficients (SCs) between groups varied from 23.0 to 67.5% and from 39.4 to 43.3% during pretreatment and treatment periods, respectively. During the treatment period, SCs within groups ranged from 39.4 to 90.3%, 43.3 to 86.7%, and 67.5 to 92.4% for TC, T3, and T9, respectively. For fecal microbes, SCs between groups varied from 38.0 to 85.2% and 38.0 to 94.2% during pretreatment and treatment periods, respectively. SCs for fecal populations during treatment were most varied for TC (38.0 to 67.9%), intermediate for T9 (75.6 to 92.0%), and least varied for T3 (80.6 to 90.6%). Heterogeneity within and between groups provided no evidence of an effect of chlorate treatment on ruminal or fecal microbial populations.


Subject(s)
Bacteria/isolation & purification , Biodiversity , Chlorates/pharmacology , Feces/microbiology , Rumen/microbiology , Animals , Bacteria/classification , Bacteria/drug effects , Bacteria/genetics , Female , Humans , Lactation , Rumen/drug effects , Rumen/physiology , Sheep
14.
Plant Biol (Stuttg) ; 16(6): 1042-9, 2014 Nov.
Article in English | MEDLINE | ID: mdl-24597843

ABSTRACT

Saline, alkaline and mixed saline-alkaline conditions frequently co-occur in soil. In this work, we compared these plant stress sources on the legume Lotus tenuis, regarding their effects on shoot growth and leaf and stem anatomy. In addition, we aimed to gain insight on the plant physiological status of stressed plants. We performed pot experiments with four treatments: control without salt (pH = 5.8; EC = 1.2 dS·m(-1)) and three stress conditions, saline (100 mM NaCl, pH = 5.8; EC = 11.0 dS·m(-1)), alkaline (10 mM NaHCO3, pH = 8.0, EC = 1.9 dS·m(-1)) and mixed salt-alkaline (10 mM NaHCO3 + 100 mM NaCl, pH = 8.0, EC = 11.0 dS·m(-1)). Neutral and alkaline salts produced a similar level of growth inhibition on L. tenuis shoots, whereas their mixture exacerbated their detrimental effects. Our results showed that none of the analysed morpho-anatomical parameters categorically differentiated one stress from the other. However, NaCl- and NaHCO3 -derived stress could be discriminated to different extents and/or directions of changes in some of the anatomical traits. For example, alkalinity led to increased stomatal opening, unlike NaCl-treated plants, where a reduction in stomatal aperture was observed. Similarly, plants from the mixed saline-alkaline treatment characteristically lacked palisade mesophyll in their leaves. The stem cross-section and vessel areas, as well as the number of vascular bundles in the sectioned stem were reduced in all treatments. A rise in the number of vessel elements in the xylem was recorded in NaCl-treated plants, but not in those treated exclusively with NaHCO3.


Subject(s)
Lotus/drug effects , Lotus/physiology , Salinity , Sodium Chloride/toxicity , Stress, Physiological/drug effects , Lotus/anatomy & histology , Osmotic Pressure , Plant Epidermis/anatomy & histology , Plant Epidermis/drug effects , Plant Leaves/chemistry , Plant Leaves/metabolism , Plant Stems/anatomy & histology , Plant Stems/drug effects , Plant Transpiration , Proline/metabolism
16.
Plant Dis ; 98(7): 1018, 2014 Jul.
Article in English | MEDLINE | ID: mdl-30708902

ABSTRACT

Seashore paspalum (Paspalum vaginatum Swartz) is a warm-season perennial turfgrass commonly used for golf courses that are grown in saline environments or using saline water for irrigation. However, seashore paspalum is also grown in non-saline conditions due to its low fertilizer and water requirements (2). In Barbados, on a newly constructed golf course, seashore paspalum 'Sea Isle Supreme' sprigs were imported from Georgia (United States) and were planted over 2006 and 2007 on greens, tees, fairways, and rough. Golf greens were constructed following the United States Golf Association Green Section (Far Hills, NJ) putting green guidelines. Tees and fairways were constructed using native soil. Two years after the grow-in, the putting greens began to exhibit irregular chlorotic patches, followed by gradual thinning and decline of turfgrass stand density in those areas. Additionally, turfgrass roots sampled from those symptomatic patches appeared to be abbreviated compared to non-symptomatic areas of the greens. A survey was conducted in May 2013 to determine if plant-parasitic nematodes were present coinciding with the observed symptoms, which were similar to those described in a previous report (3). Consequently, two samples were collected from each green with a total of four greens sampled. Each sample consisted of 20 soil cores (15 cm depth × 1.2 cm in diameter) from either areas of the greens showing symptoms or from non-symptomatic areas. Nematodes were extracted from 100 cm3 soil samples using a modified centrifugal-sugar flotation technique (4). No plant parasitic nematodes were present in any of the samples from the non-symptomatic areas. Three genera of plant parasitic nematodes were found in all the samples from the symptomatic areas: Helicotylenchus. Mesocriconema, and Pratylenchus. Nematode populations of these genera averaged 30, 60, and 200 nematodes per 100 cm3, respectively. Populations of the genera Helicotylenchus and Mesocriconema were below the action threshold levels for seashore paspalum used by the University of Florida Nematode Assay Laboratory (1). Currently, no threshold exists for Pratylenchus for seashore paspalum. Conversely, the genera Helicotylenchus. Mesocriconema, and Pratylenchus were found associated with the irregular chlorotic patches but not with the non-symptomatic areas. To our knowledge, this is the first report of plant parasitic nematodes associated with seashore paspalum maintained as putting greens in Barbados. References: (1) W. T. Crow. Nematode management for golf courses in Florida. EDIS. Accessed 31 July 2013 from: http://edis.ifas.ufl.edu/in124 , 2001. (2) R. R. Duncan and R. N. Carrow. Seashore Paspalum: The Environmental Turfgrass. John Wiley & Sons, Inc., Hoboken, New Jersey, 2000. (3) A. C. Hixson and W. T. Crow. Plant Dis. 88:680, 2004. (4) W. R. Jenkins. Plant Dis. Rep. 48:692, 1964.

17.
Rev Esp Anestesiol Reanim ; 60(4): 215-25, 2013 Apr.
Article in Spanish | MEDLINE | ID: mdl-23141206

ABSTRACT

Central venous catheter-related infections can lead to a substantial increase in morbidity and mortality in patients. Nowadays, with the increase in multi-resistant bacteria, the recent appearance of new antibiotics, and the development of new treatment guidelines, means that this has to be constantly reviewed. The objective of this review is to briefly define the epidemiological and pathogenic concepts and to look in detail at the preventive and therapeutic measures of this type of infection. Practical aspects are presented of different clinical situations such as, antibiotic-lock of the central venous catheter, and the withdrawal or maintenance of the catheter.


Subject(s)
Catheter-Related Infections , Anti-Bacterial Agents/therapeutic use , Catheter-Related Infections/diagnosis , Catheter-Related Infections/therapy , Humans
18.
Rev. MVZ Córdoba ; 17(2): 3053-3058, mayo-ago. 2012. ilus, graf, tab
Article in Spanish | LILACS, COLNAL | ID: lil-657102

ABSTRACT

Objetivo. La medición de la actividad colinesterasa (ChE) es una prueba rápida y económica que se emplea en el diagnóstico de intoxicaciones por insecticidas organofosforados y carbamatos. Como la interpretación por el laboratorio requiere valores de referencia para cada especie, en este estudio se establecieron las actividades de ChE normales en sangre, cerebro y retina de varias especies de animales domésticos mediante el método de Ellman. Materiales y métodos. Se obtuvieron encéfalos y globos oculares en el matadero central de Medellín, mientras que las muestras de sangre procedieron de animales remitidos al laboratorio de diagnóstico clínico de la Universidad de Antioquia. Resultados. Las medias (±D.E.) de actividad ChE sanguínea, expresada en µmoles de acetiltiocolina iodada hidrolizada/min/mL, fueron de 2.4± 0.2, 1.5±0.3, 1.9±0.3 y 2.5±0.2 para caninos, felinos, equinos y bovinos, respectivamente. En el encéfalo, la actividad ChE (µmol/min/g peso fresco), fue de 4.0±0.4, 5.4 ±0.3 y 4.9±0.3, en bovinos, porcinos y caninos, respectivamente. La retina bovina mostró una actividad de 21.7±2.45 µmol/min/g. Conclusiones. Los valores obtenidos coinciden ampliamente con los reportados por laboratorios certificados por la Asociación Americana de Laboratorios de Diagnostico Veterinarios (AAVLD), corroborando la buena reproducibilidad de la técnica y validando su uso como apoyo al diagnóstico de intoxicaciones por insecticidas inhibidores de la colinesterasa.


Objective. The measurement of cholinesterase activity (ChE) is a rapid and inexpensive test used in the diagnosis of intoxications by organophosphorus and carbamate insecticides. As the interpretation by laboratories entails reference values for each species, the present study was aimed to establish normal ChE activities in blood, brain and retina of several species of domestic animals by the use of the Ellman method. Materials and methods. Brains and eyeballs were obtained from Medellin's central slaughterhouse, while blood samples came from animals referred to the clinical diagnostic laboratory from the University of Antioquia. Results. The mean (± SD) of blood ChE activity, expressed as µmoles of iodide hydrolyzed acetylthiocholine/min/mL, were 2.4±0.2, 1.5±0.3, 1.9±0.3 and 2.5±0.2 for canines, felines, equines and bovines, respectively. In the brain, ChE activity (µmol/min/g wet weight) was 4.0±0.4, 5.4±0.3 and 4.9±0.3, in bovines, porcine, and canines, respectively. The bovine retina showed an activity of 21.7±2.45 µmol/min/g. Conclusions. The values obtained coincide with those reported by laboratories accredited by the American Association of Veterinary Diagnostic Laboratories (AAVLD), confirming its ease to reproduce the technique and validating its use to support the diagnosis of intoxications by cholinesterase inhibitors.


Subject(s)
Blood , Animals, Domestic , Brain , Cholinesterases , Retina
19.
Arch. Soc. Esp. Oftalmol ; 87(8): 237-246, ago. 2012. ilus, tab, graf
Article in Spanish | IBECS | ID: ibc-103808

ABSTRACT

Objetivo: Analizar la agudeza visual (AV) a largo plazo en pacientes con DMAE tratados con ranibizumab con persistencia de líquido subretiniano después del tratamiento de inducción y/o en los controles sucesivos. Método: Hemos revisado las historias clínicas, tomografías de coherencia óptica (OCT) y angiografías fluoresceínicas de los 216 pacientes tratados con ranibizumab entre enero de 2008 y abril del 2010, seleccionando aquellos que han presentado fluido subretiniano de forma persistente o recurrente a lo largo del seguimiento mínimo de un año. Resultados: Hemos incluido 36 ojos de 34 pacientes; 19 ojos (52,7%) presentaban persistencia y 17 (47,2%) recurrencia de fluido subretiniano a lo largo del seguimiento (media 29,06±9,28 meses).nLa media de inyecciones fue de 7,89 ± 3,2. El espesor macular central (EMC) inicial fue de 330 ± 84μm, a los 3 meses de 265,2 ± 62microm y de 294,5 ± 37μm al final del seguimiento. La AV media inicial fue de 0,3±0,2, a los 3 meses 0,43±0,2 (p<0,05) y al final del seguimiento de 0,41±0,22 (p<0,05). La aparición de hemorragias en las recurrencias se asoció con peor visión final en comparación con los que no las presentaron (p=0,004). Al final del seguimiento18 ojos (50%) continúan en tratamiento con ranibizumab, 16 ojos (44%) se mantienen en observación y 2 pacientes han fallecido. No existen diferencias entre AV y EMC entre ambos grupos. Conclusión: La persistencia o recurrencia de fluido macular subretiniano en pacientes tratados con ranibizumab no disminuye significativamente la ganancia visual obtenida después del tratamiento de inducción, a pesar de la interrupción del mismo durante el seguimiento. La aparición de hemorragias en las recurrencias se asoció con peor AV final(AU)


Objective: To analyse the long-term visual acuity (VA) in patients with age-related macular degeneration (ARMD) treated with ranibizumab, and who had persistent subretinal fluid after the induction therapy and/or in the successive controls. Materials and methods: We reviewed the medical records, optical coherence tomography (OCT) and fluorescein angiograms of 216 patients treated with ranibizumab between January 2008 and April 2010, selecting those who had persistent subretinal fluid or recurrent fluid for at least one year of follow-up. Results: A total of 36 eyes from 34 patients were included, with 19 eyes (52.7%) having persistent, and 17 (47.2%) recurrent subretinal fluid throughout the follow- up (mean 29.06±9.28 months). The average number of injections was 7.89±3.2. The central macular thickness (CMT) at the start of follow-up was 330±84μm, at 3 months 265.2±62micrem, and 294.5±37μm at the end of the follow-up. The initial mean VA was 0.3±0.2, at 3 months 0.43±0.2 (P<.05) and at the final review, 0.41±0.22 (P<.05). Haemorrhages in recurrences were associated with a worse final VA (P=.004). At the end of follow-up, 18 eyes (50%) continued with ranibizumab treatment, 16 eyes (44%) were kept under observation, and 2 patients died. There were no differences between VA and CMT between the groups. Conclusions: The persistence or recurrence of macular subretinal fluid in patients treated with ranibizumab does not significantly reduce the visual gain obtained after induction therapy, despite discontinuation of treatment during follow-up. Haemorrhages in the recurrences were associated with a worse final VA(AU)


Subject(s)
Humans , Male , Female , Aged , Aged, 80 and over , Macular Degeneration/complications , Macular Degeneration/prevention & control , Macular Degeneration/therapy , Angiogenesis Inhibitors , Retinal Neovascularization , Multicenter Studies as Topic , Randomized Controlled Trials as Topic , Retrospective Studies , Observational Studies as Topic
20.
Arch Soc Esp Oftalmol ; 87(8): 237-46, 2012 Aug.
Article in Spanish | MEDLINE | ID: mdl-22794170

ABSTRACT

OBJECTIVE: To analyse the long-term visual acuity (VA) in patients with age-related macular degeneration (ARMD) treated with ranibizumab, and who had persistent subretinal fluid after the induction therapy and/or in the successive controls. MATERIALS AND METHODS: We reviewed the medical records, optical coherence tomography (OCT) and fluorescein angiograms of 216 patients treated with ranibizumab between January 2008 and April 2010, selecting those who had persistent subretinal fluid or recurrent fluid for at least one year of follow-up. RESULTS: A total of 36 eyes from 34 patients were included, with 19 eyes (52.7%) having persistent, and 17 (47.2%) recurrent subretinal fluid throughout the follow- up (mean 29.06±9.28 months). The average number of injections was 7.89±3.2. The central macular thickness (CMT) at the start of follow-up was 330±84µm, at 3 months 265.2±62µm, and 294.5±37µm at the end of the follow-up. The initial mean VA was 0.3±0.2, at 3 months 0.43±0.2 (P<.05) and at the final review, 0.41±0.22 (P<.05). Haemorrhages in recurrences were associated with a worse final VA (P=.004). At the end of follow-up, 18 eyes (50%) continued with ranibizumab treatment, 16 eyes (44%) were kept under observation, and 2 patients died. There were no differences between VA and CMT between the groups. CONCLUSIONS: The persistence or recurrence of macular subretinal fluid in patients treated with ranibizumab does not significantly reduce the visual gain obtained after induction therapy, despite discontinuation of treatment during follow-up. Haemorrhages in the recurrences were associated with a worse final VA.


Subject(s)
Macular Degeneration , Subretinal Fluid , Angiogenesis Inhibitors/therapeutic use , Antibodies, Monoclonal, Humanized/therapeutic use , Follow-Up Studies , Humans , Macular Degeneration/drug therapy , Ranibizumab , Visual Acuity
SELECTION OF CITATIONS
SEARCH DETAIL