RESUMEN
As a result of climate change, causing high temperature, erratic precipitation, and extreme meteorological events, in recent times in Italy productivity of Maize is becoming less reliable. Climate change effects are accompanied by the increase in the presence of mycotoxins and various pathogens, which contribute to the reduction of the possibility of successfully producing Maize. In this framework, Proso Millet (Panicum miliaceum L.) may be an interesting alternative, as it is a relatively low-demanding crop, highly drought-resistant, and can be employed, similarly to Sorghum, in rotation, maintaining a certain amount of biodiversity and contributing to the revenue for the farmers. Moreover, Proso Millet has a very short cycle, and may be used as a catch crop, when other crops have failed or after their harvest. Millet used to be cultivated in ancient times in Italy, but then it was abandoned in favor of Maize, so now it is necessary to re-define proper agricultural practices and managements, as well as to remedy to the lack of an exact description of its phenological development. In the frame of a Life-CCA EU project, called Growing REsilience AgriculTure-Life (GREAT LIFE), aim of this work is to encode phenology of Proso Millet using BBCH scale. The lack of an exact definition of Proso Millet phenology is a major drawback in progressing in research on this crop, which could be a very valuable tool for improving the resilience of agro-ecosystems to climate change in the Mediterranean basin. For this purpose, Proso Millet was cultivated in two experimental sites in the Emilia-Romagna region (North of Italy). The crop was closely monitored throughout the life cycle, in order to document, even photographically, the achievement of the subsequent phenological phases (including the time necessary to reach each phenological stage, expressed as Days After Sowing-DAS). Thanks to weather data collection from agrometeorological stations close to the experimental fields, it was possible to correlate the phenological development to temperature-driven heat-unit accumulation (Cumulated Growing Degree Days-CGDD), using the single triangle method (useful tool for forecasting purposes). Ancillary agronomic data have also been collected, for completeness. This study well describes primary and secondary phenological stages of Proso Millet, managing at encoding them in the BBCH scale and contextually providing DAS and CGDD values necessary to achieve the different phenophases. The difference observed between the two experimental sites in reaching each BBCH stage according to both CGDD and DAS is mostly restrained, suggesting that this work may represent a valid first tool in defining the phenological development of Proso Millet in the areas of Northern Italy. The effort made to encode Proso Millet phenology in BBCH scale may be useful to give to researchers comprehensive indications for future agronomic surveys on the crop. The agronomic data collected show that the crop had a good agronomic performance despite the adverse weather pattern during the season, enlightening for farmers the opportunity offered by Millet in Italy as a resilient crop.
Asunto(s)
Panicum , Animales , Cambio Climático , Ecosistema , Italia , Estaciones del AñoRESUMEN
Knowing the incidence of invasive meningococcal disease (IMD) is essential for planning appropriate vaccination policies. However, IMD may be underestimated because of misdiagnosis or insufficiently sensitive laboratory methods. Using a national molecular surveillance register, we assessed the number of cases misdiagnosed and diagnoses obtained postmortem with real-time PCR (rPCR), and we compared sensitivity of rPCR versus culture-based testing. A total of 222 IMD cases were identified: 11 (42%) of 26 fatal cases had been misdiagnosed or undiagnosed and were reclassified as IMD after rPCR showed meningococcal DNA in all available specimens taken postmortem. Of the samples tested with both rPCR and culture, 58% were diagnosed by using rPCR alone. The underestimation factor associated with the use of culture alone was 3.28. In countries such as Italy, where rPCR is in limited use, IMD incidence may be largely underestimated; thus, assessments of benefits of meningococcal vaccination may be prone to error.
Asunto(s)
Infecciones Meningocócicas/epidemiología , Neisseria meningitidis , Adolescente , Adulto , Niño , Preescolar , Errores Diagnósticos , Femenino , Humanos , Incidencia , Lactante , Italia/epidemiología , Masculino , Infecciones Meningocócicas/diagnóstico , Vacunas Meningococicas , Reacción en Cadena en Tiempo Real de la Polimerasa , Estudios Retrospectivos , Adulto JovenRESUMEN
Drought stress poses significant productivity challenges to wheat. Several studies suggest that lower malondialdehyde (MDA) content may be a promising trait to identify drought-tolerant wheat genotypes. However, the optimal polyethylene glycol (PEG-6000) concentration for screening seedlings for drought tolerance based on MDA quantification is not clear. The aim of this study was to verify whether a 10% (w/v) PEG-6000 concentration-induced water stress was reliable for discriminating between twenty-two drought-susceptible and drought-tolerant tetraploid wheat (Triticum turgidum spp. durum, turanicum, and carthlicum) accessions based on MDA quantification. To do so, its correlation with morpho-physiological traits, notoriously related to seedling drought tolerance, i.e., Seedling Vigour Index and Seedling Water Content, was evaluated. Results showed that MDA content was not a reliable biomarker for drought tolerance, as it did not correlate significantly with the aforementioned morpho-physiological traits, which showed, on the contrary, high positive correlation with each other. Combining our study with the cited literature, it clearly emerges that different wheat genotypes have different "water stress thresholds", highlighting that using a 10% PEG-6000 concentration for screening wheat seedlings for drought tolerance based on MDA quantification is not reliable. Given the conflicting results in the literature, this study provides important insights for selecting appropriate methods for evaluating wheat seedling drought tolerance.
RESUMEN
Durum wheat (Triticum turgidum L. ssp. durum) landraces, traditional local varieties representing an intermediate stage in domestication, are gaining attention due to their high genetic variability and performance in challenging environments. While major kernel metabolites have been examined, limited research has been conducted on minor bioactive components like lipids, despite their nutritional benefits. To address this, we analyzed twenty-two tetraploid accessions, comprising modern elite cultivars and landraces, to (i) verify if the selection process for yield-related traits carried out during the Green Revolution has influenced lipid amount and composition; (ii) uncover the extent of lipid compositional variability, giving evidence that lipid fingerprinting effectively identifies evolutionary signatures; and (iii) identify genotypes interesting for breeding programs to improve yield and nutrition. Interestingly, total fat did not correlate with kernel weight, indicating lipid composition as a promising trait for selection. Tri- and di-acylglycerol were the major lipid components along with free fatty acids, and their relative content varied significantly among genotypes. In particular, landraces belonging to T. turanicum and carthlicum ecotypes differed significantly in total lipid and fatty acid profiles. Our findings provide evidence that landraces can be a genetically relevant source of lipid variability, with potential to be exploited for improving wheat nutritional quality.
RESUMEN
Wheat is one of the most important cereal crops, representing a fundamental source of calories and protein for the global human population. Drought stress (DS) is a widespread phenomenon, already affecting large wheat-growing areas worldwide, and a major threat for cereal productivity, resulting in consistent losses in average grain yield (GY). Climate change is projected to exacerbate DS incidence and severity by increasing temperatures and changing rainfall patterns. Estimating that wheat production has to substantially increase to guarantee food security to a demographically expanding human population, the need for breeding programs focused on improving wheat drought resistance is manifest. Drought occurrence, in terms of time of appearance, duration, frequency, and severity, along the plant's life cycle varies significantly among different environments and different agricultural years, making it difficult to identify reliable phenological, morphological, and functional traits to be used as effective breeding tools. The situation is further complicated by the presence of confounding factors, e.g., other concomitant abiotic stresses, in an open-field context. Consequently, the relationship between morpho-functional traits and GY under water deficit is often contradictory; moreover, controversies have emerged not only on which traits are to be preferred, but also on how one specific trait should be desired. In this review, we attempt to identify the possible causes of these disputes and propose the most suitable selection criteria in different target environments and, thus, the best trait combinations for breeders in different drought contexts. In fact, an environment-oriented approach could be a valuable solution to overcome controversies in identifying the proper selection criteria for improving wheat drought resistance.
Asunto(s)
Resistencia a la Sequía , Triticum , Humanos , Triticum/genética , Selección de Paciente , Fitomejoramiento/métodos , Grano Comestible/genética , SequíasRESUMEN
Increasing temperatures, heat waves, and reduction of annual precipitation are all the expressions of climate change (CC), strongly affecting bread wheat (Triticum aestivum L.) grain yield in Southern Europe. Being temperature the major driving force of plants' phenological development, these variations also have effects on wheat phenology, with possible consequences on grain quality, and gluten protein accumulation. Here, through a case study in the Bolognese Plain (North of Italy), we assessed the effects of CC in the area, the impacts on bread wheat phenological development, and the consequences on grain gluten quality. The increasing trend in mean annual air temperature in the area since 1952 was significant, with a breakpoint identified in 1989, rising from 12.7 to 14.1°C, accompanied by the signals of increasing aridity, i.e., increase in water table depth. Bread wheat phenological development was compared in two 15-year periods before and after the breakpoint, i.e., 1952-1966 (past period), and 2006-2020 (present period), the latest characterized by aridity and increased temperatures. A significant shortening of the chronological time necessary to reach the main phenological phases was observed for the present period compared to the past period, finally shortening the whole life cycle. This reduction, as well as the higher temperature regime, affected gluten accumulation during the grain-filling process, as emerged analyzing gluten composition in grain samples of the same variety harvested in the area both before and after the breakpoint in temperature. In particular, the proportion of gluten polymers (i.e., gliadins, high and low molecular weight glutenins, and their ratio) showed a strong and significant correlation with cumulative growing degree days (CGDDs) accumulated during the grain filling. Higher CGDD values during the period, typical of CC in Southern Europe, accounting for higher temperature and faster grain filling, correlated with gliadins, high molecular weight glutenins, and their proportion with low molecular weight glutenins. In summary, herein reported, data might contribute to assessing the effects of CC on wheat phenology and quality, representing a tool for both predictive purposes and decision supporting systems for farmers, as well as can guide future breeding choices for varietal innovation.
RESUMEN
BACKGROUND: Invasive meningococcal disease (IMD) is a highly lethal disease. Diagnosis is commonly performed by culture or Realtime-PCR (qPCR). AIMS: Our aim was to evaluate, retrospectively, whether culture positivity correlates with higher bacterial load and fatal outcome. Our secondary aim was to compare culture and qPCR sensitivity. METHODS: The National Register for Molecular Surveillance was used as data source. Cycle threshold (CT), known to be inversely correlated with bacterial load, was used to compare bacterial load in different samples. RESULTS: Three-hundred-thirteen patients were found positive for Neisseria meningitidis by qPCR, or culture, or both; 41 died (case fatality rate 13.1%); 128/143 (89.5%) blood samples and 138/144 (95.8%) CSF were positive by qPCR, 37/143 (25.9%) blood samples and 45/144 (31.2%) CSF were also positive in culture. qPCR was 3.5 times (blood) or 3.1 times (CSF) more sensitive than culture in achieving a laboratory diagnosis of IMD (OR 24.4; 95% CI 12.2-49.8; p < .10-4; Cohen's κ 0.08 for blood and OR 49.0; 95% CI 19.1-133.4; p<10-4; Cohen's κ 0.02; for CSF). Positivity of culture did not correlate with higher bacterial loads in blood (mean CT 27.7±5.71, and CT 28.1±6.03, p = 0.739 respectively in culture positive or negative samples) or in CSF (mean CT 23.1±4.9 and 24.7±5.4 respectively in positive or negative CSF samples, p = 0.11).CT values in blood from patients who died were significantly lower than in patients who survived (respectively mean 18.0, range 14-23 and mean 29.6, range 16-39; p<10-17). No deaths occurred in patients with CT in blood over 23. Positive blood cultures were found in 10/25 (40%) patients who died and in 32/163 (19.6%) patients who survived, p = 0.036, OR 2.73; 95% CL 1.025-7.215), however 60% of deaths would have remained undiagnosed with the use of culture only. CONCLUSIONS: In conclusion our study demonstrated that qPCR is significantly (at least 3 times) more sensitive than culture in the laboratory confirmation of IMD. The study also demonstrated that culture negativity is not associated with lower bacterial loads and with less severe cases. On the other side, in patients with sepsis, qPCR can predict fatal outcome since higher bacterial load, evaluated by qPCR, appears strictly associated with most severe cases and fatal outcome. The study also showed that molecular techniques such as qPCR can provide a valuable addition to the proportion of diagnosed and serotyped cases of IMD.
Asunto(s)
Carga Bacteriana/métodos , Meningitis Meningocócica/diagnóstico , Neisseria meningitidis/aislamiento & purificación , Sepsis/diagnóstico , Adolescente , Técnicas de Cultivo de Célula/estadística & datos numéricos , Niño , Preescolar , ADN Bacteriano/aislamiento & purificación , Reacciones Falso Negativas , Femenino , Humanos , Lactante , Recién Nacido , Masculino , Meningitis Meningocócica/microbiología , Meningitis Meningocócica/mortalidad , Neisseria meningitidis/genética , Reacción en Cadena en Tiempo Real de la Polimerasa/estadística & datos numéricos , Estudios Retrospectivos , Sensibilidad y Especificidad , Sepsis/microbiología , Sepsis/mortalidad , Índice de Severidad de la Enfermedad , Adulto JovenRESUMEN
The aims of this study were to identify demographic, clinical and laboratory characteristics associated with reactive thrombocytosis useful for clinical management and to evaluate potential complications of this condition in a cohort of children selected for they young age as at high risk of reactive thrombocytosis. Retrospective analysis of medical records of 239 children among 902 aged 1-24 months, hospitalized during a 12-month period, and discharged with a diagnosis of infectious disease was performed. One hundred and nineteen children out of 239 (49.8%) presented thrombocytosis (>500 platelets x 10(9)/L; normal range 150-499 x 10(9)/L), 81/119 (68%) on admission. The incidences of thrombocytosis or extreme thrombocytosis (>1,000 x 10(9)/L) were 13.2% (119/902) and 0.8% (7/902). Thrombocytotic children had higher counts of white blood cells and had been treated more frequently with steroids (36/82, 43.9% vs. 5/53, 9.4%; p = 5 x 10(-5); relative risk 7.51, 95% confidence intervals 2.71-20.82). No significant difference was found in relation to sex, age, fever, C reactive protein level, diagnoses and antibiotic therapy. Two out of 239 (0.8%) enrolled children, both thrombocytotic and with other acquired risk factors, developed thrombosis. In conclusion, reactive thrombocytosis in children aged 1 up to 24 months is frequent and unrelated to markers of disease activity or degree of inflammation.
Asunto(s)
Infecciones Comunitarias Adquiridas/epidemiología , Trombocitosis/epidemiología , Plaquetas/metabolismo , Preescolar , Estudios de Cohortes , Infecciones Comunitarias Adquiridas/sangre , Infecciones Comunitarias Adquiridas/tratamiento farmacológico , Femenino , Humanos , Incidencia , Lactante , Italia/epidemiología , Masculino , Estudios Retrospectivos , Trombocitosis/sangre , Trombocitosis/terapia , Trombosis/sangreRESUMEN
Neuroblastoma is the most common extracranial solid tumor in childhood. Its presenting signs and symptoms may be highly variable, depending on the location of the primary tumor and its local or metastatic diffusion and, rarely, with paraneoplastic syndrome such as opsoclonus-myoclonus-ataxia syndrome and gastrointestinal disturbances, due to autoantibodies or to aberrant secretion of vasoactive intestinal peptide. Herein we describe a 10-month-old child with neuroblastoma presenting with a complex clinical picture characterized by acute kidney injury manifested by renal insufficiency and signs and symptoms of tubulointerstitial damage, with polyuria, polydipsia, glucosuria, aminoaciduria and hypochloremic metabolic alkalosis, and of glomerular damage with heavy proteinuria. Imaging study documented a suprarenal mass enveloping the aorta and its abdominal and renal ramifications and bilaterally renal veins. This clinical picture shows some analogies with the hyponatremic-hypertensive syndrome concerning the renovascular disease; however, in absence of systemic arterial hypertension, the heavy proteinuria and the polyuria could be explained by sectional increased intraglomerular pressure, due to local renal blood vessels constriction. Hypochloremic metabolic alkalosis probably developed because of local production of renin, responsible of renin-angiotensin-aldosterone system activation, but above all because of chloride loss through sweating. The long lasting dehydration, due to vomiting, sweating and polyuria, caused prolonged prerenal failure evolving in proximal tubular damage manifestations.
RESUMEN
Infants born from mothers with multiple blood-borne viral infections are at risk of multiple transmissions. Whether the risk of transmission of multiple infections increases with the number of viruses infecting the mother is still unknown. The aim of this study was to describe the risk of mother-to-infant transmission of multiple infections from multi-infected mothers. Sixty-four pregnant women infected by at least two viruses among human immunodeficiency virus-type 1 (HIV-1), hepatitis C virus, TT virus, and GB virus type C, together with their 64 infants, were studied. Maternal blood samples were collected in the third trimester of pregnancy and all infants were prospectively followed for evaluation of transmission within 3 months after birth and two times in the subsequent 24 months. Transmission of single and of dual infection from mothers infected by two viruses was, respectively, 10/40 (25%) and 5/40 (12.5%) and from mothers infected by three viruses 9/20 (45%) and 2/20 (10%). One (25%) infant infected by one virus was born from the four mothers infected by four viruses. Transmission of single or dual infection was not significantly associated with the number of viruses infecting the mother (P = 0.9) in the linear regression analysis. Present study suggests the absence of a synergistic effect from viral interactions toward mother-to-infant transmission of multiple infections and supports the hypothesis that transmission from multi-infected mothers is the result of the specific interaction between each virus and the host. These observations may be of clinical relevance in perinatal counseling.
Asunto(s)
Patógenos Transmitidos por la Sangre/aislamiento & purificación , Virus de Hepatitis/aislamiento & purificación , Transmisión Vertical de Enfermedad Infecciosa , Torque teno virus/aislamiento & purificación , Virosis/transmisión , Sangre/virología , Femenino , Humanos , Lactante , Recién Nacido , Madres , Embarazo , Complicaciones Infecciosas del Embarazo , Virosis/virologíaRESUMEN
The POU1F1 gene encodes a transcription factor that is important for the development and differentiation of the cells producing GH, prolactin, and TSH in the anterior pituitary gland. Patients with POU1F1 mutations show a combined pituitary hormone deficiency with low or absent levels of GH, prolactin, and TSH. Fourteen mutations have been reported in the POU1F1 gene up to now. These genetic lesions can be inherited either in an autosomal dominant or an autosomal recessive mode. We report on the first Italian patient, a girl, affected by combined pituitary hormone deficiency. The patient was found to be positive for congenital hypothyroidism (with low TSH levels) at neonatal screening. Substitutive therapy was started, but subsequent growth was very poor, although psychomotor development was substantially normal. Hospitalized at 10 mo she showed hypotonic crises, growth retardation, delayed bone age, and facial dysmorphism. In addition to congenital hypothyroidism, GH and prolactin deficiencies were found. Mutation DNA analysis of the patient's POU1F1 gene identified the novel Q167K amino acid change at the heterozygous level. The highly conserved Q167 residue is located in the POU-specific domain. No mutation was detected in the other allele. DNA analysis in the proband's parents did not identify this amino acid substitution, suggesting a de novo genetic lesion. From these data it can be hypothesized that the Q167K mutation has a dominant negative effect.