Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 1.236
Filtrar
Mais filtros

Intervalo de ano de publicação
1.
J Infect Dis ; 2024 Jul 16.
Artigo em Inglês | MEDLINE | ID: mdl-39013016

RESUMO

BACKGROUND: Pneumococcal carriage in children has been extensively studied, but carriage in healthy adults and its relationship to invasive pneumococcal disease (IPD) is less understood. METHODS: Nasal wash samples from adults without close contact with young children (Liverpool, UK), 2011-2019, were cultured, and culture-negative samples tested by PCR. Pneumococcal carriage in adults 18-44 years was compared with carriage among PCV-vaccinated children 13-48 months (nasopharyngeal swabs, Thames Valley, UK) and IPD data for England for the same ages for 2014-2019. Age-group specific serotype invasiveness was calculated and used with national IPD data to estimate carriage serotype distributions for adults aged 65+ years. RESULTS: In total 98 isolates (97 carriers) were identified from 1,631 adults aged 18+ years (age and sex standardized carriage prevalence 6.4%), with only three identified solely by PCR. Despite different carriage and IPD serotype distributions between adults and children, serotype invasiveness was highly correlated (R=0.9). Serotypes 3, 37 and 8 represented a higher proportion of adult carriage than expected from direct low-level transmission from children to adults. The predicted carriage serotype distributions for 65+ years aligned more closely with the carriage serotype distribution for young adults than young children. CONCLUSIONS: The nasal wash technique is highly sensitive; additional benefit of PCR is limited. Comparison of carriage serotype distributions suggests some serotypes may be circulating preferentially within these specific young adults. Our data suggest that for some serotypes carried by adults 65+ years, other adults may be an important reservoir for transmission. Age groups such as older children should also be considered.

2.
Am J Transplant ; 24(1): 30-36, 2024 01.
Artigo em Inglês | MEDLINE | ID: mdl-37633449

RESUMO

De novo membranous nephropathy (dnMN) is an uncommon immune complex-mediated late complication of human kidney allografts that causes proteinuria. We report here the first case of dnMN in a pig-to-baboon kidney xenograft. The donor was a double knockout (GGTA1 and ß4GalNT1) genetically engineered pig with a knockout of the growth hormone receptor and addition of 6 human transgenes (hCD46, hCD55, hTBM, hEPCR, hHO1, and hCD47). The recipient developed proteinuria at 42 days posttransplant, which progressively rose to the nephrotic-range at 106 days, associated with an increase in serum antidonor IgG. Kidney biopsies showed antibody-mediated rejection (AMR) with C4d and thrombotic microangiopathy that eventually led to graft failure at 120 days. In addition to AMR, the xenograft had diffuse, global granular deposition of C4d and IgG along the glomerular basement membrane on days 111 and 120. Electron microscopy showed extensive amorphous subepithelial electron-dense deposits with intervening spikes along the glomerular basement membrane. These findings, in analogy to human renal allografts, are interpreted as dnMN in the xenograft superimposed on AMR. The target was not identified but is hypothesized to be a pig xenoantigen expressed on podocytes. Whether dnMN will be a significant problem in other longer-term xenokidneys remains to be determined.


Assuntos
Glomerulonefrite Membranosa , Nefropatias , Transplante de Rim , Humanos , Suínos , Animais , Glomerulonefrite Membranosa/etiologia , Transplante de Rim/efeitos adversos , Xenoenxertos , Rim/patologia , Nefropatias/patologia , Proteinúria/etiologia , Imunoglobulina G , Rejeição de Enxerto/patologia
3.
Am J Transplant ; 24(6): 1016-1026, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38341027

RESUMO

Membranous nephropathy (MN) is a leading cause of kidney failure worldwide and frequently recurs after transplant. Available data originated from small retrospective cohort studies or registry analyses; therefore, uncertainties remain on risk factors for MN recurrence and response to therapy. Within the Post-Transplant Glomerular Disease Consortium, we conducted a retrospective multicenter cohort study examining the MN recurrence rate, risk factors, and response to treatment. This study screened 22,921 patients across 3 continents and included 194 patients who underwent a kidney transplant due to biopsy-proven MN. The cumulative incidence of MN recurrence was 31% at 10 years posttransplant. Patients with a faster progression toward end-stage kidney disease were at higher risk of developing recurrent MN (hazard ratio [HR], 0.55 per decade; 95% confidence interval [CI], 0.35-0.88). Moreover, elevated pretransplant levels of anti-phospholipase A2 receptor (PLA2R) antibodies were strongly associated with recurrence (HR, 18.58; 95% CI, 5.37-64.27). Patients receiving rituximab for MN recurrence had a higher likelihood of achieving remission than patients receiving renin-angiotensin-aldosterone system inhibition alone. In sum, MN recurs in one-third of patients posttransplant, and measurement of serum anti-PLA2R antibody levels shortly before transplant could aid in risk-stratifying patients for MN recurrence. Moreover, patients receiving rituximab had a higher rate of treatment response.


Assuntos
Glomerulonefrite Membranosa , Transplante de Rim , Recidiva , Humanos , Glomerulonefrite Membranosa/etiologia , Glomerulonefrite Membranosa/patologia , Glomerulonefrite Membranosa/tratamento farmacológico , Transplante de Rim/efeitos adversos , Masculino , Estudos Retrospectivos , Feminino , Pessoa de Meia-Idade , Fatores de Risco , Seguimentos , Prognóstico , Adulto , Taxa de Filtração Glomerular , Falência Renal Crônica/cirurgia , Complicações Pós-Operatórias , Sobrevivência de Enxerto , Testes de Função Renal , Incidência , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/patologia , Taxa de Sobrevida
4.
Support Care Cancer ; 32(5): 273, 2024 Apr 08.
Artigo em Inglês | MEDLINE | ID: mdl-38587665

RESUMO

PURPOSE: Health service use is most intensive in the final year of a person's life, with 80% of this expenditure occurring in hospital. Close involvement of primary care services has been promoted to enhance quality end-of-life care that is appropriate to the needs of patients. However, the relationship between primary care involvement and patients' use of hospital care is not well described. This study aims to examine primary care use in the last year of life for cancer patients and its relationship to hospital usage. METHODS: Retrospective cohort study in Victoria, Australia, using linked routine care data from primary care, hospital and death certificates. Patients were included who died related to cancer between 2008 and 2017. RESULTS: A total of 758 patients were included, of whom 88% (n = 667) visited primary care during the last 6 months (median 9.1 consultations). In the last month of life, 45% of patients were prescribed opioids, and 3% had imaging requested. Patients who received home visits (13%) or anticipatory medications (15%) had less than half the median bed days in the last 3 months (4 vs 9 days, p < 0.001, 5 vs 10 days, p = 0.001) and 1 month of life (0 vs 2 days, p = 0.002, 0 vs 3 days, p < 0.001), and reduced emergency department presentations (32% vs 46%, p = 0.006, 31% vs 47% p < 0.001) in the final month. CONCLUSION: This study identifies two important primary care processes-home visits and anticipatory medication-associated with reduced hospital usage and intervention at the end of life.


Assuntos
Morte , Neoplasias , Humanos , Estudos Retrospectivos , Hospitais , Neoplasias/terapia , Vitória , Atenção Primária à Saúde
5.
Plant Dis ; 108(6): 1645-1658, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38127634

RESUMO

Knowledge of a pathogen's genetic variability and population structure is of great importance to effective disease management. In this study, 193 isolates of Phytophthora infestans collected from three Estonian islands were characterized over 3 years using simple sequence repeat (SSR) marker data complemented by information on their mating type and resistance to metalaxyl. In combination with SSR marker data from samples in the neighboring Pskov region of Northwest Russia, the impact of regional and landscape structure on the level of genetic exchange was also examined. Among the 111 P. infestans isolates from Estonian islands, 49 alleles were detected among 12 SSR loci, and 59 SSR multilocus genotypes were found, of which 64% were unique. The genetic variation was higher among years than that among islands, as revealed by the analysis of molecular variance. The frequency of metalaxyl-resistant isolates increased from 9% in 2012 to 30% in 2014, and metalaxyl resistance was most frequent among A1 isolates. The test for isolation by distance among the studied regions was not significant, and coupled with the absence of genetic differentiation, the result revealed gene flow and the absence of local adaptation. The data are consistent with a sexual population in which diversity is driven by an annual germination of soilborne oospores. The absence of shared genotypes over the years has important implications when it comes to the management of diseases. Such population diversity can make it difficult to predict the nature of the outbreak in the coming year as the genetic makeup is different for each year.


Assuntos
Variação Genética , Genótipo , Repetições de Microssatélites , Phytophthora infestans , Doenças das Plantas , Phytophthora infestans/genética , Phytophthora infestans/isolamento & purificação , Repetições de Microssatélites/genética , Doenças das Plantas/microbiologia , Estônia , Alanina/análogos & derivados , Alanina/farmacologia , Ilhas , Alelos
6.
J Environ Manage ; 351: 119810, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38100866

RESUMO

Robust understanding of the fine-grained sediment cascades of temperate agricultural catchments is essential for supporting targeted management for addressing the widely reported sediment problem. Within the UK, many independent field-based measurements of soil erosion, sediment sources and catchment suspended sediment yields have been published. However, attempts to review and assess the compatibility of these measurements are limited. The data available suggest that landscape scale net soil erosion rates (∼38 t km-2 yr-1 for arable and ∼26 t km-2 yr-1 grassland) are comparable to the typical suspended sediment yield of a UK catchment (∼44 t km2 yr-1). This finding cannot, however, be reconciled easily with current prevailing knowledge that agricultural topsoils dominate sediment contributions to watercourses, and that catchment sediment delivery ratios are typically low. Channel bank erosion rates can be high at landscape scale (27 km-2 yr-1) and account for these discrepancies but would need to be the dominant sediment source in most catchments, which does not agree with a review of sediment sources for the UK made in the recent past. A simple and robust colour-based sediment source tracing method using hydrogen peroxide sample treatment is therefore used in fifteen catchments to investigate their key sediment sources. Only in two of the catchments are eroding arable fields likely to be important sediment sources, supporting the alternative hypothesis that bank erosion is likely to be the dominant source of sediment in many UK catchments. It is concluded that the existing lines of evidence on the individual components of the fine sediment cascade in temperate agricultural catchments in the UK are difficult to reconcile and run the risk of best management interventions being targeted inappropriately. Recommendations for future research to address paucities in measured erosion rates, sediment delivery ratios and suspended sediment yields, validate sediment source fingerprinting results, consider the sources of sediment-associated organic matter, and re-visit soil erosion and sediment cascade model parameterisation are therefore made.


Assuntos
Erosão do Solo , Solo , Sedimentos Geológicos , Agricultura , Reino Unido , Monitoramento Ambiental/métodos
7.
J Environ Manage ; 351: 119732, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38064984

RESUMO

The intensification of farming and increased nitrogen fertiliser use, to satisfy the growing population demand, contributed to the extant climate change crisis. Use of synthetic fertilisers in agriculture is a significant source of anthropogenic Greenhouse Gas (GHG) emissions, especially potent nitrous oxide (N2O). To achieve the ambitious policy target for net zero by 2050 in the UK, it is crucial to understand the impacts of potential reductions in fertiliser use on multiple ecosystem services, including crop production, GHG emissions and soil organic carbon (SOC) storge. A novel integrated modelling approach using three established agroecosystem models (SPACSYS, CSM and RothC) was implemented to evaluate the associated impacts of fertiliser reduction (10%, 30% and 50%) under current and projected climate scenarios (RCP2.6, RCP4.5 and RCP8.5) in a study catchment in Southwest England. 48 unique combinations of soil types, climate conditions and fertiliser inputs were evaluated for five major arable crops plus improved grassland. With a 30% reduction in fertiliser inputs, the estimated yield loss under current climate ranged between 11% and 30% for arable crops compared with a 20-24% and 6-22% reduction in N2O and methane emissions, respectively. Biomass was reduced by 10-25% aboveground and by <12% for the root system. Relative to the baseline scenario, soil type dependent reductions in SOC sequestration rates are predicted under future climate with reductions in fertiliser inputs. Losses in SOC were more than doubled under the RCP4.5 scenario. The emissions from energy use, including embedded emissions from fertiliser manufacture, was a significant source (14-48%) for all arable crops and the associated GWP20.


Assuntos
Gases de Efeito Estufa , Solo , Fertilizantes/análise , Ecossistema , Carbono , Rios , Agricultura , Inglaterra , Óxido Nitroso/análise
8.
Environ Res ; 228: 115826, 2023 07 01.
Artigo em Inglês | MEDLINE | ID: mdl-37011801

RESUMO

Diffuse pollutant transfers from agricultural land often constitute the bulk of annual loads in catchments and storm events dominate these fluxes. There remains a lack of understanding of how pollutants move through catchments at different scales. This is critical if the mismatch between the scales used to implement on-farm management strategies, compared to those used for assessment of environmental quality, is to be addressed. The aim of this study was to understand how the mechanisms of pollutant export may change when assessed at different scales and the corresponding implications for on-farm management strategies. A study was conducted within a 41 km2 catchment containing 3 nested sub-catchments, instrumented to monitor discharge and various water quality parameters. Storm data over a 24-month period were analysed and hysteresis (HI) and flushing (FI) indices calculated for two water quality variables that are typically of environmental significance; NO3-N and suspended sediment (SSC). For SSC, increasing spatial scale had little effect on the mechanistic interpretation of mobilisation and the associated on-farm management strategies. At the three smallest scales NO3-N was chemodynamic with the interpretation of dominant mechanisms changing seasonally. At these scales, the same on-farm management strategies would be recommended. However, at the largest scale, NO3-N appeared unaffected by season and chemostatic. This would lead to a potentially very different interpretation and subsequent on-farm measures. The results presented here underscore the benefits of nested monitoring for extracting mechanistic understanding of agricultural impacts on water quality. The application of HI and FI indicates that monitoring at smaller scales is crucial. At large scales, the complexity of the catchment hydrochemical response means that mechanisms become obscured. Smaller catchments more likely represent critical areas within larger catchments where mechanistic understanding can be extracted from water quality monitoring and used to underpin the selection of on-farm mitigation measures.


Assuntos
Poluentes Ambientais , Poluentes Químicos da Água , Monitoramento Ambiental/métodos , Poluentes Químicos da Água/análise , Agricultura , Poluentes Ambientais/análise , Reino Unido , Rios
9.
J Am Soc Nephrol ; 33(1): 238-252, 2022 01.
Artigo em Inglês | MEDLINE | ID: mdl-34732507

RESUMO

BACKGROUND: Failure of the glomerular filtration barrier, primarily by loss of slit diaphragm architecture, underlies nephrotic syndrome in minimal change disease. The etiology remains unknown. The efficacy of B cell-targeted therapies in some patients, together with the known proteinuric effect of anti-nephrin antibodies in rodent models, prompted us to hypothesize that nephrin autoantibodies may be present in patients with minimal change disease. METHODS: We evaluated sera from patients with minimal change disease, enrolled in the Nephrotic Syndrome Study Network (NEPTUNE) cohort and from our own institutions, for circulating nephrin autoantibodies by indirect ELISA and by immunoprecipitation of full-length nephrin from human glomerular extract or a recombinant purified extracellular domain of human nephrin. We also evaluated renal biopsies from our institutions for podocyte-associated punctate IgG colocalizing with nephrin by immunofluorescence. RESULTS: In two independent patient cohorts, we identified circulating nephrin autoantibodies during active disease that were significantly reduced or absent during treatment response in a subset of patients with minimal change disease. We correlated the presence of these autoantibodies with podocyte-associated punctate IgG in renal biopsies from our institutions. We also identified a patient with steroid-dependent childhood minimal change disease that progressed to end stage kidney disease; she developed a massive post-transplant recurrence of proteinuria that was associated with high pretransplant circulating nephrin autoantibodies. CONCLUSIONS: Our discovery of nephrin autoantibodies in a subset of adults and children with minimal change disease aligns with published animal studies and provides further support for an autoimmune etiology. We propose a new molecular classification of nephrin autoantibody minimal change disease to serve as a framework for instigation of precision therapeutics for these patients.


Assuntos
Autoanticorpos/sangue , Proteínas de Membrana/imunologia , Nefrose Lipoide/sangue , Nefrose Lipoide/etiologia , Adulto , Criança , Pré-Escolar , Estudos de Coortes , Feminino , Humanos , Masculino , Nefrose Lipoide/patologia , Podócitos/patologia
10.
J Environ Manage ; 336: 117657, 2023 Jun 15.
Artigo em Inglês | MEDLINE | ID: mdl-36878061

RESUMO

The effective management of sediment losses in large river systems is essential for maintaining the water resources and ecosystem services they provide. However, budgetary, and logistical constraints often mean that the understanding of catchment sediment dynamics necessary to deliver targeted management is unavailable. This study trials the collection of accessible recently deposited overbank sediment and the measurement of its colour using an office document scanner to identify the evolution of sediment sources rapidly and inexpensively in two large river catchments in the UK. The River Wye catchment has experienced significant clean-up costs associated with post-flood fine sediment deposits in both rural and urban areas. In the River South Tyne, fine sand is fouling potable water extraction and fine silts degrade salmonid spawning habitats. In both catchments, samples of recently deposited overbank sediment were collected, fractionated to either <25 µm or 63-250 µm, and treated with hydrogen peroxide to remove organic matter before colour measurement. In the River Wye catchment, an increased contribution from sources over the geological units present in a downstream direction was identified and was attributed to an increasing proportion of arable land. Numerous tributaries draining different geologies allowed for overbank sediment to characterise material on this basis. In the River South Tyne catchment, a downstream change in sediment source was initially found. The River East Allen was identified as a representative and practical tributary sub-catchment for further investigation. The collection of samples of channel bank material and topsoils therein allowed channel banks to be identified as the dominant sediment source with an increasing but small contribution from topsoils in a downstream direction. In both study catchments, the colour of overbank sediments could quickly and inexpensively inform the improved targeting of catchment management measures.


Assuntos
Ecossistema , Inundações , Cor , Sedimentos Geológicos
11.
Int J Life Cycle Assess ; 28(2): 146-155, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-36685326

RESUMO

Goal and theoretical commentary: A number of recent life cycle assessment (LCA) studies have concluded that animal-sourced foods should be restricted-or even avoided-within the human diet due to their relatively high environmental impacts (particularly those from ruminants) compared with other protein-rich foods (mainly protein-rich plant foods). From a nutritional point of view, however, issues such as broad nutrient bioavailability, amino acid balances, digestibility and even non-protein nutrient density (e.g., micronutrients) need to be accounted for before making such recommendations to the global population. This is especially important given the contribution of animal sourced foods to nutrient adequacy in the global South and vulnerable populations of high-income countries (e.g., children, women of reproductive age and elderly). Often, however, LCAs simplify this reality by using 'protein' as a functional unit in their models and basing their analyses on generic nutritional requirements. Even if a 'nutritional functional unit' (nFU) is utilised, it is unlikely to consider the complexities of amino acid composition and subsequent protein accretion. The discussion herein focuses on nutritional LCA (nLCA), particularly on the usefulness of nFUs such as 'protein,' and whether protein quality should be considered when adopting the nutrient as an (n)FU. Further, a novel and informative case study is provided to demonstrate the strengths and weaknesses of protein-quality adjustment. Case study methods: To complement current discussions, we present an exploratory virtual experiment to determine how Digestible Indispensable Amino Acid Scores (DIAAS) might play a role in nLCA development by correcting for amino acid quality and digestibility. DIAAS is a scoring mechanism which considers the limiting indispensable amino acids (IAAs) within an IAA balance of a given food (or meal) and provides a percentage contribution relative to recommended daily intakes for IAA and subsequent protein anabolism; for clarity, we focus only on single food items (4 × animal-based products and 4 × plant-based products) in the current case exemplar. Further, we take beef as a sensitivity analysis example (which we particularly recommend when considering IAA complementarity at the meal-level) to elucidate how various cuts of the same intermediary product could affect the interpretation of nLCA results of the end-product(s). Recommendations: First, we provide a list of suggestions which are intended to (a) assist with deciding whether protein-quality correction is necessary for a specific research question and (b) acknowledge additional uncertainties by providing mitigating opportunities to avoid misinterpretation (or worse, dis-interpretation) of protein-focused nLCA studies. We conclude that as relevant (primary) data availability from supply chain 'gatekeepers' (e.g., international agri-food distributors and processors) becomes more prevalent, detailed consideration of IAA provision of contrasting protein sources needs to be acknowledged-ideally quantitatively with DIAAS being one example-in nLCA studies utilising protein as a nFU. We also contend that future nLCA studies should discuss the complementarity of amino acid balances at the meal-level, as a minimum, rather than the product level when assessing protein metabolic responses of consumers. Additionally, a broader set of nutrients should ideally be included when evaluating "protein-rich foods" which provide nutrients that extend beyond amino acids, which is of particular importance when exploring dietary-level nLCA.

12.
J Soils Sediments ; 23(10): 3589-3601, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37791374

RESUMO

Purpose: Multimodal effective particle size distributions (EPSDs) develop as flocculation and particle breakage occur dynamically in a fluid shear and such distributions have been previously reported in coastal and estuarine waters to understand flocculation processes. Here, we use time varying multimodal EPSDs and hydraulic parameters (discharge and bed shear stress) to assess freshwater flocculation in a gravel-bed river in southern Alberta, Canada. Methods: Instantaneous discharge, volume concentration (VC), and EPSD of suspended solids were measured during three high discharge events at four study sites in a 10 km reach of the Crowsnest River. The EPSD and VC of suspended solids (< 500 µm) were measured in the centroid of flow with a LISST-200x. Bed shear stress for measured discharge was obtained using a flow model, MOBED. Results: Multimodal EPSDs consisted of primary particles, flocculi, microflocs, and macroflocs. Shear dependent flocculation was consistently observed for all sites and events, due to low and high shear stress flocculation, particle breakage, and mobilization of tributary sub-catchment derived particles. Higher shear stress limited flocculation to smaller floc sizes, while lower bed shear stress conditions created higher volumes of macroflocs. Conclusion: Flocculation and particle breakage processes based on relationships between particle size and hydraulic properties presented herein have implications for advancing fine sediment transport models by a variable cohesion factor as a function of floc size class.

13.
Clin Infect Dis ; 75(1): e491-e498, 2022 08 24.
Artigo em Inglês | MEDLINE | ID: mdl-34467402

RESUMO

BACKGROUND: Coronavirus disease 2019 (COVID-19) requiring hospitalization is characterized by robust antibody production, dysregulated immune response, and immunothrombosis. Fostamatinib is a novel spleen tyrosine kinase inhibitor that we hypothesize will ameliorate Fc activation and attenuate harmful effects of the anti-COVID-19 immune response. METHODS: We conducted a double-blind, randomized, placebo-controlled trial in hospitalized adults requiring oxygen with COVID-19 where patients receiving standard of care were randomized to receive fostamatinib or placebo. The primary outcome was serious adverse events by day 29. RESULTS: A total of 59 patients underwent randomization (30 to fostamatinib and 29 to placebo). Serious adverse events occurred in 10.5% of patients in the fostamatinib group compared with 22% in placebo (P = .2). Three deaths occurred by day 29, all receiving placebo. The mean change in ordinal score at day 15 was greater in the fostamatinib group (-3.6 ±â€…0.3 vs -2.6 ±â€…0.4, P = .035) and the median length in the intensive care unit was 3 days in the fostamatinib group vs 7 days in placebo (P = .07). Differences in clinical improvement were most evident in patients with severe or critical disease (median days on oxygen, 10 vs 28, P = .027). There were trends toward more rapid reductions in C-reactive protein, D-dimer, fibrinogen, and ferritin levels in the fostamatinib group. CONCLUSION: For COVID-19 requiring hospitalization, the addition of fostamatinib to standard of care was safe and patients were observed to have improved clinical outcomes compared with placebo. These results warrant further validation in larger confirmatory trials. CLINICAL TRIALS REGISTRATION: Clinicaltrials.gov, NCT04579393.


Assuntos
Tratamento Farmacológico da COVID-19 , Adulto , Aminopiridinas , Método Duplo-Cego , Hospitalização , Humanos , Morfolinas , Oxazinas/uso terapêutico , Oxigênio , Piridinas/uso terapêutico , Pirimidinas , SARS-CoV-2 , Resultado do Tratamento
14.
J Environ Manage ; 311: 114780, 2022 Mar 10.
Artigo em Inglês | MEDLINE | ID: mdl-35278921

RESUMO

Accessible sediment provenance information is highly desirable for guiding targeted interventions for reducing excess diffuse agricultural sediment losses to water. Conventional sediment source fingerprinting methods can provide this information, but at high cost, thereby limiting their widespread application for catchment management. The use of sediment colour measured using an office document scanner represents an easy, fast, and inexpensive alternative method to trace sediment sources. However, the potential for poor source discrimination and its non-conservatism due to enrichment in sediment organic matter content during sediment transport represent possible limitations to its use. As such, the treatment of samples using hydrogen peroxide to remove organic matter can potentially improve source discrimination based upon geology or soil type, and the mapping of differences in colour between source and sediment samples removing the need for a priori source groups, were trialled in a new colour-based tracing framework. The River Avon in southwest England and Holbeck/Wath Beck in northeast England were studied as they have been identified as being of high priority for the targeting of on-farm advice delivered through a long-running agri-environment initiative. In both catchments, colour was effective at identifying that a small proportion of each which would be considered as being low erosion risk was the dominant source of the sampled sediment. This was due to poor connectivity between fields deemed to be at high risk of erosion and stream channels. The hydrogen peroxide sample treatment confirmed that sediment colour was not significantly altered by enrichment in organic matter content. This treament and the mapped comparison between source and suspended sediment colour improved source discrimination allowing for the more spatially-refined identification of critical sediment source areas. It is argued that this new inexpensive procedure can potentially deliver more precise and reliable information to catchment managers than costly quantitative sediment source fingerprinting methods. This method can greatly increase the availability of catchment-specific sediment source data and therefore the robust targeting of management efforts on a national scale.

15.
J Clean Prod ; 338: 130633, 2022 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-35241877

RESUMO

Periods of extreme wet-weather elevate agricultural diffuse water pollutant loads and climate projections for the UK suggest wetter winters. Within this context, we monitored nitrate and suspended sediment loss using a field and landscape scale platform in SW England during the recent extreme wet-weather of 2019-2020. We compared the recent extreme wet-weather period to both the climatic baseline (1981-2010) and projected near- (2041-2060) and far- (2071-2090) future climates, using the 95th percentiles of conventional rainfall indices generated for climate scenarios downscaled by the LARS-WG weather generator from the 19 global climate models in the CMIP5 ensemble for the RCP8.5 emission scenario. Finally, we explored relationships between pollutant loss and the rainfall indices. Grassland field-scale monthly average nitrate losses increased from 0.39-1.07 kg ha-1 (2016-2019) to 0.70-1.35 kg ha-1 (2019-2020), whereas losses from grassland ploughed up for cereals, increased from 0.63-0.83 kg ha-1 to 2.34-4.09 kg ha-1. Nitrate losses at landscape scale increased during the 2019-2020 extreme wet-weather period to 2.04-4.54 kg ha-1. Field-scale grassland monthly average sediment losses increased from 92-116 kg ha-1 (2016-2019) to 281-333 kg ha-1 (2019-2020), whereas corresponding losses from grassland converted to cereal production increased from 63-80 kg ha-1 to 2124-2146 kg ha-1. Landscape scale monthly sediment losses increased from 8-37 kg ha-1 in 2018 to between 15 and 173 kg ha-1 during the 2019-2020 wet-weather period. 2019-2020 was most representative of the forecast 95th percentiles of >1 mm rainfall for near- and far-future climates and this rainfall index was related to monitored sediment, but not nitrate, loss. The elevated suspended sediment loads generated by the extreme wet-weather of 2019-2020 therefore potentially provide some insight into the responses to the projected >1 mm rainfall extremes under future climates at the study location.

16.
Agrofor Syst ; 96(7): 983-995, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36164326

RESUMO

Vegetated land areas play a significant role in determining the fate of carbon (C) in the global C cycle. Riparian buffer vegetation is primarily implemented for water quality purposes as they attenuate pollutants from immediately adjacent croplands before reaching freashwater systems. However, their prevailing conditions may sometimes promote the production and subsequent emissions of soil carbon dioxide (CO2). Despite this, the understanding of soil CO2 emissions from riparian buffer vegetation and a direct comparison with adjacent croplands they serve remain elusive. In order to quantify the extent of CO2 emissions in such an agro system, we measured CO2 emissions simultaneously with soil and environmental variables for six months in a replicated plot-scale facility comprising of maize cropping served by three vegetated riparian buffers, namely: (i) a novel grass riparian buffer; (ii) a willow riparian buffer, and; (iii) a woodland riparian buffer. These buffered treatments were compared with a no-buffer control. The woodland (322.9 ± 3.1 kg ha- 1) and grass (285 ± 2.7 kg ha- 1) riparian buffer treatments (not significant to each other) generated significantly (p = < 0.0001) the largest CO2 compared to the remainder of the treatments. Our results suggest that during maize production in general, the woodland and grass riparian buffers serving a maize crop pose a CO2 threat. The results of the current study point to the need to consider the benefits for gaseous emissions of mitigation measures conventionally implemented for improving the sustainability of water resources.

17.
Am J Kidney Dis ; 78(6): 793-803, 2021 12.
Artigo em Inglês | MEDLINE | ID: mdl-34174365

RESUMO

RATIONALE & OBJECTIVE: B-cell depletion with rituximab has emerged as a first-line therapy for primary membranous nephropathy (MN). However, most patients do not achieve complete remission with rituximab monotherapy. In this case series, we report longer-term remission and relapse rates, anti-phospholipase A2 receptor (PLA2R) antibody levels, B-cell levels, and serious adverse events in patients with primary MN who received rituximab combined with an initial short course of low-dose oral cyclophosphamide and a course of rapidly tapered prednisone. STUDY DESIGN: Single-center retrospective case series. SETTING & PARTICIPANTS: 60 consecutive patients with primary MN treated with the combination of rituximab, low-dose cyclophosphamide, and prednisone at the Vasculitis and Glomerulonephritis Center at the Massachusetts General Hospital. FINDINGS: After treatment initiation, median follow-up was 38 (interquartile range [IQR], 25-62) months; 100% of patients achieved partial remission, defined as a urinary protein-creatinine ratio (UPCR) < 3 g/g and a 50% reduction from baseline, at a median of 3.4 months. By 2 years after treatment initiation, 83% achieved complete remission, defined as a UPCR < 0.3 g/g. The median time to complete remission was 12.4 months. Immunologic remission (defined by an anti-PLA2R titer < 14 RU/mL) was achieved by 86% and 100% of anti-PLA2R seropositive patients (n = 29) at 3 and 6 months, respectively, after treatment initiation. After 1 year, the median UPCR fell from 8.4 (IQR, 5.0-10.7) to 0.3 (IQR, 0.2-0.8) g/g (P < 0.001). No patient relapsed throughout the duration of B-cell depletion. Relapse occurred in 10% of patients at 2 years after the onset of B-cell reconstitution following the last rituximab dose. Over a combined follow-up time of 228 patient-years, 18 serious adverse events occurred. One death occurred unrelated to treatment or primary MN, and 1 patient progressed to kidney failure requiring kidney replacement therapy. LIMITATIONS: Absence of a comparison group. CONCLUSIONS: All patients with primary MN treated with combination therapy achieved partial remission and most achieved a durable complete remission with an acceptable safety profile.


Assuntos
Glomerulonefrite Membranosa , Ciclofosfamida/efeitos adversos , Seguimentos , Glomerulonefrite Membranosa/tratamento farmacológico , Humanos , Imunossupressores , Prednisona , Receptores da Fosfolipase A2 , Estudos Retrospectivos , Rituximab , Resultado do Tratamento
18.
Public Health ; 195: 158-160, 2021 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-34130002

RESUMO

OBJECTIVES: Schools in the Republic of Ireland reopened to students and staff in late August 2020. We sought to determine the test positivity rate of close contacts of cases of coronavirus disease 2019 (COVID-19) in schools during the first half-term of the 2020/2021 academic year. METHODS: National-level data from the schools' testing pathway were interrogated to determine the positivity rate of close contacts of cases of COVID-19 in Irish primary, postprimary and special schools during the first half-term of 2020/2021 academic year. The positivity rates among adult and child close contacts were compared and the proportion of national cases of COVID-19 who were aged 4-18 years during the observation period was calculated to assess whether this proportion increased after schools reopened. RESULTS: Of all, 15,533 adult and child close contacts were tested for COVID-19 through the schools' testing pathway during the first half-term of the 2020/2021 academic year. Three hundred and ninety-nine close contacts tested positive, indicating a positivity rate of 2.6% (95% confidence interval: 2.3-2.8%). The positivity rates of child and adult close contacts were similarly low (2.6% vs 2.7%, P = 0.7). The proportion of all national cases of COVID-19 who were aged 4-18 years did not increase during the first half-term of the 2020/2021 school year. CONCLUSIONS: The low positivity rate of close contacts of cases of COVID-19 in schools indicate that transmission of COVID-19 in Irish schools during the first half-term of the 2020/2021 academic year was low. These findings support policies to keep schools open during the pandemic.


Assuntos
COVID-19/epidemiologia , COVID-19/transmissão , Busca de Comunicante , Surtos de Doenças/prevenção & controle , Pandemias , Estudantes/estatística & dados numéricos , Adulto , COVID-19/prevenção & controle , Criança , Transmissão de Doença Infecciosa/prevenção & controle , Família , Humanos , Irlanda/epidemiologia , Masculino , SARS-CoV-2 , Instituições Acadêmicas
19.
Environ Sci Policy ; 116: 114-127, 2021 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-33613120

RESUMO

Water quality impairment by elevated sediment loss is a pervasive problem for global water resources. Sediment management targets identify exceedance or the sediment loss 'gap' requiring mitigation. In the UK, palaeo-limnological reconstruction of sediment loss during the 100-150 years pre-dating the post-World War II intensification of agriculture, has identified management targets (0.20-0.35 t ha-1 yr-1) representing 'modern background sediment delivery to rivers'. To assess exceedance on land for grazing ruminant farming, an integrated approach combined new mechanistic evidence from a heavily-instrumented experimental farm platform and a scaling out framework of modelled commercial grazing ruminant farms in similar environmental settings. Monitoring (2012-2016) on the instrumented farm platform returned sediment loss ranges of 0.11-0.14 t ha-1 yr-1 and 0.21-0.25 t ha-1 yr-1 on permanent pasture, compared with between 0.19-0.23 t ha-1 yr-1 and 0.43-0.50 t ha-1 yr-1and 0.10-0.13 t ha-1 yr-1and 0.25-0.30 t ha-1 yr-1 on pasture with scheduled plough and reseeds. Excess sediment loss existed on all three farm platform treatments but was more extensive on the two treatments with scheduled plough and reseeds. Excessive sediment loss from land used by grazing ruminant farming more strategically across England, was estimated to be up to >0.2 t ha-1 yr-1. Modelled scenarios of alternative farming futures, based on either increased uptake of interventions typically recommended by visual farm audits, or interventions selected using new mechanistic understanding for sediment loss from the instrumented farm platform, returned minimum sediment loss reductions. On the farm platform these were 2.1 % (up to 0.007 t ha-1 yr-1) and 5.1 % (up to 0.018 t ha-1 yr-1). More strategically, these were up to 2.8 % (0.014 t ha-1 yr-1) and 4.1 % (0.023 t ha-1 yr-1). Conventional on-farm measures will therefore not fully mitigate the sediment loss gap, meaning that more severe land cover change is required.

20.
J Soils Sediments ; 21(4): 1875-1889, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34720744

RESUMO

PURPOSE: Intensive livestock grazing has been associated with an increased risk of soil erosion and concomitant negative impacts on the ecological status of watercourses. Whilst various mitigation options are promoted for reducing livestock impacts, there is a paucity of data on the relationship between stocking rates and quantified sediment losses. This evidence gap means there is uncertainty regarding the cost-benefit of policy preferred best management. METHODS: Sediment yields from 15 hydrologically isolated field scale catchments on a heavily instrumented ruminant livestock farm in the south west UK were investigated over ~ 26 months spread across 6 years. Sediment yields were compared to cattle and sheep stocking rates on long-term, winter (November-April), and monthly timescales. The impacts of livestock on soil vegetation cover and bulk density were also examined. Cattle were tracked using GPS collars to determine how grazing related to soil damage. RESULTS: No observable impact of livestock stocking rates of 0.15-1.00 UK livestock units (LU) ha-1 for sheep, and 0-0.77 LU ha-1 for cattle on sediment yields was observed at any of the three timescales. Cattle preferentially spent time close to specific fences where soils were visually damaged. However, there was no indication that livestock have a significant effect on soil bulk density on a field scale. Livestock were housed indoors during winters when most rainfall occurs, and best management practices were used which when combined with low erodibility clayey soils likely limited sediment losses. CONCLUSION: A combination of clayey soils and soil trampling in only a small proportion of the field areas lead to little impact from grazing livestock. Within similar landscapes with best practice livestock grazing management, additional targeted measures to reduce erosion are unlikely to yield a significant cost-benefit. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s11368-021-02909-y.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA