RESUMEN
Although kidney transplantation from living donors (LD) offers better long-term results than from deceased donors (DD), elderly recipients are less likely to receive LD transplants than younger ones. We analyzed renal transplant outcomes from LD versus DD in elderly recipients with a propensity-matched score. This retrospective, observational study included the first single kidney transplants in recipients aged ≥65 years from two European registry cohorts (2013-2020, n = 4,257). Recipients of LD (n = 408), brain death donors (BDD, n = 3,072), and controlled cardiocirculatory death donors (cDCD, n = 777) were matched for donor and recipient age, sex, dialysis time and recipient diabetes. Major graft and patient outcomes were investigated. Unmatched analyses showed that LD recipients were more likely to be transplanted preemptively and had shorter dialysis times than any DD type. The propensity score matched Cox's regression analysis between LD and BDD (387-pairs) and LD and cDCD (259-pairs) revealing a higher hazard ratio for graft failure with BDD (2.19 [95% CI: 1.16-4.15], p = 0.016) and cDCD (3.38 [95% CI: 1.79-6.39], p < 0.001). One-year eGFR was higher in LD transplants than in BDD and cDCD recipients. In elderly recipients, LD transplantation offers superior graft survival and renal function compared to BDD or cDCD. This strategy should be further promoted to improve transplant outcomes.
Asunto(s)
Supervivencia de Injerto , Trasplante de Riñón , Donadores Vivos , Puntaje de Propensión , Sistema de Registros , Humanos , Trasplante de Riñón/estadística & datos numéricos , Masculino , Femenino , Anciano , Estudios Retrospectivos , Europa (Continente) , Donantes de Tejidos , Factores de Edad , Rechazo de Injerto , Resultado del Tratamiento , Anciano de 80 o más AñosRESUMEN
BACKGROUND: The criteria for vascular access (VA) selection in pediatric hemodialysis (HD) population has changed over time until the current patient-centered approach using the individualized Life-Plan. We analyzed the type of VA used by incident and prevalent end-stage kidney disease (ESKD) pediatric patients (pts) treated with HD in Catalonia. METHOD: Data from the Catalan Renal Registry of ESKD pts under 18 years of age undergoing kidney replacement therapy (KRT) were examined for a 22-year period (1997-2018). RESULTS: ESKD children starting KRT through HD decreased progressively from 55.6% (1997-2001) to 38.2% (2012-2018) and, conversely, there was an increase in pts starting KRT by preemptive kidney transplantation (KT) from 28.9% to 42.6% between the same periods (for both comparisons, p = 0.007). Most ESKD pts started HD by fistula (AVF) from 1997 to 2001 (56.5%) but this percentage decreased over time and no AVFs were used to start HD in children from 2012 to 2018. Likewise, the percentage of children starting HD by tunneled catheter increased progressively from 8.7% to 72.2% between the same periods (for both comparisons, p < 0.001). Regarding prevalent ESKD pts, children on HD decreased from 34.9% in 1997 to 4.7% in 2018 and, conversely, pts with a functioning kidney graft increased from 62.8% to 92.4% during the same periods (for both comparisons, p < 0.001). There was a progressive decrease in the percentage of children dialyzed by AVF from 100% in 1997 to 0% in 2018 (p < 0.001). The KT rate increased from 5.4 per million population (pmp) in 1997 to 17.1 pmp in 2018 (p = 0.007). The median time on HD prior to the first KT progressively decreased to 6.6 months (2014-2018). CONCLUSION: The high KT rate was a determining factor in choosing the VA type in the incident and prevalent pediatric population treated with HD in Catalonia.
RESUMEN
BACKGROUND: Studies focus on the incidence and risk factors (RFs) associated with reaching the final stage of chronic kidney disease (CKD-G5) and receiving kidney replacement therapy (KRT). Analysis of those related to reaching CKD-G5 while receiving conservative kidney management (CKM) has been neglected. METHODS: Retrospective cohort study analysing electronic health records of individuals aged ≥ 50 with eGFR < 60 mL/min/m2. Cumulative incidence rates of CKD-G5, with and without KRT, were calculated. Multinomial regression models determined odds ratios (ORs) for CKD-G5 progression with KRT, CKM, or death. RESULTS: Among 332,164 patients, the cumulative incidence of CKD-G5 was 2.79 cases per 100 person-years. The rates were 1.92 for CKD-G5 with KRT and 0.87 for CKD-G5 with CKM. Low eGFR and albuminuria were the primary RFs. Male gender and uncontrolled blood pressure had a greater impact on KRT (OR = 2.63 CI, 1.63) than on CKD-G5 with CKM (OR = 1.45 CI, 1.31). Increasing age and rurality reduced the probability of KRT but increased the probability of CKD-G5 with CKM. Higher incomes decreased the likelihood of developing CKD-G5 with and without KRT (OR = 0.49 CI). CONCLUSION: One-third of CKD-G5 cases receive CKM. Those are typically older, female, rural residents with lower incomes and with lesser proteinuria or cardiovascular RF. The likelihood of receiving KRT is influenced by location and socioeconomic disparities.
RESUMEN
BACKGROUND: The incidence of older patients over 80 years old with chronic kidney disease who start hemodialysis (HD) program has been increasing in the last decade. METHODS: We aimed to identify risk factors for morbidity and mortality in patients older than 80 years with end-stage renal disease who started HD. We conducted a retrospective observational study of the Catalan Renal registry (RMRC). RESULTS: A total of 2833 patients equal or older than 80 years (of 15,137) who started HD between 2002 and 2019 from the RMRC were included in the study. In this group, the first dialysis was performed through an arteriovenous fistula in 44%, percutaneous catheter in 28.2%, and tunneled catheter in 26.6%. Conventional dialysis was used in 65.7% and online HD in 34.3%. The most frequent cause of death was cardiac disease (21.8%), followed by social problems (20.4%) and infections (15.9%). Overall survival in older HD during the first year was 84% versus 91% in younger than 80 years (p < 0.001). Cox regression analysis identified the start of HD in the period 2002-2010, chronic obstructive pulmonary disease (COPD), and the onset of HD through vascular graft depicted as risk factors for first-year mortality after dialysis initiation in patients older than 80 years with end-stage renal disease who started HD. CONCLUSIONS: In conclusion, patients older than 80 years who started HD program had higher mortality, especially those who presented exacerbation of kidney disease, those with COPD, and those who started with a vascular graft.
RESUMEN
BACKGROUND: Kidney transplantation (KT) is considered to be the best kidney replacement therapy (KRT) option for most end-stage kidney disease (ESKD) patients. Arteriovenous fistula (AVF) is considered to be the best vascular access (VA) for most haemodialysis (HD) patients. In this study, we investigated the effect of KT activity on AVF use in prevalent HD patients. The probability of receiving a kidney graft (KTx) over time, depending on the first VA used to start the HD program, was also evaluated. METHODS: Data from the Catalan Registry of prevalent patients on KRT by either KT or HD were examined over a 20-year period (1997-2017). RESULTS: The percentage of prevalent ESKD patients with a functioning KTx increased from 40.5% in 1997 to 57.0% in 2017 and, conversely, the percentage of AVF utilisation in HD patients decreased from 86.0% to 63.2% during the same period (for both comparisons, p < 0.001). This inverse relationship was also demonstrated in other countries and regions worldwide by performing a simple linear regression analysis (R2 = 0.4974, p = 0.002). The probability of prevalent patients dialysed through an AVF in Catalonia was independently associated with the percentage of functioning KTx among KRT population, after adjusting by age, gender, primary kidney disease, time on KRT, cardiovascular disease and type of HD Unit. Incident patients starting HD through an AVF had a significantly higher probability of receiving a KTx over time in comparison to patients who initiated HD through a catheter (hazard ratio 1.68 [95% confidence interval: 1.41-2.00], p < 0.001). CONCLUSIONS: In addition to some demographical and clinical characteristics of patients and type of HD Unit, KT activity can be a determining factor in AVF use in prevalent HD patients. Starting an HD programme through an AVF is independently associated with a greater probability of receiving a KTx as compared to starting HD through a catheter.
Asunto(s)
Derivación Arteriovenosa Quirúrgica , Fallo Renal Crónico , Trasplante de Riñón , Humanos , Trasplante de Riñón/efectos adversos , Diálisis Renal , Fallo Renal Crónico/diagnóstico , Fallo Renal Crónico/terapia , Fallo Renal Crónico/epidemiología , Terapia de Reemplazo Renal , Sistema de Registros , Derivación Arteriovenosa Quirúrgica/efectos adversos , Estudios RetrospectivosRESUMEN
Background: There is a lack of information regarding which is the best dialysis technique after kidney transplant (KT) failure. The aim of this study is to compare the effect of kidney replacement therapy modality-peritoneal dialysis (TX-PD-TX), haemodialysis (TX-HD-TX) and preemptive deceased donor retransplantation (TX-TX) on patient survival and second KT outcomes. Methods: A retrospective observational study from the Catalan Renal Registry was carried out. We included adult patients with failing of their first KT from 2000 to 2018. Results: Among 2045 patients, 1829 started on HD (89.4%), 168 on PD (8.2%) and 48 (2.4%) received a preemptive KT. Non-inclusion on the KT waiting list and HD were associated with worse patient survival. For patients included on the waiting list, the probability of human leucocyte antigens (HLA) sensitization and to receive a second KT was similar in HD and PD. A total of 776 patients received a second KT (38%), 656 in TX-HD-TX, 72 in TX-PD-TX and 48 in TX-TX groups. Adjusted mortality after second KT was higher in TX-HD-TX patients compared with TX-TX and TX-PD-TX groups, without differences between TX-TX and TX-PD-TX groups. Death-censored second graft survival was similar in all three groups. Conclusions: Our results suggest that after first KT failure, PD is superior to HD in reducing mortality in candidates for a second KT without options for preemptive retransplantation.
RESUMEN
BACKGROUND: Data about vascular access (VA) use in failed kidney transplant (KT) patients returning to haemodialysis (HD) are limited. We analysed the VA profile of these patients, the factors associated with the likelihood of HD re-initiation through fistula (AVF) and the effect of VA in use at the time of KT on kidney graft (KTx) outcome. METHOD: Data from the Catalan Registry on failed KT patients restarting HD and incident HD patients with native kidney failure were examined over an 18-year period. RESULTS: The VA profile of 675 failed KT patients at HD re-initiation compared with that before KT and with 16,731 incident patients starting HD was (%): AVF 79.3 versus 88.6 and 46.2 (p = 0.001 and p < 0.001), graft AVG 4.4 versus 2.6 and 1.1 (p = 0.08 and p < 0.001), tunnelled catheter TCC 12.4 versus 5.5 and 18.0 (p = 0.001 and p < 0.001) and non-tunnelled catheter 3.9 versus 3.3 and 34.7 (p = 0.56 and p < 0.001). The likelihood of HD re-initiation by AVF was significantly lower in patients with cardiovascular disease, KT duration >5 years, dialysed through AVG or TCC before KT, and females. The analysis of Kaplan-Meier curves showed a greater KTx survival in patients dialysed through arteriovenous access than in patients using catheter just before KT (λ2 = 5.59, p = 0.0181, log-rank test). Cox regression analysis showed that patients on HD through arteriovenous access at the time of KT had lower probability of KTx loss compared to those with catheter (hazard ratio 0.71, 95% CI 0.55-0.90, p = 0.005). CONCLUSIONS: The VA profile of failed KT patients returning to HD and incident patients starting HD was different. Compared to before KT, the proportion of failed KT patients restarting HD with AVF decreased significantly at the expense of TCC. Patients on HD through arteriovenous access at the time of KT showed greater KTx survival compared with those using catheter.
RESUMEN
Outcomes of kidney transplantation (KT) after controlled circulatory death (cDCD) with highly expanded criteria donors (ECD) and recipients have not been thoroughly evaluated. We analyzed in a multicenter cohort of 1161 consecutive KT, granular baseline donor and recipient factors predicting transplant outcomes, selected by bootstrapping and Cox proportional hazards, and were validated in a contemporaneous European KT cohort (n = 1585). 74.3% were DBD and 25.7% cDCD-KT. ECD-KT showed the poorest graft survival rates, irrespective of cDCD or DBD (log-rank < 0.001). Besides standard ECD classification, dialysis vintage, older age, and previous cardiovascular recipient events together with low class-II-HLA match, long cold ischemia time and combining a diabetic donor with a cDCD predicted graft loss (C-Index 0.715, 95% CI 0.675-0.755). External validation showed good prediction accuracy (C-Index 0.697, 95%CI 0.643-0.741). Recipient older age, male gender, dialysis vintage, previous cardiovascular events, and receiving a cDCD independently predicted patient death. Benefit/risk assessment of undergoing KT was compared with concurrent waitlisted candidates, and despite the fact that undergoing KT outperformed remaining waitlisted, remarkably high mortality rates were predicted if KT was undertaken under the worst risk-prediction model. Strategies to increase the donor pool, including cDCD transplants with highly expanded donor and recipient candidates, should be performed with caution.
Asunto(s)
Supervivencia de Injerto , Trasplante de Riñón , Anciano , Aloinjertos , Humanos , Riñón , Masculino , Donantes de TejidosRESUMEN
The number of kidney transplant (KT) procedures with controlled donation after circulatory death (cDCD) donors has exponentially increased in Spain in recent years, with a parallel increase in donor and recipient acceptance criteria. The outcomes of cDCD-KT have been reported to be comparable to those of KT with donation after brain death (DBD) donors. However, studies in elderly recipients have yielded contradictory results. We performed a registry analysis of 852 KT recipients aged ≥65 years (575 in the DBD-KT group, 277 in the cDCD-KT group) in Catalonia, Spain. Clinical outcomes and survival were compared between DBD-KT and cDCD-KT recipients. The donor and recipient ages were similar between the two groups (71.5 ± 8.7 years for donors, 70.8 ± 4.1 years for recipients). Delayed graft function (DGF) was more frequent among cDCD-KT recipients, without a difference in the rate of primary nonfunction. The 3-year patient and death-censored graft survival rates were similar between DBD-KT and cDCD-KT recipients (78.8% vs. 76.4% and 90.3% vs. 86.6%, respectively). In multivariable analysis, previous cardiovascular disease and DGF were independent risk factors for patient death. The type of donation (cDCD vs. DBD) was not an independent risk factor for patient survival or graft loss. cDCD-KT and DBD-KT provide comparable patient and graft survival in elderly recipients.
Asunto(s)
Trasplante de Riñón , Obtención de Tejidos y Órganos , Anciano , Muerte Encefálica , Preescolar , Estudios de Cohortes , Muerte , Supervivencia de Injerto , Humanos , Sistema de Registros , Estudios Retrospectivos , Donantes de TejidosRESUMEN
In aquaculture, biofouling management is a difficult and expensive issue. Cuprous oxide has been commonly used to prevent fouling formation. To cheapen net management and reduce the use of copper, the industry has proposed several alternatives. Currently, polyurethane coatings are being explored and commercially implemented. With this alternative, net cleaning is done in situ, reducing the number of nets necessary to raise a batch, thus ideally reducing operational costs. This pilot study compared this new strategy to the use of cuprous oxide. The results show that nets treated with antifouling perform better and bioaccumulation of copper in fish tissues do not pose health risks to fish. Alternatives involving on-site cleaning need to improve efficiency. Although the conditions of this work are not completely comparable to commercial aquaculture conditions, the results might indicate the strengths and constrains of the solutions tested in real life.
RESUMEN
Normothermic regional perfusion (NRP) allows the in situ perfusion of organs with oxygenated blood in donation after the circulatory determination of death (DCDD). We aimed at evaluating the impact of NRP on the short-term outcomes of kidney transplants in controlled DCDD (cDCDD). This is a multicenter, nationwide, retrospective study comparing cDCDD kidneys obtained with NRP versus the standard rapid recovery (RR) technique. During 2012-2018, 2302 cDCDD adult kidney transplants were performed in Spain using NRP (n = 865) or RR (n = 1437). The study groups differed in donor and recipient age, warm, and cold ischemic time and use of ex situ machine perfusion. Transplants in the NRP group were more frequently performed in high-volume centers (≥90 transplants/year). Through matching by propensity score, two cohorts with a total of 770 patients were obtained. After the matching, no statistically significant differences were observed between the groups in terms of primary nonfunction (p = .261) and mortality at 1 year (p = .111). However, the RR of kidneys was associated with a significantly increased odds of delayed graft function (OR 1.97 [95% CI 1.43-2.72]; p < .001) and 1-year graft loss (OR 1.77 [95% CI 1.01-3.17]; p = .034). In conclusion, compared with RR, NRP appears to improve the short-term outcomes of cDCDD kidney transplants.
Asunto(s)
Trasplante de Riñón , Obtención de Tejidos y Órganos , Adulto , Muerte , Supervivencia de Injerto , Humanos , Preservación de Órganos , Perfusión , Estudios Retrospectivos , Donantes de TejidosRESUMEN
We redefine the genus Troglocharinus Reitter, 1908 based on a phylogenetic analysis with a combination of mitochondrial and molecular data. We recovered the current Speonomites mengeli (Jeannel, 1910) and S. mercedesi (Zariquiey, 1922) as valid, separate species within the Troglocharinus clade, not directly related to Speonomites Jeannel, 1910, a finding corroborated by a detailed study of the male and female genitalia. In consequence, we reinstate Speonomus mercedesi Zariquiey, 1922 stat. nov. as a valid species, transfer both of them to the genus Troglocharinus, T. mengeli (Jeannel, 1910) comb. nov. and T. mercedesi (Zariquiey, 1922) comb. nov., and redescribe the genus. The study of new material from the distribution area of the former S. mengeli revealed the presence of two undescribed species, T. sendrai sp. nov. and T. fadriquei sp. nov., which we describe herein. We designate the lectotype of Speonomus vinyasi Escolà, 1971 to fix its identity, as among its syntypes there are two different species. In agreement with the results of the phylogenetic analyses we establish the synonymy between the genus Speonomites and Pallaresiella Fresneda, 1998 syn. nv.
Asunto(s)
Escarabajos , Animales , Escarabajos/genética , Femenino , Masculino , FilogeniaRESUMEN
BACKGROUND: Some studies reveal that obesity is associated with a decrease in mortality in haemodialysis (HD) patients. However, few studies have addressed the association between body mass index (BMI) and peritoneal dialysis (PD) patients. METHODS: We performed this longitudinal, retrospective study to evaluate the impact of obesity on PD patients, using data from the Catalan Registry of Renal Patients from 2002 to 2015 (n = 1573). Obesity was defined as BMI ≥30; low weight: BMI <18.5; normal range: BMI = 18.5-24.99; and pre-obesity: BMI = 25-29.99 kg/m2. Variations in BMI were calculated during follow-up. The main outcomes evaluated were the technique and patient survival. RESULTS: Obesity was observed in 20% of patients starting PD. We did not find differences in sex or PD modality, with the obesity group being older (65.9% are ≥55 years versus 59% non-obese, P = 0.003) and presenting more diabetes mellitus and cardiovascular disease (CVD) (47.9% obese versus 25.1% non-obese and 41.7% versus 31.5%, respectively). We did not observe differences in haemoglobin, albumin and Kt/V in obese patients. Regarding peritonitis rate, we did not find any difference between groups, presenting more peritonitis patients on continuous ambulatory peritoneal dialysis and aged ≥65 years [sub-hazard ratio (SHR) = 1.75, P = 0.000 and SHR = 1.56, P = 0.009]. In relation to technique survival, we found higher transfer to HD in the obese group of patients in the univariate analysis, which was not confirmed in the multivariate analysis (SHR = 1.12, P = 0.4), and we did not find differences in mortality rate. In relation to being transplanted, the underweight group, elderly and patients with CVD or diabetic nephropathy presented less probability to undergo kidney transplantation (SHR = 0.65, 0.24, 0.5 and 0.54, P < 0.05). Obese patients did not present differences in survival with weight changes but in normal-weight patients, a gain of 7% of the basal weight during the first year had a protective effect on death risk (hazard ratio 0.6, P = 0.034). CONCLUSIONS: Obese and non-obese patients starting on PD had similar outcomes.
RESUMEN
BACKGROUND: Patient survival with end-stage renal disease is longer after kidney transplantation (KT) compared with those remaining on dialysis. Nevertheless, this remains uncertain when receiving a kidney from a donor ≥80 years old. METHODS: In a longitudinal mortality study in the Catalan Renal Registry including 2585 patients ≥60 years old on dialysis and placed on the KT waiting list, 1084 received a first KT from a deceased donor aged 60 to 79 years and 128 from a deceased donor ≥80 years. We calculated adjusted risk of graft loss by means of competing-risks regression, considering patient death with functioning graft as a competing event. To assess patient survival benefit from KT, we calculated the adjusted risk of death by nonproportional hazard analysis, taking the fact of being transplanted as a time-dependent effect. Considering all KT ≥60 (n = 1212), we assessed whether the benefit of KT varied per different recipient characteristics by calculating the interaction effect between all potential mortality risk factors and the treatment group. RESULTS: Compared with kidneys from donors 60 to 79 years old, graft survival was significantly lower for kidneys from donors aged ≥80 years (subhazard ratio = 1.55; 95% confidence interval, 1.00-2.38; P = 0.048). In comparison with those who remained on dialysis, adjusted risk of death 12 months after transplantation in recipients with a kidney from donors ≥80 years was 0.54 (95% confidence interval, 0.38-0.77; P < 0.0001). CONCLUSIONS: Despite KT from octogenarian deceased donors being associated with reduced graft survival, recipients had lower mortality rates than those remaining on dialysis, even if the kidney came from an extremely aged donor.
Asunto(s)
Selección de Donante/normas , Supervivencia de Injerto , Fallo Renal Crónico/terapia , Trasplante de Riñón/normas , Listas de Espera/mortalidad , Factores de Edad , Anciano , Anciano de 80 o más Años , Femenino , Estudios de Seguimiento , Humanos , Fallo Renal Crónico/mortalidad , Trasplante de Riñón/efectos adversos , Trasplante de Riñón/estadística & datos numéricos , Estudios Longitudinales , Masculino , Persona de Mediana Edad , Pronóstico , Sistema de Registros/estadística & datos numéricos , Diálisis Renal/estadística & datos numéricos , Estudios Retrospectivos , Factores de Riesgo , Análisis de Supervivencia , Factores de Tiempo , Resultado del TratamientoRESUMEN
MOTIVATION: Detailed patient data are crucial for medical research. Yet, these healthcare data can only be released for secondary use if they have undergone anonymization. RESULTS: We present and describe µ-ANT, a practical and easily configurable anonymization tool for (healthcare) data. It implements several state-of-the-art methods to offer robust privacy guarantees and preserve the utility of the anonymized data as much as possible. µ-ANT also supports the heterogenous attribute types commonly found in electronic healthcare records and targets both practitioners and software developers interested in data anonymization. AVAILABILITY AND IMPLEMENTATION: (source code, documentation, executable, sample datasets and use case examples) https://github.com/CrisesUrv/microaggregation-based_anonymization_tool.
Asunto(s)
Investigación Biomédica , Anonimización de la Información , Humanos , Privacidad , Semántica , Programas InformáticosRESUMEN
BACKGROUND: Obese kidney allograft recipients have worse results in kidney transplantation (KT). However, there is lack of information regarding the effect of body mass index (BMI) variation after KT. The objective of the study was to evaluate the effects of body weight changes in obese kidney transplant recipients. METHODS: In this study we used data from the Catalan Renal Registry that included KT recipients from 1990 to 2011 (n = 5607). The annual change in post-transplantation BMI was calculated. The main outcome variables were delayed graft function (DGF), estimated glomerular filtration rate (eGFR) and patient and graft survival. RESULTS: Obesity was observed in 609 patients (10.9%) at the time of transplantation. The incidence of DGF was significantly higher in obese patients (40.4% versus 28.3%; P < 0.001). Baseline obesity was significantly associated with worse short- and long-term graft survival (P < 0.05) and worse graft function during the follow-up (P < 0.005). BMI variations in obese patients did not improve eGFR or graft or patient survival. CONCLUSIONS: Our conclusion is that in obese patients, decreasing body weight after KT does not improve either short-term graft outcomes or long-term renal function.
RESUMEN
Accurate assessments of species vulnerability to climate change need to consider the physiological capacity of organisms to deal with temperature changes and identify early signs of thermally induced stress. Oxidative stress biomarkers and acetylcholinesterase activity are useful proxies of stress at the cellular and nervous system level. Such responses are especially relevant for poor dispersal organisms with limited capacity for behavioural thermoregulation, like deep subterranean species. We combined experimental measurements of upper lethal thermal limits, acclimation capacity and biomarkers of oxidative stress and neurotoxicity to assess the impact of heat stress (20°C) at different exposure times (2 and 7 days) on the Iberian endemic subterranean beetle Parvospeonomus canyellesi. Survival response (7 days of exposure) was similar to that reported for other subterranean specialist beetles (high survival up to 20°C but no above 23°C). However, a low physiological plasticity (i.e. incapacity to increase heat tolerance via acclimation) and signs of impairment at the cellular and nervous system level were observed after 7 days of exposure at 20°C. Such sublethal effects were identified by significant differences in total antioxidant capacity, glutathione S-transferase activity, the ratio of reduced to oxidized forms of glutathione and acetylcholinesterase activity between the control (cave temperature) and 20°C treatment. At 2 days of exposure, most biomarker values indicated some degree of oxidative stress in both the control and high-temperature treatment, likely reflecting an initial altered physiological status associated to factors other than temperature. Considering these integrated responses and the predicted increase in temperature in its unique locality, P. canyellesi would have a narrower thermal safety margin to face climate change than that obtained considering only survival experiments. Our results highlight the importance of exploring thermally sensitive processes at different levels of biological organization to obtain more accurate estimates of the species capacity to face climate change.
RESUMEN
Recent studies have proven that vegetables cultivated in peri-urban areas are exposed to a greater concentration of organic microcontaminants (OMCs) and trace elements (TEs) than those grown in rural areas. In this study, the occurrence and human health risk of chemical contaminants (16 TEs and 33 OMCs) in edible parts of lettuce, tomato, cauliflower, and broad beans from two farm fields in the peri-urban area of the city of Barcelona and one rural site outside the peri-urban area were assessed. The concentration of TEs and OMCs (on fresh weight basis) ranged from non-detectable to 17.4â¯mg/kg and from non-detectable to 256⯵g/kg, respectively. Tomato fruits showed the highest concentration of TEs and OMCs. Principal component analysis indicated that the occurrence of chemical contaminants in vegetables depended on the commodity rather than the location (peri-urban vs rural). Risk assessment using hazardous quotient (HQ) and threshold of toxicological concern (TTC) approaches showed that the risk for the consumption of target vegetables in the peri-urban area was low and similar to that observed for the rural site. Total HQ values for TEs were always below 1, and a minimum consumption of 150â¯g/day for children and 380â¯g/day for adults is required to reach the TTC due to the presence of pesticides. Further studies are needed to estimate the combined effect of TEs and OMCs on human health.
Asunto(s)
Agricultura , Contaminación de Alimentos , Plaguicidas/análisis , Medición de Riesgo , Oligoelementos/análisis , Verduras/química , Contaminantes Atmosféricos/química , Niño , Ciudades , Humanos , Contaminantes del Suelo/análisisRESUMEN
Peri-urban horticulture performs environmental and socio-economic functions and provides ecological services to nearby urban areas. Nevertheless, industrialization and water pollution have led to an increase in the exposure of peri-urban vegetables to contaminants such as trace elements (TEs) and organic microcontaminants (OMCs). In this study, the occurrence of chemical contaminants (i.e., 16 TEs, 33 OMCs) in soil and lettuce leaves from 4 farm fields in the peri-urban area of the city of Barcelona was assessed. A rural site, outside the peri-urban area of influence, was selected for comparison. The concentration of TEs and OMCs ranged from non-detectable to 803â¯mg/kgâ¯dw and from non-detectable to 397⯵g/kgâ¯dw respectively in the peri-urban soil, and from 6⯷â¯10-5 to 4.91â¯mg/kgâ¯fw and from non-detectable to 193⯵g/kgâ¯fw respectively in lettuce leaves. Although the concentration of Mo, Ni, Pb, and As in the soil of the peri-urban area exceeded the environmental quality guidelines, their occurrence in lettuce complied with human food standards (except for Pb). The many fungicides (carbendazim, dimetomorph, and methylparaben) and chemicals released by plastic pipelines (tris(1-chloro-2-propyl)phosphate, bisphenol F, and 2-mercaptobenzothiazole) used in agriculture were prevalent in the soil and the edible parts of the lettuce. The occurrence of these chemical pollutants in the peri-urban area did not affect the chlorophyll, lipid, or carbohydrate content of the lettuce leaves. PCA (Principal Component Analysis) showed that soil pollution, fungicide application, and irrigation water quality are the most relevant factors determining the presence of contaminants in crops.
Asunto(s)
Agricultura , Monitoreo del Ambiente , Lactuca/química , Contaminantes del Suelo/análisis , Riego Agrícola , Ciudades , Aguas Residuales/química , Contaminantes Químicos del Agua/análisisRESUMEN
Water scarcity and water pollution have increased the pressure on water resources worldwide. This pressure is particularly important in highly populated areas where water demand exceeds the available natural resources. In this regard, water reuse has emerged as an excellent water source alternative for peri-urban agriculture. Nevertheless, it must cope with the occurrence of chemical contaminants, ranging from trace elements (TEs) to organic microcontaminants. In this study, chemical contaminants (i.e., 15 TEs, 34 contaminants of emerging concern (CECs)), bulk parameters, and nutrients from irrigation waters and crop productivity (Lycopersicon esculentum Mill. cv. Bodar and Lactuca sativa L. cv. Batavia) were seasonally surveyed in 4 farm plots in the peri-urban area of the city of Barcelona. A pristine site, where rain-groundwater is used for irrigation, was selected for background concentrations. The average concentration levels of TEs and CECs in the irrigation water impacted by treated wastewater (TWW) were 3 (35±75µgL-1) and 13 (553±1050ngL-1) times higher than at the pristine site respectively. Principal component analysis was used to classify the irrigation waters by chemical composition. To assess the impact of the occurrence of these contaminants on agriculture, a seed germination assay (Lactuca sativa L) and real field-scale study of crop productivity (i.e., lettuce and tomato) were used. Although irrigation waters from the peri-urban area exhibited a higher frequency of detection and concentration of the assessed chemical contaminants than those of the pristine site (P1), no significant differences were found in seed phytotoxicity or crop productivity. In fact, the crops impacted by TWW showed higher productivity than the other farm plots studied, which was associated with the higher nutrient availability for plants.