Your browser doesn't support javascript.
loading
Montrer: 20 | 50 | 100
Résultats 1 - 20 de 66
Filtrer
1.
Clin Kidney J ; 17(8): sfae137, 2024 Aug.
Article de Anglais | MEDLINE | ID: mdl-39131078

RÉSUMÉ

Background: Electrolyte abnormalities are common symptoms of chronic kidney disease (CKD), but previous studies have mainly focussed on serum potassium and sodium levels. Chloride is an important biomarker for the prognosis of various diseases. However, the relationship between serum chloride levels and atrial fibrillation (AF) in CKD patients is unclear. Objective: In this study, we sought to determine the association between serum chloride homeostasis and AF in CKD patients. Methods: In this retrospective cohort study, we included patients who met the diagnostic criteria for CKD in China between 2000 and 2021. Competing risk regression for AF was performed. The associations of the baseline serum chloride concentration with heart failure (HF) and stroke incidence were also calculated by competing risk regression. The association of baseline serum chloride levels with all-cause death was determined by a Cox regression model. Results: The study cohort comprised 20 550 participants. During a median follow-up of 350 days (interquartile range, 123-730 days), 211 of the 20 550 CKD patients developed AF. After multivariable adjustment, every decrease in the standard deviation of serum chloride (5.02 mmol/l) was associated with a high risk for AF [sub-hazard ratio (sHR) 0.78, 95% confidence interval (CI) 0.65-0.94, P = .008]. These results were also consistent with those of the stratified and sensitivity analyses. According to the fully adjusted models, the serum chloride concentration was also associated with a high risk for incident HF (sHR 0.85, 95% CI 0.80-0.91, P < .001), a high risk for incident stroke (sHR 0.87, 95% CI 0.81-0.94, P < .001), and a high risk for all-cause death [hazard ratio (HR) 0.82, 95% CI 0.73-0.91, P < .001]. Conclusion: In this CKD population, serum chloride levels were independently and inversely associated with the incidence of AF. Lower serum chloride levels were also associated with an increased risk of incident HF, stroke, and all-cause death.

2.
Sci Total Environ ; 951: 175435, 2024 Aug 10.
Article de Anglais | MEDLINE | ID: mdl-39134269

RÉSUMÉ

Microbial carbon utilization efficiency (CUE) is a crucial indicator for evaluating the efficiency of soil carbon sequestration and transformation, which is applied to quantify the proportion of soil carbon extracted by microbes for anabolism (growth) and catabolism (respiration). Previous studies have shown that the degradation of Moso bamboo forests (Phyllostachys edulis) destroyed the aboveground bamboo structure, reduced vegetation carbon storage, and weakened ecosystem carbon sequestration capacity. Interestingly, soil organic carbon stocks are gradually increasing. However, the mechanism by which degradation-induced changes in soil and vegetation characteristics affect microbial CUE and drive soil carbon sequestration remains unclear. Here we selected four stands with the same origin but different degradation years (intensive management, CK; 2 years' degradation, DM1; 6 years' degradation, DM2; and 10 years' degradation, DM3) based on the local management profiles. The principle of space-for-time substitution was used to investigate the changes in microbial CUE along a degradation time and to further identify the controlling biotic and abiotic factors. Our finding showed that microbial CUE increased by 12.27 %, 31.01 %, and 55.95 %, respectively, compared with CK; whereas microbial biomass turnover time decreased from 23.99 ± 1.11 to 17.16 ± 1.20 days. Promoting microbial growth was the main pathway to enhance microbial CUE. Massive inputs of vegetative carbon replenished soil carbon substrate content, and altered microbial communities and life history strategy, which in turn promoted microbial growth and increased microbial CUE. These findings provide theoretical support for the interactions between carbon dynamics and microbial physiology in degraded bamboo forests, and reinforce the importance of vegetation and microbial properties and soil carbon substrates in predicting microbial CUE.

4.
Plants (Basel) ; 13(14)2024 Jul 15.
Article de Anglais | MEDLINE | ID: mdl-39065468

RÉSUMÉ

Agroforestry management has immense potential in enhancing forest carbon sequestration and mitigating climate change. Yet the impact and response mechanism of compound fertilization rates on carbon sinks in agroforestry systems remain ambiguous. This study aims to elucidate the impact of different compound fertilizer rates on soil greenhouse gas (GHG) emissions, vegetation and soil organic carbon (SOC) sinks, and to illustrate the differences in agroforestry systems' carbon sinks through a one-year positioning test across 12 plots, applying different compound fertilizer application rates (0 (CK), 400 (A1), 800 (A2), and 1600 (A3) kg ha-1). The study demonstrated that, after fertilization, the total GHG emissions of A1 decreased by 4.41%, whereas A2 and A3 increased their total GHG emissions by 17.13% and 72.23%, respectively. The vegetation carbon sequestration of A1, A2, and A3 increased by 18.04%, 26.75%, and 28.65%, respectively, and the soil organic carbon sequestration rose by 32.57%, 42.27% and 43.29%, respectively. To sum up, in contrast with CK, the ecosystem carbon sequestration climbed by 54.41%, 51.67%, and 0.90%, respectively. Our study suggests that rational fertilization can improve the carbon sink of the ecosystem and effectively ameliorate climate change.

5.
Plants (Basel) ; 13(11)2024 May 31.
Article de Anglais | MEDLINE | ID: mdl-38891335

RÉSUMÉ

Moso bamboo (Phyllostachys heterocycla cv. Pubescens) is known for its high capacity to sequester atmospheric carbon (C), which has a unique role to play in the fight against global warming. However, due to rising labor costs and falling bamboo prices, many Moso bamboo forests are shifting to an extensive management model without fertilization, resulting in gradual degradation of Moso bamboo forests. However, many Moso bamboo forests are being degraded due to rising labor costs and declining bamboo timber prices. To delineate the effect of degradation on soil microbial carbon sequestration, we instituted a rigorous analysis of Moso bamboo forests subjected to different degradation durations, namely: continuous management (CK), 5 years of degradation (D-5), and 10 years of degradation (D-10). Our inquiry encompassed soil strata at 0-20 cm and 20-40 cm, scrutinizing alterations in soil organic carbon(SOC), water-soluble carbon(WSOC), microbial carbon(MBC)and microbial residues. We discerned a positive correlation between degradation and augmented levels of SOC, WSOC, and MBC across both strata. Furthermore, degradation escalated concentrations of specific soil amino sugars and microbial residues. Intriguingly, extended degradation diminished the proportional contribution of microbial residuals to SOC, implying a possible decline in microbial activity longitudinally. These findings offer a detailed insight into microbial C processes within degraded bamboo ecosystems.

6.
Signal Transduct Target Ther ; 9(1): 154, 2024 Jun 06.
Article de Anglais | MEDLINE | ID: mdl-38844816

RÉSUMÉ

Early insulin therapy is capable to achieve glycemic control and restore ß-cell function in newly diagnosed type 2 diabetes (T2D), but its effect on cardiovascular outcomes in these patients remains unclear. In this nationwide real-world study, we analyzed electronic health record data from 19 medical centers across China between 1 January 2000, and 26 May 2022. We included 5424 eligible patients (mean age 56 years, 2176 women/3248 men) who were diagnosed T2D within six months and did not have prior cardiovascular disease. Multivariable Cox regression models were used to estimate the associations of early insulin therapy (defined as the first-line therapy for at least two weeks in newly diagnosed T2D patients) with the incidence of major cardiovascular events including coronary heart disease (CHD), stroke, and hospitalization for heart failure (HF). During 17,158 persons years of observation, we documented 834 incident CHD cases, 719 stroke cases, and 230 hospitalized cases for HF. Newly diagnosed T2D patients who received early insulin therapy, compared with those who did not receive such treatment, had 31% lower risk of incident stroke, and 28% lower risk of hospitalization for HF. No significant difference in the risk of CHD was observed. We found similar results when repeating the aforesaid analysis in a propensity-score matched population of 4578 patients and with inverse probability of treatment weighting models. These findings suggest that early insulin therapy in newly diagnosed T2D may have cardiovascular benefits by reducing the risk of incident stroke and hospitalization for HF.


Sujet(s)
Diabète de type 2 , Insuline , Humains , Diabète de type 2/traitement médicamenteux , Diabète de type 2/épidémiologie , Femelle , Mâle , Adulte d'âge moyen , Insuline/usage thérapeutique , Incidence , Sujet âgé , Chine/épidémiologie , Maladies cardiovasculaires/épidémiologie , Maladies cardiovasculaires/traitement médicamenteux , Hypoglycémiants/usage thérapeutique , Adulte , Accident vasculaire cérébral/épidémiologie , Accident vasculaire cérébral/traitement médicamenteux
7.
Article de Anglais | MEDLINE | ID: mdl-38652239

RÉSUMÉ

BACKGROUND: Hypoglycemic pharmacotherapy interventions for alleviating the risk of dementia remains controversial, particularly about dipeptidyl peptidase 4 (DPP4) inhibitors versus metformin. Our objective was to investigate whether the initiation of DPP4 inhibitors, as opposed to metformin, was linked to a reduced risk of dementia. METHODS: We included individuals with type 2 diabetes over 40 years old who were new users of DPP4 inhibitors or metformin in the Chinese Renal Disease Data System (CRDS) database between 2009 and 2020. The study employed Kaplan-Meier and Cox regression for survival analysis and the Fine and Gray model for the competing risk of death. RESULTS: Following a 1:1 propensity score matching, the analysis included 3626 DPP4 inhibitor new users and an equal number of metformin new users. After adjusting for potential confounders, the utilization of DPP4 inhibitors was associated with a decreased risk of all-cause dementia compared to metformin (hazard ratio (HR) 0.63, 95% confidence interval (CI) 0.45-0.89). Subgroup analysis revealed that the utilization of DPP4 inhibitors was associated with a reduced incidence of dementia in individuals who initiated drug therapy at the age of 60 years or older (HR 0.69, 95% CI 0.48-0.98), those without baseline macrovascular complications (HR 0.62, 95% CI 0.41-0.96), and those without baseline microvascular complications (HR 0.67, 95% CI 0.47-0.98). CONCLUSION: In this real-world study, we found that DPP4 inhibitors presented an association with a lower risk of dementia in individuals with type 2 diabetes than metformin, particularly in older people and those without diabetes-related comorbidities.

8.
Indian J Ophthalmol ; 72(Suppl 3): S381-S392, 2024 May 01.
Article de Anglais | MEDLINE | ID: mdl-38454841

RÉSUMÉ

This study aimed to investigate the efficacy and safety of trigeminal parasympathetic pathway (TPP) stimulation in the treatment of dry eye. A comprehensive search for randomized clinical trials was performed in seven databases (MEDLINE, Embase, CENTRAL, etc.) up to 28 February 2023. After screening the suitable studies, the data were extracted and transformed as necessary. Data synthesis and analysis were performed using Review Manager 5.4, and the risk of bias and quality of evidence were evaluated with the recommended tools. Fourteen studies enrolling 1714 patients with two methods (electrical and chemical) of TPP stimulation were included. Overall findings indicate that TPP stimulation was effective in reducing subjective symptom score (standardized mean difference [SMD], -0.45; 95% confidence interval [CI], -0.63 to -0.28), corneal fluorescence staining (mean difference [MD], -0.78; 95% CI, -1.39 to -0.18), goblet cell area (MD, -32.10; 95% CI, -54.58 to -9.62) and perimeter (MD, -5.90; 95% CI, -10.27 to -1.53), and increasing Schirmer's test score (SMD, 0.98; 95% CI, 0.65 to 1.31) and tear film break-up time (SMD, 0.57; 95% CI, 0.19 to 0.95). Compared to inactive or low-activity stimulation controls, it has a higher incidence of adverse events. Therefore, TPP stimulation may be an effective treatment for dry eye, whether electrical or chemical. Adverse events are relatively mild and tolerable. Due to the high heterogeneity and low level of evidence, the current conclusions require to be further verified.


Sujet(s)
Syndromes de l'oeil sec , Humains , Syndromes de l'oeil sec/physiopathologie , Syndromes de l'oeil sec/thérapie , Nerf trijumeau/physiologie , Système nerveux parasympathique/physiologie , Système nerveux parasympathique/physiopathologie , Électrothérapie/méthodes , Larmes/physiologie , Larmes/métabolisme , Résultat thérapeutique
9.
Nanoscale Adv ; 6(3): 876-891, 2024 Jan 30.
Article de Anglais | MEDLINE | ID: mdl-38298577

RÉSUMÉ

In this research, a molecular dynamics (MD) model was adopted to investigate the essence of the effect of strain rate on the mechanical behavior of the Fe14.6Ni (at%) elastocaloric refrigeration alloy. The study showed that the mechanical behavior of the Fe14.6Ni (at%) alloy was dependent on the strain rate. Besides, the investigation of temperature demonstrated that the strain rate influenced mechanical behavior by changing the transient temperatures. Furthermore, it was found that the adiabatic temperature change (ΔTadi) was high and up to 51 K, which was a 1.57 times improvement. Finally, the conclusion was drawn that the strain rate influenced the mechanical behavior by changing the transient total kinetic energy and the phase content evolution processes, which was the essence of the effect of strain rate on the mechanical behavior. This work has clarified the essence and enriched the theory of the effect of strain rate on the mechanical behavior of elastocaloric refrigeration alloys.

10.
Nephrol Dial Transplant ; 39(6): 967-977, 2024 May 31.
Article de Anglais | MEDLINE | ID: mdl-38262746

RÉSUMÉ

BACKGROUND: Postoperative acute kidney injury (AKI) is a common condition after surgery, however, the available data about nationwide epidemiology of postoperative AKI in China from large and high-quality studies are limited. This study aimed to determine the incidence, risk factors and outcomes of postoperative AKI among patients undergoing surgery in China. METHODS: This was a large, multicentre, retrospective study performed in 16 tertiary medical centres in China. Adult patients (≥18 years of age) who underwent surgical procedures from 1 January 2013 to 31 December 2019 were included. Postoperative AKI was defined by the Kidney Disease: Improving Global Outcomes creatinine criteria. The associations of AKI and in-hospital outcomes were investigated using logistic regression models adjusted for potential confounders. RESULTS: Among 520 707 patients included in our study, 25 830 (5.0%) patients developed postoperative AKI. The incidence of postoperative AKI varied by surgery type, which was highest in cardiac (34.6%), urologic (8.7%) and general (4.2%) surgeries. A total of 89.2% of postoperative AKI cases were detected in the first 2 postoperative days. However, only 584 (2.3%) patients with postoperative AKI were diagnosed with AKI on discharge. Risk factors for postoperative AKI included older age, male sex, lower baseline kidney function, pre-surgery hospital stay ≤3 days or >7 days, hypertension, diabetes mellitus and use of proton pump inhibitors or diuretics. The risk of in-hospital death increased with the stage of AKI. In addition, patients with postoperative AKI had longer lengths of hospital stay (12 versus 19 days) and were more likely to require intensive care unit care (13.1% versus 45.0%) and renal replacement therapy (0.4% versus 7.7%). CONCLUSIONS: Postoperative AKI was common across surgery type in China, particularly for patients undergoing cardiac surgery. Implementation and evaluation of an alarm system is important for the battle against postoperative AKI.


Sujet(s)
Atteinte rénale aigüe , Complications postopératoires , Humains , Atteinte rénale aigüe/étiologie , Atteinte rénale aigüe/épidémiologie , Mâle , Femelle , Chine/épidémiologie , Incidence , Études rétrospectives , Facteurs de risque , Adulte d'âge moyen , Complications postopératoires/épidémiologie , Complications postopératoires/étiologie , Sujet âgé , Adulte , Mortalité hospitalière
11.
Kidney Dis (Basel) ; 9(6): 517-528, 2023 Dec.
Article de Anglais | MEDLINE | ID: mdl-38089444

RÉSUMÉ

Introduction: Comprehensive data on the risk of hospital-acquired (HA) acute kidney injury (AKI) among adult users of opioid analgesics are lacking. This study aimed to systematically compare the risk of HA-AKI among the users of various opioid analgesics. Methods: This multicenter, retrospective real-world study analyzed 255,265 adult hospitalized patients who received at least one prescription of opioid analgesic during the first 30 days of hospitalization. The primary outcome was the time from the first opioid analgesic prescription to HA-AKI occurrence. 12 subtypes of opioid analgesics were analyzed, including 9 for treating moderate-to-severe pain and 3 for mild-to-moderate pain. We examined the association between the exposure to each subtype of opioid analgesic and the risk of HA-AKI using Cox proportional hazards models, using the most commonly used opioid analgesic as the reference group. Results: As compared to dezocine, the most commonly used opioid analgesic for treating moderate-to-severe pain, exposure to morphine, but not the other 7 types of opioid analgesics, was associated with a significantly increased risk of HA-AKI (adjusted hazard ratio: 1.56, 95% confidence interval: 1.40-1.78). The association was consistent in stratified analyses and in a propensity-matched cohort. There were no significant differences in the risk of HA-AKI among the opioid analgesic users with mild-to-moderate pain after adjusting for confounders. Conclusion: The use of morphine was associated with an increased risk of HA-AKI in adult patients with moderate-to-severe pain. Opioid analgesics other than morphine should be chosen preferentially in adult patients with high risk of HA-AKI when treating moderate-to-severe pain.

12.
Clin Kidney J ; 16(11): 2262-2270, 2023 Nov.
Article de Anglais | MEDLINE | ID: mdl-37915920

RÉSUMÉ

Background: Acute kidney injury (AKI) has been associated with increased risks of new-onset and worsening proteinuria. However, epidemiologic data for post-AKI proteinuria was still lacking. This study aimed to determine the incidence, risk factors and clinical correlations of post-AKI proteinuria among hospitalized patients. Methods: This study was conducted in a multicenter cohort including patients aged 18-100 years with hospital-acquired AKI (HA-AKI) hospitalized at 19 medical centers throughout China. The primary outcome was the incidence of post-AKI proteinuria. Secondary outcomes included AKI recovery and kidney disease progression. The results of both quantitative and qualitative urinary protein tests were used to define post-AKI proteinuria. Cox proportional hazard model with stepwise regression was used to determine the risk factors for post-AKI proteinuria. Results: Of 6206 HA-AKI patients without proteinuria at baseline, 2102 (33.9%) had new-onset proteinuria, whereas of 5137 HA-AKI with baseline proteinuria, 894 (17.4%) had worsening proteinuria after AKI. Higher AKI stage and preexisting CKD diagnosis were risk factors for new-onset proteinuria and worsening proteinuria, whereas treatment with renin-angiotensin system inhibitors was associated with an 11% lower risk of incident proteinuria. About 60% and 75% of patients with post-AKI new-onset and worsening proteinuria, respectively, recovered within 3 months. Worsening proteinuria was associated with a lower incidence of AKI recovery and a higher risk of kidney disease progression. Conclusions: Post-AKI proteinuria is common and usually transient among hospitalized patients. The risk profiles for new-onset and worsening post-AKI proteinuria differed markedly. Worsening proteinuria after AKI was associated with adverse kidney outcomes, which emphasized the need for close monitoring of proteinuria after AKI.

14.
Nat Commun ; 14(1): 3739, 2023 06 22.
Article de Anglais | MEDLINE | ID: mdl-37349292

RÉSUMÉ

Acute kidney injury (AKI) is prevalent and a leading cause of in-hospital death worldwide. Early prediction of AKI-related clinical events and timely intervention for high-risk patients could improve outcomes. We develop a deep learning model based on a nationwide multicenter cooperative network across China that includes 7,084,339 hospitalized patients, to dynamically predict the risk of in-hospital death (primary outcome) and dialysis (secondary outcome) for patients who developed AKI during hospitalization. A total of 137,084 eligible patients with AKI constitute the analysis set. In the derivation cohort, the area under the receiver operator curve (AUROC) for 24-h, 48-h, 72-h, and 7-day death are 95·05%, 94·23%, 93·53%, and 93·09%, respectively. For dialysis outcome, the AUROC of each time span are 88·32%, 83·31%, 83·20%, and 77·99%, respectively. The predictive performance is consistent in both internal and external validation cohorts. The model can predict important outcomes of patients with AKI, which could be helpful for the early management of AKI.


Sujet(s)
Atteinte rénale aigüe , Dialyse rénale , Humains , Mortalité hospitalière , Facteurs de risque , Dialyse rénale/effets indésirables , Atteinte rénale aigüe/diagnostic , Atteinte rénale aigüe/thérapie , Atteinte rénale aigüe/étiologie , Hôpitaux , Études rétrospectives
15.
Clin J Am Soc Nephrol ; 18(9): 1186-1194, 2023 09 01.
Article de Anglais | MEDLINE | ID: mdl-37314777

RÉSUMÉ

BACKGROUND: The efficacy of immunosuppression in the management of immunoglobulin A (IgA) nephropathy remains highly controversial. The study was conducted to assess the effect of immunosuppression, compared with supportive care, in the real-world setting of IgA nephropathy. METHODS: A cohort of 3946 patients with IgA nephropathy, including 1973 new users of immunosuppressive agents and 1973 propensity score-matched recipients of supportive care, in a nationwide register data from January 2019 to May 2022 in China was analyzed. The primary outcome was a composite of 40% eGFR decrease of the baseline, kidney failure, and all-cause mortality. A Cox proportional hazard model was used to estimate the effects of immunosuppression on the composite outcomes and its components in the propensity score-matched cohort. RESULTS: Among 3946 individuals (mean [SD] age 36 [10] years, mean [SD] eGFR 85 [28] ml/min per 1.73 m 2 , and mean [SD] proteinuria 1.4 [1.7] g/24 hours), 396 primary composite outcome events were observed, of which 156 (8%) were in the immunosuppression group and 240 (12%) in the supportive care group. Compared with supportive care, immunosuppression treatment was associated with 40% lower risk of the primary outcome events (adjusted hazard ratio, 0.60; 95% confidence interval, 0.48 to 0.75). Comparable effect size was observed for glucocorticoid monotherapy and mycophenolate mofetil alone. In the prespecified subgroup analysis, the treatment effects of immunosuppression were consistent across ages, sexes, levels of proteinuria, and values of eGFR at baseline. Serious adverse events were more frequent in the immunosuppression group compared with the supportive care group. CONCLUSIONS: Immunosuppressive therapy, compared with supportive care, was associated with a 40% lower risk of clinically important kidney outcomes in patients with IgA nephropathy.


Sujet(s)
Glomérulonéphrite à dépôts d'IgA , Humains , Adulte , Glomérulonéphrite à dépôts d'IgA/complications , Glomérulonéphrite à dépôts d'IgA/traitement médicamenteux , Débit de filtration glomérulaire , Rein , Immunosuppression thérapeutique/effets indésirables , Immunosuppresseurs/effets indésirables , Protéinurie/traitement médicamenteux , Protéinurie/étiologie
16.
Front Plant Sci ; 14: 1154232, 2023.
Article de Anglais | MEDLINE | ID: mdl-37152132

RÉSUMÉ

Stem respiration (R s) plays a vital role in ecosystem carbon cycling. However, the measured efflux on the stem surface (E s) is not always in situ R s but only part of it. A previously proposed mass balance framework (MBF) attempted to explore the multiple partitioning pathways of R s, including sap-flow-transported and internal storage of R s, in addition to E s. This study proposed stem photosynthesis as an additional partitioning pathway to the MBF. Correspondingly, a double-chamber apparatus was designed and applied on newly sprouted Moso bamboo (Phyllostachys edulis) in leafless and leaved stages. R s of newly sprouted bamboo were twice as high in the leafless stage (7.41 ± 2.66 µmol m-2 s-1) than in the leaved stage (3.47 ± 2.43 µmol m-2 s-1). E s accounted for ~80% of R s, while sap flow may take away ~2% of R s in both leafless and leaved stages. Culm photosynthesis accounted for ~9% and 13% of R s, respectively. Carbon sequestration from culm photosynthesis accounted for approximately 2% of the aboveground bamboo biomass in the leafless stage. High culm photosynthesis but low sap flow during the leafless stage and vice versa during the leaved stage make bamboo an outstanding choice for exploring the MBF.

17.
CMAJ ; 195(21): E729-E738, 2023 05 29.
Article de Anglais | MEDLINE | ID: mdl-37247880

RÉSUMÉ

BACKGROUND: The role of statin therapy in the development of kidney disease in patients with type 2 diabetes mellitus (DM) remains uncertain. We aimed to determine the relationships between statin initiation and kidney outcomes in patients with type 2 DM. METHODS: Through a new-user design, we conducted a multicentre retrospective cohort study using the China Renal Data System database (which includes inpatient and outpatient data from 19 urban academic centres across China). We included patients with type 2 DM who were aged 40 years or older and admitted to hospital between Jan. 1, 2000, and May 26, 2021, and excluded those with pre-existing chronic kidney disease and those who were already on statins or without follow-up at an affiliated outpatient clinic within 90 days after discharge. The primary exposure was initiation of a statin. The primary outcome was the development of diabetic kidney disease (DKD), defined as a composite of the occurrence of kidney dysfunction (estimated glomerular filtration rate [eGFR] < 60 mL/min/1.73 m2 and > 25% decline from baseline) and proteinuria (a urinary albumin-to-creatinine ratio ≥ 30 mg/g and > 50% increase from baseline), sustained for at least 90 days; secondary outcomes included development of kidney function decline (a sustained > 40% decline in eGFR). We used Cox proportional hazards regression to evaluate the relationships between statin initiation and kidney outcomes, as well as to conduct subgroup analyses according to patient characteristics, presence or absence of dyslipidemia, and pattern of dyslipidemia. For statin initiators, we explored the association between different levels of lipid control and outcomes. We conducted analyses using propensity overlap weighting to balance the participant characteristics. RESULTS: Among 7272 statin initiators and 12 586 noninitiators in the weighted cohort, statin initiation was associated with lower risks of incident DKD (hazard ratio [HR] 0.72, 95% confidence interval [CI] 0.62-0.83) and kidney function decline (HR 0.60, 95% CI 0.44-0.81). We obtained similar results to the primary analyses for participants with differing patterns of dyslipidemia, those prescribed different statins, and after stratification according to participant characteristics. Among statin initiators, those with intensive control of high-density lipoprotein cholesterol (LDL-C) (< 1.8 mmol/L) had a lower risk of incident DKD (HR 0.51, 95% CI 0.32-0.81) than those with inadequate lipid control (LDL-C ≥ 3.4 mmol/L). INTERPRETATION: For patients with type 2 DM admitted to and followed up in academic centres, statin initiation was associated with a lower risk of kidney disease development, particularly in those with intensive control of LDL-C. These findings suggest that statin initiation may be an effective and reasonable approach for preventing kidney disease in patients with type 2 DM.


Sujet(s)
Diabète de type 2 , Dyslipidémies , Inhibiteurs de l'hydroxyméthylglutaryl-CoA réductase , Insuffisance rénale chronique , Humains , Inhibiteurs de l'hydroxyméthylglutaryl-CoA réductase/effets indésirables , Diabète de type 2/traitement médicamenteux , Diabète de type 2/épidémiologie , Cholestérol LDL , Études rétrospectives , Insuffisance rénale chronique/épidémiologie , Dyslipidémies/traitement médicamenteux , Dyslipidémies/épidémiologie
18.
Scand J Gastroenterol ; 58(10): 1173-1179, 2023.
Article de Anglais | MEDLINE | ID: mdl-37128690

RÉSUMÉ

BACKGROUND AND STUDY AIMS: The optimal treatment for gastric varices (GVs) is a topic that remains definite for this study. This study compared the clinical outcomes of clip-assisted endoscopic cyanoacrylate injection (clip-ECI) to conventional endoscopic cyanoacrylate injection (con-ECI) for the treatment of GVs with a gastrorenal shunt. PATIENTS AND METHODS: Data were collected retrospectively in five medical centers from 2015 to 2020. The patients were treated with con-ECI (n = 126) or clip-ECI (n = 148). Clinical characteristics and procedural outcomes were compared. Patients were followed until death, liver transplantation or 6 months after the treatment. The primary outcome was rebleeding, and the secondary outcome was survival. RESULTS: There were no significant differences in age, sex, etiology, shunt diameter and Child-Pugh classification between the two groups. Fewer GVs obliteration sessions were required in the clip-ECI group than in the con-ECI group (p = 0.015). The cumulative 6-month rebleeding-free rates were 88.6% in the clip-ECI group and 73.7% in the con-ECI group (p = 0.002). The cumulative 6-month survival rates were 97.1% in the clip-ECI group and 94.8% in the con-ECI group (p = 0.378). CONCLUSIONS: Compared with con-ECI, clip-ECI appears more effective for the treatment of GVs with a gastrorenal shunt, which required less sessions and achieved a higher 6-month rebleeding-free rate.


Sujet(s)
Cyanoacrylates , Varices oesophagiennes et gastriques , Humains , Cyanoacrylates/effets indésirables , Varices oesophagiennes et gastriques/complications , Études rétrospectives , Résultat thérapeutique , Hémorragie gastro-intestinale/étiologie , Hémorragie gastro-intestinale/thérapie , Récidive tumorale locale , Instruments chirurgicaux/effets indésirables , Récidive
19.
Sci Total Environ ; 877: 162915, 2023 Jun 15.
Article de Anglais | MEDLINE | ID: mdl-36933713

RÉSUMÉ

Moso bamboo (Phyllostachys heterocycla cv. Pubescens) is well known for its high capacity to sequester atmospheric carbon, which has a unique role to play in combating global warming. Many Moso bamboo forests are gradually degrading due to rising labor costs and falling prices for bamboo timber. However, the mechanisms of carbon sequestration of Moso bamboo forest ecosystems in response to degradation are unclear. In this study, a space-for-time substitution approach was used to select Moso bamboo forest plots with the same origin and similar stand types, but different years of degradation, and four degradation sequences, continuous management (CK), 2 years of degradation (D-I), 6 years of degradation (D-II) and 10 years of degradation (D-III). A total of 16 survey sample plots were established based on the local management history files. After a 12-month monitoring, the response characteristics of soil greenhouse gases (GHG) emissions, vegetation, and soil organic carbon sequestration in different degradation sequences were evaluated to reveal the differences in the ecosystem carbon sequestration. The results indicated that under D-I, D-II, and D-III, the global warming potential (GWP) of soil GHG emissions decreased by 10.84 %, 17.75 %, and 31.02 %, while soil organic carbon (SOC) sequestration increased by 2.82 %, 18.11 %, and 4.68 %, and vegetation carbon sequestration decreased by 17.30 %, 33.49 %, and 44.76 %, respectively. In conclusion, compared to CK, the ecosystem carbon sequestration was reduced by 13.79 %, 22.42 %, and 30.31 %, respectively. This suggests that degradation reduces soil GHG emissions but weakens the ecosystem carbon sequestration capability. Therefore, in the background of global warming and the strategic goal of carbon neutrality, restorative management of degraded Moso bamboo forests is critically needed to improve the carbon sequestration potential of the ecosystem.


Sujet(s)
Écosystème , Gaz à effet de serre , Séquestration du carbone , Gaz à effet de serre/métabolisme , Carbone/analyse , Sol , Poaceae/métabolisme , Chine
20.
J Am Soc Nephrol ; 34(7): 1253-1263, 2023 07 01.
Article de Anglais | MEDLINE | ID: mdl-36977125

RÉSUMÉ

SIGNIFICANCE STATEMENT: Serum creatinine is not a sensitive biomarker for neonatal AKI because it is confounded by maternal creatinine level, gestational age, and neonatal muscle mass. In this multicenter cohort study of 52,333 hospitalized Chinese neonates, the authors proposed serum cystatin C-related criteria (CyNA) for neonatal AKI. They found that cystatin C (Cys-C) is a robust and sensitive biomarker for identifying AKI in neonates who are at an elevated risk of in-hospital mortality and that CyNA detects 6.5 times as many cases as the modified Kidney Disease Improving Global Outcomes creatinine criteria. They also show that AKI can be detected using a single test of Cys-C. These findings suggest that CyNA shows promise as a powerful and easily applicable tool for detecting AKI in neonates. BACKGROUND: Serum creatinine is not a sensitive biomarker for AKI in neonates. A better biomarker-based criterion for neonatal AKI is needed. METHODS: In this large multicenter cohort study, we estimated the upper normal limit (UNL) and reference change value (RCV) of serum cystatin C (Cys-C) in neonates and proposed cystatin C-based criteria (CyNA) for detecting neonatal AKI using these values as the cutoffs. We assessed the association of CyNA-detected AKI with the risk of in-hospital death and compared CyNA performance versus performance of modified Kidney Disease Improving Global Outcomes (KDIGO) creatinine criteria. RESULTS: In this study of 52,333 hospitalized neonates in China, Cys-C level did not vary with gestational age and birth weight and remained relatively stable during the neonatal period. CyNA criteria define AKI by a serum Cys-C of ≥2.2 mg/L (UNL) or an increase in Cys-C of ≥25% (RCV) during the neonatal period. Among 45,839 neonates with measurements of both Cys-C and creatinine, 4513 (9.8%) had AKI detected by CyNA only, 373 (0.8%) by KDIGO only, and 381 (0.8%) by both criteria. Compared with neonates without AKI by both criteria, neonates with AKI detected by CyNA alone had an increased risk of in-hospital mortality (hazard ratio [HR], 2.86; 95% confidence interval [95% CI], 2.02 to 4.04). Neonates with AKI detected by both criteria had an even higher risk of in-hospital mortality (HR, 4.86; 95% CI, 2.84 to 8.29). CONCLUSIONS: Serum Cys-C is a robust and sensitive biomarker for detecting neonatal AKI. Compared with modified KDIGO creatinine criteria, CyNA is 6.5 times more sensitive in identifying neonates at elevated risk of in-hospital mortality.


Sujet(s)
Atteinte rénale aigüe , Cystatine C , Nouveau-né , Humains , Études de cohortes , Créatinine , Études prospectives , Mortalité hospitalière , Marqueurs biologiques
SÉLECTION CITATIONS
DÉTAIL DE RECHERCHE