Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
1.
IEEE J Biomed Health Inform ; 26(3): 1239-1250, 2022 03.
Article in English | MEDLINE | ID: mdl-34347615

ABSTRACT

Knee osteoarthritis (OA) is a chronic disease that considerably reduces patients' quality of life. Preventive therapies require early detection and lifetime monitoring of OA progression. In the clinical environment, the severity of OA is classified by the Kellgren and Lawrence (KL) grading system, ranging from KL-0 to KL-4. Recently, deep learning methods were applied to OA severity assessment to improve accuracy and efficiency. However, this task is still challenging due to the ambiguity between adjacent grades, especially in early-stage OA. Low confident samples, which are less representative than the typical ones, undermine the training process. Targeting the uncertainty in the OA dataset, we propose a novel learning scheme that dynamically separates the data into two sets according to their reliability. Besides, we design a hybrid loss function to help CNN learn from the two sets accordingly. With the proposed approach, we emphasize the typical samples and control the impacts of low confident cases. Experiments are conducted in a five-fold manner on five-class task and early-stage OA task. Our method achieves a mean accuracy of 70.13% on the five-class OA assessment task, which outperforms all other state-of-art methods. Despite early-stage OA detection still benefiting from the human intervention of lesion region selection, our approach achieves superior performance on the KL-0 vs. KL-2 task. Moreover, we design an experiment to validate large-scale automatic data refining during training. The result verifies the ability to characterize low confidence samples. The dataset used in this paper was obtained from the Osteoarthritis Initiative.


Subject(s)
Osteoarthritis, Knee , Early Diagnosis , Humans , Osteoarthritis, Knee/diagnosis , Quality of Life , Reproducibility of Results , Severity of Illness Index
2.
Ren Fail ; 43(1): 1577-1587, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34861810

ABSTRACT

OBJECTIVE: To investigate whether high-phosphorus diets alter gut microbiota in healthy rats and chronic kidney disease (CKD) rats. METHODS: In this 4-week randomized controlled trial, healthy rats and CKD rats were fed a regular-phosphorus (Pi: 0.8%) and high-phosphorus (Pi: 1.2%) diet. The subjects were divided into four groups: sham-group rats with regular-phosphorus diet intervention (CTL group), sham-group rats with high-phosphorus diet intervention (CTLP group), CKD model rats with regular-phosphorus diet intervention (CKD group), and CKD model rats with high-phosphorus diet intervention (CKDP group). The V3-V4 region of the 16S rRNA gene was sequenced to study the effect of a high-phosphorus diet on gut microbiota. RESULTS: A high-phosphorus intervention increased systolic blood pressure (SBP) and parathyroid hormone (PTH) in CTL and CKD rats but did not change serum creatinine and 25(OH)D levels. After the high-phosphorus diet, serum phosphate and fibroblast growth factor 23 (FGF23) increased in the CKDP group compared with the CKD group. The gut microbiota was significantly altered after intervention with a high-phosphorus diet in CTL and CKD group rats. A high-phosphorus diet reduced the Shannon index values of gut microbiota in all rats. The Chao1 and Ace indexes were decreased in the CTL group after high-phosphorus diet intervention. Some microbial genera were elevated significantly after high-phosphorus dietary intervention, such as Blautia and Allobaculum. The main bacteria linked to SBP and FGF23 also correlated directly with creatinine. After high-phosphorus diet intervention, the bacteria Prevotella were positively related to SBP in CTLP and CKDP groups. CONCLUSIONS: High-phosphorus diets were associated with adverse changes in gut microbiota and elevated SBP, which may have adverse consequences for long-term health outcomes.


Subject(s)
Blood Pressure/drug effects , Diet , Gastrointestinal Microbiome/drug effects , Kidney Failure, Chronic , Phosphorus/administration & dosage , Animals , Biomarkers/blood , High-Throughput Nucleotide Sequencing , Male , Parathyroid Hormone/blood , RNA, Ribosomal, 16S/analysis , Rats , Rats, Sprague-Dawley
3.
BMC Nephrol ; 22(1): 398, 2021 12 01.
Article in English | MEDLINE | ID: mdl-34852774

ABSTRACT

BACKGROUND: Estimation of phosphate load in hemodialysis patients is always controversial in clinical practice. The aim of this study was to verify individual achievement rate of serum phosphate as the evaluation of phosphate load through investigating its impact on cardiovascular mortality in hemodialysis patients. METHODS: This was a single-center, retrospective cohort study. A total of 251 maintenance hemodialysis patients were enrolled. The individual achievement rate of serum phosphate was defined as the times of tests within the target range divided by total times of tests over a period of time. Cox regression model was used to examine the relationship between individual achievement rate of serum phosphate and cardiovascular mortality. RESULTS: The mean age of the study population was 61 ± 13 years old. A total of 44 (17.5%) patients died from cardiovascular disease (CVD) during a median follow-up of 65 months. Multivariable Cox analysis showed that one-year serum phosphate achievement rate of 0% (HR = 4.117, P = 0.016) and 25% (HR = 3.343, P = 0.023) increased the risk of cardiovascular mortality while the achievement rate of 50% (HR = 2.129, P = 0.162) and 75% (HR = 1.080, P = 0.902) did not, compared to the rate of 100%. Urea reduction ratio (URR) was positively, while serum intact parathyroid hormone (iPTH), alkaline phosphatase (ALP), normalized protein catabolic rate (nPCR), and total phosphate-binding capacity of drug were negatively associated with achievement in target of serum phosphate. CONCLUSIONS: Keeping one-year achievement rate of serum phosphate higher than 50% provides significant clinical benefits in reducing cardiovascular mortality.


Subject(s)
Cardiovascular Diseases/blood , Cardiovascular Diseases/mortality , Phosphates/blood , Renal Dialysis , Aged , Cohort Studies , Female , Humans , Male , Middle Aged , Retrospective Studies , Time Factors
4.
Ren Fail ; 43(1): 1076-1086, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34193019

ABSTRACT

BACKGROUND: The purpose of this study was to explore the contribution of each factor of the phosphorus metabolism network following phosphorus diet intervention via Granger causality analysis. METHODS: In this study, a total of six healthy male volunteers were enrolled. All participants sequentially received regular, low-, and high-phosphorus diets. Consumption of each diet lasted for five days, with a 5-day washout period between different diets. Blood and urinary samples were collected on the fifth day of consumption of each diet at 9 time points (00:00, 04:00, 08:00, 10:00, 12:00, 14:00, 16:00, 20:00, 24:00) for measurements of serum levels of phosphate, calcium, PTH, FGF23, BALP, α-Klotho, and 1,25 D and urinary phosphorus excretion. Granger causality and the centrality of the above variables in the phosphorus network were analyzed by pairwise panel Granger causality analysis using the time-series data. RESULTS: The mean age of the participants was 28.5 ± 2.1 years. By using Granger causality analysis, we found that the α-Klotho level had the strongest connection with and played a key role in influencing the other variables. In addition, urinary phosphorus excretion was frequently regulated by other variables in the network of phosphorus metabolism following a regular phosphorus diet. After low-phosphorus diet intervention, serum phosphate affected the other factors the most, and the 1,25 D level was the main outcome factor, while urinary phosphorus excretion was the most strongly associated variable in the network of phosphorus metabolism. After high-phosphorus diet intervention, FGF23 and 1,25 D played a more critical role in active regulation and passive regulation in the Granger causality analysis. CONCLUSIONS: Variations in dietary phosphorus intake led to changes in the central factors involved in phosphorus metabolism.


Subject(s)
Phosphorus, Dietary/administration & dosage , Phosphorus/metabolism , Adult , Calcium/blood , Fibroblast Growth Factors/blood , Healthy Volunteers , Humans , Klotho Proteins/blood , Male , Phosphorus/blood , Phosphorus/urine
5.
Kidney Blood Press Res ; 46(1): 53-62, 2021.
Article in English | MEDLINE | ID: mdl-33477164

ABSTRACT

BACKGROUND: Our research group has previously reported a noninvasive model that estimates phosphate removal within a 4-h hemodialysis (HD) treatment. The aim of this study was to modify the original model and validate the accuracy of the new model of phosphate removal for HD and hemodiafiltration (HDF) treatment. METHODS: A total of 109 HD patients from 3 HD centers were enrolled. The actual phosphate removal amount was calculated using the area under the dialysate phosphate concentration time curve. Model modification was executed using second-order multivariable polynomial regression analysis to obtain a new parameter for dialyzer phosphate clearance. Bias, precision, and accuracy were measured in the internal and external validation to determine the performance of the modified model. RESULTS: Mean age of the enrolled patients was 63 ± 12 years, and 67 (61.5%) were male. Phosphate removal was 19.06 ± 8.12 mmol and 17.38 ± 6.75 mmol in 4-h HD and HDF treatments, respectively, with no significant difference. The modified phosphate removal model was expressed as Tpo4 = 80.3 × C45 - 0.024 × age + 0.07 × weight + ß × clearance - 8.14 (ß = 6.231 × 10-3 × clearance - 1.886 × 10-5 × clearance2 - 0.467), where C45 was the phosphate concentration in the spent dialysate measured at the 45th minute of HD and clearance was the phosphate clearance of the dialyzer. Internal validation indicated that the new model was superior to the original model with a significantly smaller bias and higher accuracy. External validation showed that R2, bias, and accuracy were not significantly different than those of internal validation. CONCLUSIONS: A new model was generated to quantify phosphate removal by 4-h HD and HDF with a dialyzer surface area of 1.3-1.8 m2. This modified model would contribute to the evaluation of phosphate balance and individualized therapy of hyperphosphatemia.


Subject(s)
Hemodiafiltration/methods , Hyperphosphatemia/therapy , Phosphates/isolation & purification , Renal Dialysis/methods , Aged , Cross-Sectional Studies , Female , Humans , Linear Models , Male , Middle Aged
6.
BMC Geriatr ; 19(1): 214, 2019 08 07.
Article in English | MEDLINE | ID: mdl-31390985

ABSTRACT

BACKGROUND: Hearing loss is one of the most common modifiable factors associated with cognitive and functional decline in geriatric populations. An accurate, easy-to-apply, and inexpensive hearing screening method is needed to detect hearing loss in community-dwelling elderly people, intervene early and reduce the negative consequences and burden of untreated hearing loss on individuals, families and society. However, available hearing screening tools do not adequately meet the need for large-scale geriatric hearing detection due to several barriers, including time, personnel training and equipment costs. This study aimed to propose an efficient method that could potentially satisfy this need. METHODS: In total, 1793 participants (≥60 years) were recruited to undertake a standard audiometric air conduction pure tone test at 4 frequencies (0.5-4 kHz). Audiometric data from one community were used to train the decision tree model and generate a pure tone screening rule to classify people with or without moderate or more serious hearing impairment. Audiometric data from another community were used to validate the tree model. RESULTS: In the decision tree analysis, 2 kHz and 0.5 kHz were found to be the most important frequencies for hearing severity classification. The tree model suggested a simple two-step screening procedure in which a 42 dB HL tone at 2 kHz is presented first, followed by a 47 dB HL tone at 0.5 kHz, depending on the individual's response to the first tone. This approach achieved an accuracy of 91.20% (91.92%), a sensitivity of 95.35% (93.50%) and a specificity of 86.85% (90.56%) in the training dataset (testing dataset). CONCLUSIONS: A simple two-step screening procedure using the two tones (2 kHz and 0.5 kHz) selected by the decision tree analysis can be applied to screen moderate-to-profound hearing loss in a community-based geriatric population in Shanghai. The decision tree analysis is useful in determining the optimal hearing screening criteria for local elderly populations. Implanting the pair of tones into a well-calibrated sound generator may create a simple, practical and time-efficient screening tool with high accuracy that is readily available at healthcare centers of all levels, thereby facilitating the initiation of extensive nationwide hearing screening in older adults.


Subject(s)
Decision Trees , Geriatric Assessment/methods , Hearing Loss/diagnosis , Independent Living , Mass Screening/methods , Population Surveillance/methods , Aged , Aged, 80 and over , Audiometry, Pure-Tone/methods , Audiometry, Pure-Tone/trends , China/epidemiology , Female , Hearing Loss/epidemiology , Humans , Independent Living/trends , Male , Mass Screening/trends , Middle Aged
7.
IEEE J Transl Eng Health Med ; 7: 4200109, 2019.
Article in English | MEDLINE | ID: mdl-32309061

ABSTRACT

Objective: Dry Weight (DW) is a typical hemodialysis (HD) prescription for End-Stage Renal Disease (ESRD) patients. However, an accurate DW assessment is difficult due to the complication of body components and individual variations. Our objective is to model a clinically practicable DW estimator. Method: We proposed a time series-based regression method to evaluate the weight fluctuation of HD patients according to Electronic Health Record (EHR). A total of 34 patients with 5100 HD sessions data were selected and partitioned into three groups; in HD-stabilized, HD-intolerant, and near-death. Each group's most recent 150 HD sessions data were adopted to evaluate the proposed model. Results: Within a 0.5 kg absolute error margin, our model achieved 95.44%, 91.95%, and 83.12% post-dialysis weight prediction accuracies for the HD-stabilized, HD-intolerant, and near-death groups, respectively. Within a 1%relative error margin, the proposed method achieved 97.99%, 95.36%, and 66.38% accuracies. For HD-stabilized patients, the Mean Absolute Error (MAE) of the proposed method was 0.17 kg ± 0.04 kg. In the model comparison experiment, the performance test showed that the quality of the proposed model was superior to those of the state-of-the-art models. Conclusion: The outcome of this research indicates that the proposed model could potentially automate the clinical weight management for HD patients. Clinical Impact: This work can aid physicians to monitor and estimate DW. It can also be a health risk indicator for HD patients.

SELECTION OF CITATIONS
SEARCH DETAIL
...