Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 315
Filtrar
1.
BMC Public Health ; 24(1): 2101, 2024 Aug 03.
Artigo em Inglês | MEDLINE | ID: mdl-39097727

RESUMO

With childhood hypertension emerging as a global public health concern, understanding its associated factors is crucial. This study investigated the prevalence and associated factors of hypertension among Chinese children. This cross-sectional investigation was conducted in Pinghu, Zhejiang province, involving 2,373 children aged 8-14 years from 12 schools. Anthropometric measurements were taken by trained staff. Blood pressure (BP) was measured in three separate occasions, with an interval of at least two weeks. Childhood hypertension was defined as systolic blood pressure (SBP) and/or diastolic blood pressure (DBP) ≥ age-, sex-, and height-specific 95th percentile, across all three visits. A self-administered questionnaire was utilized to collect demographic, socioeconomic, health behavioral, and parental information at the first visit of BP measurement. Random forest (RF) and multivariable logistic regression model were used collectively to identify associated factors. Additionally, population attributable fractions (PAFs) were calculated. The prevalence of childhood hypertension was 5.0% (95% confidence interval [CI]: 4.1-5.9%). Children with body mass index (BMI) ≥ 85th percentile were grouped into abnormal weight, and those with waist circumference (WC) > 90th percentile were sorted into central obesity. Normal weight with central obesity (NWCO, adjusted odds ratio [aOR] = 5.04, 95% CI: 1.96-12.98), abnormal weight with no central obesity (AWNCO, aOR = 4.60, 95% CI: 2.57-8.21), and abnormal weight with central obesity (AWCO, aOR = 9.94, 95% CI: 6.06-16.32) were associated with an increased risk of childhood hypertension. Childhood hypertension was attributable to AWCO mostly (PAF: 0.64, 95% CI: 0.50-0.75), followed by AWNCO (PAF: 0.34, 95% CI: 0.19-0.51), and NWCO (PAF: 0.13, 95% CI: 0.03-0.30). Our results indicated that obesity phenotype is associated with childhood hypertension, and the role of weight management could serve as potential target for intervention.


Assuntos
Hipertensão , Humanos , Estudos Transversais , Masculino , Feminino , Hipertensão/epidemiologia , China/epidemiologia , Criança , Prevalência , Adolescente , Fatores de Risco , Modelos Logísticos , Algoritmo Florestas Aleatórias
2.
Artigo em Inglês | MEDLINE | ID: mdl-39045798

RESUMO

When evaluating the effect of psychological treatments on a dichotomous outcome variable in a randomized controlled trial (RCT), covariate adjustment using logistic regression models is often applied. In the presence of covariates, average marginal effects (AMEs) are often preferred over odds ratios, as AMEs yield a clearer substantive and causal interpretation. However, standard error computation of AMEs neglects sampling-based uncertainty (i.e., covariate values are assumed to be fixed over repeated sampling), which leads to underestimation of AME standard errors in other generalized linear models (e.g., Poisson regression). In this paper, we present and compare approaches allowing for stochastic (i.e., randomly sampled) covariates in models for binary outcomes. In a simulation study, we investigated the quality of the AME and stochastic-covariate approaches focusing on statistical inference in finite samples. Our results indicate that the fixed-covariate approach provides reliable results only if there is no heterogeneity in interindividual treatment effects (i.e., presence of treatment-covariate interactions), while the stochastic-covariate approaches are preferable in all other simulated conditions. We provide an illustrative example from clinical psychology investigating the effect of a cognitive bias modification training on post-traumatic stress disorder while accounting for patients' anxiety using an RCT.

3.
Eur J Obstet Gynecol Reprod Biol ; 300: 142-149, 2024 Jul 02.
Artigo em Inglês | MEDLINE | ID: mdl-39002400

RESUMO

OBJECTIVE: Prediction of fetal growth restriction (FGR) and small of gestational age (SGA) infants by using various ultrasound cardiac parameters in a logistic regression model. METHODS: In this retrospective study we obtained standardized ultrasound images of 357 fetuses between the 20th and 39th week of gestation, 99 of these fetuses were between the 3rd and 10th growth percentile, 61 smaller than 3rd percentile and 197- appropriate for gestational age over the 10th percentile (control group). Several cardiac parameters were studied. The cardiothoracic ratio and sphericity of the ventricles was calculated. A binary logistic regression model was developed for prediction of growth restriction using the cardiac and biometric parameters. RESULTS: There were noticeable differences between the control and study group in the sphericity of the right ventricle (p = 0.000), left and right longitudinal ventricle length (pright = 0.000, pleft = 0.000), left ventricle transverse length (p = 0.000), heart diameter (p = 0.002), heart circumference (p = 0.000), heart area (p = 0.000), and thoracic diameter limited by the ribs (p = 0.002). There was no difference of the cardiothoracic ratio between groups. The logistic regression model achieved a prediction rate of 79.4 % with a sensitivity of 74.5 % and specificity of 83.2 %. CONCLUSION: The heart of growth restricted infants is characterized by a more globular right ventricle, shorter ventricle length and smaller thorax diameter. These parameters could improve prediction of FGR and SGA.

4.
Public Health ; 234: 126-131, 2024 Jul 08.
Artigo em Inglês | MEDLINE | ID: mdl-38981376

RESUMO

OBJECTIVES: The quality of care for patients may be partly determined by the time they are admitted to the hospital. This study was conducted to explore the effect of admission time and describe the pattern and magnitude of weekly variation in the quality of patient care. STUDY DESIGN: A retrospective observational study. METHODS: Data were collected from the Medical Care Quality Management and Control System for Specific (Single) Diseases in China. A total of 238,122 patients treated for acute ischemic stroke between January 2015 and December 2017 were included. The primary outcomes were completion of the ten process indicators and in-hospital death. RESULTS: The quality of in-hospital care varied according to hospital arrival time. We identified several patterns of variation across the days of the week. In the first pattern, the quality of four indicators, such as stroke physicians within 15 min, was lowest for arrivals between 08:00 and 11:59, increased throughout the day, and peaked for arrivals between 20:00 and 23:59 or 00:00 and 03:59. In the second pattern, the quality of four indicators, such as the application of antiplatelet therapy within 48 h, was not significantly different between days and weeks. There was no difference in in-hospital mortality between the different admission times. CONCLUSIONS: The effect of admission time on the quality of in-hospital care of patients with acute ischemic stroke showed several diurnal patterns. Detecting the times when quality is relatively low may lead to quality improvements in health care. Quality improvement should also focus on reducing diurnal temporal variation.

5.
Ultrasound Med Biol ; 50(9): 1299-1307, 2024 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-38969525

RESUMO

OBJECTIVE: To develop and validate a predictive model for sarcopenia. METHODS: A total of 240 subjects who visited our hospital between August 2021 and May 2023 were randomly divided by time of entry into a training set containing 2/3 of patients and a validation set containing 1/3 of patients. The muscle thickness (MT), echo intensity (EI), and shear wave velocity (SWV) of the medial gastrocnemius muscle were measured. Indicators that were meaningful in the univariate analysis in the training set were included in a binary logistic regression to derive a regression model, and the model was evaluated using a consistency index, calibration plot, and clinical validity curve. Diagnostic efficacy and clinical applicability were compared between the model and unifactorial indicators. RESULTS: Four meaningful variables, age, body mass index (BMI), MT, and SWV, were screened into the predictive model. The model was Logit Y = 21.292 + 0.065 × Age - 0.411 × BMI - 0.524 × MT - 3.072 × SWV. The model was well differentiated with an internally validated C-index of 0.924 and an external validation C-index of 0.914. The calibration plot predicted probabilities against actual probabilities showed excellent agreement. The specificity, sensitivity, and Youden's index of the model were 73.80%, 97.40%, and 71.20%, respectively, when using the diagnostic cut-off value of >0.279 for sarcopenia. The logistic model had higher diagnostic efficacy (p < 0.001) and higher net clinical benefit (p < 0.001) over the same threshold range compared to indicators. CONCLUSION: The logistic model of sarcopenia has been justified to have good discriminatory, calibrated, and clinical validity, and has higher diagnostic value than indicators.


Assuntos
Técnicas de Imagem por Elasticidade , Músculo Esquelético , Sarcopenia , Humanos , Sarcopenia/diagnóstico por imagem , Masculino , Feminino , Músculo Esquelético/diagnóstico por imagem , Pessoa de Meia-Idade , Idoso , Técnicas de Imagem por Elasticidade/métodos , Ultrassonografia/métodos , Reprodutibilidade dos Testes , Valor Preditivo dos Testes , Adulto
6.
Artigo em Inglês | MEDLINE | ID: mdl-39037154

RESUMO

Few studies included objective blood pressure (BP) to construct the predictive model of severe obstructive sleep apnea (OSA). This study used binary logistic regression model (BLRM) and the decision tree method (DTM) to constructed the predictive models for identifying severe OSA, and to compare the prediction capability between the two methods. Totally 499 adult patients with severe OSA and 1421 non-severe OSA controls examined at the Sleep Medicine Center of a tertiary hospital in southern Taiwan between October 2016 and April 2019 were enrolled. OSA was diagnosed through polysomnography. Data on BP, demographic characteristics, anthropometric measurements, comorbidity histories, and sleep questionnaires were collected. BLRM and DTM were separately applied to identify predictors of severe OSA. The performance of risk scores was assessed by area under the receiver operating characteristic curves (AUCs). In BLRM, body mass index (BMI) ≥27 kg/m2, and Snore Outcomes Survey score ≤55 were significant predictors of severe OSA (AUC 0.623). In DTM, mean SpO2 <96%, average systolic BP ≥135 mmHg, and BMI ≥39 kg/m2 were observed to effectively differentiate cases of severe OSA (AUC 0.718). The AUC for the predictive models produced by the DTM was higher in older adults than in younger adults (0.807 vs. 0.723) mainly due to differences in clinical predictive features. In conclusion, DTM, using a different set of predictors, seems more effective in identifying severe OSA than BLRM. Differences in predictors ascertained demonstrated the necessity for separately constructing predictive models for younger and older adults.

7.
BMC Med ; 22(1): 308, 2024 Jul 29.
Artigo em Inglês | MEDLINE | ID: mdl-39075527

RESUMO

BACKGROUND: A prediction model can be a useful tool to quantify the risk of a patient developing dementia in the next years and take risk-factor-targeted intervention. Numerous dementia prediction models have been developed, but few have been externally validated, likely limiting their clinical uptake. In our previous work, we had limited success in externally validating some of these existing models due to inadequate reporting. As a result, we are compelled to develop and externally validate novel models to predict dementia in the general population across a network of observational databases. We assess regularization methods to obtain parsimonious models that are of lower complexity and easier to implement. METHODS: Logistic regression models were developed across a network of five observational databases with electronic health records (EHRs) and claims data to predict 5-year dementia risk in persons aged 55-84. The regularization methods L1 and Broken Adaptive Ridge (BAR) as well as three candidate predictor sets to optimize prediction performance were assessed. The predictor sets include a baseline set using only age and sex, a full set including all available candidate predictors, and a phenotype set which includes a limited number of clinically relevant predictors. RESULTS: BAR can be used for variable selection, outperforming L1 when a parsimonious model is desired. Adding candidate predictors for disease diagnosis and drug exposure generally improves the performance of baseline models using only age and sex. While a model trained on German EHR data saw an increase in AUROC from 0.74 to 0.83 with additional predictors, a model trained on US EHR data showed only minimal improvement from 0.79 to 0.81 AUROC. Nevertheless, the latter model developed using BAR regularization on the clinically relevant predictor set was ultimately chosen as best performing model as it demonstrated more consistent external validation performance and improved calibration. CONCLUSIONS: We developed and externally validated patient-level models to predict dementia. Our results show that although dementia prediction is highly driven by demographic age, adding predictors based on condition diagnoses and drug exposures further improves prediction performance. BAR regularization outperforms L1 regularization to yield the most parsimonious yet still well-performing prediction model for dementia.


Assuntos
Bases de Dados Factuais , Demência , Humanos , Demência/diagnóstico , Demência/epidemiologia , Idoso , Feminino , Masculino , Idoso de 80 Anos ou mais , Pessoa de Meia-Idade , Registros Eletrônicos de Saúde , Medição de Risco/métodos , Fatores de Risco
8.
Artigo em Inglês | MEDLINE | ID: mdl-39080196

RESUMO

OBJECTIVES: This study aimed to evaluate the association between atopic dermatitis in pregnant women and preterm births, accounting for maternal ritodrine hydrochloride administration status. METHODS: Data of 83,796 women with singleton pregnancies at and after 22 weeks of gestation (enrolled between 2011 and 2014) were analyzed. These data were obtained from the Japan Environment and Children's Study. Atopic dermatitis was defined based on self-reported questionnaire responses obtained during the first trimester. The primary outcome measures were preterm births before 37, 32, and 28 weeks of gestation. Using a multivariable logistic regression model, odds ratios for preterm births in pregnant women with atopic dermatitis were calculated, with women without atopic dermatitis included in the reference group. This analysis considered confounding factors and maternal ritodrine hydrochloride administration. RESULTS: Among pregnant women with atopic dermatitis, the adjusted odds ratios (95% confidence intervals) for preterm births before 37, 32, and 28 weeks of gestation were 0.89 (0.81-0.98), 0.98 (0.74-1.30), and 0.88 (0.50-1.55), respectively. This trend remained consistent after excluding participants who received ritodrine hydrochloride. CONCLUSIONS FOR PRACTICE: Atopic dermatitis in pregnant women was significantly associated with a decreased incidence of preterm births before 37 weeks of gestation, even after accounting for the effects of maternal ritodrine hydrochloride administration.

9.
J Biopharm Stat ; : 1-22, 2024 Jul 19.
Artigo em Inglês | MEDLINE | ID: mdl-39028254

RESUMO

Dose selection and optimization in early phase of oncology drug development serves as the foundation for the success of late phases drug development. Bivariate Bayesian logistic regression model (BLRM) is a widely utilized model-based algorithm that has been shown to improve the accuracy for identifying recommended phase 2 dose (RP2D) based on dose-limiting-toxicity (DLT) over traditional method such as 3 + 3. However, it remains a challenge to optimize dose selection that strikes a proper balance between safety and efficacy in escalation and expansion phase of phase I trials. In this paper, we first use a phase I clinical trial to demonstrate how the variability of drug exposure related to pharmacokinetic (PK) parameters among trial participants may add to the difficulties of identifying optimal dose. We use simulation to show that concurrently or retrospectively fitting BLRM model for dose/toxicity data from escalation phase with dose-independent PK parameters as covariate lead to improved accuracy of identifying dose level at which DLT rate is within a prespecified toxicity interval. Furthermore, we proposed both model- and rule-based methods to modify dose at patient level in expansion cohorts based on their PK/exposure parameters. Simulation studies show this approach leads to higher likelihood for a dose level with a manageable toxicity and desirable efficacy margin to be advanced to late phase pipeline after being screened at expansion phase of phase I trial.

10.
Artigo em Inglês | MEDLINE | ID: mdl-39073887

RESUMO

BACKGROUND: Vulnerable populations across the United States are frequently exposed to extreme heat, which is becoming more intense due to a combination of climate change and urban-induced warming. Extreme heat can be particularly detrimental to the health and well-being of older citizens when it is combined with ozone. Although population-based studies have demonstrated associations between ozone, extreme heat, and human health, few studies focused on the role of social and behavioral factors that increase indoor risk and exposure among older adults. METHODS: We conducted a household survey that aimed to understand how older adults are affected by extreme heat and ozone pollution inside and outside of their homes across Houston, Phoenix, and Los Angeles. We examine contributing factors to the risk of self-reported health effects using a generalized linear mixed-effects regression model of telephone survey data of 909 older adults in 2017. RESULTS: We found an increased occurrence of self-reported symptoms for extreme heat with preexisting respiratory health conditions and a lack of air conditioning access; self-reported ozone symptoms were more likely with preexisting respiratory health conditions. The risk of heat-related symptoms was slightly higher in Los Angeles than Houston and Phoenix. We found several demographic, housing, and behavioral characteristics that influenced the risk of heat- and ozone-related symptoms. CONCLUSIONS: The increased risk among older adults based on specific social and behavioral factors identified in this study can inform public health policy and help cities tailor their heat and ozone response plans to the specific needs of this vulnerable population.


Assuntos
Calor Extremo , Ozônio , Humanos , Ozônio/análise , Idoso , Masculino , Feminino , Calor Extremo/efeitos adversos , Fatores de Risco , Exposição Ambiental/efeitos adversos , Cidades , Fatores Sociodemográficos , Autorrelato , Idoso de 80 Anos ou mais , Mudança Climática , Los Angeles/epidemiologia , Estados Unidos/epidemiologia , Poluição do Ar em Ambientes Fechados/efeitos adversos
11.
J Allergy Clin Immunol Glob ; 3(3): 100278, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-38873244

RESUMO

Background: Chronic histaminergic angioedema (CHA) may be classified as a separate acquired angioedema (AE) or as an endotype of chronic spontaneous urticaria (CSU). A recent study suggested them to be independent pathologies. Objective: We carried out an exhaustive analysis between CHA and AE-CSU to explore the possible differentiation between them on the bases of a series of predictors. Methods: An observational, retrospective, cross-sectional, and exploratory study was designed. Fifty-six CHA and 40 AE-CSU patients were included. Data were extracted from the year before and year after time of diagnosis. A predictive model was generated by logistic regression, and its discriminatory power was assessed using the area under the receiver operating characteristic curve. Results: The average frequency of AE attacks per year turned out to be higher in the AE-CSU group than in the CHA group, both before (median [interquartile range] 12 [43] vs 8 [16]) and after (24.3 [51.2] vs 2 [4.25]) diagnosis, respectively. The uvula was more frequently affected in CHA. No other differences were found. However, using 7 clinical characteristics of the patients, a multiple logistic regression model was able to predict, with a specificity of 86.4%, a sensitivity of 92.3%, and an area under the curve of 95.1% (P = .024), that CHA and AE-CSU behaved differently. Conclusion: CHA has similar characteristics to AE-CSU, although they slightly differed in the frequency of attacks and their location. Despite its similarities, a multiple logistic regression model that used clinical and evolutionary characteristics allowed the differentiation of both pathologies and supports the idea that these 2 entities are independent.

12.
Stat Methods Med Res ; : 9622802241259174, 2024 Jun 12.
Artigo em Inglês | MEDLINE | ID: mdl-38865137

RESUMO

Estimation of the 100p percent lethal dose (LD100p) is of great interest to pharmacologists for assessing the toxicity of certain compounds. However, most existing literature focuses on the interval estimation of LD100p and little attention has been paid to its point estimation. Currently, the most commonly used method for estimating the LD100p is the maximum likelihood estimator (MLE), which can be represented as a ratio estimator, with the denominator being the slope estimated from the logistic regression model. However, the MLE can be seriously biased when the sample size is small, a common nature in such studies, or when the dose-response curve is relatively flat (i.e. the slope approaches zero). In this study, we address these issues by developing a novel penalised maximum likelihood estimator (PMLE) that can prevent the denominator of the ratio from being close to zero. Similar to the MLE, the PMLE is computationally simple and thus can be conveniently used in practice. Moreover, with a suitable penalty parameter, we show that the PMLE can (a) reduce the bias to the second order with respect to the sample size and (b) avoid extreme estimates. Through simulation studies and real data applications, we show that the PMLE generally outperforms the existing methods in terms of bias and root mean square error.

13.
Front Immunol ; 15: 1400046, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38887295

RESUMO

Background: Kawasaki disease shock syndrome (KDSS) is a critical manifestation of Kawasaki disease (KD). In recent years, a logistic regression prediction model has been widely used to predict the occurrence probability of various diseases. This study aimed to investigate the clinical characteristics of children with KD and develop and validate an individualized logistic regression model for predicting KDSS among children with KD. Methods: The clinical data of children diagnosed with KDSS and hospitalized between January 2021 and December 2023 were retrospectively analyzed. The best predictors were selected by logistic regression and lasso regression analyses. A logistic regression model was built of the training set (n = 162) to predict the occurrence of KDSS. The model prediction was further performed by logistic regression. A receiver operating characteristic curve was used to evaluate the performance of the logistic regression model. We built a nomogram model by visualizing the calibration curve using a 1000 bootstrap resampling program. The model was validated using an independent validation set (n = 68). Results: In the univariate analysis, among the 24 variables that differed significantly between the KDSS and KD groups, further logistic and Lasso regression analyses found that five variables were independently related to KDSS: rash, brain natriuretic peptide, serum Na, serum P, and aspartate aminotransferase. A logistic regression model was established of the training set (area under the receiver operating characteristic curve, 0.979; sensitivity=96.2%; specificity=97.2%). The calibration curve showed good consistency between the predicted values of the logistic regression model and the actual observed values in the training and validation sets. Conclusion: Here we established a feasible and highly accurate logistic regression model to predict the occurrence of KDSS, which will enable its early identification.


Assuntos
Síndrome de Linfonodos Mucocutâneos , Humanos , Síndrome de Linfonodos Mucocutâneos/diagnóstico , Síndrome de Linfonodos Mucocutâneos/sangue , Masculino , Feminino , Pré-Escolar , Lactente , Estudos Retrospectivos , Modelos Logísticos , Criança , Choque/etiologia , Choque/diagnóstico , Curva ROC , Nomogramas , Prognóstico , Biomarcadores/sangue
14.
Microorganisms ; 12(6)2024 May 25.
Artigo em Inglês | MEDLINE | ID: mdl-38930455

RESUMO

Extensive research has been conducted to identify key proteins governing stress responses, virulence, and antimicrobial resistance, as well as to elucidate their interactions within Listeria monocytogenes. While these proteins hold promise as potential targets for novel strategies to control L. monocytogenes, given their critical roles in regulating the pathogen's metabolism, additional analysis is needed to further assess their druggability-the chance of being effectively bound by small-molecule inhibitors. In this work, 535 binding pockets of 46 protein targets for known drugs (mainly antimicrobials) were first analyzed to extract 13 structural features (e.g., hydrophobicity) in a ligand-protein docking platform called Molsoft ICM Pro. The extracted features were used as inputs to develop a logistic regression model to assess the druggability of protein binding pockets, with a value of one if ligands can bind to the protein pocket. The developed druggability model was then used to evaluate 23 key proteins from L. monocytogenes that have been identified in the literature. The following proteins are predicted to be high-potential druggable targets: GroEL, FliH/FliI complex, FliG, FlhB, FlgL, FlgK, InlA, MogR, and PrfA. These findings serve as an initial point for future research to identify specific compounds that can inhibit druggable target proteins and to design experimental work to confirm their effectiveness as drug targets.

15.
Front Cardiovasc Med ; 11: 1358066, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38720918

RESUMO

Background: The prevalence of Type 2 Diabetes Mellitus (T2D) and its significant role in increasing Coronary Heart Disease (CHD) risk highlights the urgent need for effective CHD screening within this population. Despite current advancements in T2D management, the complexity of cardiovascular complications persists. Our study aims to develop a comprehensive CHD screening model for T2D patients, employing multimodal data to improve early detection and management, addressing a critical gap in clinical practice. Methods: We analyzed data from 699 patients, including 471 with CHD (221 of these also had T2D) and a control group of 228 without CHD. Employing strict diagnostic criteria, we conducted significance testing and multivariate analysis to identify key indicators for T2D-CHD diagnosis. This led to the creation of a neural network model using 21 indicators and a logistic regression model based on an 8-indicator subset. External validation was performed with an independent dataset from an additional 212 patients to confirm the models' generalizability. Results: The neural network model achieved an accuracy of 90.7%, recall of 90.78%, precision of 90.83%, and an F-1 score of 0.908. The logistic regression model demonstrated an accuracy of 90.13%, recall of 90.1%, precision of 90.22%, and an F-1 score of 0.9016. External validation reinforced the models' reliability and effectiveness in broader clinical settings. Conclusion: Our AI-driven diagnostic models significantly enhance early CHD detection and management in T2D patients, offering a novel, efficient approach to addressing the complex interplay between these conditions. By leveraging advanced analytics and comprehensive patient data, we present a scalable solution for improving clinical outcomes in this high-risk population, potentially setting a new standard in personalized care and preventative medicine.

16.
Int J Med Inform ; 187: 105468, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38703744

RESUMO

PURPOSE: Our research aims to compare the predictive performance of decision tree algorithms (DT) and logistic regression analysis (LR) in constructing models, and develop a Post-Thrombotic Syndrome (PTS) risk stratification tool. METHODS: We retrospectively collected and analyzed relevant case information of 618 patients diagnosed with DVT from January 2012 to December 2021 in three different tertiary hospitals in Jiangxi Province as the modeling group. Additionally, we used the case information of 212 patients diagnosed with DVT from January 2022 to January 2023 in two tertiary hospitals in Hubei Province and Guangdong Province as the validation group. We extracted electronic medical record information including general patient data, medical history, laboratory test indicators, and treatment data for analysis. We established DT and LR models and compared their predictive performance using receiver operating characteristic (ROC) curves and confusion matrices. Internal and external validations were conducted. Additionally, we utilized LR to generate nomogram charts, calibration curves, and decision curves analysis (DCA) to assess its predictive accuracy. RESULTS: Both DT and LR models indicate that Year, Residence, Cancer, Varicose Vein Operation History, DM, and Chronic VTE are risk factors for PTS occurrence. In internal validation, DT outperforms LR (0.962 vs 0.925, z = 3.379, P < 0.001). However, in external validation, there is no significant difference in the area under the ROC curve between the two models (0.963 vs 0.949, z = 0.412, P = 0.680). The validation results of calibration curves and DCA demonstrate that LR exhibits good predictive accuracy and clinical effectiveness. A web-based calculator software of nomogram (https://sunxiaoxuan.shinyapps.io/dynnomapp/) was utilized to visualize the logistic regression model. CONCLUSIONS: The combination of decision tree and logistic regression models, along with the web-based calculator software of nomogram, can assist healthcare professionals in accurately assessing the risk of PTS occurrence in individual patients with lower limb DVT.


Assuntos
Síndrome Pós-Trombótica , Trombose Venosa , Humanos , Trombose Venosa/diagnóstico , Síndrome Pós-Trombótica/diagnóstico , Síndrome Pós-Trombótica/etiologia , Feminino , Masculino , Pessoa de Meia-Idade , Medição de Risco/métodos , Estudos Retrospectivos , Extremidade Inferior/irrigação sanguínea , Fatores de Risco , Modelos Logísticos , Adulto , Árvores de Decisões , Idoso , Curva ROC , Algoritmos , Nomogramas
17.
BMC Nutr ; 10(1): 67, 2024 May 02.
Artigo em Inglês | MEDLINE | ID: mdl-38698456

RESUMO

BACKGROUND: Child marriage remains an important problem around the world with young mothers and their under-five children often experiencing under-nutrition. The problem is rarely studied in the Bangladeshi population. This paper was designed to identify the association between child marriage and nutritional status of mothers and their under-five children in Bangladesh. METHODS: Nationally representative secondary data was used for this study, data was extracted from the Bangladesh Demographic and Health Survey (BDHS) 2017-18. The sample consisted of 7235 mothers aged 18-49 years and their under-five children. The mothers were classified into two classes according to their age at first marriage: (i) child marriage (marriage at < 18 years) and (ii) not child marriage (marriage at ≥ 18 years). The nutritional status of mothers was measured by body mass index (BMI), and under-five children's nutritional status was measured by (i) height-for-age (z-score) (stunting), (ii) weight-for-age (z-score) (underweight), and (iii) weight-for-height (z-score) (wasting). The chi-square test and two-level logistic regression model were used for data analysis using SPSS software (IBM version 20). RESULTS: The prevalence of child marriage among Bangladeshi women was 69.0%, with the mean and median of age at the first marriage being 16.57 ± 2.83 years and 16 years, respectively. Of the mothers, 15.2% suffered from chronic energy deficiency (underweight), and 72.8% were married at < 18 years. The prevalence of stunting, underweight, and wasting among under-five children in Bangladesh was 31.0%, 22.0%, and 8.5%, respectively. Compared to women married at the age of ≥ 18 years, there was a significantly higher likelihood of chronic energy deficiency among women who married at < 18 years [Adjusted OR = 1.27, CI: 1.05-1.82; p < 0.05]. Under-five children of mothers married before the age of 18 were more likely to have stunting [Adjusted OR = 1.201, CI: 1.11-1.72; p < 0.05], wasting [Adjusted OR = 1.519, CI: 1.15-2.00; p < 0.01], and underweight [Adjusted OR = 1.150, CI: 1.09-1.82; p < 0.05] compared to children of mothers who married at age ≥ 18. CONCLUSION: The rate of child marriage among Bangladeshi women is high, and it is significantly associated with malnutrition among mothers and their under-five children. The Bangladesh government can use the findings of this study to prevent and reduce child marriage and malnutrition among mothers and their under-five children to achieve sustainable development goals by 2030.

18.
BMC Health Serv Res ; 24(1): 664, 2024 May 26.
Artigo em Inglês | MEDLINE | ID: mdl-38797840

RESUMO

INTRODUCTION: Reproductive health service (RHS) helps for people to have a delighted and safe sex through their life journey. It enables especially for women to go safely through pregnancy and childbirth and provide couples with the best chance of having a healthy infant. Therefore, this study aimed to identify the significant determinants of RHS utilization among undergraduate regular class students in Assosa University by using advanced methodology. METHODS: We used cross-sectional study design to collect RHS data from 362 students in Assosa University from 5 to 16, may 2021. These students were selected using stratified random sampling technique. We also used cross-tabulation to summarize the extents of RHS utilization across all predictors in terms of percentage and three varieties of multilevel binary logistic regression model to model the determinants of RHS. RESULTS: 42.27% of undergraduate regular class students in Assosa University utilize at least one type of RHS during their time at Assosa University whereas, 57.73% of undergraduate regular class students in this University are not utilized it. Among three varieties of multilevel binary logistic regression models, the random slopes two-level model was selected as a best fitted model for the datasets. At 5% level of significance, awareness about RHS, gender, preference of service fees and student's monthly average income were significant predictor variables in this model. In addition, the covariates; age, gender and preference of service fees have a significant random effects on utilization of RHS across all colleges/school. CONCLUSION: Students who; preferred service fee as usual rate, have awareness about RHS, are females and have high monthly average income were more likely to utilize RHS. RHS utilization among undergraduate regular students in Assosa University is likely to increase more effectively with interventions that address these factors.


Assuntos
Serviços de Saúde Reprodutiva , Estudantes , Humanos , Feminino , Estudos Transversais , Masculino , Universidades , Serviços de Saúde Reprodutiva/estatística & dados numéricos , Modelos Logísticos , Estudantes/estatística & dados numéricos , Estudantes/psicologia , Adulto Jovem , Adulto , Adolescente
19.
Int J Food Microbiol ; 419: 110738, 2024 Jul 16.
Artigo em Inglês | MEDLINE | ID: mdl-38772219

RESUMO

This study investigates the possibility of utilizing drip as a non-destructive method for assessing the freshness and spoilage of chicken meat. The quality parameters [pH, volatile base nitrogen (VBN), and total aerobic bacterial counts (TAB)] of chicken meat were evaluated over a 13-day storage period in vacuum packaging at 4 °C. Simultaneously, the metabolites in the chicken meat and its drip were measured by nuclear magnetic resonance. Correlation (Pearson's and Spearman's rank) and pathway analyses were conducted to select the metabolites for model training. Binary logistic regression (model 1 and model 2) and multiple linear regression models (model 3-1 and model 3-2) were trained using selected metabolites, and their performance was evaluated using receiver operating characteristic (ROC) curves. As a result, the chicken meat was spoiled after 7 days of storage, exceeding 20 mg/100 g VBN and 5.7 log CFU/g TAB. The correlation analysis identified one organic acid, eight free amino acids, and five nucleic acids as highly correlated with chicken meat and its drip during storage. Pathway analysis revealed tyrosine and purine metabolism as metabolic pathways highly correlated with spoilage. Based on these findings, specific metabolites were selected for model training: ATP, glutamine, hypoxanthine, IMP, tyrosine, and tyramine. To predict the freshness and spoilage of chicken meat, model 1, trained using tyramine, ATP, tyrosine, and IMP from chicken meat, achieved a 99.9 % accuracy and had an ROC value of 0.884 when validated using drip metabolites. This model 1 was improved by training with tyramine and IMP from both chicken meat and its drip (model 2), which increased the ROC value for drip metabolites from 0.884 to 0.997. Finally, selected two metabolites (tyramine and IMP) can predict TAB and VBN quantitatively through models 3-1 and 3-2, respectively. Therefore, the model developed using metabolic changes in drip demonstrated the capability to non-destructively predict the freshness and spoilage of chicken meat at 4 °C. To make generic predictions, it is necessary to expand the model's applicability to various conditions, such as different temperatures, and validate its performance across multiple chicken batches.


Assuntos
Galinhas , Embalagem de Alimentos , Carne , Animais , Carne/microbiologia , Carne/análise , Embalagem de Alimentos/métodos , Microbiologia de Alimentos , Armazenamento de Alimentos , Contagem de Colônia Microbiana , Vácuo , Contaminação de Alimentos/análise
20.
Cureus ; 16(4): e59036, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38800155

RESUMO

BACKGROUND: Uncontrolled hypertension is a major public health concern that contributes significantly to cardiovascular morbidity and mortality. Treatment of hypertension prevents and reduces cardiovascular morbidity, notably a 40% reduction in risk of stroke and a 15% reduction in risk of myocardial infarction. Understanding the prevalence and predictors of uncontrolled hypertension is crucial for developing targeted interventions. OBJECTIVE: This study aimed to determine the prevalence of uncontrolled hypertension and identify potential predictors among patients attending the Non-Communicable Disease (NCD) clinic of a tertiary care center in Gujarat, India. METHODS: A cross-sectional study involving 732 adult patients with hypertension was conducted. Sociodemographic data, lifestyle factors, anthropometric measurements, and comorbidities were assessed. Blood pressure was measured using standardized protocols, and uncontrolled hypertension was defined as a systolic blood pressure ≥140 mmHg or diastolic blood pressure ≥90 mmHg. Univariate and multivariate logistic regression analyses were performed to identify predictors of uncontrolled hypertension. RESULTS: The prevalence of uncontrolled hypertension was 60.2% (95% CI: 56.7%-63.7%). In the multivariate analysis, increasing age (adjusted OR: 1.21, 95% CI: 1.05-1.39), increased body mass index (adjusted OR: 1.49, 95% CI: 1.27-1.75), diabetes (adjusted OR: 1.68, 95% CI: 1.20-2.35), chronic kidney disease (adjusted OR: 2.11, 95% CI: 1.22-3.65), and current smoking status (adjusted OR: 1.83, 95% CI: 1.14-2.93) were identified as independent predictors of uncontrolled hypertension. CONCLUSION: This study revealed a high prevalence of uncontrolled hypertension in this tertiary care setting. Age, obesity, diabetes, chronic kidney disease, and smoking were identified as significant predictors. Targeted interventions addressing these modifiable risk factors and comorbidities are crucial for improving blood pressure control and reducing the burden of hypertension-related complications.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA