ABSTRACT
PURPOSE: To investigate the combinations of variables that comprise the biopsychosocial model domains to identify clinical profiles of risk and protection of second anterior cruciate ligament injury. METHODS: One hundred and forty-five patients for return-to-sport testing after anterior cruciate ligament (ACL) reconstruction (ACLR) were contacted, and 97 were deemed eligible. All were evaluated between 6 and 24 months and followed up for 2 years. Participants answered the International Knee Documentation Committee (IKDC) and Anterior Cruciate Ligament-Return to Sport after Injury Scale (ACL-RSI), performed the postural stability assessment using the Biodex Balance System, and assessed muscle strength at 60° and 300°/s on the isokinetic dynamometer. Personal factors (age, gender, body mass index), body structures (graft type and concomitant injuries), and environmental factors (time between surgery and evaluation) were also collected. The participants were asked about the occurrence of a second ACL injury and return to sport after 2 years of follow-up. Classification and regression tree (CART) analysis was used to determine predictors of a second ACL injury. The receiver operating characteristic (ROC) curve was performed to verify the accuracy of the CART analysis, in addition to the sensitivity, specificity, and relative risk (RR) of the model. RESULTS: Of the initial 97 participants, 88 (89.8%) responded to follow-up and 14 (15.9%) had a second ACL injury (11 graft ruptures and three contralateral ACL). CART analysis identified the following variables as predictors of second ACL injury: return to sport, hamstring strength symmetry at 300°/s, ACL-RSI score, hamstrings/quadriceps ratio at 60°/s, and body mass index (BMI). CART correctly identified 9 (64.3%) of the 14 participants who were reinjured and 71 (95.9%) of the 74 participants who were not. The total correct classification was 90.9%. The area under the ROC curve was 0.88 (95% CI 0.72-0.99; p < 0.001), and the model showed a sensitivity of 75% (95% CI 42.8-94.5), specificity of 93.4% (95% CI 85.3-97.8), and RR of 15.9 (95% CI 4.9-51.4; p < 0.0001). CONCLUSION: The combination of hamstring strength symmetry, hamstring/quadriceps ratio (body functions); return to sport (activity and participation); psychological readiness; and BMI (personal factors) could identify three clinical risk profiles for a second ACL injury with good accuracy. LEVEL OF EVIDENCE: IV.
ABSTRACT
Purpose: We aimed to assess the effect of hemoglobin (Hb) concentration and oxygenation index on COVID-19 patients' mortality risk. Patients and Methods: We retrospectively reviewed sociodemographic and clinical characteristics, laboratory findings, and clinical outcomes from patients admitted to a tertiary care hospital in Bogotá, Colombia, from March to July 2020. We assessed exploratory associations between oxygenation index and Hb concentration at admission and clinical outcomes. We used a generalized additive model (GAM) to evaluate the observed nonlinear relations and the classification and regression trees (CART) algorithm to assess the interaction effects. Results: We included 550 patients, of which 52% were male. The median age was 57 years old, and the most frequent comorbidity was hypertension (29%). The median value of SpO2/FiO2 was 424, and the median Hb concentration was 15 g/dL. The mortality was 15.1% (83 patients). Age, sex, and SpO2/FiO2, were independently associated with mortality. We described a nonlinear relationship between Hb concentration and neutrophil-to-lymphocyte ratio with mortality and an interaction effect between SpO2/FiO2 and Hb concentration. Patients with a similar oxygenation index had different mortality likelihoods based upon their Hb at admission. CART showed that patients with SpO2/FiO2 < 324, who were less than 81 years with an NLR >9.9, and Hb > 15 g/dl had the highest mortality risk (91%). Additionally, patients with SpO2/FiO2 > 324 but Hb of < 12 g/dl and a history of hypertension had a higher mortality likelihood (59%). In contrast, patients with SpO2/FiO2 > 324 and Hb of > 12 g/dl had the lowest mortality risk (9%). Conclusion: We found that a decreased SpO2/FiO2 increased mortality risk. Extreme values of Hb, either low or high, showed an increase in the likelihood of mortality. However, Hb concentration modified the SpO2/FiO2 effect on mortality; the probability of death in patients with low SpO2/FiO2 increased as Hb increased.
ABSTRACT
Many studies have shown that children with reading difficulties present deficits in rapid automatized naming (RAN) and phonological awareness skills. The aim of this study was to examine RAN and explicit phonological processing in Brazilian Portuguese-speaking children with developmental dyslexia and to explore the ability of RAN to discriminate between children with and without dyslexia. Participants were 30 children with a clinical diagnosis of dyslexia established by the Brazilian Dyslexia Association and 30 children with typical development. Children were aged between 7 and 12, and groups were matched for chronological age and sex. They completed a battery of tests that are commonly used in Brazil for diagnosing dyslexia, consisting of the Wechsler Intelligence Test for Children (WISC-IV) as well as tests of single word and non-word reading, RAN, and the profile of phonological abilities test. Results indicate that the cognitive profile of this group of children, with a clinical diagnosis of dyslexia, showed preserved skills in the four subscales of the WISC-IV (verbal comprehension, perceptual reasoning, working memory, and processing speed) and on the profile of phonological abilities test. Groups significantly differed on the reading tests (word and non-word) and RAN measures, with medium to large effect sizes for RAN. Classification and regression tree analysis revealed that RAN was a good predictor for dyslexia diagnosis, with an overall classification accuracy rate of 88.33%.
ABSTRACT
CONTEXT: Understanding the factors that predict return to sport (RTS) after anterior cruciate ligament reconstruction facilitates clinical decision making. OBJECTIVE: To develop a clinical decision algorithm that could predict RTS and non-RTS based on the differences in the variables after anterior cruciate ligament reconstruction. DESIGN: Cross-sectional study. SETTING: University laboratory. PATIENTS OR OTHER PARTICIPANTS: A total of 150 athletes in any sport involving deceleration, jumping, cutting, or turning enrolled in the study. All participants answered the International Knee Documentation Committee and Anterior Cruciate Ligament Return to Sport After Injury (ACL-RSI) questionnaires and performed balance and isokinetic tests. MAIN OUTCOME MEASURE(S): The classification and regression tree (CART) was used to determine the clinical decision algorithm associated with RTS at any level and RTS at the preinjury level. The diagnostic accuracy of the CART was verified. RESULTS: Of the 150 participants, 57.3% (n = 86) returned to sport at any level and 12% (n = 18) returned to sport at the preinjury level. The interactions among the peak torque extension at 300°/s >93.55 Nm, ACL-RSI score >27.05 (P = .06), and postoperative time >7.50 months were associated with RTS at any level identified by CART and were factors associated with RTS. An ACL-RSI score >72.85% was the main variable associated with RTS at the preinjury level. The interaction among an ACL-RSI score of 50.40% to 72.85%, agonistâ:âantagonist ratio at 300°/s ≤63.6%, and anteroposterior stability index ≤2.4 in these participants was the second factor associated with RTS at the preinjury level. CONCLUSIONS: Athletes who had more quadriceps strength tended to RTS at any level more quickly, even with less-than-expected psychological readiness. Regarding a return at the preinjury level, psychological readiness was the most important factor in not returning, followed by a better agonistâ:âantagonist ratio and better balance.
Subject(s)
Anterior Cruciate Ligament Injuries/surgery , Anterior Cruciate Ligament Reconstruction/rehabilitation , Athletic Injuries , Clinical Decision Rules , Exercise Test/methods , Return to Sport , Adult , Athletic Injuries/physiopathology , Athletic Injuries/psychology , Athletic Injuries/rehabilitation , Cross-Sectional Studies , Female , Humans , Male , Muscle Strength , Postural Balance , Prognosis , Quadriceps Muscle , Return to Sport/physiology , Return to Sport/psychology , Surveys and QuestionnairesABSTRACT
OBJECTIVES: Recent guidelines recommend that all cirrhotic patients should undergo endoscopic screening for esophageal varices. That identifying cirrhotic patients with esophageal varices by noninvasive predictors would allow for the restriction of the performance of endoscopy to patients with a high risk of having varices. This study aimed to develop a decision model based on classification and regression tree analysis for the prediction of large esophageal varices in cirrhotic patients. METHODS: 309 cirrhotic patients (training sample, 187 patients; test sample 122 patients) were included. Within the training sample, the classification and regression tree analysis was used to identify predictors and prediction model of large esophageal varices. The prediction model was then further evaluated in the test sample and different Child-Pugh classes. RESULTS: The prevalence of large esophageal varices in cirrhotic patients was 50.8 percent. A tree model that was consisted of spleen width, portal vein diameter and prothrombin time was developed by classification and regression tree analysis achieved a diagnostic accuracy of 84 percent for prediction of large esophageal varices. When reconstructed into two groups, the rate of varices was 83.2 percent for high-risk group and 15.2 percent for low-risk group. Accuracy of the tree model was maintained in the test sample and different Child-Pugh classes. CONCLUSIONS: A decision tree model that consists of spleen width, portal vein diameter and prothrombin time may be useful for prediction of large esophageal varices in cirrhotic patients.