RESUMO
BACKGROUND: Insufficient or excessive sleep duration are associated with increased risk of individual adverse outcomes. However, it remains largely unknown whether sleep duration trajectories are associated with overall health among older adults. This study aimed to examine the association between sleep duration trajectories and successful aging. METHODS: In the China Health and Retirement Longitudinal Study (CHARLS), 3,306 participants without major chronic diseases at baseline and survived to aged 60 years and older at the end of follow-up were potentially eligible participants. Total sleep duration was assessed in 2011, 2013, and 2015, and successful aging was evaluated in 2020 and was defined as being free of major chronic diseases, no physical impairment, high cognitive function, good mental health, and active engagement with life. Latent class mixed model (LCMM) was used to identify sleep duration trajectories and logistic regression was performed to explore the association between these trajectories and successful aging. RESULTS: During the 9-year follow-up, 455 individuals (13.8%) met the criteria for successful aging. Five sleep duration trajectories were identified: normal stable, long stable, decreasing, increasing, and short stable. Compared with the normal stable trajectory, the adjusted ORs (95% CI) for achieving successful aging for participants with long stable, decreasing, increasing, and short stable trajectories were 1.00 (0.77, 1.30), 0.64 (0.40, 1.03), 0.64 (0.45, 0.92), and 0.48 (0.35, 0.66), respectively. The stratified and sensitivity analyses were generally consistent with the main results. CONCLUSIONS: Increasing and short stable trajectories of sleep duration are associated with lower odds of successful aging relative to participants in the normal stable trajectory. The findings underscore the critical importance of monitoring dynamic changes in sleep duration in middle-aged and older Chinese adults.
Assuntos
Sono , Humanos , Masculino , Feminino , Idoso , Pessoa de Meia-Idade , China/epidemiologia , Estudos Longitudinais , Sono/fisiologia , Fatores de Tempo , Envelhecimento Saudável/fisiologia , Idoso de 80 Anos ou mais , Envelhecimento/fisiologia , Estudos de Coortes , Duração do SonoRESUMO
BACKGROUND: This case study examines the application of Integrated Enhanced Cognitive Behavioural Therapy (I-CBTE) for a patient with severe, longstanding anorexia nervosa and multiple comorbidities, including organic hallucinosis, complex post-traumatic stress disorder (CPTSD), and severe self-harm. Such complex presentations often result in patients falling between services, which can lead to high chronicity and increased mortality risk. Commentaries from two additional patients who have recovered from severe and longstanding anorexia nervosa are included. CASE STUDY: The patient developed severe anorexia nervosa and hallucinosis after a traumatic brain injury in 2000. Despite numerous hospitalisations and various psychotropic medications in the UK and France, standard treatments were ineffective for 17 years. However, Integrated Enhanced Cognitive Behaviour Therapy (I-CBTE) using a whole-team approach and intensive, personalised psychological treatment alongside nutritional rehabilitation proved effective. METHODS: In this paper, we describe the application of the I-CBTE model for individuals with severe, longstanding, and complex anorexia nervosa, using lived experience perspectives from three patients to inform clinicians. We also outline the methodology for adapting the model to different presentations of the disorder. OUTCOMES: The patient achieved and maintained full remission from her eating disorder over the last 6 years, highlighting the benefit of the I-CBTE approach in patients with complex, longstanding eating disorder histories. Successful treatment also saved in excess of £360 k just by preventing further hospitalisations and not accounting for the improvement in her quality of life. This suggests that this method can improve outcomes and reduce healthcare costs. CONCLUSION: This case study, with commentaries from two patients with histories of severe and longstanding anorexia nervosa, provides a detailed description of the practical application of I-CBTE for patients with severe and longstanding eating disorders with complex comorbidities, and extensive treatment histories. This offers hope for patients and a framework for clinicians to enhance existing treatment frameworks, potentially transforming the trajectory of those traditionally deemed treatment resistant. RECOMMENDATIONS: We advocate the broader integration of CBT for EDs into specialist services across the care pathway to help improve outcomes for patients with complex eating disorders. Systematic training and supervision for multidisciplinary teams in this specialised therapeutic approach is recommended. Future research should investigate the long-term effectiveness of I-CBTE through longitudinal studies. Patient feedback on experiences of integrated models of care such as I-CBTE is also needed. In addition, systematic health economics studies should be conducted.
This case study, with commentaries from two other patients who have recovered from severe and longstanding anorexia nervosa, examines the use of Integrated Enhanced Cognitive Behavioural Therapy (I-CBTE) for a patient with severe and longstanding anorexia nervosa and multiple comorbidities. The patient had a history of multiple hospitalisations and was treated with various psychotropic medications without success for 17 years. However, she responded to I-CBTE. The model integrates multidisciplinary treatment to address the eating disorder and co-occurring conditions effectively. The patient achieved and maintained full remission from her eating disorder over the last 6 years, highlighting the effectiveness of the I-CBTE approach in patients with complex, longstanding and severe eating disorders. This intervention is cost-effective and has significant financial advantages for healthcare systems. The authors recommend further research into the long-term effectiveness of I-CBTE and broader integration of CBT for ED into clinical services and existing treatment frameworks to enhance care for patients with severe and longstanding eating disorders. Systematic training and supervision for multidisciplinary teams is needed and patient feedback on experiences of integrated models of care such as I-CBTE is also needed. Finally, systematic health economics studies should be conducted.
RESUMO
PURPOSE: This study aimed to investigate the association between vitamin D levels and periodontitis according to sleep duration in a representative sample of Korean adults. MATERIALS AND METHODS: A total of 3535 subjects who participated in the sixth (2013-2014) Korea National Health and Nutrition Examination Survey were examined. Vitamin D deficiency was defined as a 25-hydroxyvitamin D serum concentration of 20 ng/ml. Periodontal status was assessed with the community periodontal index (CPI). A high CPI was defined as a score ≥ 3. Multivariable logistic regression analyses were adjusted for sociodemographic variables, oral and general health behaviors, and systemic health status. All analyses used a complex sampling design, and a subgroup analysis was performed to determine estimates following stratification for sleep duration (≤ 5, 6, 7-8, and ≥ 9 h per day). RESULTS: Multivariable regression analysis indicated that among participants who slept for ≥ 9 h per day, those with vitamin D deficiency were 5.51 times (95% confidence interval = 2.04-14.89) more likely to have periodontitis than those with sufficient vitamin D levels. This association was not statistically significant in the other sleep duration groups. CONCLUSION: The findings of this study indicate that people with vitamin D deficiency who sleep 9 h or longer may also be statistically significantly more likely to have periodontitis.
Assuntos
Inquéritos Nutricionais , Sono , Deficiência de Vitamina D , Vitamina D , Humanos , República da Coreia/epidemiologia , Masculino , Feminino , Sono/fisiologia , Vitamina D/sangue , Vitamina D/análogos & derivados , Adulto , Pessoa de Meia-Idade , Deficiência de Vitamina D/epidemiologia , Deficiência de Vitamina D/sangue , Periodontite/sangue , Periodontite/epidemiologia , Índice Periodontal , Fatores de Tempo , Idoso , Duração do SonoRESUMO
BACKGROUND: Advanced glycation end products (AGEs), a group of food processing byproducts, have been implicated in the development of various diseases. However, the relationship between circulating AGEs and sleep disorders remains uncertain. METHODS: This cross-sectional study elucidated the association of plasma AGEs with sleep disorders among 1732 Chinese adults who participated in the initial visit (2019-2020) of the Tongji-Shenzhen Cohort (TJSZC). Sleep behavior was assessed using self-reported questionnaires and precise accelerometers. Plasma levels of AGEs, including Nε-(Carboxymethyl)lysine (CML), Nε-(Carboxyethyl)lysine (CEL), and Nδ-(5-hydro-5-methyl-4-imidazolone-2-yl)-ornithine (MG-H1), were quantified by ultra-high performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS). RESULTS: In logistic regression, per IQR increment in individual AGEs was associated with an increased odds ratio of short sleep duration (CML: 1.11 [1.00, 1.23]; CEL: 1.16, [1.04, 1.30]), poor sleep quality (CML: 1.33 [1.10, 1.60]; CEL: 1.53, [1.17, 2.00]; MG-H1: 1.61 [1.25, 2.07]), excessive daytime sleepiness (CML: 1.33 [1.11, 1.60]; MG-H1: 1.39 [1.09, 1.77]), and insomnia (CML: 1.29 [1.05, 1.59]). Furthermore, in weighted quantile sum regression and Bayesian kernel machine regression analyses, elevated overall exposure levels of plasma AGEs were associated with an increased risk of sleep disorders, including short sleep duration, poor sleep quality, excessive daytime sleepiness, and insomnia, with CML being identified as the leading contributor. Insufficient vegetable intake and higher dietary fat intake was associated with an increase in plasma CEL. CONCLUSIONS: These findings support a significant association between plasma AGEs and sleep disorders, indicating that AGEs may adversely influence sleep health and reducing the intake of AGEs may facilitate preventing and ameliorating sleep disorders.
Assuntos
Produtos Finais de Glicação Avançada , Transtornos do Sono-Vigília , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , China/epidemiologia , Estudos Transversais , População do Leste Asiático , Produtos Finais de Glicação Avançada/sangue , Lisina/análogos & derivados , Lisina/sangue , Ornitina/análogos & derivados , Transtornos do Sono-Vigília/sangue , Espectrometria de Massas em TandemRESUMO
PURPOSE: This study aimed to explore the safety and efficacy of laser treatment settings of micropulse transscleral cyclophotocoagulation treatment in glaucoma patients and to evaluate the relationship between intraocular pressure (IOP) reduction and different treatment parameters. MATERIALS AND METHODS: A total of 74 eyes in 64 glaucoma patients with IOP over 21 mmHg or under 20 mmHg with visual field progression who underwent micropulse transscleral cyclophotocoagulation treatment were included. Patients were divided into success and failure groups based on criteria of 20% IOP reduction rate. The predictive factors of IOP reduction between success and failure groups and the IOP reduction rates in the different treatment duration groups were evaluated. Predictive factors for IOP reduction were analyzed using univariate and multivariate regression models. RESULTS: Patients in the success group had significantly higher baseline IOP (median: 28.0 vs. 23.0 mmHg; P = 0.016) and longer treatment times (median: 240 vs. 160 s; P = 0.001). Treatment duration range between 200 and 240 s achieved significantly higher intraocular pressure reduction rates (47.8 ± 17.4%) than durations under 140 s (23.1 ± 14.2%). Univariate analysis showed that baseline IOP and treatment duration were significant contributing factors in IOP reduction. Multivariable analysis further demonstrated that treatment duration over 200 s was the significant predictive factor for IOP reduction. CONCLUSION: Treatment duration settings were the most significant factor of IOP reduction rates in micropulse cyclophotocoagulation. Customized therapy according to the target IOP reduction rate can be applied with different treatment duration settings to achieve optimal outcomes.
RESUMO
Time is a fundamental dimension of our perception and mental construction of reality. It enables resolving changes in our environment without a direct sensory representation of elapsed time. Therefore, the concept of time is inferential by nature, but the units of subjective time that provide meaningful segmentation of the influx of sensory input remain to be determined. In this review, we posit that events are the construal instances of time perception as they provide a reproducible and consistent segmentation of the content. In that light, we discuss the implications of this proposal by looking at "events" and their role in subjective time experience from cultural anthropological and ontogenetic perspectives, as well as their relevance for episodic memory. Furthermore, we discuss the significance of "events" for the two critical aspects of subjective time-duration and order. Because segmentation involves parsing event streams according to causal sequences, we also consider the role of causality in developing the concept of directionality of mental timelines. We offer a fresh perspective on representing past and future events before age 5 by an egocentric bi-directional timeline model before acquiring the allocentric concept of absolute time. Finally, we illustrate how the relationship between events and durations can resolve contradictory experimental results. Although "time" warrants a comprehensive interdisciplinary approach, we focus this review on "time perception", the experience of time, without attempting to provide an all encompassing overview of the rich philosophical, physical, psychological, cognitive, linguistic, and neurophysiological context.
RESUMO
Microplastics (MPs) can enter the human body through respiration and pose a hazard to human health. Wearing masks has become a routine behavior during the COVID-19 pandemic. The level of respirational exposure and the influence of wearing masks are currently unknown. We recruited 113 college students and divided them into natural exposure (NE), surgical mask (SM), and cotton mask (CM) groups. Nasal lavage fluid (NLF) was collected and MPs characteristics were analyzed using polarized light microscopy and laser direct infrared system. We found a relatively high abundance of MPs in NLF in the SM group (41.24 ± 1.73 particles/g). The particle size distribution and fibrous MP percentage significantly differed among the three groups. The main components in the NE, SM, and CM groups were polypropylene (58.70 %)ï¼polycarbonate (PC, 49.49 %)ï¼and PC (54.29 %). Components such as polyamide, polyethylene and polyethylene terephthalate were also detected. Wearing surgical masks increased the MP abundance in NLF (ß = 0.36, P < 0.01). As the wear time increased, the abundance of MPs also rose (ß = 0.28, P < 0.05). However, those who used bedding containing synthetic fibers had lower MP abundance in their NLF. This study highlights the use of NLF to evaluate MP exposure, which is associated with potential health risks.
RESUMO
BACKGROUND: Hemodialysis (HD) patients represent a high-risk group for hepatitis B infection. It is crucial to administer hepatitis B vaccination and stimulate higher and more sustained levels of anti-HBs. Our aim is to enhance the immunogenicity and persistence by implementing high-dose and prolonged hepatitis B vaccine schedule regimen in HD patients. METHODS: We conducted this multicenter, randomized, parallel-controlled trial between July 2020 and February 2023 at 11 hospitals in Shanxi province, China. A total of 504 HD patients were enrolled. All participants randomly allocated in a ratio of 1:1:1 to receive recombinant HBV vaccine of 3 standard doses (20 µg) at 0-1-6 months (IM20×3 group), 4 standard doses at 0-1-2-6 months (IM20×4 group), or 4 triple doses (60 µg) at 0-1-2-6 months (IM60×4 group). RESULTS: The vaccine-elicited antibody response peaked at month 7. The follow-up outcomes ranging from month 7 to 30 revealed that the response rates of anti-HBs decreased from 85.9% (134/156) to 33.0% (33/100) in IM20×3 group, from 92.5% (135/146) to 53.9% (56/104) in IM20×4 group and from 95.4% (145/152) to 57.3% (55/96) in IM60×4 group. The duration of vaccine-induced response with 75% of patients maintained protective antibody were 21.0 months in IM20×3 group, 25.7 months in IM20×4 group (vs. IM20×3 group, P=0.056) and 29.2 months in IM60×4 group (vs. IM20×3 group, P=0.034). All the adverse reactions were mild. CONCLUSIONS: The four-triple-dose hepatitis B vaccination regimens could enhance the immunogenicity and 2-year duration in HD patients.The trial was registered with Clinical Trials.gov, number NCT03962881. https://classic.clinicaltrials.gov/ct2/show/NCT03962881?term=NCT03962881&draw=2&rank=1.
RESUMO
BACKGROUND: Interdisciplinary multimodal pain therapy (IMPT) is an established treatment for patients with severe chronic pain. Little evidence is available on the role of treatment dosage and, in particular, on the association between the duration of IMPT and treatment outcome. AIM: The aim of this retrospective study was to compare the medium-term treatment success of a short inpatient (SIT, 1 week) and a long outpatient (LOT, 4 weeks) IMPT with a comparable treatment concept and comparable therapy intensity (20â¯h/week) in patients with severe chronic pain. METHODS: Patients in both groups completed the German Pain Questionnaire at the beginning and end of IMPT as well as after 3 months. Primary outcome measures included pain-related impairment and average pain intensity at follow-up in patients of comparable sex, age as well as pain intensity and impairment at the beginning of the therapy. RESULTS: While both groups initially showed significant treatment effects in pain-related impairment and average pain intensity, LOT patients (nâ¯= 32) reported significantly better values in both variables at 3month follow-up compared with SIT patients (nâ¯= 32). This was due to sustained positive effects in LOT patients and worsening in the SIT group. CONCLUSION: The results indicate that initial treatment effects can be observed in both treatment settings, but a longer duration of therapy seems to favour the long-term stability of treatment effects.
RESUMO
Background: This study aims to explore the association between sleep duration and the prevalence of chronic musculoskeletal pain (CMP). Methods: A cross-sectional study was conducted using data from the National Health and Nutrition Examination Survey (NHANES) 2009-2010, which involved multiple centers across the United States. The study included 3,904 adults selected based on age and complete data availability. Demographic variables such as gender, age, race, and socioeconomic status (represented by the poverty-to-income ratio) were considered. Results: Of the participants, 1,595 reported less than 7 h of sleep, 2,046 reported 7-8 h, and 263 reported more than 9 h of sleep. Short sleep duration was associated with higher odds of CMP (OR, 1.611, 95% CI: 1.224-2.120, p = 0.005). Long sleep duration also showed a higher prevalence (OR, 1.751; 95% CI, 0.923 to 3.321; p = 0.059), although this result was not statistically significant. A U-shaped relationship emerged (Effective degree of freedom (EDF) = 3.32, p < 0.001), indicating that 7 h of sleep was associated with the lowest odds of CMP. In individuals with sleep durations less than 7 h, each hour increment correlated with 22.8% reduced odds of CMP (OR, 0.772; 95% CI, 0.717-0.833; p = 0.002). Beyond 7 h, each hour increment was associated with 38.9% increased odds of CMP (OR, 1.389; 95% CI, 1.103-1.749; p = 0.049). Conclusion: The findings suggest that both insufficient and excessive sleep durations are linked to a higher prevalence of CMP, highlighting the importance of optimal sleep duration for musculoskeletal health.
RESUMO
BACKGROUND: The prospective association between sleep duration and the development of late-life depressive symptomology is unclear. AIMS: To investigate sleep duration from midlife to late life in relation to risk of depressive symptoms in late life. METHOD: A total of 14 361 participants from the Singapore Chinese Health Study were included in the present study. Daily sleep duration was self-reported at baseline (mean age of 52.4 years; 1993-98), follow-up 2 (mean age of 65.2 years; 2006-10) and follow-up 3 (mean age of 72.5 years; 2014-16) interviews. Depressive symptoms were evaluated using the Geriatric Depression Scale at follow-up 3 interviews. Modified Poisson regression models were performed to estimate relative risks and 95% confidence intervals of late-life depressive symptoms in relation to sleep duration at baseline and the two follow-up interviews. RESULTS: Compared with sleeping 7 h per day, a short sleep duration of ≤5 h per day at baseline (i.e. midlife) was related to a higher risk of depressive symptoms (relative risk 1.10, 95% CI 1.06-1.15), and this risk was not affected by subsequent prolongation of sleep. Conversely, a long sleep duration of ≥9 h per day at baseline was not related to risk of depressive symptoms. At follow-up 3 (i.e. late life), both short sleep (relative risk 1.20, 95% CI 1.16-1.25) and long sleep (relative risk 1.12, 95% CI 1.07-1.18) duration were cross-sectionally associated with depressive symptoms. CONCLUSION: Short sleep duration in midlife, regardless of subsequent prolongation, is associated with an increased risk of depression in late life. Contrariwise, both short and long sleep duration in late life co-occur with depressive symptoms.
RESUMO
Several indices were suggested to determine the follow up duration in oncology trials from either maturity or stability perspective, by maximizing time t $$ t $$ such that the index was either greater or less than a pre-defined cutoff value. However, the selection of cutoff value was subjective and usually no commonly agreed cutoff value existed; sometimes one had to resort to simulations. To solve this problem, a new balance index was proposed, which integrated both data stability and data maturity. Its theoretical properties and relationships with other indices were investigated; then its performance was demonstrated through a case study. The highlights of the index are: (1) easy to calculate; (2) free of cutoff value selection; (3) generally consistent with the other indices while sometimes able to shorten the follow-up duration thus more flexible. For the cases where the new balance index cannot be calculated, a modified balance index was also proposed and discussed. For either single arm trial or randomized clinical trial, the two new balance indices can be implemented to widespread situations such as designing a new trial from scratch, or using aggregated trial information to inform the decision-making in the middle of trial conduct.
RESUMO
Background: In this study, we aim to examine the impact of dietary total antioxidant capacity (TAC) on sleep problems and depressive symptoms (DS); besides, we seek to elucidate the potential mediating effect of dietary TAC on the relationships between sleep problems and DS. Methods: Weighted Kruskal-Wallis tests for continuous variables and Chi-square tests for categorical variables were employed to discriminate between DS and non-DS participants. Multivariable logistic regression and restricted cubic spline analysis were applied to evaluate the associations of TAC with DS and sleep problems. Results: Among the 21,805 participants, 1,947 participants suffered from DS. Weighted multivariable logistical regression indicated that shorter sleep hours were linked to an increased likelihood of risk of DS even after complete adjustments. Restricted cubic spline cure displayed that TAC was almost non-linearly correlated with DS and sleep problems. Mediation analysis indicated that sleep duration slightly mediated the association between TAC and DS (proportion of mediation: 3.12%, p < 0.001). Conclusion: This study illustrated the inverse association between TAC value and sleep problems and DS. Furthermore, TAC slightly mediated the effect of sleep duration on the DS, and there was a nearly non-linear relationship between TAC and DS, and TAC and sleep problems.
RESUMO
BACKGROUND: Hearing efficiency is known to influence and interact with communication and mental health. Hearing impairment may be hidden when co-occurring with neurological disorders. PURPOSE: We performed a systematic review and meta-analysis in order to address the following questions: 1) which specific tools of auditory processing show clear deficits, separating Temporal Lobe Epilepsy (TLE) patients from normal controls,2) How well is TLE evaluated in terms of hearing and auditory processing? METHODS: The study inclusion criteria were: 1) patients diagnosed with temporal lobe epilepsy, 2) presence of a normal control group, 3) auditory processing assessment using auditory stimuli with behavioral tests and/or P300 or Mitch Match Negativity (MMN) latency and/or amplitude, 4) publications written in English, 5) publication date after 2000. 132 articles were retrieved and based on PRISMA & PICO criteria 23 articles were analyzed. RESULTS: Temporal resolution and processing as measured by the behavioral tests of Gaps-In-Noise (GIN) and Duration Pattern Test (DPT) document deficiencies in TLE patients and separate them from normal controls. Electrophysiology as measured by MMN & P300 shows statistically significant differences in TLE patients compared to controls with patients showing deficient auditory processing. A clear difference between studies with psychoacoustic assessment as opposed to electrophysiology ones may be due to lacking or incomplete evaluation of peripheral hearing by gold standard tools (76.9% in electrophysiology studies). CONCLUSION: Auditory processing is deficient in patients with TLE. There is a clear need to evaluate hearing efficiency before proceeding to auditory processing evaluation with behavioral or electrophysiological tests.
RESUMO
OBJECTIVE: A decline in abdominal aortic aneurysm (AAA) prevalence calls into question the credibility of general population screening of 65 year old men. Selectively targeting high risk individuals among this group could be more effective in preventing death from AAA rupture. This cross sectional study analysed risk factor data in a cohort of 65 year old men screened in the Swedish general population based AAA screening program, with the aim of exploring the effectiveness of hypothetical targeted screening strategies. METHODS: All men attending AAA screening in four neighbouring counties in Sweden between 2006 and 2010 completed a health questionnaire on smoking habits and medical history. Abdominal aortic aneurysm was defined as measuring ≥ 30 mm. The sensitivity and specificity of different targeted screening strategies, with targeted subpopulations defined by duration of smoking with and without additional risk factors, were explored using receiver operating characteristic (ROC) curves. RESULTS: A total of 16 232 men were screened, with 236 (1.5%) screening detected AAAs. A strategy combining smoking, presence of coronary artery disease (CAD), or both was associated with the mathematically optimal balance between sensitivity and specificity (optimal threshold) in the ROC analysis. The optimal threshold corresponded to targeting men having smoked for thirty years or more, a history of CAD, or both, where 74.0% of all AAAs could be detected by screening 33.0% of the population, compared with general screening. Targeting men that have smoked for ten years or more indicated that 84.0% of all AAAs could be detected by screening 55.0% of the population. A simplified strategy of targeting ever smokers resulted in detecting 85.0% of all AAAs by screening 61.0% of the population. CONCLUSION: Targeted screening of men at high risk of AAA, focusing on smoking history for inclusion, may be a safe and effective alternative to general population screening.
RESUMO
PURPOSE: To evaluate 18F-FDG myocardial metabolism imaging (MMI) using a total-body PET/CT scanner and explore the feasible scan duration to guide the clinical practice. METHODS: A retrospective analysis was conducted on 41 patients who underwent myocardial perfusion-metabolism imaging to assess myocardial viability. The patients underwent 18F-FDG MMI with a total-body PET/CT scanner using a list-mode for 600 s. PET data were trimmed and reconstructed to simulate images of 600-s, 300-s, 120-s, 60-s, and 30-s acquisition time (G600-G30). Images among different groups were subjectively evaluated using a 5-point Likert scale. Semi-quantitative evaluation was performed using standardized uptake value (SUV), myocardial to background activity ratio (M/B), signal to noise ratio (SNR), contrast to noise ratio (CNR), contrast ratio (CR), and coefficient of variation (CV). Myocardial viability analysis included indexes of Mismatch and Scar. G600 served as the reference. RESULTS: Subjective visual evaluation indicated a decline in the scores of image quality with shortening scan duration. All the G600, G300, and G120 images were clinically acceptable (score ≥ 3), and their image quality scores were 4.9 ± 0.3, 4.8 ± 0.4, and 4.5 ± 0.8, respectively (P > 0.05). Moreover, as the scan duration reduced, the semi-quantitative parameters M/B, SNR, CNR, and CR decreased, while SUV and CV increased, and significant difference was observed in G300-G30 groups when comparing to G600 group (P < 0.05). For myocardial viability analysis of left ventricular and coronary segments, the Mismatch and Scar values of G300-G30 groups were almost identical to G600 group (ICC: 0.968-1.0, P < 0.001). CONCLUSION: Sufficient image quality for clinical diagnosis could be achieved at G120 for MMI using a total-body PET/CT scanner, while the image quality of G30 was acceptable for myocardial viability analysis.
RESUMO
This paper aims to investigate the overtaking behavior of motorcyclists in a suburban environment. The goal is to model overtaking duration, identify the factors influencing it, and determine the likelihood of a rider overtaking a vehicle while maintaining critical lateral clearance. Riding data were collected using a passenger car equipped with cameras and a GPS device, which recorded videos of motorcyclists performing maneuvers to overtake it. This setup allowed for capturing natural motorcyclist behavior and avoided the potential limitations of instrumented motorcycle studies, such as bias due to participants being aware of their involvement in the experiment. A total of 119 overtaking maneuvers were recorded. A methodology combining digital image processing algorithms and GPS analysis was employed to characterize the recorded maneuvers. Survival and logistic analyses were then conducted to model the duration of overtaking and lateral clearance, respectively. The hazard-based duration model indicated that the duration of a motorcyclist's overtaking maneuver is influenced by the final longitudinal distance between the motorcycle and the passed vehicle at the end of the maneuver. Other factors include the speed difference between the motorcycle and the front vehicle at the same instant, and the initial Time-To-Collision (TTC) between the motorcycle and the front vehicle at the beginning of the overtaking. The logistic regression analysis revealed that the probability of overtaking a vehicle with a lateral clearance below the critical threshold increases when the rider does not invade the opposite lane during the overtaking maneuver when a vehicle in the opposite lane induces the motorcyclist to return to the right lane, and as the duration of the overtaking maneuver increases. This research provides valuable contributions to understanding motorcyclist behavior during overtaking maneuvers, aiding in the development of more realistic microsimulation models that account for actual rider behavior. Additionally, the study contributes to the development of Advanced Rider Assistance Systems aimed at guiding motorcyclists to make safer overtaking decisions and reduce significant risk exposure from complex overtaking maneuvers.
RESUMO
The effect of the duration of red blood cell (RBC) storage on the outcomes of transfused patients remains controversial, and studies on patients in the emergency department (ED) are limited. This study aimed to determine the association between RBC storage duration and outcomes of patients receiving transfusions in the ED. For RBCs issued to patients in the ED between 2017 and 2022, the storage period of the RBC and data on the transfused patient were obtained. Patients were divided into fresh (≤ 7 days) and old (> 7 days) RBC groups, and the associations between storage duration, outcomes, and laboratory changes were evaluated. There was no significant difference in outcomes between the two groups in the 28-day mortality (adjusted odds ratio [OR] 0.91, 95% confidence interval [CI] 0.75-1.10, P = 0.320) and the length of stay (fresh 13.5 ± 18.1 vs. old 13.3 ± 19.8, P = 0.814). Regarding changes in laboratory test results, the increase in hemoglobin and hematocrit levels was not affected by the storage durations. The study revealed that transfusion of older RBCs is not associated with inferior outcomes or adverse clinical consequences when compared to that of fresh RBCs in patients in the ED.
Assuntos
Preservação de Sangue , Serviço Hospitalar de Emergência , Transfusão de Eritrócitos , Eritrócitos , Humanos , Masculino , Feminino , Preservação de Sangue/métodos , Pessoa de Meia-Idade , Idoso , Fatores de Tempo , Tempo de Internação , Estudos Retrospectivos , Hematócrito , Hemoglobinas/metabolismo , Hemoglobinas/análise , AdultoRESUMO
PURPOSE: Inflammation may play a role in the mechanism of postoperative delirium (POD), a severe complication among older postoperative patients. The purpose of this study was to investigate the risk factors of POD in postoperative patients with hip fracture, especially the inflammation marker- neutrophil-lymphocyte ratio (NLR). METHODS: This retrospective investigation utilized data from the Seventh Medical Center of People's Liberation Army. 1,242 Eligible patients with hip fracture (829 females), median age 81 years, mean neutrophil-lymphocyte ratio (NLR) 5.28, were enrolled. Receiver operating characteristic (ROC) curve was performed to identify the optimal cut point of NLR for POD. The relationship between NLR and POD occurrence, NLR and POD duration were analyzed by multivariable analysis. RESULTS: ROC curve showed that the optimal cut point of NLR for POD was NLR ≥ 7.6. Multivariate logistic regression analysis showed that NLR ≥ 7.6 (odds ratio [OR] 2.75, [95% confidence interval [CI] 1.51 to 5.02], p = 0.001), stroke (OR 1.05, [95% CI 1.02 to 1.09], p = 0.005), complications, general anesthesia, long length of stay were risk factors of POD, with the largest effect for NLR ≥ 7.6. NLR ≥ 7.6 (ß 0.59, [95% CI 0.209 to 0.886], p = 0.038), older age (ß 0.054, [95% CI 0.009 to 0.099], p = 0.019), previous stroke (ß 0.908, [95% CI 0.085 to 1.731], p = 0.031), and previous heart failure (ß 1.679, [95% CI 0.448 to 2.910], p = 0.008) suggested long POD duration. CONCLUSIONS: This study demonstrates an association between NLR and postoperative delirium in geriatric hip fracture patients, and contribute new evidence to support NLR as a potential marker for prediction of POD and POD duration.
Assuntos
Delírio , Fraturas do Quadril , Neutrófilos , Complicações Pós-Operatórias , Humanos , Fraturas do Quadril/cirurgia , Feminino , Estudos Retrospectivos , Masculino , Idoso de 80 Anos ou mais , Idoso , Complicações Pós-Operatórias/diagnóstico , Complicações Pós-Operatórias/etiologia , Complicações Pós-Operatórias/epidemiologia , Delírio/etiologia , Delírio/diagnóstico , Delírio/epidemiologia , Fatores de Risco , Linfócitos , Curva ROCRESUMO
BACKGROUND: Previous studies have adequately demonstrated that physical activity or healthy sleep duration can reduce the risk of hypertension. However, the combined effects of physical activity and healthy sleep on hypertension have not been well explored in studies using nationally representative samples. METHODS: The data were obtained from the National Health and Nutrition Examination Survey (2007-2018). Sleep duration and physical activity were obtained from self-reported questionnaires. Survey logistic regression and restricted cubic spline curves were used to evaluate the joint effects of physical activity and healthy sleep duration on hypertension. RESULTS: A total of 18,007 participants were enrolled in the main study. Physical activity was categorized into insufficient physical activity (600 < Met-min/week) and sufficient physical activity (≥600 Met-min/week). Sleep duration of ≤6 or ≥9 hours was defined as unhealthy sleep duration, and 7-8 hours was defined as healthy sleep duration. Compared to the individuals with unhealthy sleep duration and insufficient physical activity, only the participants with healthy sleep duration and sufficient physical activity (adjusted odds ratio: 0.76, 95% CI 0.66-0.88) were negatively associated with hypertension, while the participants with healthy sleep duration but insufficient physical activity or sufficient physical activity but unhealthy sleep duration were not associated with hypertension. Physical activity was nonlinearly associated with hypertension in the healthy sleep duration group, whereas in the unhealthy sleep duration group, physical activity was not associated with hypertension. CONCLUSION: Our findings indicate that sufficient physical activity and healthy sleep duration were negatively associated with hypertension. This underscores the importance of integrating both sufficient physical activity and healthy sleep duration in strategies aimed at reducing hypertension risk.