RESUMO
BACKGROUND: Polypharmacy is increasing. The longitudinal association of polypharmacy and social isolation has not been previously reported. The aim of this study was to explore longitudinal associations of polypharmacy with loneliness and social isolation among older adults. METHODS: Participants aged 60 years and above in southern Sweden were invited for participation. A total of 1526 and 2556 participants were included in the separate analyses for loneliness and social isolation. Polypharmacy was defined as taking five or more medications. Associations of polypharmacy with occurrence of loneliness and social isolation were estimated using logistic regression models. RESULTS: During follow-up, 409 and 414 participants developed loneliness and social isolation, respectively. The odds for loneliness occurrence were higher for participants with polypharmacy compared to participants without polypharmacy (OR, 1.37; 95% CI, 1.05-1.78; P = 0.020). For participants without polypharmacy, the probability of developing loneliness was 0.28 (95% CI, 0.25-0.31), while for those with polypharmacy this probability was 25% higher (0.35; 95% CI, 0.30-0.39). The odds for social isolation occurrence were higher for participants with polypharmacy compared to participants without polypharmacy (OR, 1.29; 95% CI, 1.02-1.64; P = 0.036). For participants without polypharmacy, the probability of developing social isolation was 0.16 (95% CI, 0.14-0.18), while for those with polypharmacy this probability was 18% higher (0.19; 95% CI, 0.17-0.22). CONCLUSIONS: Polypharmacy was associated with loneliness and social isolation occurrence among older adults. Consideration of loneliness and social isolation are warranted when caring for older adults taking multiple medications.
Assuntos
Solidão , Polimedicação , Humanos , Idoso , Isolamento Social , Modelos Logísticos , ProbabilidadeRESUMO
Fisher's geometric model provides a powerful tool for making predictions about key properties of Darwinian adaptation. Here, I apply the geometric model to predict differences between the evolution of altruistic versus nonsocial phenotypes. I recover Kimura's prediction that probability of fixation is greater for mutations of intermediate size, but I find that the effect size that maximises probability of fixation is relatively small in the context of altruism and relatively large in the context of nonsocial phenotypes, and that the overall probability of fixation is lower for altruism and is higher for nonsocial phenotypes. Accordingly, the first selective substitution is expected to be smaller, and to take longer, in the context of the evolution of altruism. These results strengthen the justification for employing streamlined social evolutionary methodologies that assume adaptations are underpinned by many genes of small effect.
Assuntos
Altruísmo , Seleção Genética , Evolução Biológica , Matemática , ProbabilidadeRESUMO
Coastal areas are of paramount importance due to their pivotal role in facilitating a wide range of socio-economic activities and providing vital environmental services. These areas, as the meeting points of land and sea, face significant risks of flooding due to the ongoing rise in sea levels caused by climate change. Additionally, they are susceptible to extreme events like king tides and large waves in the future. This paper introduces a framework for estimating the extreme total water level (TWL) by considering the effects of regional sea level rise (RSLR) resulting from a warming climate under RCP 8.5. It also incorporates the contributions of high tides, 100-year storm surge, and 100-year wave setup and run-up. The proposed framework is utilized to evaluate the occurrence of extreme coastal flooding along the Persian Gulf coast of Iran, an area that is home to significant industries in the country. The results offer an estimated increase of RSLR by 0.23 m from 2020 to 2050 considering an ensemble of climate model projections. Extreme wave setup values are estimated to range between 0.19 and 0.66 m, while storm surge is projected to vary from 0.4 to 1.44 m across the studied coastline. These together yield in a projected extreme TWL along the coastline within the range of 3.18 and 3.90 m above the current sea level. This significant increase in sea level could lead to the inundation of approximately 513 km2 of low-lying coastal land, which accounts for about 16% of the studied domain and could pose serious flooding threat to the people and their assets in this region. Finally, relative ranking of flooded zones (i.e., 6 zones) helps determine the areas with higher chance of flood exposure, at which investment in flood mitigation measures should be prioritized.
Assuntos
Inundações , Elevação do Nível do Mar , Humanos , Oceano Índico , Mudança Climática , ProbabilidadeRESUMO
Studies designed to estimate the effect of an action in a randomized or observational setting often do not represent a random sample of the desired target population. Instead, estimates from that study can be transported to the target population. However, transportability methods generally rely on a positivity assumption, such that all relevant covariate patterns in the target population are also observed in the study sample. Strict eligibility criteria, particularly in the context of randomized trials, may lead to violations of this assumption. Two common approaches to address positivity violations are restricting the target population and restricting the relevant covariate set. As neither of these restrictions is ideal, we instead propose a synthesis of statistical and simulation models to address positivity violations. We propose corresponding g-computation and inverse probability weighting estimators. The restriction and synthesis approaches to addressing positivity violations are contrasted with a simulation experiment and an illustrative example in the context of sexually transmitted infection testing uptake. In both cases, the proposed synthesis approach accurately addressed the original research question when paired with a thoughtfully selected simulation model. Neither of the restriction approaches was able to accurately address the motivating question. As public health decisions must often be made with imperfect target population information, model synthesis is a viable approach given a combination of empirical data and external information based on the best available knowledge.
Assuntos
Infecções Sexualmente Transmissíveis , Humanos , Simulação por Computador , ProbabilidadeRESUMO
Objetivos La escala de Deauville (DS) en la tomografía de emisión de positrones (PET) con [18F]fludeoxiglucosa ([18F]FDG) es un método semicuantitativo único para la evaluación del linfoma. Sin embargo, el tipo de algoritmo de reconstrucción empleado para el cálculo de los valores de captación estándar (max, medio y pico) podría afectar a la DS. Comparamos el algoritmo de reconstrucción de probabilidad penalizada bayesiano (BPL) con el de maximización de expectativas de subconjuntos ordenados (OSEM) respecto a los parámetros cuantitativos y en la DS en el linfoma. Investigamos el efecto del tamaño del ganglio linfático sobre la variación cuantitativa. Métodos Se reconstruyeron por separado los resultados de la PET sin procesar de 255 pacientes con linfoma utilizando la aplicación Q.Clear (General Electric Healthcare, Milwaukee, WI, EE. UU.), un algoritmo BPL, y la aplicación SharpIR (General Electric Healthcare, Milwaukee, WI, EE. UU.), un algoritmo OSEM. En ambas imágenes, para cada paciente, se valoró hígado, el pool sanguíneo mediastínico y los valores de captación estándar (SUV) (SUVmáx, SUVmedio y SUVpico) de un total de 487 lesiones seleccionadas. Se compararon DSmáx, DSmedio y DSpico. Resultados En nuestro estudio hubo un aumento significativo de la DS con el BPL (p<0,001) que pasó a una puntuación de 4 a 5 en 30 pacientes inicialmente catalogados como 1-2-3 mediante el algoritmo OSEM. Se observó que los valores cuantitativos de los ganglios linfáticos aumentaban de forma estadísticamente significativa con el BPL (p<0,001), mientras que la disminución de los valores de hígado fue notable respecto a las regiones de referencia (p<0,001). Además, la diferencia en los ganglios linfáticos se asoció de forma independiente con el tamaño de la lesión y fue considerablemente más pronunciada en las lesiones de pequeño tamaño (p<0,001) (AU)
Introduction and Objectives 18F-FDG PET with the Deauville score (DS) is a unique semiquantitative method for lymphoma. However, type of standard uptake values (max, mean, and peak) reconstruction algorithms could affect DS. We compared the Bayesian Penalized Likelihood reconstruction algorithm (BPL) with Ordered Subsets Expectation Maximization (OSEM) on quantitative parameters and DS in lymphoma. We investigated the effect of the size of the lymph node on quantitative variation. Patients and Methods Raw PET data of 255 lymphoma patients were reconstructed separately with Q.Clear (GE Healthcare), a BPL, and SharpIR (GE Healthcare), an OSEM algorithm. In both images, each patient's liver, mediastinal blood pool, and SUVs (SUVmax, SUVmean, and SUVpeak) of a total of 487 lesions selected from the patients were performed. DSmax, DSmean, and DSpeak were compared. Results In our study, DS increased significantly with BPL (p<0.001), and the DS increased to 4-5 in 30 patients evaluated as 1-2-3 with OSEM. It was found that the quantitative values of the lymph nodes increased statistically with BPL (p<0.001), and the liver from the reference regions were significantly decreased (p<0.001). In addition, difference in lymph node was independently associated with size of lesion and was significantly more pronounced in small lesions (p<0.001). The effects of BPL algorithm were more pronounced in SUVmax than in SUVmean and SUVpeak. DS-mean and DS-peak scores were less changed by BPL than DS-max. Conclusion Different reconstruction algorithms in FDG PET/CT affect the quantitative evaluation. That variation may affect the change in DS in lymphoma patients, thus affecting patient management (AU)
Assuntos
Humanos , Masculino , Feminino , Adulto Jovem , Adulto , Pessoa de Meia-Idade , Processamento de Imagem Assistida por Computador , Linfoma/diagnóstico por imagem , Tomografia por Emissão de Pósitrons combinada à Tomografia Computadorizada , Teorema de Bayes , Probabilidade , AlgoritmosRESUMO
Risk assessment of properties and associated population was conducted for the state of Nebraska, leveraging only open-source datasets. The flood risk framework consisted of interactions among drivers, i.e. hazard, exposure, vulnerability, and response, to assess the risks related to properties and associated populations. To quantify hazard on a county scale, we considered properties at risk of flooding based on a flood score (a higher score represents a greater chance of flooding). Exposure was quantified by considering population density at the county level. We quantified vulnerability under four categories: social, ecological, economic, and health. Response, a relatively newer component in flood risk assessment, was also quantified under three distinct categories: structural, non-structural, and emergency. Overall, we found that counties in eastern Nebraska (Sarpy, Dakota, Wayne, and Adams) have a higher risk of flooding consequences due to more exposure to vulnerable assets such as population and property. The assessment also observed that counties in eastern Nebraska are in the process of improving their flood control measures with dams, levees, and higher insurance coverage that can subdue the risks associated with flooding. The results from this study are anticipated to guide water managers and policymakers in making more effective and locally relevant policies and measures to mitigate flood risks and consequences.
Assuntos
Inundações , Cobertura do Seguro , Nebraska , Medição de Risco , ProbabilidadeRESUMO
Subject's own name (SON) is widely used in both daily life and the clinic. Event-related potential (ERP)-based studies have previously detected several ERP components related to SON processing; however, as most of these studies used SON as a deviant stimulus, it was not possible to determine whether these components were SON-specific. To identify SON-specific ERP components, we adopted a passive listening task with EEG data recording involving 25 subjects. The auditory stimuli were a SON, a friend's name (FN), an unfamiliar name (UN) selected from other subjects' names and seven different unfamiliar names (DUNs). The experimental settings included Equal-probabilistic, Frequent-SON, Frequent-FN and Frequent-UN conditions. The results showed that SON consistently evoked a frontocentral SON-related negativity (SRN) within 210-350 ms under all conditions, which was not detected with the other names. Meanwhile, a late positive potential evoked by SON was found to be affected by stimulus probability, showing no significant difference between the SON and the other names in the Frequent-SON condition, or between the SON and a FN in the Frequent-UN condition. Taken together, our findings indicated that the SRN was a SON-specific ERP component, suggesting that distinct neural mechanism underly the processing of a SON.
Assuntos
Eletroencefalografia , Nomes , Humanos , Eletroencefalografia/métodos , Estimulação Acústica/métodos , Potenciais Evocados/fisiologia , ProbabilidadeRESUMO
Background: Complete resection of the tumor and the ipsilateral thyroid lobe at the primary surgery is the "gold standard" for the treatment of parathyroid carcinoma (PC). However, differences in the overall survival (OS) of patients with PC who underwent partial and total surgical resection remain to be determined. Methods: Data on patients with PC who underwent partial and total surgical resection were extracted from the Surveillance, Epidemiology and End Results (SEER) database (2000-2018). The X-tile software (https://medicine.yale.edu/lab/rimm/research/software/) was used to define the optimal cut-off values for continuous variables. The inverse probability of treatment weighting (IPTW) method was used to reduce the selection bias. IPTW-adjusted Kaplan-Meier curves and Cox proportional hazards models were used to compare the OS of patients with PC in the partial and total surgical resection groups. Results: A total of 334 patients with PC were included in this study (183 and 151 in the partial and total surgical resection groups, respectively). The optimal cut-off values for age at diagnosis were 53 and 73 years, respectively, while that for tumor size was 34 mm. In both the Kaplan-Meier analysis and univariable Cox proportional hazards regression analysis before IPTW, the difference in OS between the partial and total surgical resection groups was not statistically significant (p>0.05). These findings were confirmed in the IPTW-adjusted Kaplan-Meier analysis and multivariate Cox proportional hazards regression analysis (p>0.05). Subgroup analysis revealed that total surgical resection was beneficial for OS only in the subgroup with unknown tumor size. Conclusion: There was no significant difference in the prognosis of patients who underwent partial and total surgical resection. This finding may provide a useful reference for the treatment of PC.
Assuntos
Neoplasias das Paratireoides , Humanos , Estimativa de Kaplan-Meier , Neoplasias das Paratireoides/epidemiologia , Neoplasias das Paratireoides/cirurgia , Probabilidade , Prognóstico , Modelos de Riscos Proporcionais , Masculino , Feminino , Pessoa de Meia-Idade , IdosoRESUMO
In the current era, quantum resources are extremely limited, and this makes difficult the usage of quantum machine learning (QML) models. Concerning the supervised tasks, a viable approach is the introduction of a quantum locality technique, which allows the models to focus only on the neighborhood of the considered element. A well-known locality technique is the k-nearest neighbors (k-NN) algorithm, of which several quantum variants have been proposed; nevertheless, they have not been employed yet as a preliminary step of other QML models. Instead, for the classical counterpart, a performance enhancement with respect to the base models has already been proven. In this paper, we propose and evaluate the idea of exploiting a quantum locality technique to reduce the size and improve the performance of QML models. In detail, we provide (i) an implementation in Python of a QML pipeline for local classification and (ii) its extensive empirical evaluation. Regarding the quantum pipeline, it has been developed using Qiskit, and it consists of a quantum k-NN and a quantum binary classifier, both already available in the literature. The results have shown the quantum pipeline's equivalence (in terms of accuracy) to its classical counterpart in the ideal case, the validity of locality's application to the QML realm, but also the strong sensitivity of the chosen quantum k-NN to probability fluctuations and the better performance of classical baseline methods like the random forest.
Assuntos
Algoritmos , Aprendizado de Máquina , ProbabilidadeRESUMO
Background: Today, with the development of the industry, the occurrence of accidents caused by the release and explosion of chemical and toxic substances in industrial units has increased, and these accidents sometimes cause irreparable damage to human life and the environment. According to a study by the American Petroleum Institute, of the recent major accidents in the last 30 years, 44% are related to machinery failure and 12% are caused by unknown factors and lack of information. Therefore, equipment risk control is aimed at preventing large and dangerous accident. The present study, the performance of LOPA and fuzzy-LOPA methods was compared toward the risk assessment of Imam Khomeini Petrochemical Company under certainty and uncertainty of data. This comparison was done in order to a conceptual method with high certainty to assess high-level hazards leading to health and safety risks and environmental pollution. Methods: First, the health, safety hazards and environmental aspects were identified via the HAZOP method. Then, a risk assessment was performed using the LOPA method. The fuzzification, severity, and likelihood of each risk were considered as an input variable and risk probability as an output variable. Finally, was the methods used in our analysis were compared and the Bow-tie software was used to draw a Bow-tie diagram to control and reduce the risks. Results: As a result, a total of 50 safety and health hazards and 37 environmental aspects were identified in the aromatic outlet of the studied company using the HAZOP method. The most critical risks identified were operational activities in feed and product tanks; flammable materials pumping; blocking the flare path; and releasing H2S gas. The results showed that the production of air pollutants in the power supply unit, disposal of waste from reactor tanks, disposal of waste from condensate tanks, and fire and explosion of the reactor are high-level environmental risks. Conclusion: In the conditions of uncertainty or the absence of information related to the probability and severity of the risk scenario, among the mentioned methods. The result showed that errors in the risk assessment were reduced to an acceptable extent by using Fuzzy LOPA method.
Assuntos
Poluentes Ambientais , Humanos , Estados Unidos , Clero , Medição de Risco/métodos , Poluição Ambiental , ProbabilidadeRESUMO
This study presents a novel approach for obtaining reliable models and coefficients to estimate the probability of infection caused by common human enteric viruses. The aim is to provide guidance for public health policies in disease prevention and control, by reducing uncertainty and management costs in health risk assessments. Conventional dose-response (DR) models, based on the theory elaborated by Furumoto and Mickey [1], exhibit limitations stemming from the heterogeneity of individual host susceptibilities to infection resulting from ingesting aggregate viruses. Moreover, the scarcity of well-designed viral challenge experiments contributes to significant uncertainty in these DR models. To address these issues, we conducted a review of infection models used in health risk analysis, focusing on Norovirus (NoV) GI.1, pooled Enterovirus group (EV), Poliovirus 1/SM, and Echo-12 virus via contaminated water or food. Using a mechanistic approach, we reevaluated the known DR models and coefficients for the probability of individual host infection in the mentioned viruses based on dose-infection challenge experiments. Specifically, we sought to establish a relationship between the minimum infectious dose (ID) and the ID having a 50% probability of initiating host infection in the same challenge experiment. Furthermore, we developed a new formula to estimate the degree of aggregation of GI.1 NoV at the mean infectious dose. The proposed models, based on "exact" beta-Poisson DR models, effectively predicted infection probabilities from ingestion of both disaggregated and aggregate NoV GI.1. Through a numerical evaluation, we compared the results with the maximum likelihood estimation (MLE) probability obtained from a controlled challenge trial with the NoV GI.1 virus described in the literature, demonstrating the accuracy of our approach. By addressing the indetermination of the unmeasured degree of NoV aggregation in each single infectious dose, our models reduce overestimations and uncertainties in microbial risk assessments. This improvement enhances the management of health risks associated with enteric virus infections.
Assuntos
Enterovirus , Norovirus , Vírus , Humanos , Poluição da Água , ProbabilidadeRESUMO
Recently, contrastive learning has gained popularity in the field of unsupervised image-to-image (I2I) translation. In a previous study, a query-selected attention (QS-Attn) module, which employed an attention matrix with a probability distribution, was used to maximize the mutual information between the source and translated images. This module selected significant queries using an entropy metric computed from the attention matrix. However, it often selected many queries with equal significance measures, leading to an excessive focus on the background. In this study, we proposed a dual-learning framework with QS-Attn and convolutional block attention module (CBAM) called object-stable dual contrastive learning generative adversarial network (OS-DCLGAN). In this paper, we utilize a CBAM, which learns what and where to emphasize or suppress, thereby refining intermediate features effectively. This CBAM was integrated before the QS-Attn module to capture significant domain information for I2I translation tasks. The proposed framework outperformed recently introduced approaches in various I2I translation tasks, showing its effectiveness and versatility. The code is available at https://github.com/RedPotatoChip/OSUDL.
Assuntos
Aprendizagem , Entropia , ProbabilidadeRESUMO
Symmetry arguments are frequently used-often implicitly-in mathematical modelling of natural selection. Symmetry simplifies the analysis of models and reduces the number of distinct population states to be considered. Here, I introduce a formal definition of symmetry in mathematical models of natural selection. This definition applies to a broad class of models that satisfy a minimal set of assumptions, using a framework developed in previous works. In this framework, population structure is represented by a set of sites at which alleles can live, and transitions occur via replacement of some alleles by copies of others. A symmetry is defined as a permutation of sites that preserves probabilities of replacement and mutation. The symmetries of a given selection process form a group, which acts on population states in a way that preserves the Markov chain representing selection. Applying classical results on group actions, I formally characterize the use of symmetry to reduce the states of this Markov chain, and obtain bounds on the number of states in the reduced chain.
Assuntos
Modelos Genéticos , Seleção Genética , Cadeias de Markov , Probabilidade , MutaçãoRESUMO
BACKGROUND: Idiopathic membranous nephropathy (IMN) is a type of nephrotic syndrome and the leading cause of chronic kidney disease. As far as we know, no predictive model for assessing the prognosis of IMN is currently available. This study aims to establish a nomogram to predict remission probability in patients with IMN and assists clinicians to make treatment decisions. METHODS: A total of 266 patients with histopathology-proven IMN were included in this study. Least absolute shrinkage and selection operator regression was utilized to identify the most important variables. Subsequently, multivariate Cox regression analysis was conducted to construct a nomogram, and bootstrap resampling was employed for internal validation. Receiver operating characteristic and calibration curves and decision curve analysis (DCA) were utilized to assess the performance and clinical utility of the developed model. RESULTS: A prognostic nomogram was established, which incorporated creatinine, glomerular_basement_membrane_thickening, gender, IgG_deposition, low-density lipoprotein cholesterol, and fibrinogen. The areas under the curves of the 3-, 12-, 24-month were 0.751, 0.725, and 0.830 in the training set, and 0.729, 0.730, and 0.948 in the validation set respectively. These results and calibration curves demonstrated the good discrimination and calibration of the nomogram in the training and validation sets. Additionally, DCA indicated that the nomogram was useful for remission prediction in clinical settings. CONCLUSION: The nomogram was useful for clinicians to evaluate the prognosis of patients with IMN in early stage.
Assuntos
Glomerulonefrite Membranosa , Humanos , Nomogramas , Glomérulos Renais , Aprendizado de Máquina , ProbabilidadeRESUMO
Traditionally, datasets with multiple censored time-to-events have not been utilized in multivariate analysis because of their high level of complexity. In this paper, we propose the Censored Time Interval Analysis (CTIVA) method to address this issue. It estimates the joint probability distribution of actual event times in the censored dataset by implementing a statistical probability density estimation technique on the dataset. Based on the acquired event time, CTIVA investigates variables correlated with the interval time of events via statistical tests. The proposed method handles both categorical and continuous variables simultaneously-thus, it is suitable for application on real-world censored time-to-event datasets, which include both categorical and continuous variables. CTIVA outperforms traditional censored time-to-event data handling methods by 5% on simulation data. The average area under the curve (AUC) of the proposed method on the simulation dataset exceeds 0.9 under various conditions. Further, CTIVA yields novel results on National Sample Cohort Demo (NSCD) and proteasome inhibitor bortezomib dataset, a real-world censored time-to-event dataset of medical history of beneficiaries provided by the National Health Insurance Sharing Service (NHISS) and National Center for Biotechnology Information (NCBI). We believe that the development of CTIVA is a milestone in the investigation of variables correlated with interval time of events in presence of censoring.
Assuntos
Análise de Sobrevida , Humanos , Simulação por Computador , Probabilidade , Análise Multivariada , Fatores de TempoRESUMO
The occurrence of spontaneous bursts of uncontrolled electrical activity between neurons can disrupt normal brain function and lead to epileptic seizures. Despite extensive research, the mechanisms underlying seizure onset remain unclear. This study investigates the onset of seizures from the perspective of nonequilibrium statistical physics. By analyzing the probability flux within the framework of the nonequilibrium potential-flux landscape, we establish a connection between seizure dynamics and nonequilibrium. Our findings demonstrate that the degree of nonequilibrium is sensitive to the onset of epileptic seizures. This result offers an alternative perspective on assessing seizure onset in epilepsy.
Assuntos
Epilepsia , Humanos , Convulsões , Encéfalo , Neurônios/fisiologia , ProbabilidadeRESUMO
A cerebrospinal fluid (CSF) sample containing no red blood cells (RBC), colloquially known as a champagne tap, is an ideal outcome of a lumbar puncture (LP). In this pseudoprospective study of 2573 patients aged from 0 days to 95 years, we examined in four different age categories (neonates and infants, children and adolescents, adults, and older adults) whether a champagne tap in the patient's first LP procedure and a shorter time than 1 week between the two successive procedures are independently associated with fewer blood-contaminated CSF samples (traumatic LP) in the following procedure. One out of five CSF samples from the patient's first LP procedures were RBC-free on average, varying from about 9% in neonates and infants to about 36% in children and adolescents. The mean incidence of champagne taps was 19.5%. According to binary logistic regression, a champagne tap in the previous LP procedure significantly determined whether the following procedure was not blood-contaminated. The odds of traumatic LP were halved or even reduced tenfold after a champagne tap. Less than a week between the two successive procedures, in turn, multiplied the odds of traumatic LP in the latter even more than tenfold. A champagne tap was not significantly associated with traumatic LP in the following procedure among pediatric patients. If the patient's condition or therapy plan permits and the blood contamination can compromise the reliability of the CSF-based analysis and consequent diagnosis, postponing the LP procedure by several days is advisable to improve the odds of receiving a high-quality CSF sample.
Assuntos
Punção Espinal , Lactente , Recém-Nascido , Adolescente , Criança , Humanos , Idoso , Punção Espinal/métodos , Reprodutibilidade dos Testes , Incidência , ProbabilidadeRESUMO
BACKGROUND: Widely adopted criteria suggest using either low handgrip strength or poor chair stand performance to identify probable sarcopenia. However, there are limited direct comparisons of these measures in relation to important clinical endpoints. We aimed to compare associations between these two measures of probable sarcopenia and all-cause mortality. METHODS: Analyses included 7838 community-dwelling participants (55% women) aged 40-84 years from the seventh survey of the Tromsø Study (2015-2016), with handgrip strength assessed using a Jamar + Digital Dynamometer and a five-repetition chair stand test (5-CST) also undertaken. We generated sex-specific T-scores and categorised these as "not low", "low", and "very low" handgrip strength or 5-CST performance. Cox Proportional Hazard regression models were used to investigate associations between these two categorised performance scores and time to death (up to November 2020 ascertained from the Norwegian Cause of Death registry), adjusted for potential confounders including lifestyle factors and specific diseases. RESULTS: A total of 233 deaths occurred (median follow-up 4.7 years) with 1- and 5-year mortality rates at 3.1 (95% confidence interval [CI] 2.1, 4.6) and 6.3 (95% CI 5.5, 7.2) per 1000 person-years, respectively. There was poor agreement between the handgrip strength and 5-CST categories for men (Cohen's kappa [κ] = 0.19) or women (κ = 0.20). Fully adjusted models including handgrip strength and 5-CST performance mutually adjusted for each other, showed higher mortality rates among participants with low (hazard ratio [HR] 1.22, 95% CI 0.87, 1.71) and very low (HR 1.68, 95% CI 1.02, 2.75) handgrip strength compared with the not low category. Similar associations, although stronger, were seen for low (HR 1.88, 95% CI 1.38, 2.56) and very low (HR 2.64, 95% CI 1.73, 4.03) 5-CST performance compared with the not low category. CONCLUSIONS: We found poor agreement between T-score categories for handgrip strength and 5-CST performance and independent associations with mortality. Our findings suggest that these tests identify different people at risk when case-finding probable sarcopenia. As discussions on an international consensus for sarcopenia definitions proceed, testing both handgrip strength and chair stand performance should be recommended rather than viewing these as interchangeable assessments.
Assuntos
Força da Mão , Sarcopenia , Masculino , Feminino , Humanos , Sarcopenia/diagnóstico , Sarcopenia/epidemiologia , Probabilidade , Consenso , Vida IndependenteRESUMO
Civil aviation transport is an important source of global respiratory disease spread due to the closely-spaced environment. In order to reduce the probability of infection of passengers, an improved Wells-Riley model for cabin passenger risk assessment have been given in this work, the cabin ventilation and passenger nose and mouth orientation were considered. The model's effectiveness has been verified with published data. Finally, how the load factor and use of an empty seat scheme are associated with the number of infected people was assessed. The results demonstrated that the number of infected people positively correlates with the passenger load factor, and the most suitable load factor can be determined by controlling the final number of infected people with the condition of the epidemic situation in the departure city. Additionally, infection risk was found to be lower among passengers in window seats than in those in aisle seats and middle seats, and keeping empty seats in the middle or aisle could reduce the cabin average probability of infection by up to 37.47%. Using the model developed here, airlines can determine the optimal load factor threshold and seating arrangement strategy to improve economic benefits and reduce the probability of passenger infection.
Assuntos
Aviação , Humanos , Medição de Risco , Ventilação , ProbabilidadeRESUMO
BACKGROUND: The past few decades have seen remarkable developments in dose-finding designs for phase I cancer clinical trials. While many of these designs rely on a binary toxicity response, there is an increasing focus on leveraging continuous toxicity responses. A continuous toxicity response pertains to a quantitative measure represented by real numbers. A higher value corresponds not only to an elevated likelihood of side effects for patients but also to an increased probability of treatment efficacy. This relationship between toxicity and dose is often nonlinear, necessitating flexibility in the quest to find an optimal dose. METHODS: A flexible, fully Bayesian dose-finding design is proposed to capitalize on continuous toxicity information, operating under the assumption that the true shape of the dose-toxicity curve is nonlinear. RESULTS: We conduct simulations of clinical trials across varying scenarios of non-linearity to evaluate the operational characteristics of the proposed design. Additionally, we apply the proposed design to a real-world problem to determine an optimal dose for a molecularly targeted agent. CONCLUSIONS: Phase I cancer clinical trials, designed within a fully Bayesian framework with the utilization of continuous toxicity outcomes, offer an alternative approach to finding an optimal dose, providing unique benefits compared to trials designed based on binary toxicity outcomes.