RESUMEN
In this study, we developed a microfluidic device that is able to monitor cell biology under continuous PM2.5 treatment. The effects of PM2.5 on human alveolar basal epithelial cells, A549 cells, and uncovered several significant findings were investigated. The results showed that PM2.5 exposure did not lead to a notable decrease in cell viability, indicating that PM2.5 did not cause cellular injury or death. However, the study found that PM2.5 exposure increased the invasion and migration abilities of A549 cells, suggesting that PM2.5 might promote cell invasiveness. Results of RNA sequencing revealed 423 genes that displayed significant differential expression in response to PM2.5 exposure, with a particular focus on pathways associated with the generation of reactive oxygen species (ROS) and mitochondrial dysfunction. Real-time detection demonstrated an increase in ROS production in A549 cells after exposure to PM2.5. JC1 assay, which indicated a loss of mitochondrial membrane potential (ΔΨm) in A549 cells exposed to PM2.5. The disruption of mitochondrial membrane potential further supports the detrimental effects of PM2.5 on A549 cells. These findings highlight several adverse effects of PM2.5 on A549 cells, including enhanced invasion and migration capabilities, altered gene expression related to ROS pathways, increased ROS production and disruption of mitochondrial membrane potential. These findings contribute to our understanding of the potential mechanisms through which PM2.5 can impact cellular function and health.
Asunto(s)
Movimiento Celular , Supervivencia Celular , Neoplasias Pulmonares , Potencial de la Membrana Mitocondrial , Material Particulado , Especies Reactivas de Oxígeno , Humanos , Material Particulado/efectos adversos , Especies Reactivas de Oxígeno/metabolismo , Células A549 , Neoplasias Pulmonares/patología , Neoplasias Pulmonares/genética , Movimiento Celular/efectos de los fármacos , Potencial de la Membrana Mitocondrial/efectos de los fármacos , Supervivencia Celular/efectos de los fármacos , Dispositivos Laboratorio en un Chip , Mitocondrias/metabolismo , Mitocondrias/efectos de los fármacos , Invasividad Neoplásica/genética , Regulación Neoplásica de la Expresión Génica/efectos de los fármacos , Microfluídica/métodosRESUMEN
BACKGROUND: Survival estimation for patients with symptomatic skeletal metastases ideally should be made before a type of local treatment has already been determined. Currently available survival prediction tools, however, were generated using data from patients treated either operatively or with local radiation alone, raising concerns about whether they would generalize well to all patients presenting for assessment. The Skeletal Oncology Research Group machine-learning algorithm (SORG-MLA), trained with institution-based data of surgically treated patients, and the Metastases location, Elderly, Tumor primary, Sex, Sickness/comorbidity, and Site of radiotherapy model (METSSS), trained with registry-based data of patients treated with radiotherapy alone, are two of the most recently developed survival prediction models, but they have not been tested on patients whose local treatment strategy is not yet decided. QUESTIONS/PURPOSES: (1) Which of these two survival prediction models performed better in a mixed cohort made up both of patients who received local treatment with surgery followed by radiotherapy and who had radiation alone for symptomatic bone metastases? (2) Which model performed better among patients whose local treatment consisted of only palliative radiotherapy? (3) Are laboratory values used by SORG-MLA, which are not included in METSSS, independently associated with survival after controlling for predictions made by METSSS? METHODS: Between 2010 and 2018, we provided local treatment for 2113 adult patients with skeletal metastases in the extremities at an urban tertiary referral academic medical center using one of two strategies: (1) surgery followed by postoperative radiotherapy or (2) palliative radiotherapy alone. Every patient's survivorship status was ascertained either by their medical records or the national death registry from the Taiwanese National Health Insurance Administration. After applying a priori designated exclusion criteria, 91% (1920) were analyzed here. Among them, 48% (920) of the patients were female, and the median (IQR) age was 62 years (53 to 70 years). Lung was the most common primary tumor site (41% [782]), and 59% (1128) of patients had other skeletal metastases in addition to the treated lesion(s). In general, the indications for surgery were the presence of a complete pathologic fracture or an impending pathologic fracture, defined as having a Mirels score of ≥ 9, in patients with an American Society of Anesthesiologists (ASA) classification of less than or equal to IV and who were considered fit for surgery. The indications for radiotherapy were relief of pain, local tumor control, prevention of skeletal-related events, and any combination of the above. In all, 84% (1610) of the patients received palliative radiotherapy alone as local treatment for the target lesion(s), and 16% (310) underwent surgery followed by postoperative radiotherapy. Neither METSSS nor SORG-MLA was used at the point of care to aid clinical decision-making during the treatment period. Survival was retrospectively estimated by these two models to test their potential for providing survival probabilities. We first compared SORG to METSSS in the entire population. Then, we repeated the comparison in patients who received local treatment with palliative radiation alone. We assessed model performance by area under the receiver operating characteristic curve (AUROC), calibration analysis, Brier score, and decision curve analysis (DCA). The AUROC measures discrimination, which is the ability to distinguish patients with the event of interest (such as death at a particular time point) from those without. AUROC typically ranges from 0.5 to 1.0, with 0.5 indicating random guessing and 1.0 a perfect prediction, and in general, an AUROC of ≥ 0.7 indicates adequate discrimination for clinical use. Calibration refers to the agreement between the predicted outcomes (in this case, survival probabilities) and the actual outcomes, with a perfect calibration curve having an intercept of 0 and a slope of 1. A positive intercept indicates that the actual survival is generally underestimated by the prediction model, and a negative intercept suggests the opposite (overestimation). When comparing models, an intercept closer to 0 typically indicates better calibration. Calibration can also be summarized as log(O:E), the logarithm scale of the ratio of observed (O) to expected (E) survivors. A log(O:E) > 0 signals an underestimation (the observed survival is greater than the predicted survival); and a log(O:E) < 0 indicates the opposite (the observed survival is lower than the predicted survival). A model with a log(O:E) closer to 0 is generally considered better calibrated. The Brier score is the mean squared difference between the model predictions and the observed outcomes, and it ranges from 0 (best prediction) to 1 (worst prediction). The Brier score captures both discrimination and calibration, and it is considered a measure of overall model performance. In Brier score analysis, the "null model" assigns a predicted probability equal to the prevalence of the outcome and represents a model that adds no new information. A prediction model should achieve a Brier score at least lower than the null-model Brier score to be considered as useful. The DCA was developed as a method to determine whether using a model to inform treatment decisions would do more good than harm. It plots the net benefit of making decisions based on the model's predictions across all possible risk thresholds (or cost-to-benefit ratios) in relation to the two default strategies of treating all or no patients. The care provider can decide on an acceptable risk threshold for the proposed treatment in an individual and assess the corresponding net benefit to determine whether consulting with the model is superior to adopting the default strategies. Finally, we examined whether laboratory data, which were not included in the METSSS model, would have been independently associated with survival after controlling for the METSSS model's predictions by using the multivariable logistic and Cox proportional hazards regression analyses. RESULTS: Between the two models, only SORG-MLA achieved adequate discrimination (an AUROC of > 0.7) in the entire cohort (of patients treated operatively or with radiation alone) and in the subgroup of patients treated with palliative radiotherapy alone. SORG-MLA outperformed METSSS by a wide margin on discrimination, calibration, and Brier score analyses in not only the entire cohort but also the subgroup of patients whose local treatment consisted of radiotherapy alone. In both the entire cohort and the subgroup, DCA demonstrated that SORG-MLA provided more net benefit compared with the two default strategies (of treating all or no patients) and compared with METSSS when risk thresholds ranged from 0.2 to 0.9 at both 90 days and 1 year, indicating that using SORG-MLA as a decision-making aid was beneficial when a patient's individualized risk threshold for opting for treatment was 0.2 to 0.9. Higher albumin, lower alkaline phosphatase, lower calcium, higher hemoglobin, lower international normalized ratio, higher lymphocytes, lower neutrophils, lower neutrophil-to-lymphocyte ratio, lower platelet-to-lymphocyte ratio, higher sodium, and lower white blood cells were independently associated with better 1-year and overall survival after adjusting for the predictions made by METSSS. CONCLUSION: Based on these discoveries, clinicians might choose to consult SORG-MLA instead of METSSS for survival estimation in patients with long-bone metastases presenting for evaluation of local treatment. Basing a treatment decision on the predictions of SORG-MLA could be beneficial when a patient's individualized risk threshold for opting to undergo a particular treatment strategy ranged from 0.2 to 0.9. Future studies might investigate relevant laboratory items when constructing or refining a survival estimation model because these data demonstrated prognostic value independent of the predictions of the METSSS model, and future studies might also seek to keep these models up to date using data from diverse, contemporary patients undergoing both modern operative and nonoperative treatments. LEVEL OF EVIDENCE: Level III, diagnostic study.
RESUMEN
BACKGROUND: Bone metastasis in advanced cancer is challenging because of pain, functional issues, and reduced life expectancy. Treatment planning is complex, with consideration of factors such as location, symptoms, and prognosis. Prognostic models help guide treatment choices, with Skeletal Oncology Research Group machine-learning algorithms (SORG-MLAs) showing promise in predicting survival for initial spinal metastases and extremity metastases treated with surgery or radiotherapy. Improved therapies extend patient lifespans, increasing the risk of subsequent skeletal-related events (SREs). Patients experiencing subsequent SREs often suffer from disease progression, indicating a deteriorating condition. For these patients, a thorough evaluation, including accurate survival prediction, is essential to determine the most appropriate treatment and avoid aggressive surgical treatment for patients with a poor survival likelihood. Patients experiencing subsequent SREs often suffer from disease progression, indicating a deteriorating condition. However, some variables in the SORG prediction model, such as tumor histology, visceral metastasis, and previous systemic therapies, might remain consistent between initial and subsequent SREs. Given the prognostic difference between patients with and without a subsequent SRE, the efficacy of established prognostic models-originally designed for individuals with an initial SRE-in addressing a subsequent SRE remains uncertain. Therefore, it is crucial to verify the model's utility for subsequent SREs. QUESTION/PURPOSE: We aimed to evaluate the reliability of the SORG-MLAs for survival prediction in patients undergoing surgery or radiotherapy for a subsequent SRE for whom both the initial and subsequent SREs occurred in the spine or extremities. METHODS: We retrospectively included 738 patients who were 20 years or older who received surgery or radiotherapy for initial and subsequent SREs at a tertiary referral center and local hospital in Taiwan between 2010 and 2019. We excluded 74 patients whose initial SRE was in the spine and in whom the subsequent SRE occurred in the extremities and 37 patients whose initial SRE was in the extremities and the subsequent SRE was in the spine. The rationale was that different SORG-MLAs were exclusively designed for patients who had an initial spine metastasis and those who had an initial extremity metastasis, irrespective of whether they experienced metastatic events in other areas (for example, a patient experiencing an extremity SRE before his or her spinal SRE would also be regarded as a candidate for an initial spinal SRE). Because these patients were already validated in previous studies, we excluded them in case we overestimated our result. Five patients with malignant primary bone tumors and 38 patients in whom the metastasis's origin could not be identified were excluded, leaving 584 patients for analysis. The 584 included patients were categorized into two subgroups based on the location of initial and subsequent SREs: the spine group (68% [399]) and extremity group (32% [185]). No patients were lost to follow-up. Patient data at the time they presented with a subsequent SRE were collected, and survival predictions at this timepoint were calculated using the SORG-MLAs. Multiple imputation with the Missforest technique was conducted five times to impute the missing proportions of each predictor. The effectiveness of SORG-MLAs was gauged through several statistical measures, including discrimination (measured by the area under the receiver operating characteristic curve [AUC]), calibration, overall performance (Brier score), and decision curve analysis. Discrimination refers to the model's ability to differentiate between those with the event and those without the event. An AUC ranges from 0.5 to 1.0, with 0.5 indicating the worst discrimination and 1.0 indicating perfect discrimination. An AUC of 0.7 is considered clinically acceptable discrimination. Calibration is the comparison between the frequency of observed events and the predicted probabilities. In an ideal calibration, the observed and predicted survival rates should be congruent. The logarithm of observed-to-expected survival ratio [log(O:E)] offers insight into the model's overall calibration by considering the total number of observed (O) and expected (E) events. The Brier score measures the mean squared difference between the predicted probability of possible outcomes for each individual and the observed outcomes, ranging from 0 to 1, with 0 indicating perfect overall performance and 1 indicating the worst performance. Moreover, the prevalence of the outcome should be considered, so a null-model Brier score was also calculated by assigning a probability equal to the prevalence of the outcome (in this case, the actual survival rate) to each patient. The benefit of the prediction model is determined by comparing its Brier score with that of the null model. If a prediction model's Brier score is lower than the null model's Brier score, the prediction model is deemed as having good performance. A decision curve analysis was performed for models to evaluate the "net benefit," which weighs the true positive rate over the false positive rate against the "threshold probabilities," the ratio of risk over benefit after an intervention was derived based on a comprehensive clinical evaluation and a well-discussed shared-decision process. A good predictive model should yield a higher net benefit than default strategies (treating all patients and treating no patients) across a range of threshold probabilities. RESULTS: For the spine group, the algorithms displayed acceptable AUC results (median AUCs of 0.69 to 0.72) for 42-day, 90-day, and 1-year survival predictions after treatment for a subsequent SRE. In contrast, the extremity group showed median AUCs ranging from 0.65 to 0.73 for the corresponding survival periods. All Brier scores were lower than those of their null model, indicating the SORG-MLAs' good overall performances for both cohorts. The SORG-MLAs yielded a net benefit for both cohorts; however, they overestimated 1-year survival probabilities in patients with a subsequent SRE in the spine, with a median log(O:E) of -0.60 (95% confidence interval -0.77 to -0.42). CONCLUSION: The SORG-MLAs maintain satisfactory discriminatory capacity and offer considerable net benefits through decision curve analysis, indicating their continued viability as prediction tools in this clinical context. However, the algorithms overestimate 1-year survival rates for patients with a subsequent SRE of the spine, warranting consideration of specific patient groups. Clinicians and surgeons should exercise caution when using the SORG-MLAs for survival prediction in these patients and remain aware of potential mispredictions when tailoring treatment plans, with a preference for less invasive treatments. Ultimately, this study emphasizes the importance of enhancing prognostic algorithms and developing innovative tools for patients with subsequent SREs as the life expectancy in patients with bone metastases continues to improve and healthcare providers will encounter these patients more often in daily practice. LEVEL OF EVIDENCE: Level III, prognostic study.
Asunto(s)
Neoplasias Óseas , Humanos , Neoplasias Óseas/secundario , Neoplasias Óseas/mortalidad , Masculino , Femenino , Persona de Mediana Edad , Anciano , Estudios Retrospectivos , Reproducibilidad de los Resultados , Aprendizaje Automático , Adulto , Pronóstico , Valor Predictivo de las Pruebas , Progresión de la Enfermedad , Medición de Riesgo , Técnicas de Apoyo para la Decisión , Factores de RiesgoRESUMEN
Wolbachia are among the most prevalent and widespread endosymbiotic bacteria on Earth. Wolbachia's success in infecting an enormous number of arthropod species is attributed to two features: the range of phenotypes they induce in their hosts, and their ability to switch between host species. Whilst much progress has been made in elucidating their induced phenotypes, our understanding of Wolbachia host-shifting is still very limited: we lack answers to even fundamental questions concerning Wolbachia's routes of transfer and the importance of factors influencing host shifts. Here, we investigate the diversity and host-shift patterns of Wolbachia in scale insects, a group of arthropods with intimate associations with other insects that make them well suited to studying host shifts. Using Illumina multitarget amplicon sequencing of Wolbachia-infected scale insects and their direct associates we determined the identity of all Wolbachia strains. We then fitted a generalized additive mixed model to our data to estimate the influence of host phylogeny and the geographical distribution on Wolbachia strain sharing among scale insect species. The model predicts no significant contribution of host geography but strong effects of host phylogeny, with high rates of Wolbachia sharing among closely related species and a sudden drop-off in sharing with increasing phylogenetic distance. We also detected the same Wolbachia strain in scale insects and several intimately associated species (ants, wasps and flies). This indicates putative host shifts and potential routes of transfers via these associates and highlights the importance of ecological connectivity in Wolbachia host-shifting.
Asunto(s)
Hemípteros , Wolbachia , Animales , Hemípteros/microbiología , Insectos/genética , Filogenia , Simbiosis/genética , Avispas/genética , Wolbachia/genéticaRESUMEN
OBJECTIVE: Both electrocardiographic and echocardiographic left ventricular hypertrophy (LVH) have been reported with an association with greater carotid intima-media thickness (cIMT), a marker of subclinical atherosclerosis in patients with hypertension, while the associations are unclear in physically fit young adults. METHODS: A total of 1822 Taiwanese military personnel, aged 18-40 years, received an annual health examination including electrocardiography (ECG) and echocardiography in 2018-2020. Left carotid bulb cIMT was measured by high-resolution ultrasonography. Multiple logistic regression analysis with adjustments for age, sex, smoking, alcohol consumption, body mass index, mean blood pressure, and physical fitness was used to determine the associations between echocardiographic and ECG parameters and the highest quintile of cIMT (≥0.8 mm). RESULTS: Cornell-based LVH, Myers et al.-based RVH and heart rate ≥75/min were associated with cIMT ≥0.8 mm [odds ratios (ORs) and 95% confidence intervals: 1.54 (1.01, 2.35), 1.66 (1.18, 2.33), and 1.39 (1.06, 1.83), respectively], while echocardiographic LVH defined as ≥46.0 g/m2.7 for men and ≥38.0 g/m2.7 for women was inversely associated with cIMT ≥0.8 mm [OR: 0.45 (0.24, 0.86)]. CONCLUSION: In tactical athletes of military, the associations of ECG and echocardiographic LVH with cIMT were in opposite directions. Higher physical fitness may cause cardiac muscle hypertrophy and reduce the atherosclerosis severity, possibly leading to the paradoxical echocardiographic finding. This study suggests that ECG-based LVH remains a good marker of subclinical atherosclerosis in our military population.
Asunto(s)
Aterosclerosis , Electrocardiografía , Masculino , Adulto Joven , Humanos , Femenino , Grosor Intima-Media Carotídeo , Ecocardiografía , Hipertrofia Ventricular Izquierda , Aterosclerosis/complicaciones , Aterosclerosis/diagnóstico por imagen , Factores de RiesgoRESUMEN
BACKGROUND: The Skeletal Oncology Research Group machine-learning algorithm (SORG-MLA) was developed to predict the survival of patients with spinal metastasis. The algorithm was successfully tested in five international institutions using 1101 patients from different continents. The incorporation of 18 prognostic factors strengthens its predictive ability but limits its clinical utility because some prognostic factors might not be clinically available when a clinician wishes to make a prediction. QUESTIONS/PURPOSES: We performed this study to (1) evaluate the SORG-MLA's performance with data and (2) develop an internet-based application to impute the missing data. METHODS: A total of 2768 patients were included in this study. The data of 617 patients who were treated surgically were intentionally erased, and the data of the other 2151 patients who were treated with radiotherapy and medical treatment were used to impute the artificially missing data. Compared with those who were treated nonsurgically, patients undergoing surgery were younger (median 59 years [IQR 51 to 67 years] versus median 62 years [IQR 53 to 71 years]) and had a higher proportion of patients with at least three spinal metastatic levels (77% [474 of 617] versus 72% [1547 of 2151]), more neurologic deficit (normal American Spinal Injury Association [E] 68% [301 of 443] versus 79% [1227 of 1561]), higher BMI (23 kg/m2 [IQR 20 to 25 kg/m2] versus 22 kg/m2 [IQR 20 to 25 kg/m2]), higher platelet count (240 × 103/µL [IQR 173 to 327 × 103/µL] versus 227 × 103/µL [IQR 165 to 302 × 103/µL], higher lymphocyte count (15 × 103/µL [IQR 9 to 21× 103/µL] versus 14 × 103/µL [IQR 8 to 21 × 103/µL]), lower serum creatinine level (0.7 mg/dL [IQR 0.6 to 0.9 mg/dL] versus 0.8 mg/dL [IQR 0.6 to 1.0 mg/dL]), less previous systemic therapy (19% [115 of 617] versus 24% [526 of 2151]), fewer Charlson comorbidities other than cancer (28% [170 of 617] versus 36% [770 of 2151]), and longer median survival. The two patient groups did not differ in other regards. These findings aligned with our institutional philosophy of selecting patients for surgical intervention based on their level of favorable prognostic factors such as BMI or lymphocyte counts and lower levels of unfavorable prognostic factors such as white blood cell counts or serum creatinine level, as well as the degree of spinal instability and severity of neurologic deficits. This approach aims to identify patients with better survival outcomes and prioritize their surgical intervention accordingly. Seven factors (serum albumin and alkaline phosphatase levels, international normalized ratio, lymphocyte and neutrophil counts, and the presence of visceral or brain metastases) were considered possible missing items based on five previous validation studies and clinical experience. Artificially missing data were imputed using the missForest imputation technique, which was previously applied and successfully tested to fit the SORG-MLA in validation studies. Discrimination, calibration, overall performance, and decision curve analysis were applied to evaluate the SORG-MLA's performance. The discrimination ability was measured with an area under the receiver operating characteristic curve. It ranges from 0.5 to 1.0, with 0.5 indicating the worst discrimination and 1.0 indicating perfect discrimination. An area under the curve of 0.7 is considered clinically acceptable discrimination. Calibration refers to the agreement between the predicted outcomes and actual outcomes. An ideal calibration model will yield predicted survival rates that are congruent with the observed survival rates. The Brier score measures the squared difference between the actual outcome and predicted probability, which captures calibration and discrimination ability simultaneously. A Brier score of 0 indicates perfect prediction, whereas a Brier score of 1 indicates the poorest prediction. A decision curve analysis was performed for the 6-week, 90-day, and 1-year prediction models to evaluate their net benefit across different threshold probabilities. Using the results from our analysis, we developed an internet-based application that facilitates real-time data imputation for clinical decision-making at the point of care. This tool allows healthcare professionals to efficiently and effectively address missing data, ensuring that patient care remains optimal at all times. RESULTS: Generally, the SORG-MLA demonstrated good discriminatory ability, with areas under the curve greater than 0.7 in most cases, and good overall performance, with up to 25% improvement in Brier scores in the presence of one to three missing items. The only exceptions were albumin level and lymphocyte count, because the SORG-MLA's performance was reduced when these two items were missing, indicating that the SORG-MLA might be unreliable without these values. The model tended to underestimate the patient survival rate. As the number of missing items increased, the model's discriminatory ability was progressively impaired, and a marked underestimation of patient survival rates was observed. Specifically, when three items were missing, the number of actual survivors was up to 1.3 times greater than the number of expected survivors, while only 10% discrepancy was observed when only one item was missing. When either two or three items were omitted, the decision curves exhibited substantial overlap, indicating a lack of consistent disparities in performance. This finding suggests that the SORG-MLA consistently generates accurate predictions, regardless of the two or three items that are omitted. We developed an internet application (https://sorg-spine-mets-missing-data-imputation.azurewebsites.net/) that allows the use of SORG-MLA with up to three missing items. CONCLUSION: The SORG-MLA generally performed well in the presence of one to three missing items, except for serum albumin level and lymphocyte count (which are essential for adequate predictions, even using our modified version of the SORG-MLA). We recommend that future studies should develop prediction models that allow for their use when there are missing data, or provide a means to impute those missing data, because some data are not available at the time a clinical decision must be made. CLINICAL RELEVANCE: The results suggested the algorithm could be helpful when a radiologic evaluation owing to a lengthy waiting period cannot be performed in time, especially in situations when an early operation could be beneficial. It could help orthopaedic surgeons to decide whether to intervene palliatively or extensively, even when the surgical indication is clear.
RESUMEN
The association between intravascular photobiomodulation (iPBM) and crossed cerebellar diaschisis (CCD) and cognitive dysfunction in patients with traumatic brain injury (TBI) remains unknown. We postulate that iPBM might enable greater neurologic improvements. The objective of this study was to evaluate the clinical impact of iPBM on the prognosis of patients with TBI. In this longitudinal study, patients who were diagnosed with TBI were recruited. CCD was identified from brain perfusion images when the uptake difference of both cerebella was > 20%. Thus, two groups were identified: CCD( +) and CCD( -). All patients received general traditional physical therapy and three courses of iPBM (helium-neon laser illuminator, 632.8 nm). Treatment assemblies were conducted on weekdays for 2 consecutive weeks as a solitary treatment course. Three courses of iPBM were performed over 2-3 months, with 1-3 weeks of rest between each course. The outcomes were measured using the Rancho Los Amigos Levels of Cognitive Functioning (LCF) tool. The chi-square test was used to compare categorical variables. Generalized estimating equations were used to verify the associations of various effects between the two groups. p < 0.05 indicated a statistically significant difference. Thirty patients were included and classified into the CCD( +) and CCD( -) groups (n = 15, each group). Statistics showed that before iPBM, CCD in the CCD( +) group was 2.74 (exp 1.0081) times higher than that of CCD( -) group (p = 0.1632). After iPBM, the CCD was 0.64 (exp-0.4436) times lower in the CCD( +) group than in the CCD( -) group (p < 0.0001). Cognitive assessment revealed that, before iPBM, the CCD( +) group had a non-significantly 0.1030 lower LCF score than that of CCD( -) group (p = 0.1632). Similarly, the CCD( +) group had a non-significantly 0.0013 higher score than that of CCD( -) after iPBM treatment (p = 0.7041), indicating no significant differences between the CCD( +) or CCD( -) following iPBM and general physical therapy. CCD was less likely to appear in iPBM-treated patients. Additionally, iPBM was not associated with LCF score. Administration of iPBM could be applied in TBI patients to reduce the occurrence of CCD. The study failed to show differences in cognitive function after iPBM, which still serves as an alternative non-pharmacological intervention.
Asunto(s)
Lesiones Traumáticas del Encéfalo , Disfunción Cognitiva , Diásquisis , Procedimientos Endovasculares , Terapia por Luz de Baja Intensidad , Humanos , Lesiones Traumáticas del Encéfalo/fisiopatología , Lesiones Traumáticas del Encéfalo/radioterapia , Disfunción Cognitiva/fisiopatología , Disfunción Cognitiva/terapia , Diásquisis/fisiopatología , Diásquisis/radioterapia , Estudios Longitudinales , Terapia por Luz de Baja Intensidad/métodos , Resultado del Tratamiento , Masculino , Femenino , Adulto , Persona de Mediana EdadRESUMEN
Subjects with coronary artery disease (CAD) have myocardial ischemia and associated abnormal left ventricular ejection fraction (EF). Heart failure with mildly reduced EF (41-49%) (HFmrEF) is a new subgroup of EF for heart failure. Although prognostic factors for CAD and HF with reduced EF are well known, fewer studies have been conducted on factors related to the survival of CAD and HFmrEF. We recruited study subjects with significant CAD and HFmrEF from our cardiac catheterization data bank. Data were recorded from traceable chart records from our hospital. All-cause and cardiovascular mortality were recorded until December 2019 and served as a follow-up outcome. A total of 348 subjects with CAD and HFmrEF were analyzed. The median duration of follow-up was 37 months. Seventy-eight subjects died during the follow-up period and 30 of them were due to cardiovascular causes. In univariate analyses, those who died were of older ages, and with a lower estimated glomerular filtration rate (eGFR) (47 ± 30 versus 71 ± 30 mL/minute/1.73 m2, P < 0.001), and lower usage of percutaneous coronary intervention (PCI) and beta blockers. In the Cox survival regression analysis, a higher eGFR (hazard ratio 0.980, P < 0.001) was protective, while older age and a higher serum total cholesterol (hazard ratio 1.006, P = 0.048) were related to all-cause mortality for CAD with HFmrEF. Furthermore, a higher eGFR was also associated with less cardiovascular mortality. In conclusion, for subjects with CAD and HFmrEF, a higher eGFR was protective and associated with a lower all-cause and cardiovascular mortality.
Asunto(s)
Enfermedad de la Arteria Coronaria , Insuficiencia Cardíaca , Intervención Coronaria Percutánea , Humanos , Enfermedad de la Arteria Coronaria/complicaciones , Volumen Sistólico , Función Ventricular Izquierda , Tasa de Filtración Glomerular , Pronóstico , MuerteRESUMEN
BACKGROUND: Although cognitive-behavioral therapy is the first-line treatment for insomnia, pharmacotherapy is often prescribed to treat insomnia and related symptoms. In addition, muscle relaxants are commonly prescribed to alleviate muscle soreness when the pain is unbearable. However, pharmacotherapy can lead to numerous side effects. The non-drug strategy intravascular laser irradiation of blood (iPBM) has been advocated to improve pain, wound healing, blood circulation, and blood cell function to relieve insomnia and muscle soreness symptoms. Therefore, we assessed whether iPBM improves blood parameters and compared drug use before and after iPBM therapy. METHODS: Consecutive patients who received iPBM therapy between January 2013 and August 2021 were reviewed. The associations between laboratory data, pharmacotherapies, and iPBM therapy were retrospectively analyzed. We compared patient characteristics, blood parameters, and drug use within the three months before the first treatment and the three months after the last treatment. We also compared the changes before and after treatment in patients who received ≥10 or 1-9 iPBM treatments. RESULT: We assessed 183 eligible patients who received iPBM treatment. Of them, 18 patients reported insomnia disturbance, and 128 patients reported pain in any part of their body. After the treatment, HGB and HCT significantly increased after treatment in both the ≥10 and 1-9 iPBM treatment groups (HGB p < 0.001 and p = 0.046; HCT p < 0.001 and p = 0.029, respectively). Pharmacotherapy analysis revealed no significant differences in drug use before and after treatment, though drug use tended to decrease after iPBM. CONCLUSIONS: iPBM therapy is an efficient, beneficial, and feasible treatment that increases HGB and HCT. While the results of this study do not support the suggestion that iPBM reduces drug use, further larger studies using symptom scales are needed to confirm the changes in insomnia and muscle soreness after iPBM treatment.
Asunto(s)
Terapia por Luz de Baja Intensidad , Mialgia , Trastornos del Inicio y del Mantenimiento del Sueño , Estudios Retrospectivos , Trastornos del Inicio y del Mantenimiento del Sueño/radioterapia , Mialgia/radioterapia , Humanos , Taiwán , Masculino , Femenino , Adulto , Persona de Mediana Edad , Anciano , Pruebas Hematológicas , Fármacos Neuromusculares , Hipnóticos y Sedantes , AnalgésicosRESUMEN
Wolbachia is one of the most successful endosymbiotic bacteria of arthropods. Known as the 'master of manipulation', Wolbachia can induce a wide range of phenotypes in its host that can have far-reaching ecological and evolutionary consequences and may be exploited for disease and pest control. However, our knowledge of Wolbachia's distribution and the infection rate is unevenly distributed across arthropod groups such as scale insects. We fitted a distribution of within-species prevalence of Wolbachia to our data and compared it to distributions fitted to an up-to-date dataset compiled from surveys across all arthropods. The estimated distribution parameters indicate a Wolbachia infection frequency of 43.6% (at a 10% prevalence threshold) in scale insects. Prevalence of Wolbachia in scale insects follows a distribution similar to exponential decline (most species are predicted to have low prevalence infections), in contrast to the U-shaped distribution estimated for other taxa (most species have a very low or very high prevalence). We observed no significant associations between Wolbachia infection and scale insect traits. Finally, we screened for Wolbachia in scale insect's ecological associates. We found a positive correlation between Wolbachia infection in scale insects and their ant associates, pointing to a possible route of horizontal transfer of Wolbachia.
Asunto(s)
Hormigas , Artrópodos , Hemípteros , Wolbachia , Animales , Evolución Biológica , Simbiosis , Wolbachia/genéticaRESUMEN
BACKGROUND: Stroke is a burdensome cerebral eventthat affects many aspects of daily activities such as motion, speech, memory, vision, and cognition. Intravascular laser irradiation of blood (ILIB) is a novel therapy, going beyond conventional rehabilitation modalities, that is effective in stroke recovery. Homocysteine ââis an important risk factor associated with stroke. However, there are few studies that examine the relationship between ILIB treatment and the level of homocysteine. In recent years, researchers use the single-photon emission computed tomography (SPECT) scan of the brain to evaluate stroke patients and patients with a neurologicdeficit. The present report investigates the clinical effect of ILIB treatment on the level of serum homocysteine, the perfusion change of impaired brain region via SPECT, and the patient's neurologic appearance. CASEPRESENTATION: We focus on a case of a 62-year-old man with subacute stroke accompanied with left hemiparesis and hyperhomocysteinemia, who showed dramatic improvement in muscle power, a decreasing level of homocysteine, and increased blood flow of the right cerebral after three-courseILIB treatment. CONCLUSION: We found that ILIB is effective in lowering serum levels of homocysteine and facilitating cerebral circulation for the patient with subacute stroke.
Asunto(s)
Homocisteína , Accidente Cerebrovascular , Encéfalo/irrigación sanguínea , Circulación Cerebrovascular , Humanos , Isquemia , Rayos Láser , Masculino , Persona de Mediana Edad , Paresia/complicaciones , Perfusión/efectos adversos , Accidente Cerebrovascular/complicaciones , Accidente Cerebrovascular/diagnóstico por imagen , Tomografía Computarizada de Emisión de Fotón Único/métodosRESUMEN
AIM: To clarify the association between systemic and hepatic inflammation and localized periodontitis which has been reported to vary among races. MATERIALS AND METHODS: The study included 1112 military males, aged 18-40 years, in Taiwan. Participants were classified as periodontally healthy/stage I (n = 796) or stage II/III periodontitis (n = 316), according to the 2017 world workshop criteria. Systemic and hepatic inflammation were defined by the highest tertiles of blood leukocyte counts (7.51 × 103 /µl) and alanine aminotransferase (30 U/L), respectively. Multiple logistic regression analysis with adjustments for age, metabolic syndrome, betel nut consumption and smoking was carried out. RESULTS: There was a significant association between high systemic inflammation, irrespective of hepatic inflammation severity, and localized stage II/III periodontitis (odds ratio [OR], 1.62 [1.09-2.42] and 1.47 [1.00-2.15], respectively, in the presence of high or no hepatic inflammation. However, no significant association was found among participants with low systemic inflammation, irrespective of the severity of hepatic inflammation (OR, 1.31 [0.91-1.91]). CONCLUSIONS: An association between hepatic inflammation and localized periodontitis in Taiwanese was observed only if systemic inflammation coexisted, possibly accounting for the reported differences in the association between Japanese and non-Asian populations in prior studies.
Asunto(s)
Síndrome Metabólico , Periodontitis , Adolescente , Adulto , Alanina Transaminasa , Humanos , Inflamación , Masculino , Salud Bucal , Periodontitis/epidemiología , Adulto JovenRESUMEN
BACKGROUND: Prior studies have shown an association between generalized periodontitis and anemia in older or undernourished adults. The aim of the study was to examine the associations of erythrocyte indices with localized periodontitis in robust young adults, which has never been reported before. METHODS: The study included 1286 military participants, aged 19-40 years, with regular exercise training in Hualien, Taiwan. Localized periodontitis was grouped to healthy/stage I and stage II/III (n = 803 and 325) in men and (n = 130 and 28) in women according to the 2017 criteria of the world workshop. Systemic inflammation was evaluated by leukocyte counts. Multiple logistic regression analysis with adjustment for age, tobacco smoking status, betel nut chewing status, body mass index and leucocyte counts were used to determine the associations. RESULTS: Greater mean corpuscular volume in young men [odds ratio (OR) and 95% confidence intervals 1.03 (1.01-1.06)], and greater hematocrit and hemoglobin levels in young women were associated with a higher risk of localized stage II/III periodontitis [OR: 1.17 (1.02-1.34) and 1.60 (1.06-2.41), respectively]. However, there were no associations for erythrocyte counts. CONCLUSIONS: The localized stage II/III periodontitis risk increased with greater erythrocyte indices in robust young adults. This finding could be explained in part by that localized periodontitis may promote physical stress, possibly resulting in an increase of erythrocyte indices. On the other side, greater physical fitness associated with a lower risk of periodontitis may consume iron storage in the body, leading to exercise-induced anemia or smaller erythrocyte volume.
Asunto(s)
Anemia , Índices de Eritrocitos , Personal Militar , Periodontitis , Anemia/sangre , Estudios Transversales , Femenino , Hemoglobinas , Humanos , Hierro , Masculino , Salud Bucal , Periodontitis/sangre , Periodontitis/clasificación , Adulto JovenRESUMEN
AIM: Oral health and ocular diseases may be associated with collagen defects and inflammation status. However, the results from prior studies are conflicting. The aim of this study was to explore the association of dental caries and periodontitis with myopia in young adults. MATERIALS AND METHODS: A total of 938 military personnel aged 19-39 years receiving both oral and eye examinations from 2018 through 2020 were included in this study in Taiwan. The severity of myopia was graded as no myopia (diopters > - 0.5, N = 459), low myopia (diopters: - 0.5 to -5.9, N = 225) and high myopia (diopters ≤ - 6.0, N = 254). A multiple logistic regression analysis with adjustments for age, body mass index, systolic blood pressure, smoking, alcohol consumption, missing teeth numbers, blood leucocyte counts, triglycerides, high-density lipoprotein, and uric acid were used to determine the associations of actively dental caries, filled teeth and stage II/III periodontitis with myopia. RESULTS: The presence of any actively dental caries was significantly associated with a higher risk of any myopia (low or high) (odds ratio [OR] and 95% confidence intervals [95% CI] 1.42 [1.04-1.94]), whereas there was no association for filled teeth. Moreover, the association for stage II/III periodontitis was only observed with high myopia (OR: 1.52 [1.07-2.15]) and was not observed with low myopia. CONCLUSIONS: Our findings suggest that only actively dental caries and a higher severity of periodontitis were associated with myopia among young adults, thus highlighting the dental inflammation status in the oral cavity as a potential link to ocular diseases.
Asunto(s)
Caries Dental , Periodontitis , Estudios Transversales , Caries Dental/complicaciones , Caries Dental/etiología , Humanos , Inflamación , Salud Bucal , Periodontitis/complicaciones , Periodontitis/epidemiología , Adulto JovenRESUMEN
Background and Objectives: The impact of direct-acting antiviral (DAA)-based regimens on the recurrence of hepatocellular carcinoma (HCC) after successful curative hepatectomy is controversial. Aims: This study aimed to assess the association between DAAs treatment and recurrence risk in HCC after resection. Materials and Methods: We retrospectively assessed 152 cases of early stage (BCLC stage 0/A) hepatitis C virus (HCV)-related HCC (HCV-HCC) that underwent resection with curative intent between 2001 and 2019 at Kaohsiung Chang Gung Memorial Hospital; 48 cases achieved a sustained virological response (SVR) by DAA, and 104 cases were not treated with any antiviral therapy (non-treatment group). Recurrence-free survival (RFS) following curative resection was analyzed by using the log-rank test and Kaplan-Meier method. A Cox proportional hazards model was used to analyze the factors that impacted RFS and OS. Results: Five patients (10.4%) experienced HCC recurrence after DAA therapy. The cumulative HCC recurrence rate was significantly lower in the DAA group than the non-treatment group (p < 0.001). Multivariate analysis revealed a significant difference in RFS between the non-treatment group and DAA group (p = 0.001; hazard ratio (HR), 4.978; 95% CI, 1.976-12.542); liver cirrhosis (p = 0.005; HR, 2.062; 95% CI, 1.247-3.410), microvascular invasion (p = 0.001; HR, 2.331; 95% CI, 1.408-3.860) and AFP > 15 ng/mL (p = 0.022; HR, 1.799; 95% CI, 1.089-2.970) were also independent factors for HCC recurrence. ALBI stage II/III (p = 0.005; HR, 3.249; 95% CI, 1.418-7.443) and microvascular invasion (p < 0.001; HR, 4.037 95% CI, 2.071-7.869) were independent factors for OS; no significant difference in OS was observed between the DAA and no DAA treatment groups. Conclusions: DAA treatment could reduce the risk of recurrence after curative treatment for early stage HCC.
Asunto(s)
Carcinoma Hepatocelular , Hepatitis C Crónica , Hepatitis C , Neoplasias Hepáticas , Antivirales/uso terapéutico , Carcinoma Hepatocelular/tratamiento farmacológico , Hepacivirus , Hepatitis C/complicaciones , Hepatitis C/tratamiento farmacológico , Hepatitis C/epidemiología , Hepatitis C Crónica/tratamiento farmacológico , Humanos , Neoplasias Hepáticas/tratamiento farmacológico , Recurrencia Local de Neoplasia/tratamiento farmacológico , Recurrencia Local de Neoplasia/epidemiología , Estudios RetrospectivosRESUMEN
AIM: To investigate the associations between metabolic risk factors and periodontitis in young adults. MATERIALS AND METHODS: The study included 1123 participants, aged 19-40 years, in Taiwan. Metabolic syndrome components were defined by the International Diabetes Federation criteria. Localized periodontitis was graded to healthy (n = 828) and stage II/III (n = 295) according to the 2017 criteria of the World Workshop. Multiple logistic regression analysis with adjustment for sex, age, betel nut consumption, and smoking were used to determine the associations. RESULTS: Greater waist circumference, serum triglycerides, and serum uric acid were associated with higher localized stage II/III periodontitis risk [odds ratio (OR) and 95% confidence interval (CI): 1.04 (1.02-1.05), 1.004 (1.002-1.006), and 1.10 (1.00-1.21), respectively]. There were no associations for total cholesterol, high-density lipoprotein, and blood pressure. There was a non-linear association between fasting glucose and localized stage II/III periodontitis, where the turning point was 105 mg/dl [OR: 0.97 (0.95-0.99) and 1.06 (1.00-1.13) when the levels were <105 and ≥105 mg/dl, respectively]. CONCLUSIONS: The risks of localized stage II/III periodontitis vary with metabolic components, in which waist circumference, serum triglycerides, and serum uric acid are the risk factors, whereas plasma glucose shows a non-linear relationship in young adults.
Asunto(s)
Síndrome Metabólico , Periodontitis , Biomarcadores , Glucemia , Humanos , Síndrome Metabólico/epidemiología , Salud Bucal , Periodontitis/epidemiología , Factores de Riesgo , Triglicéridos , Ácido Úrico , Circunferencia de la Cintura , Adulto JovenRESUMEN
Anemia manifested as reduced red blood cell (RBC) amounts or hemoglobin levels has been associated with lower cardiorespiratory fitness. However, the relationship of smaller RBC with physical fitness was unknown. We included 2933 non-anemic military males (hemoglobin levels: 11.1-15.9 g/dL and mean corpuscular volume (MCV) <100 fL) in Taiwan during 2014. Aerobic fitness was assessed by time for a 3000-meter run, and anaerobic fitness was evaluated by numbers of sit-ups and push-ups, each performed within 2 minutes. Multiple linear and logistic regression models adjusting for age, service specialty, lipid profiles, and hemoglobin levels were used to determine the associations. Microcytosis and normocytosis were defined as MCV ≤ 70 fL (n = 190) and MCV > 70 fL (n = 2743), respectively. The linear regression shows that as compared with microcytosis, normocytosis was associated with more numbers of sit-ups performed within 2 minutes (ß = 1.51, P-value = 0.02). The logistic regression also reveals that those males with microcytosis had higher probability as the worst 10% performers in the 2-minute push-up test (odds ratio: 1.91, 95% confidence intervals: 1.18-3.12). By contrast, there was no association of microcytosis with 3000-meter running time. Our study suggests that non-anemic microcytosis was associated with lower anaerobic fitness but not with aerobic fitness. Whether the causative factors for microcytosis such as iron deficiency status and thalassemia trait unavailable in the study might account for the relationship needs further investigations.
Asunto(s)
Tamaño de la Célula , Índices de Eritrocitos/fisiología , Eritrocitos/citología , Personal Militar , Aptitud Física/fisiología , Adulto , Factores de Edad , Umbral Anaerobio/fisiología , Análisis de Varianza , Capacidad Cardiovascular/fisiología , Recuento de Eritrocitos , Ejercicio Físico/fisiología , Hemoglobina A/análisis , Humanos , Modelos Lineales , Lípidos/sangre , Modelos Logísticos , Masculino , Oportunidad Relativa , Estudios Retrospectivos , Carrera/fisiología , Taiwán , Talasemia beta/sangreRESUMEN
BACKGROUND: Proteinuria, a marker of kidney injury, may be related to skeletal muscle loss. Whether the severity of proteinuria is associated with physical performance is unclear. METHODS: We examined the association of proteinuria severity with physical performance cross-sectionally in 3357 military young males, free of chronic kidney disease, from the cardiorespiratory fitness and hospitalization events in armed Forces (CHIEF) study in Taiwan. The grades of proteinuria were classified according to one dipstick urinalysis which were collected at morning after an 8-h fast as unremarkable (0, +/-, and 1+), moderate (2+) and severe (3+ and 4+). Aerobic physical performance was evaluated by time for a 3000-m run and anaerobic physical performance was evaluated by numbers of 2-min sit-ups and 2-min push-ups, separately. Multiple linear regressions were used to determine the relationship. RESULTS: As compared with unremarkable proteinuria, moderate and severe proteinuria were dose-dependently correlated with 3000-m running time (ß: 4.74 (95% confidence intervals (CI): - 0.55, 10.02) and 7.63 (95% CI: 3.21, 12.05), respectively), and inversely with numbers of 2-min push-ups (ß = - 1.13 (- 1.97, - 0.29), and - 1.00 (- 1.71, - 0.28), respectively) with adjustments for age, service specialty, body mass index, blood pressure, alcohol intake, smoking, fasting plasma glucose, blood urea nitrogen, serum creatinine and physical activity. However, there was no association between proteinuria severity and 2-min sit-ups. CONCLUSIONS: Our findings show a relationship of dipstick proteinuria with aerobic physical performance and parts of anaerobic physical performance in military healthy males. This mechanism is not fully understood and requires further investigations.
Asunto(s)
Personal Militar , Rendimiento Físico Funcional , Proteinuria/orina , Adulto , Humanos , Masculino , Taiwán , Adulto JovenRESUMEN
PURPOSE: No standard strategy exists for managing cervical spondylotic myelopathy (CSM). The efficacy of spinous process-splitting laminoplasty, its impact on cervical alignment change and the incidence of postoperative neck pain remain unclear. We analyzed the parameters of cervical alignment and cord morphology in CSM. METHODS: The radiographic parameters investigated were pre- and postoperative C2-C7 lordosis (CL), C2-C7 sagittal vertical axis (CSVA), T1 slope (TS), TS minus CL (TS - CL) and cervical spinal cord morphology. Myelopathy severity was measured using two different functional scores. Statistical analysis was performed to determine significant differences between preoperative and follow-up radiological findings and change in functional scores. RESULTS: This retrospective study comprised 85 CSM patients from a single institute, with a minimum follow-up of 24 months. Overall, 63.5% (n = 54) of patients had improvement in their postoperative cervical lordotic alignment; 36.5% (n = 31) developed progressive aggravation of the cervical kyphotic alignment. Pearson correlation analysis showed that CSVA, TS and T1-CL were independent predictors of CL curve change. Based on the receiver operating characteristic curve, the cutoff value for CSVA was 2.89 cm with a postoperative visual analog scale (VAS) > 4. The cutoff value of the TS - CL was 20 degrees with a postoperative VAS > 4. CSVA, TS and TS - CL had a significant association with variation in CL. CSVA and TS - CL had a significant association with postoperative neck pain. CONCLUSIONS: CSVA, T1 slope and T1-CL are good predictors of postoperative degenerative kyphotic change and neck pain. Careful consideration of their preoperative cutoff values can improve postoperative outcomes. LEVEL OF EVIDENCE: IV. These slides can be retrieved under Electronic Supplementary Material.