Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 36
Filtrar
1.
Am J Hypertens ; 2024 May 10.
Artigo em Inglês | MEDLINE | ID: mdl-38727326

RESUMO

BACKGROUND: Medicare supplement insurance, or Medigap, covers 21% of Medicare beneficiaries. Despite offsetting some out-of-pocket (OOP) expenses, remaining OOP costs may pose a barrier to medication adherence. This study aims to evaluate how OOP costs and insurance plan types influence medication adherence among beneficiaries covered by Medicare Supplement plans. METHODS: We conducted a retrospective analysis of the MerativeTM MarketScan® Medicare Supplement Database (2017-2019) in Medigap enrollees (≥ 65 years) with hypertension. Proportion of days covered (PDC) was a continuous measure of medication adherence and was also dichotomized (PDC ≥ 0.8) to quantify adequate adherence. Beta-binomial and logistic regression models were used to estimate associations between these outcomes and insurance plan type and log-transformed OOP costs, adjusting for patient characteristics. RESULTS: Among 27,407 patients with hypertension, the average PDC was 0.68 ± 0.31; 47.5% achieved adequate adherence. A mean $1 higher in 30-day OOP costs was associated with a 0.06 (95% Confidence intervals [CI]: -0.09 - -0.03) lower probability of adequate adherence, or a 5% (95% C.I.: 4% - 7%) decrease in PDC. Compared to comprehensive plan enrollees, the odds of adequate adherence were lower among those with point-of-service plans (O.R.: 0.69, 95%C.I.: 0.62 - 0.77), but higher among those with preferred provider organization (PPO) plans (O.R.: 1.08, 95%C.I.: 1.01 - 1.15). Moreover, the association between OOP costs and PDC was significantly greater for PPO enrollees. CONCLUSIONS: While Medicare supplement insurance alleviates some OOP costs, different insurance plans and remaining OOP costs influence medication adherence. Reducing patient cost-sharing may improve medication adherence.

2.
medRxiv ; 2024 Mar 22.
Artigo em Inglês | MEDLINE | ID: mdl-38562806

RESUMO

INTRODUCTION: Intravenous (IV) medications are a fundamental cause of fluid overload (FO) in the intensive care unit (ICU); however, the association between IV medication use (including volume), administration timing, and FO occurrence remains unclear. METHODS: This retrospective cohort study included consecutive adults admitted to an ICU ≥72 hours with available fluid balance data. FO was defined as a positive fluid balance ≥7% of admission body weight within 72 hours of ICU admission. After reviewing medication administration record (MAR) data in three-hour periods, IV medication exposure was categorized into clusters using principal component analysis (PCA) and Restricted Boltzmann Machine (RBM). Medication regimens of patients with and without FO were compared within clusters to assess for temporal clusters associated with FO using the Wilcoxon rank sum test. Exploratory analyses of the medication cluster most associated with FO for medications frequently appearing and used in the first 24 hours was conducted. RESULTS: FO occurred in 127/927 (13.7%) of the patients enrolled. Patients received a median (IQR) of 31 (13-65) discrete IV medication administrations over the 72-hour period. Across all 47,803 IV medication administrations, ten unique IV medication clusters were identified with 121-130 medications in each cluster. Among the ten clusters, cluster 7 had the greatest association with FO; the mean number of cluster 7 medications received was significantly greater in patients in the FO cohort compared to patients who did not experience FO (25.6 vs.10.9. p<0.0001). 51 of the 127 medications in cluster 7 (40.2%) appeared in > 5 separate 3-hour periods during the 72-hour study window. The most common cluster 7 medications included continuous infusions, antibiotics, and sedatives/analgesics. Addition of cluster 7 medications to a prediction model with APACHE II score and receipt of diuretics improved the ability for the model to predict fluid overload (AUROC 5.65, p =0.0004). CONCLUSIONS: Using ML approaches, a unique IV medication cluster was strongly associated with FO. Incorporation of this cluster improved the ability to predict development of fluid overload in ICU patients compared with traditional prediction models. This method may be further developed into real-time clinical applications to improve early detection of adverse outcomes.

3.
Ther Adv Infect Dis ; 11: 20499361241244967, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38645299

RESUMO

Background: Emerging risk factors highlight the need for an updated understanding of cryptococcosis in the United States. Objective: Describe the epidemiological trends and clinical outcomes of cryptococcosis in three patient groups: people with HIV (PWH), non-HIV-infected and non-transplant (NHNT) patients, and patients with a history of solid organ transplantation. Methods: We utilized data from the Merative Medicaid Database to identify individuals aged 18 and above with cryptococcosis based on the International Classification of Diseases, Tenth Revision diagnosis codes from January 2017 to December 2019. Patients were stratified into PWH, NHNT patients, and transplant recipients according to Infectious Diseases Society of America guidelines. Baseline characteristics, types of cryptococcosis, hospitalization details, and in-hospital mortality rates were compared across groups. Results: Among 703 patients, 59.7% were PWH, 35.6% were NHNT, and 4.7% were transplant recipients. PWH were more likely to be younger, male, identify as Black, and have fewer comorbidities than patients in the NHNT and transplant groups. Notably, 24% of NHNT patients lacked comorbidities. Central nervous system, pulmonary, and disseminated cryptococcosis were most common overall (60%, 14%, and 11%, respectively). The incidence of cryptococcosis fluctuated throughout the study period. PWH accounted for over 50% of cases from June 2017 to June 2019, but this proportion decreased to 47% from July to December 2019. Among the 52% of patients requiring hospitalization, 61% were PWH and 35% were NHNT patients. PWH had longer hospital stays. In-hospital mortality at 90 days was significantly higher in NHNT patients (22%) compared to PWH (7%) and transplant recipients (0%). One-year mortality remained lowest among PWH (8%) compared to NHNT patients (22%) and transplant recipients (13%). Conclusion: In this study, most cases of cryptococcosis were PWH. Interestingly, while the incidence remained relatively stable in PWH, it slightly increased in those without HIV by the end of the study period. Mortality was highest in NHNT patients.


Epidemiological trends of cryptococcosis in the US The epidemiology and outcomes of cryptococcosis across the United States have not been recently examined. This study analyzed an insured population from 2017 to 2019 and revealed a relatively stable incidence of cryptococcosis among people with HIV, while concurrently demonstrating a slightly increased incidence among individuals without HIV. Notably, mortality rates were highest among non-HIV-infected and non-transplant patients.

4.
Sci Rep ; 13(1): 22041, 2023 Dec 12.
Artigo em Inglês | MEDLINE | ID: mdl-38086877

RESUMO

The traditional A* algorithm suffers from issues such as sharp turning points in the path, weak directional guidance during the search, and a large number of computed nodes. To address these problems, a modified approach called the Directional Search A* algorithm along with a path smoothing technique has been proposed. Firstly, the Directional Search A* algorithm introduces an angle constraint condition through the evaluation function. By converting sharp turns into obtuse angles, the path turning points become smoother. This approach reduces the occurrence of sharp turns in the path, resulting in improved path smoothness. Secondly, the algorithm enhances the distance function to strengthen the directional guidance during the path search. By optimizing the distance function, the algorithm tends to prefer directions that lead towards the target, which helps reduce the search space and shorten the overall path planning time. Additionally, the algorithm removes redundant nodes along the path, resulting in a more concise path representation. Lastly, the algorithm proposes an improved step size adjustment method to optimize the number of path nodes obtained. By appropriately adjusting the step size, the algorithm further reduces the number of nodes, leading to improved path planning efficiency. By applying these methods together, the Directional Search A* algorithm effectively addresses the limitations of the traditional A* algorithm and produces smoother and more efficient path planning results. Simulation experiments comparing the modified A* algorithm with the traditional A* algorithm were conducted using MATLAB. The experimental results demonstrate that the improved A* algorithm can generate shorter paths, with reduced planning time and smoother trajectories. This indicates that the Directional Search A* algorithm, incorporating the angle constraint condition in the evaluation function and the direction-guided strategy, outperforms the traditional A* algorithm in path planning and provides better solutions to the existing issues.

5.
J Bone Miner Res ; 38(12): 1809-1821, 2023 12.
Artigo em Inglês | MEDLINE | ID: mdl-37950643

RESUMO

Dietary interventions designed to examine the role of nutrition on childhood bone accrual have often focused on the role of individual micronutrients (eg, calcium, vitamin D, and zinc) and macronutrients (eg, protein). The osteogenic benefits of whole foods, such as eggs, are not well understood despite eggs being a source of high-quality nutrients and bioactive compounds known to positively influence bone. A significant positive cross-sectional association between whole egg consumption and tibia cortical bone mass has recently been shown in young children; however, randomized controlled trials (RCTs) have not been conducted. This study is a double-blind RCT in male and female children ages 9-13 years of different ancestries, designed to determine if consuming food products with whole eggs (equivalent to 8-10 eggs/wk) versus foods with milk or gelatin (placebo) over a 9-month period will improve measures of bone strength. Total body less head (TBLH) and lumbar spine bone mineral content (BMC) and areal bone mineral density (aBMD) were assessed using dual-energy X-ray absorptiometry (DXA). DXA Z-scores were computed using published pediatric growth charts and were adjusted for height-for-age Z-score (HAZ). Mid-tibia cortical volumetric BMD, BMC, cortical area, total bone area, cortical thickness, and strength strain index were measured using peripheral quantitative computed tomography. Overall, there were no significant intervention effects for any bone outcomes. The increase in spine BMCHAZ Z-score in the egg group versus the gelatin group approached significance (p = 0.07). Significant time effects in TBLH aBMDHAZ Z-score occurred as all groups decreased over 9 months (p < 0.03). Most tibia cortical bone outcomes increased over time (all p < 0.02), but changes did not differ across intervention groups. Whole eggs provide important nutritional benefits for children, but the bone responses to consumption of 8-10 eggs/wk over a period of 9 months in children entering the early stages of puberty were small. © 2023 The Authors. Journal of Bone and Mineral Research published by Wiley Periodicals LLC on behalf of American Society for Bone and Mineral Research (ASBMR).


Assuntos
Osso e Ossos , Gelatina , Masculino , Feminino , Humanos , Criança , Pré-Escolar , Densidade Óssea/fisiologia , Absorciometria de Fóton/métodos , Vértebras Lombares , Minerais , Ensaios Clínicos Controlados Aleatórios como Assunto
6.
Sci Rep ; 13(1): 19654, 2023 11 10.
Artigo em Inglês | MEDLINE | ID: mdl-37949982

RESUMO

Fluid overload, while common in the ICU and associated with serious sequelae, is hard to predict and may be influenced by ICU medication use. Machine learning (ML) approaches may offer advantages over traditional regression techniques to predict it. We compared the ability of traditional regression techniques and different ML-based modeling approaches to identify clinically meaningful fluid overload predictors. This was a retrospective, observational cohort study of adult patients admitted to an ICU ≥ 72 h between 10/1/2015 and 10/31/2020 with available fluid balance data. Models to predict fluid overload (a positive fluid balance ≥ 10% of the admission body weight) in the 48-72 h after ICU admission were created. Potential patient and medication fluid overload predictor variables (n = 28) were collected at either baseline or 24 h after ICU admission. The optimal traditional logistic regression model was created using backward selection. Supervised, classification-based ML models were trained and optimized, including a meta-modeling approach. Area under the receiver operating characteristic (AUROC), positive predictive value (PPV), and negative predictive value (NPV) were compared between the traditional and ML fluid prediction models. A total of 49 of the 391 (12.5%) patients developed fluid overload. Among the ML models, the XGBoost model had the highest performance (AUROC 0.78, PPV 0.27, NPV 0.94) for fluid overload prediction. The XGBoost model performed similarly to the final traditional logistic regression model (AUROC 0.70; PPV 0.20, NPV 0.94). Feature importance analysis revealed severity of illness scores and medication-related data were the most important predictors of fluid overload. In the context of our study, ML and traditional models appear to perform similarly to predict fluid overload in the ICU. Baseline severity of illness and ICU medication regimen complexity are important predictors of fluid overload.


Assuntos
Unidades de Terapia Intensiva , Aprendizado de Máquina , Adulto , Humanos , Estudos de Coortes , Curva ROC , Estudos Retrospectivos , Modelos Logísticos
7.
Ann Pharmacother ; : 10600280231210275, 2023 Nov 09.
Artigo em Inglês | MEDLINE | ID: mdl-37946374

RESUMO

BACKGROUND: Fluoroquinolones (FQs) are associated with increased risk of tendon injury but comparative risk versus other antibiotic options for the same indication has yet to be evaluated. OBJECTIVE: Describe the incidence (relative risk) of any tendon injury in patients receiving FQ compared with other (non-FQs) antibiotics for treatment of community-acquired pneumonia (CAP). METHODS: A retrospective propensity score weighted cohort study was performed to evaluate the association between FQ antibiotics and tendon injury risk at 2 time points (within 1 month and within 6 months of use) compared with non-FQ regimens for treatment of CAP. The evaluation was performed using the CCAE (MarketScan Commercial Claims and Encounters) and COB (Medicare Supplemental and Coordination of Benefits) databases from 2014 to 2020. Patients with ICD (International Classification of Diseases) 9/10 coding for outpatient pneumonia who were >18 years and without history of tendon injury were included. Patients with history of tendon injury, who received multiple antibiotic therapies for recurrent pneumonia, or who received both FQ and non-FQ regimens during the study period were excluded. Propensity score weighting was used to adjust for selection bias due to contributing risk factors, including demographics (age, sex), comorbidities (diabetes mellitus, chronic kidney disease), and concurrent medications (corticosteroids). RESULTS: At 1 month, the odds of tendon injury were estimated to be significantly higher (41.9%) in patients receiving FQs compared with those receiving a non-FQ-based regimen (odds ratio [OR] = 1.419, 95% confidence interval [CI] = [1.188-1.698]). The odds of tendon injury were also estimated to be higher (OR = 1.067, 95% CI = [0.975-1.173]) in the FQ population within 180 days, but this effect was not statistically significant. The most frequent sites of tendon injuries were rotator cuff, shoulder, and patellar tendon. CONCLUSIONS AND RELEVANCE: Prescribers should recognize the risk of tendon injury within 1 month of FQ use when considering treatment regimens for CAP and use alternative options with lower risk whenever possible.

8.
Front Oncol ; 13: 1274924, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37886166

RESUMO

Purpose: To compare the differences between involved-field irradiation (IFI) and elective nodal irradiation (ENI) in selecting the optimal target area for neoadjuvant chemoradiotherapy (nCRT) in patients with locally advanced esophageal squamous cell carcinoma (LA-ESCC). Materials and methods: We retrospectively analyzed 267 patients with LA-ESCC, of whom 165 underwent ENI and 102 underwent IFI. Dosimetry, treatment-related complications, pathological responses, recurrence/metastasis patterns, and survival were compared between the two groups. Results: The median follow-up duration was 27.9 months. The R0 resection rates in the IFI and ENI groups were 95.1% and 92.7%, respectively (p=0.441), while the pathological complete response (pCR) rates were 42.2% and 34.5%, respectively (p=0.12). The ENI group received higher radiation doses to the heart (HV30:23.9% vs. 18%, p=0.033) and lungs (LV30:7.7% vs. 4.9%, p<0.001) than the IFI group. Consequently, the ENI group showed a higher incidence of grade 2 or higher radiation pneumonitis (30.3% vs. 17.6%, p=0.004) and pericardial effusion (26.7% vs. 11.8%, p=0.021) than the IFI group. Post-operation fistulas were observed in 3 (2.9%) and 17 cases (10.3%) in the IFI and ENI groups, respectively (p=0.026). In the multivariate analysis, smoking, positive lymph node involvement (pN+), and anastomotic fistula were independent predictors of overall survival (OS). The pN+ patients exhibited a greater propensity for recurrence compared to pN- patients, especially in the first year of follow-up (6.67% vs. 0.56%, p=0.003). Conclusion: The ENI group had a higher incidence of radiation-induced adverse events compared to the IFI group, likely due to the higher radiation doses to normal tissues. Considering the similar disease-free survival (DFS) and OS rates in the two groups, IFI may be suitable for nCRT in patients with LA-ESCC, although further prospective studies are warranted.

9.
Ment Health Clin ; 13(4): 183-189, 2023 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-37860586

RESUMO

Introduction: In general, racial and ethnic differences exist in antipsychotic prescription practices. However, little is known about such differences between individual long-acting injectable (LAI) antipsychotic formulations, specifically. This study's primary objective was to determine racial and ethnic differences among LAI antipsychotic use. Secondary objectives were to identify if discontinuation rates differed between agents and by race or ethnicity. Methods: International Classification of Diseases, 10th edition (ICD-10) codes were used to identify patients with schizophrenia and related disorders (18-64 years) who received an LAI antipsychotic between 2016 and 2020 using Merative Multi-State Medicaid databases. Using National Drug Code numbers for LAI antipsychotics, pharmacy claims were identified and data analyzed. Cochran-Mantel-Haenszel tests and odds ratio estimators were used to investigate conditional association between race or ethnicity and medication, while controlling for age, sex, health plan, and prescription year. Kaplan-Meier survival curves were examined, and stratified log-rank tests were conducted to compare the time until discontinuation distributions by race or ethnicity. Results: The analysis included 37 712 patients. Blacks received an LAI first-generation antipsychotic more often than Whites (OR: 1.64, 95% CI: [1.56, 1.73], Hispanics (OR: 1.46, 95% CI: [1.21, 1.75]) and others (OR: 1.44, 95% CI: [1.20, 1.73]). Aside from fluphenazine decanoate showing earlier discontinuation rates for Whites over Blacks (P = .02), no significant differences in discontinuation across race or ethnicity were identified. Discussion: Despite no significant differences in second-generation antipsychotic LAI discontinuation rates between Blacks and other racial or ethnic groups, Blacks received second-generation antipsychotic LAIs significantly less often than other groups. Further studies are needed to determine why differences may be occurring.

10.
Sci Rep ; 13(1): 15562, 2023 09 20.
Artigo em Inglês | MEDLINE | ID: mdl-37730817

RESUMO

Unsupervised clustering of intensive care unit (ICU) medications may identify unique medication clusters (i.e., pharmacophenotypes) in critically ill adults. We performed an unsupervised analysis with Restricted Boltzmann Machine of 991 medications profiles of patients managed in the ICU to explore pharmacophenotypes that correlated with ICU complications (e.g., mechanical ventilation) and patient-centered outcomes (e.g., length of stay, mortality). Six unique pharmacophenotypes were observed, with unique medication profiles and clinically relevant differences in ICU complications and patient-centered outcomes. While pharmacophenotypes 2 and 4 had no statistically significant difference in ICU length of stay, duration of mechanical ventilation, or duration of vasopressor use, their mortality differed significantly (9.0% vs. 21.9%, p < 0.0001). Pharmacophenotype 4 had a mortality rate of 21.9%, compared with the rest of the pharmacophenotypes ranging from 2.5 to 9%. Phenotyping approaches have shown promise in classifying the heterogenous syndromes of critical illness to predict treatment response and guide clinical decision support systems but have never included comprehensive medication information. This first-ever machine learning approach revealed differences among empirically-derived subgroups of ICU patients that are not typically revealed by traditional classifiers. Identification of pharmacophenotypes may enable enhanced decision making to optimize treatment decisions.


Assuntos
Estado Terminal , Sistemas de Apoio a Decisões Clínicas , Adulto , Humanos , Análise por Conglomerados , Vasoconstritores , Cuidados Críticos
11.
Pestic Biochem Physiol ; 195: 105576, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37666602

RESUMO

Herbicide resistance is rapidly emerging in Cyperus difformis in rice fields across China. The response of a C. difformis population GX-35 was tested against five acetolactate synthase (ALS)-inhibiting herbicides, auxin herbicide MCPA and photosynthesis II (PSII)-inhibitor bentazone. Population GX-35 evolved multiple resistance to ALS-inhibiting herbicides (penoxsulam, bispyribac­sodium, pyrazosulfuron-ethyl, halosulfuron-methly and imazapic) and auxin herbicide MCPA, with resistance levels of 140-, 1253-, 578-, 18-, 13-, and 21-fold, respectively, compared to the susceptible population. In this population, ALS gene expression was similar to that of the susceptible population. However, an Asp376Glu mutation in ALS gene was observed, leading to reduced inhibition of in-vitro ALS activities by five ALS-inhibiting herbicides. Furthermore, CYP71D8, CYP77A3, CYP78A5 and three ABC transporter genes (cluster-14412.23067, cluster-14412.25321, and cluster-14412.24716) over-expressed in absence of penoxsulam. On the other hand, an UGT73C1 and an ABC transporter (cluster-14412.25038) were induced by penoxsulam. Additionally, both over-expression and induction were observed for CYP74, CYP71A1, UGT88A1 and an ABC transporter (cluster-14412.21723). The GX-35 population has indeed evolved multiple herbicide resistance in China. Therefore, a diverse range of weed control tactics should be implemented in rice field.


Assuntos
Ácido 2-Metil-4-clorofenoxiacético , Acetolactato Sintase , Cyperus , Herbicidas , Oryza , Oryza/genética , Resistência a Herbicidas/genética , China , Transportadores de Cassetes de Ligação de ATP , Acetolactato Sintase/genética , Herbicidas/farmacologia , Ácidos Indolacéticos
12.
Crit Care Explor ; 5(9): e0956, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37644971

RESUMO

BACKGROUND: The workload of healthcare professionals including physicians and nurses in the ICU has an established relationship to patient outcomes, including mortality, length of stay, and other quality indicators; however, the relationship of critical care pharmacist workload to outcomes has not been rigorously evaluated and determined. The objective of our study is to characterize the relationship of critical care pharmacist workload in the ICU as it relates to patient-centered outcomes of critically ill patients. METHODS: Optimizing Pharmacist Team-Integration for ICU patient Management is a multicenter, observational cohort study with a target enrollment of 20,000 critically ill patients. Participating critical care pharmacists will enroll patients managed in the ICU. Data collection will consist of two observational phases: prospective and retrospective. During the prospective phase, critical care pharmacists will record daily workload data (e.g., census, number of rounding teams). During the retrospective phase, patient demographics, severity of illness, medication regimen complexity, and outcomes will be recorded. The primary outcome is mortality. Multiple methods will be used to explore the primary outcome including multilevel multiple logistic regression with stepwise variable selection to exclude nonsignificant covariates from the final model, supervised and unsupervised machine learning techniques, and Bayesian analysis. RESULTS: Our protocol describes the processes and methods for an observational study in the ICU. CONCLUSIONS: This study seeks to determine the relationship between pharmacist workload, as measured by pharmacist-to-patient ratio and the pharmacist clinical burden index, and patient-centered outcomes, including mortality and length of stay.

13.
Sci Total Environ ; 903: 166182, 2023 Dec 10.
Artigo em Inglês | MEDLINE | ID: mdl-37562614

RESUMO

Due to the nonlinear impacts of meteorology and precursors, the response of ozone (O3) trends to emission changes is very complex over different regions in megacity Beijing. Based on long-term in-situ observations at 35 air quality sites (four categories, i.e., urban, traffic, northern suburban and southern suburban sites) and satellite data, spatiotemporal variability of O3, gaseous precursors, and O3-VOCs-NOx sensitivity were explored through multiple metrics during the warm season from 2013 to 2020. Additionally, the contribution of meteorology and emissions to O3 was separated by a machine-learning-based de-weathered method. The annual averaged MDA8 O3 and O3 increased by 3.7 and 2.9 µg/m3/yr, respectively, with the highest at traffic sites and the lowest in northern suburb, and the rate of Ox (O3 + NO2) was 0.2 µg/m3/yr with the highest in southern suburb, although NO2 declined strongly and HCHO decreased slightly. However, the increment of O3 and Ox in the daytime exhibited decreasing trends to some extent. Additionally, NOx abatements weakened O3 loss through less NO titration, which drove narrowing differences in urban-suburban O3 and Ox. Due to larger decrease of NO2 in urban region and HCHO in northern suburb, the extent of VOCs-limited regime fluctuated over Beijing and northern suburb gradually shifted to transition or NOx-limited regime. Compared with the directly observed trends, the increasing rate of de-weathered O3 was lower, which was attributed to favorable meteorological conditions for O3 generation after 2017, especially in June (the most polluted month); whereas the de-weathered Ox declined except in southern suburb. Overall, clean air actions were effective in reducing the atmospheric oxidation capacity in urban and northern suburban regions, weakening local photochemical production over Beijing and suppressing O3 deterioration in northern suburb. Strengthening VOCs control and keeping NOx abatement, especially in June, will be vital to reverse O3 increase trend in Beijing.

14.
Sci Rep ; 13(1): 10784, 2023 07 04.
Artigo em Inglês | MEDLINE | ID: mdl-37402869

RESUMO

While medication regimen complexity, as measured by a novel medication regimen complexity-intensive care unit (MRC-ICU) score, correlates with baseline severity of illness and mortality, whether the MRC-ICU improves hospital mortality prediction is not known. After characterizing the association between MRC-ICU, severity of illness and hospital mortality we sought to evaluate the incremental benefit of adding MRC-ICU to illness severity-based hospital mortality prediction models. This was a single-center, observational cohort study of adult intensive care units (ICUs). A random sample of 991 adults admitted ≥ 24 h to the ICU from 10/2015 to 10/2020 were included. The logistic regression models for the primary outcome of mortality were assessed via area under the receiver operating characteristic (AUROC). Medication regimen complexity was evaluated daily using the MRC-ICU. This previously validated index is a weighted summation of medications prescribed in the first 24 h of ICU stay [e.g., a patient prescribed insulin (1 point) and vancomycin (3 points) has a MRC-ICU = 4 points]. Baseline demographic features (e.g., age, sex, ICU type) were collected and severity of illness (based on worst values within the first 24 h of ICU admission) was characterized using both the Acute Physiology and Chronic Health Evaluation (APACHE II) and the Sequential Organ Failure Assessment (SOFA) score. Univariate analysis of 991 patients revealed every one-point increase in the average 24-h MRC-ICU score was associated with a 5% increase in hospital mortality [Odds Ratio (OR) 1.05, 95% confidence interval 1.02-1.08, p = 0.002]. The model including MRC-ICU, APACHE II and SOFA had a AUROC for mortality of 0.81 whereas the model including only APACHE-II and SOFA had a AUROC for mortality of 0.76. Medication regimen complexity is associated with increased hospital mortality. A prediction model including medication regimen complexity only modestly improves hospital mortality prediction.


Assuntos
Unidades de Terapia Intensiva , Escores de Disfunção Orgânica , Adulto , Humanos , Índice de Gravidade de Doença , APACHE , Mortalidade Hospitalar , Curva ROC , Estudos Retrospectivos , Prognóstico
15.
Thorac Cancer ; 14(27): 2735-2744, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-37519061

RESUMO

BACKGROUND: The immune system may influence prognosis, and lymphopenia is a frequent side effect of concurrent chemoradiotherapy (CCRT). Radical irradiation for locally advanced esophageal cancer (LA-EC) exposes significant vascular and heart volumes. In this study, we hypothesized that lymphopenia is linked to cardiac and pericardial doses and affects patient prognosis. METHODS AND MATERIALS: We identified 190 LA-EC patients who received radical CCRT. Multivariate analysis (MVA) was performed to correlate clinical factors and dosimetric parameters with overall survival (OS). We collected lymphocyte-related variables and ratios before and during CCRT. MVA was performed to correlate hematologic toxicity with OS. The relationship between dosimetric parameters and G4 lymphopenia was determined using logistic stepwise regression. Finally, a nomogram of G4 lymphopenia was developed and validated externally. RESULTS: Median follow-up time for all patients was 27.5 months. On MVA for OS, higher pericardial V30 (PV30 ) was linked to worse survival (HR: 1.013, 95% CI: 1.001-1.026, p = 0.039). The median OS stratified by PV30 > 55.3% and PV30 ≤ 55.3% was 24.0 months and 54.0 months, respectively (p = 0.004). G4 lymphopenia was shown to be linked with worse OS in the MVA of hematological toxicity with OS (HR: 2.042, 95% CI: 1.335-3.126, p = 0.001). Thirty of the 100 patients in the training set had G4 lymphopenia. Logistic stepwise regression was used to identify variables associated with G4 lymphopenia, and the final model consisted of stage-IVA (p = 0.017), platelet-to-lymphocyte ratio during CCRT (p = 0.008), Heart V50 (p = 0.046), and PV30 (p = 0.048). Finally, a nomogram predicting G4 lymphocytopenia were constructed and externally validated. The ROC curve showed an AUC for internal validation of 0.775 and external validation of 0.843. CONCLUSION: Higher doses of pericardial radiation might affect LA-EC patients' prognosis by inducing G4 lymphopenia during CCRT. Further prospective studies are warranted to confirm these findings, especially in the era of immune-checkpoint inhibitor treatment.


Assuntos
Neoplasias Esofágicas , Linfopenia , Humanos , Prognóstico , Linfopenia/induzido quimicamente , Quimiorradioterapia/efeitos adversos , Quimiorradioterapia/métodos , Pericárdio
16.
Oncology (Williston Park) ; 37(1): 26-33, 2023 01 26.
Artigo em Inglês | MEDLINE | ID: mdl-36724139

RESUMO

BACKGROUND AND PURPOSE: Currently, there is no standard treatment for patients with lung cancer with deteriorated pulmonary function. In this study, we aimed to assess the efficacy of thoracic radiotherapy for unresectable non-small cell lung cancer (NSCLC) with baseline severe pulmonary dysfunction and severe acute radiation pneumonitis (SARP). METHODS: Patients were categorized into a radiotherapy group and a nonradiotherapy group, followed by analysis of clinical variables. A Cox regression was used to evaluate the impact of various factors on overall survival (OS). Each SARP factor's predictive value was assessed using logistic regression, receiver operating characteristic curve, and Kaplan-Meier analyses. RESULTS: The median OS in the radiotherapy group was 21.6 months vs 8.9 months in the nonradiotherapy group. Cox analysis revealed that chemotherapy (HR, 0.221; 95% CI, 0.149-0.329; P < .001) and radiotherapy (HR, 0.589; 95% CI, 0.399-0.869; P = .008) are independent prognostic factors for the current cohort. The data suggested that the ipsilateral lung V10 (ilV10, the percentage of the lung volume that received more than 10 Gy) was an independent predictor of SARP. CONCLUSIONS: Our findings suggested that thoracic radiotherapy might be associated with clinical benefits to inoperable NSCLC in patients with severe pulmonary dysfunction and that ilV10 may be involved in the prediction of risk for SARP in these patients.


Assuntos
Carcinoma Pulmonar de Células não Pequenas , Neoplasias Pulmonares , Pneumonite por Radiação , Humanos , Carcinoma Pulmonar de Células não Pequenas/complicações , Carcinoma Pulmonar de Células não Pequenas/radioterapia , Carcinoma Pulmonar de Células não Pequenas/tratamento farmacológico , Neoplasias Pulmonares/complicações , Neoplasias Pulmonares/radioterapia , Neoplasias Pulmonares/tratamento farmacológico , Pneumonite por Radiação/etiologia , Pulmão , Dosagem Radioterapêutica
17.
Strahlenther Onkol ; 199(7): 645-657, 2023 07.
Artigo em Inglês | MEDLINE | ID: mdl-36484821

RESUMO

PURPOSE: This study evaluated whether antibiotic treatment before chemoradiotherapy influenced outcomes in patients with locally advanced non-small cell lung cancer (LA-NSCLC). METHODS: The records of LA-NSCLC patients treated with chemoradiotherapy between 2010 and 2017 at West China Hospital of Sichuan University were retrospectively examined together with their antibiotic use (antibiotic type, duration of treatment, and time between discontinuation and chemoradiotherapy). The influence of antibiotics on progression-free survival (PFS) and overall survival (OS) was evaluated with Kaplan-Meier curves and univariate and multivariate Cox regression. RESULTS: Of 522 patients, 176 had received intravenous broad-spectrum antibiotics in the month before chemoradiotherapy. Antibiotic use was linked to both reduced PFS (7.9 vs. 13.4 months, p < 0.001) and OS (20.4 vs. 25.3 months, p = 0.049). Multivariate regression demonstrated that antibiotic treatment was an unfavorable independent prognostic factor for LA-NSCLC patients who received chemoradiotherapy (HR 1.234; 95% CI 1.019-1.494; p = 0.031). Prognosis was also influenced by antibiotic type, length of treatment, and interval between discontinuation and chemoradiotherapy initiation. ß­lactamase inhibitors were found to be the most harmful (median OS for ß­lactamase inhibitors/fluoroquinolones/cephalosporins: 16.5/19.9/25.9 months, p = 0.045). Cutoff values for interval and duration calculated by the X­tile procedure showed that intervals of 7-16 days or durations ≤ 6 days did not significantly affect OS relative to untreated patients (intervals: p = 0.9; duration: p = 0.93). CONCLUSION: Antibiotic treatment for longer than 6 days, especially with ß­lactamase inhibitors, was associated with poor prognosis. Furthermore, delaying chemoradiotherapy for 7-16 days after antibiotic discontinuation may reduce these negative effects.


Assuntos
Carcinoma Pulmonar de Células não Pequenas , Neoplasias Pulmonares , Humanos , Carcinoma Pulmonar de Células não Pequenas/radioterapia , Estudos Retrospectivos , Antibacterianos/uso terapêutico , Inibidores de beta-Lactamases/uso terapêutico , Prognóstico , Quimiorradioterapia/métodos
18.
Front Microbiol ; 13: 1032001, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36353460

RESUMO

Bensulfuron methyl (BSM) is a widely used sulfonylurea herbicide in agriculture. However, the large-scale BSM application causes severe environmental problems. Biodegradation is an important way to remove BSM residue. In this study, an endophytic bacterium strain CD3, newly isolated from barnyard grass (Echinochloa crus-galli), could effectively degrade BSM in mineral salt medium. The strain CD3 was identified as Proteus sp. based on the phenotypic features, physiological biochemical characteristics, and 16S rRNA gene sequence. The suitable conditions for BSM degradation by this strain were 20-40°C, pH 6-8, the initial concertation of 12.5-200 mg L-1 with 10 g L-1 glucose as additional carbon source. The endophyte was capable of degrading above 98% BSM within 7 d under the optimal degrading conditions. Furthermore, strain CD3 could also effectively degrade other sulfonylurea herbicides including nicosulfuron, halosulfuron methyl, pyrazosulfuron, and ethoxysulfuron. Extracellular enzyme played a critical role on the BSM degradation by strain CD3. Two degrading metabolites were detected and identified by using liquid chromatography-mass spectrometry (LC-MS). The biochemical degradation pathways of BSM by this endophyte were proposed. The genomic analysis of strain CD3 revealed the presence of putative hydrolase or esterase genes involved in BSM degradation, suggesting that a novel degradation enzyme for BSM was present in this BSM-degrading Proteus sp. CD3. The results of this research suggested that strain CD3 may have potential for using in the bioremediation of BSM-contaminated environment.

19.
Biosens Bioelectron ; 217: 114721, 2022 Dec 01.
Artigo em Inglês | MEDLINE | ID: mdl-36152394

RESUMO

Rapid and sensitive pathogen detection is important for prevention and control of disease. Here, we report a label-free diagnostic platform that combines surface-enhanced Raman scattering (SERS) and machine learning for the rapid and accurate detection of thirteen respiratory virus species including SARS-CoV-2, common human coronaviruses, influenza viruses, and others. Virus detection and measurement have been performed using highly sensitive SiO2 coated silver nanorod array substrates, allowing for detection and identification of their characteristic SERS peaks. Using appropriate spectral processing procedures and machine learning algorithms (MLAs) including support vector machine (SVM), k-nearest neighbor, and random forest, the virus species as well as strains and variants have been differentiated and classified and a differentiation accuracy of >99% has been obtained. Utilizing SVM-based regression, quantitative calibration curves have been constructed to accurately estimate the unknown virus concentrations in buffer and saliva. This study shows that using a combination of SERS, MLA, and regression, it is possible to classify and quantify the virus in saliva, which could aid medical diagnosis and therapeutic intervention.


Assuntos
Técnicas Biossensoriais , COVID-19 , COVID-19/diagnóstico , Humanos , Aprendizado de Máquina , SARS-CoV-2 , Dióxido de Silício , Prata/química , Análise Espectral Raman/métodos
20.
Am J Clin Nutr ; 116(6): 1663-1671, 2022 12 19.
Artigo em Inglês | MEDLINE | ID: mdl-36173384

RESUMO

BACKGROUND: Elevated brain choline is associated with better executive functions in preadolescents. Manipulating dietary choline prospectively in preadolescents using egg supplementation could improve executive functions via effects on brain cellular and neurotransmitter functions. OBJECTIVES: We tested the 9-month impacts of egg supplementation on executive functions. It was hypothesized that preadolescents who consumed meal or snack replacement products containing powder made from whole eggs would have the largest improvements in executive functions after 9 months compared to those consuming similar products with either added milk powder or gelatin as a placebo. METHODS: A randomized, parallel-group, double-blinded, placebo-controlled trial design was used. The executive functions of 122 preadolescents (58 females) aged 9-13 were analyzed before and after the 9-month intervention. The primary outcomes were 3 NIH Toolbox-Cognitive Battery measures of executive function: mental flexibility, working memory, and selective attention and inhibitory control. Participants were randomized to consume food products with either: 1) whole egg powder; 2) milk powder; or 3) gelatin as a placebo, all matched on macronutrient content and used as replacements for commonly consumed foods (i.e., waffles, pancakes, macaroni and cheese, ice cream, and brownies). Hypothesis testing used mixed-effects models that included physical activity and sleep scores as covariates. RESULTS: A statistically significant group × time interaction for selective attention and inhibitory control was found (P = 0.049) for the milk group. This interaction resulted from no change for the placebo group and an improvement in selective attention and inhibitory control performance for the milk group by a T-score of 5.8; the effect size (d) was 0.44 SD units. Other comparisons were statistically insignificant. CONCLUSIONS: Consumption of foods with added milk powder as a replacement for snacks or meals for 9 months improves selective attention and inhibitory control in preadolescents. Replacement of foods with added whole egg powder does not impact 9-month changes in preadolescent executive functions. This trial was registered at clinicaltrials.gov as NCT03739424.


Assuntos
Função Executiva , Lanches , Feminino , Humanos , Animais , Leite , Pós , Gelatina , Refeições , Colina
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...