RESUMO
The misuse and overtreatment of antibiotics in hospitalized patients with community-acquired pneumonia (CAP) can cause multi-drug resistance and worsen clinical outcomes. We aimed to analyze the trends and appropriateness of antibiotic changes in hospitalized patients with CAP and their impact on clinical outcomes. This retrospective study enrolled patients with CAP, aged > 18 years, admitted from January 2017 to December 2021 at Seoul National University Bundang Hospital, South Korea. We examined the pathogens identified, antibiotics prescribed, and the appropriateness of antibiotic changes as reviewed by infectious disease specialists. Antibiotic appropriateness was assessed based on adherence to the 2019 ATS/IDSA guidelines and the 2018 Korean national guidelines for CAP, targeting appropriate pathogens, proper route, dosage, and duration of therapy. Outcomes measured included time to clinical stability (TCS), length of hospital stay, duration of antibiotic treatment, and in-hospital mortality. The study included 436 patients with a mean age of 72.11 years, of whom 35.1% were male. The average duration of antibiotic treatment was 13.5 days. More than 55% of patients experienced at least one antibiotic change, and 21.7% had consecutive changes. Throughout their hospital stay, 273 patients (62.6%) received appropriate antibiotic treatment, while 163 patients (37.4%) received at least one inappropriate antibiotic prescription. Those who received at least one inappropriate prescription experienced longer antibiotic treatment durations and extended hospital stays, despite having similar TCS. In conclusion, inappropriate antibiotic prescribing in hospitalized patients with CAP is associated with prolonged antibiotic treatment and increased length of stay. Emphasizing the appropriate initial antibiotic selection may help mitigate these negative effects.
Assuntos
Antibacterianos , Infecções Comunitárias Adquiridas , Tempo de Internação , Pneumonia , Humanos , Infecções Comunitárias Adquiridas/tratamento farmacológico , Masculino , Feminino , Antibacterianos/uso terapêutico , Idoso , Estudos Retrospectivos , Pneumonia/tratamento farmacológico , Pessoa de Meia-Idade , República da Coreia , Idoso de 80 Anos ou mais , Mortalidade Hospitalar , HospitalizaçãoRESUMO
OBJECTIVE: To explain the relationship between cartilage erosion and medial patellar luxation (MPL) and to identify risk factors in dogs. METHODS: A retrospective review was conducted on 90 dogs (103 stifles) surgically treated for MPL between January 2006 and March 2024. Data collected included signalment, side of operated stifle, patellar luxation grade, symptom duration, and lameness score. Cartilage erosion was evaluated for extent and location on the patella and femoral trochlea. Statistical analyses were conducted to identify risk factors. RESULTS: The prevalence of cartilage erosion of the patella and femoral trochlea was 47.6% (49/103) and 54.4% (56/103), respectively, increasing with a higher grade of patellar luxation. Lesions were most prevalent in the distolateral patella and proximomedial trochlea, with generalized lesions more prevalent in grade IV. The extent of both lesions was significantly associated with age, patellar luxation grade, and symptom duration, while body weight significantly correlated only with the cartilage erosion of the patella. No significant correlation was observed with sex, side of operated stifle, or lameness score. CONCLUSIONS: Many patients with MPL exhibited cartilage erosion in the patellofemoral joint, likely due to biomechanical mechanisms. Surgery can be indicated for patients with MPL, as it may prevent cartilage erosion while improving patellofemoral alignment and gait. When selecting surgical candidates, it is important to consider risk factors, such as patellar luxation grade, body weight, age, and symptom duration. CLINICAL RELEVANCE: Early surgical treatment is recommended, especially for dogs with higher body weight and higher grade of MPL, to prevent cartilage erosion and secondary osteoarthritis.
RESUMO
Purpose: Older patients have a higher risk of aspiration pneumonia and mortality if they are hospitalized. We aimed to assess the effectiveness of an aspiration prevention quality improvement (QI) program that utilizes the Gugging Swallowing Screen (GUSS) in older patients. Patients and Methods: This retrospective cohort study was conducted in an acute medical care unit of a tertiary hospital in South Korea. The study used one-to-one propensity matching and included 96 patients who received the QI program and 96 who did not. All patients were aged 65 years or older and had risk factors for aspiration, including neurological and non-neurological disorders, neuromuscular disorders, impaired airway defenses, and dysphagia due to esophageal or gastrointestinal disorders. The primary outcomes included the duration of the fasting period during hospitalization, changes in nutritional status before admission and at discharge, in-hospital mortality, and readmission due to pneumonia within 90 days. Results: Fasting period, changes in weight and albumin levels upon discharge after hospitalization, and length of stay did not differ significantly between patients in the GUSS and non-GUSS groups. However, the risk of readmission within 90 days was significantly lower in patients who underwent the GUSS than in those who did not (hazard ratio, 0.085; 95% confidence interval, 0.025-0.290; p = 0.001). Conclusion: The GUSS aspiration prevention program effectively prevented readmission due to pneumonia within 90 days in older patients with acute illnesses. This implies that the adoption of efficient aspiration prevention methods in older patients with acute illnesses could play a pivotal role by enhancing patient outcomes and potentially mitigating the healthcare costs linked to readmissions.
Assuntos
Transtornos de Deglutição , Readmissão do Paciente , Pneumonia Aspirativa , Melhoria de Qualidade , Humanos , Masculino , Feminino , Idoso , Estudos Retrospectivos , Pneumonia Aspirativa/prevenção & controle , República da Coreia , Readmissão do Paciente/estatística & dados numéricos , Idoso de 80 Anos ou mais , Transtornos de Deglutição/prevenção & controle , Fatores de Risco , Mortalidade Hospitalar , Deglutição , Hospitalização , Estado Nutricional , Tempo de Internação , Pontuação de Propensão , JejumRESUMO
Obesity often leads to inadequate angiogenesis in expanding adipose tissue, resulting in inflammation and insulin resistance. We explored the role of placental growth factor (PlGF) in metabolic syndrome (MS) using mice models of type 2 diabetes, high-fat diet, or aging. Reduced serum PlGF levels were associated with decreased insulin sensitivity and development of MS features. PlGF was localized within endothelial cells and pericytes of adipose tissue. In vitro, low PlGF levels in hypoxic conditions worsened oxidative stress, apoptosis, and reduced autophagy. This was associated with a reduction in expression of vascular endothelial growth factor (VEGF)-A/VEGF-R1/-R2, which was influenced by a decrease and increase in PlGF/pAMPK/PI3K-pAkt/PLCγ1-iCa++/eNOS and PTEN/GSK3ß axes, respectively. PlGF-knockout mice exhibited MS traits through alterations in the same signaling pathways, and these changes were mitigated by recombinant PlGF and metformin. These enhanced angiogenesis and lipid metabolism, underscoring PlGF's role in age-related MS and its potential as a therapeutic target.
RESUMO
Background: Transitional medication safety is crucial, as miscommunication about medication changes can lead to significant risks. Unclear or incomplete documentation during care transitions can result in outdated or incorrect medication lists at discharge, potentially causing medication errors, adverse drug events, and inadequate patient education. These issues are exacerbated by extended hospital stays and multiple care events, making accurate medication recall challenging at discharge. Objective: Thus, we aimed to investigate how real-time documentation of in-hospital medication changes prevents undocumented medication changes at discharge and improves physician-pharmacist communication. Methods: We conducted a retrospective cohort study in a tertiary hospital. Two pharmacists reviewed medical records of patients admitted to the acute medical unit from April to June 2020. In-hospital medication discrepancies were determined by comparing preadmission and hospitalization medication lists and it was verified whether the physician's intent of medication changes was clarified by documentation. By a documentation rate of medication changes of 100% and <100%, respectively, fully documented (FD) and partially documented (PD) groups were defined. Any undocumented medication changes at discharge were considered a "documentation error at discharge". Pharmacists' survey was conducted to assess the impact of appropriate documentation on the pharmacists. Results: After reviewing 400 medication records, patients were categorized into FD (61.3%) and PD (38.8%) groups. Documentation errors at discharge were significantly higher in the PD than in the FD group. Factors associated with documentation errors at discharge included belonging to the PD group, discharge from a non-hospitalist-managed ward, and having three or more intentional discrepancies. Pharmacists showed favorable attitudes towards physician's documentation. Conclusion: Appropriate documentation of in-hospital medication changes, facilitated by free-text communication, significantly decreased documentation errors at discharge. This analysis underlines the importance of communication between pharmacists and hospitalists in improving patient safety during transitions of care.
During transitions of care, communication failures among healthcare professionals can lead to medication errors. Therefore, effective sharing of information is essential, especially when intentional changes in prescription orders are made. Documenting medication changes facilitates real-time communication, potentially improving medication reconciliation and reducing discrepancies. However, inadequate documentation of medication changes is common in clinical practice. This retrospective cohort study underlines the importance of real-time documentation of in-hospital medication changes. There was a significant reduction in documentation errors at discharge in fully documented group, where real-time documentation of medication changes was more prevalent. Pharmacists showed favorable attitudes toward the physician's real-time documenting of medication changes because it provided valuable information on understanding the physician's intent and improving communication and also saved time for pharmacists. This study concludes that physicians' documentation on medication changes may reduce documentation errors at discharge, meaning that proper documentation of medication changes could enhance patient safety through effective communication.
RESUMO
OBJECTIVES: To determine the risk factors for mortality in Korean patients with rheumatoid arthritis (RA)-associated interstitial lung disease (ILD) in comparison to patients with RA but without ILD (RA-nonILD). METHODS: Data were extracted from a single-centre prospective cohort of RA patients with a chest computed tomography scan at an academic referral hospital in Korea. Patients with RA-ILD enroled between May 2017 and August 2022 were selected, and those without ILD were selected as comparators. The mortality rate was calculated, and the causes of each death were investigated. We used Cox proportional hazard regression with Firth's penalised likelihood method to identify the risk factors for mortality in patients with RA-ILD. RESULTS: A total of 615 RA patients were included: 200 with ILD and 415 without ILD. In the RA-ILD group, there were 15 deaths over 540.1 person-years (PYs), resulting in mortality rate of 2.78/100 PYs. No deaths were reported in the RA-nonILD group during the 1669.9 PYs. The primary causes of death were infection (nine cases) and lung cancer (five cases), with only one death attributed to ILD aggravation. High RA activity (adjusted HR 1.87, CI 1.16-3.10), baseline diffusing capacity for carbon monoxide (DLCO) < 60% (adjusted HR 4.88, 95% CI 1.11-45.94), and usual interstitial pneumonia (UIP) pattern (adjusted HR 5.13, 95% CI 1.00-57.36) were identified as risk factors for mortality in RA-ILD patients. CONCLUSION: Patients with RA-ILD have an elevated risk of mortality compared with those without ILD. Infection-related deaths are the main causes of mortality in this population. High RA activity, low DLCO, and the UIP pattern are significantly associated with the mortality in patients with RA-ILD.
Assuntos
Artrite Reumatoide , Doenças Pulmonares Intersticiais , Humanos , Doenças Pulmonares Intersticiais/mortalidade , Artrite Reumatoide/mortalidade , Artrite Reumatoide/complicações , Masculino , Feminino , Pessoa de Meia-Idade , Fatores de Risco , Estudos Prospectivos , Idoso , República da Coreia/epidemiologia , Estudos de Coortes , AdultoRESUMO
BACKGROUND: Naturalistic Developmental Behavioral Interventions (NDBIs) for young children with autism spectrum disorder commonly involve caregiver-mediated approaches. However, to date, there is limited research on how caregivers' skills change, and, in turn, impact child outcomes. METHODS: We evaluated the NDBI strategy use of 191 caregivers prior to participation in NDBIs (or control groups) across multiple randomized controlled trials, using the Measure of NDBI Strategy Implementation, Caregiver Change (MONSI-CC). Clustering analyses were used to examine caregiver variability in NDBI strategy use at intervention entry. Generalized Linear Mixed Models were used to examine changes in caregiver strategy use over the course of intervention and its impact on changes in children's social communication. RESULTS: Using clustering analysis, we found that caregivers' baseline skills fit four profiles: limited, emerging, variable, and consistent/high, with few demographic factors distinguishing these groups. Caregivers starting with limited or emerging skills improved in their strategy use with intervention. Caregivers starting with more skills (consistent/high or variable) maintained higher skills over intervention. Children of caregivers in these groups who received target NDBIs improved in their social communication skills. CONCLUSIONS: Results suggested that caregiver skills improve through participation in NDBIs and may directly contribute to their children's outcomes, although more research on mediating factors is needed. Individualized approaches for caregivers and their children starting with differing skill profiles at intervention entry may be warranted.
RESUMO
BACKGROUND: To evaluate the effectiveness of Korean Red Ginseng (KRG) in managing fatigue in Korean patients with rheumatic diseases. METHODS: Patients were randomly assigned to KRG (2 g/day, n = 60) or placebo (n = 60) groups for 12 weeks of blind phase and then open-label KRG from weeks 12 to 24 (placebo-KRG, continuous-KRG). The primary outcome was the improvement rate in fatigue, defined by an increase in Functional Assessment of Chronic Illness Therapy (FACIT)-Fatigue scores at 12 weeks. Secondary outcomes included changes in FACIT-Fatigue and fatigue visual analog scale (VAS) between 0 and 12 weeks and those changes in both indices at 24 weeks. RESULTS: The study enrolled 120 patients (Sjogren syndrome [n = 53], rheumatoid arthritis [n = 43], or both diseases [n = 24]). The mean age was 50.9 ± 11.6 years, with 97.5% being female. Baseline characteristics were similar between the two groups. The improvement rate in FACIT-Fatigue after 12 weeks was higher in the KRG group than in the placebo group, but the difference was statistically insignificant (38.3% vs. 26.7%, p = 0.242). Improvement in fatigue was observed in both groups by increases in FACIT-F (4.6 vs. 4.0) and reductions in fatigue VAS (-16.0 vs. -12.2) scores at 12 weeks. The most frequently reported adverse events during KRG use were pruritus and urticarial, with no significant difference between the two groups. CONCLUSION: Both KRG and placebo groups showed significant reductions in fatigue. KRG treatment for 24 weeks did not reduce fatigue symptoms more than the placebo in patients with rheumatic diseases.
Assuntos
Fadiga , Panax , Humanos , Feminino , Masculino , Fadiga/tratamento farmacológico , Fadiga/etiologia , Fadiga/diagnóstico , Fadiga/fisiopatologia , Pessoa de Meia-Idade , Método Duplo-Cego , Adulto , Resultado do Tratamento , República da Coreia , Fatores de Tempo , Fitoterapia , Extratos Vegetais/uso terapêutico , Extratos Vegetais/efeitos adversos , Artrite Reumatoide/tratamento farmacológico , Artrite Reumatoide/complicações , Artrite Reumatoide/fisiopatologia , Síndrome de Sjogren/complicações , Síndrome de Sjogren/tratamento farmacológico , Síndrome de Sjogren/diagnóstico , Síndrome de Sjogren/fisiopatologia , Doenças Reumáticas/tratamento farmacológico , Doenças Reumáticas/complicações , IdosoRESUMO
One of the most challenging elements of modeling the behaviour of reinforced concrete (RC) walls is combining realistic material models that can capture the observable behaviour of the physical system. Experiments with realistic loading rates and pressures reveal that steel and concrete display complicated nonlinear behaviour that is challenging to represent in a single constitutive model. To investigate the response of a reinforced concrete structure subjected to dynamic loads, this paper's study is based on many different material models to assess the advantages and disadvantages of the models on 2D and 3D RC walls using the LS-DYNA program. The models consisted of the KCC model and the CDP model, which represented plasticity and distinct tensile/compressive damage models, and the Winfrith model, which represented plasticity and the smeared crack model. Subsequently, the models' performances were assessed by comparing them to experimental data from reinforced concrete structures, in order to validate the accuracy of the overall behaviour prediction. The Winfrith model demonstrated satisfactory results in predicting the behaviour of 2D and 3D walls, including maximum strength, stiffness deterioration, and energy dissipation. The method accurately predicted the maximum strength of the Winfrith concrete model for the 2D wall with an error of 9.24% and for the 3D wall with errors of 3.28% in the X direction and 5.02% in the Y direction. The Winfrith model demonstrated higher precision in predicting dissipation energy for the 3D wall in both the X and Y directions, with errors of 6.84% and 6.62%, correspondingly. Additional parametric analyses were carried out to investigate structural behaviour, taking into account variables such as concrete strength, strain rate, mesh size, and the influence of the element type.
RESUMO
BACKGROUND/AIMS: This cross-sectional study aimed to investigate biologics treatment disparities in rheumatoid arthritis (RA) patients based on socioeconomic status (SES). METHODS: Data from the KOrean Observational Study Network for Arthritis (KORONA) database were analyzed to assess various factors associated with SES, health behaviors, and biologics use. Logistic regression and structured equation modeling (SEM) were utilized for data analysis. RESULTS: Among 5,077 RA patients included, 393 (7.7%) patients were identified as biologics users. Within the entire cohort, 31.8% of the participants were in the low-income and low-education groups, and 39.3% of the participants were in the high-income and high-education groups. Despite the patients with low income or low education experienced higher disease activity at diagnosis, had more comorbidities, exhibited higher medication compliance, underwent more check-ups, and had more hospital admissions than their counterparts, the odds of patients with low-income receiving biologics were 34% lower (adjusted odds ratio = 0.76, 95% confidence interval: 0.60-0.96, p = 0.021) after adjustment for demographics and comorbidities. SEM and pathway analyses confirmed the negative impact of low SES on biologics use. CONCLUSION: The findings suggest that SES plays a significant role in biologics use among RA patients, indicating potential healthcare inefficiencies for low SES patients. Moreover, adverse healthcare habits negatively affect biologics use in RA patients. The study highlights the importance of considering socioeconomic factors while discussing biologics use and promoting equitable access to biologics for optimal RA management.
Assuntos
Antirreumáticos , Artrite Reumatoide , Produtos Biológicos , Disparidades em Assistência à Saúde , Humanos , Artrite Reumatoide/tratamento farmacológico , Artrite Reumatoide/diagnóstico , Masculino , Feminino , Pessoa de Meia-Idade , Produtos Biológicos/uso terapêutico , Estudos Transversais , República da Coreia/epidemiologia , Idoso , Antirreumáticos/uso terapêutico , Adulto , Bases de Dados Factuais , Classe SocialRESUMO
OBJECTIVE: To assess the effectiveness of tofacitinib vs tumour necrosis factor inhibitors (TNFi) in Korean patients with rheumatoid arthritis (RA). METHODS: The study used data from a single academic referral hospital's registries of biologic disease-modifying anti-rheumatic drugs (bDMARDs) and tofacitinib and examined remission rates based on the disease activity score (DAS)28-erythrocyte sedimentation rate (ESR) after 12 months. Multivariable logistic regression analysis was used to estimate the odds ratio (OR) for achieving remission with tofacitinib compared with TNFi, adjusting for potential confounders. RESULTS: This analysis included 665 patients (200 on tofacitinib and 455 on TNFi) who were followed up for at least 12 months. Of these, 96 patients in the tofacitinib group (48.0%) and 409 patients in the TNFi group (89.9%) were treatment-naïve to bDMARDs. Intention-to-treat analysis revealed no significant difference in the remission rates between the two groups (18.0% vs 19.6%, p = 0.640). Multivariable analysis demonstrated comparable remission rates with tofacitinib and TNFi (OR 1.204, 95% confidence interval [CI] 0.720-2.013). In the subpopulation naïve to JAKi and bDMARD, tofacitinib showed better remission rates than TNFi (OR 1.867, 95% CI 1.033-3.377). Tofacitinib had more adverse events (AEs) but similar rates of serious AEs (SAEs) to TNFi. CONCLUSION: In real-world settings, there was no significant difference in remission rates at 12 months between the tofacitinib and TNFi groups. In terms of safety, tofacitinib exhibited a higher incidence of AEs compared with TNFi, while the occurrence of SAEs was comparable between the groups. CLINICAL TRIAL REGISTRATION: ClinicalTrials.gov, NCT02602704.
RESUMO
BACKGROUND: Measurement of sodium intake in hospitalized patients is critical for their care. In this study, artificial intelligence (AI)-based imaging was performed to determine sodium intake in these patients. OBJECTIVE: The applicability of a diet management system was evaluated using AI-based imaging to assess the sodium content of diets prescribed for hospitalized patients. METHODS: Based on the information on the already investigated nutrients and quantity of food, consumed sodium was analyzed through photographs obtained before and after a meal. We used a hybrid model that first leveraged the capabilities of the You Only Look Once, version 4 (YOLOv4) architecture for the detection of food and dish areas in images. Following this initial detection, 2 distinct approaches were adopted for further classification: a custom ResNet-101 model and a hyperspectral imaging-based technique. These methodologies focused on accurate classification and estimation of the food quantity and sodium amount, respectively. The 24-hour urine sodium (UNa) value was measured as a reference for evaluating the sodium intake. RESULTS: Results were analyzed using complete data from 25 participants out of the total 54 enrolled individuals. The median sodium intake calculated by the AI algorithm (AI-Na) was determined to be 2022.7 mg per day/person (adjusted by administered fluids). A significant correlation was observed between AI-Na and 24-hour UNa, while there was a notable disparity between them. A regression analysis, considering patient characteristics (eg, gender, age, renal function, the use of diuretics, and administered fluids) yielded a formula accounting for the interaction between AI-Na and 24-hour UNa. Consequently, it was concluded that AI-Na holds clinical significance in estimating salt intake for hospitalized patients using images without the need for 24-hour UNa measurements. The degree of correlation between AI-Na and 24-hour UNa was found to vary depending on the use of diuretics. CONCLUSIONS: This study highlights the potential of AI-based imaging for determining sodium intake in hospitalized patients.
RESUMO
[This corrects the article on p. 151 in vol. 30, PMID: 37476674.].
RESUMO
Eruca sativa is a commonly used edible plant in Italian cuisine. E. sativa 70% ethanol extract (ES) was fractionated with five organic solvents, including n-hexane (EHex), chloroform (ECHCl3), ethyl acetate (EEA), n-butyl alcohol (EBuOH), and water (EDW). Ethyl acetate fraction (EEA) had the highest antioxidant activity, which was correlated with the total polyphenol and flavonoid content. ES and EEA acted as PPAR-α ligands by PPAR-α competitive binding assay. EEA significantly increased cornified envelope formation as a keratinocyte terminal differentiation marker in HaCaT cells. Further, it significantly reduced nitric oxide and pro-inflammatory cytokines (IL-6 and TNF-α) in lipopolysaccharide-stimulated RAW 264.7 cells. The main flavonol forms detected in high amounts from EEA are mono-and di-glycoside of each aglycone. The main flavonol form of EEA is the mono-glycoside of each aglycone detected, and the most abundant flavonol mono-glycoside is kaempferol 3-glucoside 7.4%, followed by quercetin-3-glucoside 2.3% and isorhamnetin 3-glucoside 1.4%. Flavonol mono-glycosides were shown to be a potent PPAR-α ligand using molecular docking simulation and showed the inhibition of nitric oxide. These results suggest that the flavonol composition of E. sativa is suitable for use in improving skin barrier function and inflammation in skin disorders, such as atopic dermatitis.
RESUMO
Recently, the azepino[4,3-b]indole-1-one derivative 1 showed in vitro nanomolar inhibition against butyrylcholinesterase (BChE), the ChE isoform that plays a role in the progression and pathophysiology of Alzheimer's disease (AD), and protects against N-methyl- d-aspartate-induced neuronal toxicity. Three 9-R-substituted (R = F, Br, OMe) congeners were investigated. The 9-F derivative (2a) was found more potent as BChE inhibitors (half-maximal inhibitory concentration value = 21 nM) than 2b (9-Br) and 2c (9-OMe), achieving a residence time (38 s), assessed by surface plasmon resonance, threefold higher than that of 1. To progress in featuring the in vivo pharmacological characterization of 2a, herein the 18 F-labeled congener 2a was synthesized, by applying the aromatic 18 F-fluorination method, and its whole-body distribution in healthy mice, including brain penetration, was evaluated through positron emission tomography imaging. [18 F]2a exhibited a rapid and high brain uptake (3.35 ± 0.26% ID g-1 at 0.95 ± 0.15 min after injection), followed by a rapid clearance (t1/2 = 6.50 ± 0.93 min), showing good blood-brain barrier crossing. After a transient liver accumulation of [18 F]2a, the intestinal and urinary excretion was quantified. Finally, ex vivo pharmacological experiments in mice showed that the unlabeled 2a affects the transmitters' neurochemistry, which might be favorable to reverse cognition impairment in mild-to-moderate AD-related dementias.
Assuntos
Doença de Alzheimer , Animais , Camundongos , Doença de Alzheimer/tratamento farmacológico , Butirilcolinesterase , Relação Estrutura-Atividade , Transporte Biológico , IndóisRESUMO
BACKGROUND: The hospitalist system has been introduced to improve the quality and safety of inpatient care. As its effectiveness has been confirmed in previous studies, the hospitalist system is spreading in various fields. However, few studies have investigated the feasibility and value of hospitalist-led care of patients with cancer in terms of quality and safety measures. This study aimed to evaluate the efficacy of the Hospitalist-Oncologist co-ManagemEnt (HOME) system. METHODS: Between January 1, 2019, and January 31, 2021, we analyzed 591 admissions before and 1068 admissions after the introduction of HOME system on January 1, 2020. We compared the length of stay and the types and frequencies of safety events between the conventional system and the HOME system, retrospectively. We also investigate rapid response system activation, cardiopulmonary resuscitation, unplanned intensive care unit transfer, all-cause in-hospital mortality, and 30-day re-admission or emergency department visits. RESULTS: The average length of stay (15.9 days vs. 12.9 days, P < 0.001), frequency of safety events (5.6% vs. 2.8%, P = 0.006), rapid response system activation (7.3% vs. 2.2%, P < 0.001) were significantly reduced after the HOME system introduction. However, there was no statistical difference in frequencies of cardiopulomonary resuscitation and intensive care unit transfer, all-cause in-hospital morality, 30-day unplanned re-admission or emergency department visits. CONCLUSIONS: The study suggests that the HOME system provides higher quality of care and safer environment compared to conventional oncologist-led team-based care, and the efficiency of the medical delivery system could be increased by reducing the hospitalization period without increase in 30-day unplanned re-admission.
Assuntos
Médicos Hospitalares , Neoplasias , Humanos , Tempo de Internação , Readmissão do Paciente , Estudos Retrospectivos , Hospitalização , Neoplasias/terapiaRESUMO
Introduction: Alicyclobacillus has been isolated from extreme environments such as hot springs, volcanoes, as well as pasteurized acidic beverages, because it can tolerate extreme temperatures and acidity. In our previous study, Alicyclobacillus was isolated during the enrichment of methane oxidizing bacteria from Yellowstone Hot Spring samples. Methods: Physiological characterization and genomic exploration of two new Alicyclobacillus isolates, AL01A and AL05G, are the main focus of this study to identify their potential relationships with a thermoacidophilic methanotroph (Methylacidiphilum) isolated from the same hot spring sediments. Results and discussion: In the present study, both Alicyclobacillus isolates showed optimal growth at pH 3.5 and 55°C, and contain ω-alicyclic fatty acids as a major lipid (ca. 60%) in the bacterial membrane. Genomic analysis of these strains revealed specific genes and pathways that the methanotroph genome does not have in the intermediary carbon metabolism pathway such as serC (phosphoserine aminotransferase), comA (phosphosulfolactate synthase), and DAK (glycerone kinase). Both Alicyclobacillus strains were also found to contain transporter systems for extracellular sulfate (ABC transporter), suggesting that they could play an important role in sulfur metabolism in this extreme environment. Genomic analysis of vitamin metabolism revealed Alicyclobacillus and Methylacidiphilum are able to complement each other's nutritional deficiencies, resulting in a mutually beneficial relationship, especially in vitamin B1(thiamin), B3 (niacin), and B7 (biotin) metabolism. These findings provide insights into the role of Alicyclobacillus isolates in geothermal environments and their unique metabolic adaptations to these environments.
RESUMO
BACKGROUND: This study aimed to determine whether serum uric acid (SUA) levels are associated with various indices of liver damage in the adult Korean population. METHODS: We used the Seventh (VII) Korean National Health and Nutritional Examination Surveys. Our study population comprised 6,007 men and 8,488 women. Levels of SUA were divided into four groups (≤ 5.3, 5.3-6.0, 6.0-7.0, and > 7.0 mg/dL for men and ≤ 4.0, 4.0-4.8, 4.8-6.0, and > 6.0 mg/dL for women). Elevated liver enzyme levels were defined as > 35 (men) and > 31 (women) IU/L for aspartate aminotransferase (AST), > 45 (men) and > 34 (women) IU/L for alanine aminotransferase (ALT). Hepatic steatosis index and fibrosis (FIB)-4 index was used to determine nonalcoholic fatty liver disease (NAFLD) and liver FIB, respectively. Adjusted odds ratios (aORs) were calculated by logistic regression analysis for liver enzymes, NAFLD, and liver FIB, according to the SUA level. RESULTS: Among women, the 4.8-6.0 and > 6.0 mg/dL SUA groups showed higher ORs of elevated AST (aOR, 1.78 and 2.03; 95% confidence interval [CI], 1.37-2.32 and 1.40-2.96, respectively; P < 0.001) and the 4.0-4.8, 4.8-6.0, and > 6.0 mg/dL SUA groups showed a higher ORs of ALT elevation (aOR, 1.35, 2.26, and 2.37; 95% CI, 1.02-1.79, 1.72-2.97, and 1.60-3.50, respectively; P < 0.001) compared to the lowest level SUA group. Among women with normal ALT, > 6.0 mg/dL SUA group showed higher OR of NAFLD status (aOR, 1.52; 95% CI, 1.06-2.19). Among men and women with NAFLD, hyperuricemia showed higher ORs of liver FIB (aOR, 2.25 and 1.89; 95% CI, 1.21-4.19 and 1.09-3.27, respectively) than the lowest level SUA group. CONCLUSION: High SUA levels may be associated with elevated liver enzymes and NAFLD, mainly in women. Even in women with normal ALT levels, SUA levels may predict the NAFLD status. Hyperuricemia may predict advanced liver FIB in both men and women with NAFLD. Further studies investigating the causal effects of SUA on liver damage are required.
Assuntos
Hiperuricemia , Hepatopatia Gordurosa não Alcoólica , Adulto , Masculino , Humanos , Feminino , Hepatopatia Gordurosa não Alcoólica/diagnóstico , Ácido Úrico , Estudos Transversais , Hiperuricemia/diagnóstico , Hiperuricemia/epidemiologia , Cirrose Hepática/diagnóstico , Cirrose Hepática/etiologia , República da Coreia/epidemiologiaRESUMO
PURPOSE: This study aimed to evaluate the use of active surgical co-management (SCM) by medical hospitalists for urology inpatient care. MATERIALS AND METHODS: Since March 2019, a hospitalist-SCM program was implemented at a tertiary-care medical center, and a retrospective cohort study was conducted among co-managed urology inpatients. We assessed the clinical outcomes of urology inpatients who received SCM and compared passive SCM (co-management of patients by hospitalists only on request; March 2019 to June 2020) with active SCM (co-management of patients based on active screening by hospitalists; July 2020 to October 2021). We also evaluated the perceptions of patients who received SCM toward inpatient care quality, safety, and subjective satisfaction with inpatient care at discharge or when transferred to other wards. RESULTS: We assessed 525 patients. Compared with the passive SCM group (n=205), patients in the active SCM group (n=320) required co-management for a significantly shorter duration (p=0.012) and tended to have a shorter length of stay at the urology ward (p=0.062) and less frequent unplanned readmissions within 30 days of discharge (p=0.095) while triggering significantly fewer events of rapid response team activation (p=0.002). No differences were found in the proportion of patients transferred to the intensive care unit, in-hospital mortality rates, or inpatient care questionnaire scores. CONCLUSION: Active surveillance and co-management of urology inpatients by medical hospitalists can improve the quality and efficacy of inpatient care without compromising subjective inpatient satisfaction.