RESUMO
Background: Low serum prealbumin levels have been identified as a predictor of infectious complication in critically ill patients. However, the association in patients with Community-acquired bacterial meningitis (CABM) remains unclear. The aim of this study is to investigate the relationship of prealbumin and the poor outcome of CABM through a retrospective cohort study. Methods: A total of 77 patients of CABM were enrolled. They were divided into good outcome group (GOS: 5) and a bad outcome group (GOS: 1-4). Serum prealbumin and other clinical records were measured within 24 h after admission. Results: Among the included patients, 38(65.52%) had a bad outcome (the GOS score between 1 and 4). The mean age of the overall cohort was 45.3 ± 15.9 years, and 58.6% of patients were male. The mean prealbumin level in the bad outcome group was 115.4 ± 49.4 mmol/L, while the mean level in the good outcome group was 199.1 ± 49.3 mmol/L (p < 0.001). Individuals with plasma prealbumin level ≤180 mmol/L had a 3.32-fold higher risk of CABM than those with normal plasma prealbumin level [OR = 4.32 (1.02 ~ 18.24), p < 0.05]. Conclusion: Reduced plasma prealbumin level is independently associated with the poor outcome of CABM. Plasma prealbumin level might help to identify patients at high risk of bad outcome.
RESUMO
BACKGROUND: In the context of an outbreak of HIV among people who inject drugs in Glasgow, Scotland, identified in 2015, our objectives were to: (1) develop epidemiological methods to estimate HIV incidence using data linkage, and (2) examine temporal changes in HIV incidence to inform public health responses. METHODS: This was a retrospective cohort study involving data linkage of laboratory HIV testing data to identify individuals with a history of drug use. Person-years (PY) and Poisson regression were used to estimate incidence by time period (pre-outbreak: 2000-2010 and 2011-2013; early outbreak: 2014-2016; ongoing outbreak: 2017-2019). RESULTS: Among 13 484 individuals tested for HIV, 144 incident HIV infections were observed from 2000 to 2019. Incidence rates increased from pre-outbreak periods (1.00/1000 PY (95% confidence interval, CI: 0.60-1.65) in 2000-2010 and 1.70/1000 PY (95% CI: 1.14-2.54) in 2011-2013) to 3.02/1000 PY (95% CI: 2.36-3.86) early outbreak (2014-2016) and 2.35 (95% CI 1.74-3.18) during the ongoing outbreak period (2017-2019). Compared with the pre-outbreak period (2000-2010), the incidence rates were significantly elevated during both the early outbreak (2014-16) (adjusted incidence rate ratio (aIRR) = 2.87, 95% CI: 1.62-5.09, p < 0.001) and the ongoing outbreak periods (2017-19) (aIRR = 2.12, 95% CI: 1.16-3.90, p = 0.015). CONCLUSIONS: Public health responses helped to curb the rising incidence of HIV infection among people with a history of drug use in Glasgow, but further efforts are needed to reduce it to levels observed prior to the outbreak. Data linkage of routine diagnostic test data to assess and monitor incidence of HIV infection provided enhanced surveillance, which is important to inform outbreak investigations and guide national strategies on elimination of HIV transmission.
RESUMO
BACKGROUND: Most studies that explore the long-term effects of COVID-19 are based on subjectively reported symptoms, while laboratory-measured biomarkers are mainly examined in studies of relatively small cohorts. This study investigates the long-term effects of SARS-CoV-2 infection on common laboratory biomarkers. METHODS: We utilized a retrospective cohort of SARS-CoV-2 infected individuals and rigorously matched controls based on demographic and clinical characteristics, examining 63 common laboratory biomarkers. Additional lab-specific cohorts were matched with an additional criterion of baseline biomarker values. Differences in biomarkers over a 12-month follow-up were analyzed using standardized mean difference-in-differences. RESULTS: The general cohort included 361,061 matched pairs, with 26M laboratory results. The effects on most biomarkers lasted 1-4 months and were consistent with anticipated changes after acute viral infections. Some biomarkers presented prolonged effects, consistent across the general and lab-specific cohorts. One group of such findings included a 7-8 month decrease in WBC counts, mainly driven by decreased counts of neutrophils, monocytes, and basophils. Potassium levels were decreased for 3-5 months. Vaccinated individuals' data suggested potentially smaller effects on WBCs, but cohort sizes limited this analysis. CONCLUSIONS: This study explores SARS-CoV-2 infection effects on common laboratory biomarkers, characterizing the direction and duration of these effects on the largest infected cohort to date. The effects of most biomarkers resolve in the first months following infection. The most notable longer-lasting effects involved the immune system. Further research is required to characterize the magnitude of these effects among specific individuals.
RESUMO
Injury recurrence in young children is a significant public health concern, as it may indicate an unfavorable home environment. This study evaluates whether infantile injuries increase recurrence during preschool years, contributing to more effective prevention strategies for vulnerable families. The study included 20,191 children from "The Longitudinal Survey of Babies in the 21st Century," a representative sample of infants born in Japan between May 10 and 24, 2010. We conducted a logistic regression analysis to compare injury recurrence risk between children aged 18 months to seven years with and without infantile injury histories. The study revealed that infants with a history of injuries had a higher risk of subsequent hospital visits for injuries during preschool years (crude Odds Ratio (cOR) 1.52, 95% CI, 1.41-1.64, adjusted OR (aOR) 1.48, 95% CI 1.37-1.60). Specific injuries, such as falls (aOR 1.34, 95% CI, 1.26-1.43), pinches (aOR 1.22, 95% CI, 1.15-1.29), drowning (aOR 1.29, 95% CI, 1.19-1.40), ingestion (aOR 1.35, 95% CI, 1.17-1.55), and burns (aOR 1.47, 95% CI, 1.31-1.65), independently increased the risk of future injuries. Our findings highlight the necessity of universal safety measures in the home environment and targeted interventions for families with a history of high-risk injuries.
Assuntos
Ferimentos e Lesões , Humanos , Japão/epidemiologia , Pré-Escolar , Lactente , Masculino , Feminino , Estudos Longitudinais , Ferimentos e Lesões/epidemiologia , Ferimentos e Lesões/etiologia , Criança , Recidiva , Fatores de Risco , Razão de ChancesRESUMO
INTRODUCTION: For 35 years, recombinant human growth hormone (rhGH) has been successfully used worldwide to treat children with short stature related to growth hormone deficiency (GHD). Growth hormone therapy requires an individual approach to the patient due to varying responses to the treatment. Excessive body weight is one of the factors influencing the response. AIM OF THE STUDY: To evaluate the impact of excessive body mass on rhGH therapy effectiveness in GHD children. MATERIAL AND METHODS: 165 short-statured children with isolated GHD (mean age 10.72 ±3.33 years), treated with rhGH for at least one year (mean follow-up 4.32 ±1.80 years), were separated into 3 groups based on their BMI standard deviation score (SDS). Bone age, height, weight, insulin-like growth factor 1 level, and rhGH dose were obtained up to 10 years with one-year intervals. RESULTS: The mean change in height SDS in the first year was 0.52 ±0.41 SD and 0.60 ±0.32 SD for normal and excessive body weight children, respectively. The mean height velocity, based on the height SDS measured over the consecutive 5 years, was 0.44±0.25 SD/year for the normal-weight group and 0.32 ±0.24 SD/year for the excessive body weight group (p < 0.1). CONCLUSIONS: Excess body weight has a significant impact on rhGH therapy outcomes. This correlates with the height increase in the first year of observation; however, long-term observation has shown that children diagnosed with overweight or obesity achieve significantly worse results compared to their normal-weight peers.
Assuntos
Estatura , Hormônio do Crescimento Humano , Humanos , Criança , Feminino , Masculino , Hormônio do Crescimento Humano/uso terapêutico , Hormônio do Crescimento Humano/deficiência , Adolescente , Resultado do Tratamento , Estatura/efeitos dos fármacos , Proteínas Recombinantes/uso terapêutico , Índice de Massa Corporal , Pré-Escolar , Peso Corporal/efeitos dos fármacos , Transtornos do Crescimento/tratamento farmacológico , Seguimentos , SobrepesoRESUMO
BACKGROUND: Diabetic macular edema (DME), a leading cause of blindness, requires treatment with costly drugs, such as anti-vascular endothelial growth factor (VEGF) agents. The prolonged use of these effective but expensive drugs results in an incremental economic burden for patients with DME compared with those with diabetes mellitus (DM) without DME. However, there are no studies on the long-term patient-centered economic burden of DME after reimbursement for anti-VEGFs. OBJECTIVE: This retrospective cohort study aims to estimate the 3-year patient-centered economic burden of DME compared with DM without DME, using the Common Data Model. METHODS: We used medical data from 1,903,603 patients (2003-2020), transformed and validated using the Observational Medical Outcomes Partnership Common Data Model from Seoul National University Bundang Hospital. We defined the group with DME as patients aged >18 years with nonproliferative diabetic retinopathy and intravitreal anti-VEGF or steroid prescriptions. As control, we defined the group with DM without DME as patients aged >18 years with DM or diabetic retinopathy without intravitreal anti-VEGF or steroid prescriptions. Propensity score matching, performed using a regularized logistic regression with a Laplace prior, addressed selection bias. We estimated direct medical costs over 3 years categorized into total costs, reimbursement costs, nonreimbursement costs, out-of-pocket costs, and costs covered by insurance, as well as healthcare resource utilization. An exponential conditional model and a count model estimated unbiased incremental patient-centered economic burden using generalized linear models and a zero-inflation model. RESULTS: In a cohort of 454 patients with DME matched with 1640 patients with DM, the economic burden of DME was significantly higher than that of DM, with total costs over 3 years being 2.09 (95% CI 1.78-2.47) times higher. Reimbursement costs were 1.89 (95% CI 1.57-2.28) times higher in the group with DME than with the group with DM, while nonreimbursement costs were 2.54 (95% CI 2.12-3.06) times higher. Out-of-pocket costs and costs covered by insurance were also higher by a factor of 2.11 (95% CI 1.58-2.59) and a factor of 2.01 (95% CI 1.85-2.42), respectively. Patients with DME had a significantly higher number of outpatient (1.87-fold) and inpatient (1.99-fold) visits compared with those with DM (P<.001 in all cases). CONCLUSIONS: Patients with DME experience a heightened economic burden compared with diabetic patients without DME. The substantial and enduring economic impact observed in real-world settings underscores the need to alleviate patients' burden through preventive measures, effective management, appropriate reimbursement policies, and the development of innovative treatments. Strategies to mitigate the economic impact of DME should include proactive approaches such as expanding anti-VEGF reimbursement criteria, approving and reimbursing cost-effective drugs such as bevacizumab, advocating for proactive eye examinations, and embracing early diagnosis by ophthalmologists facilitated by cutting-edge methodologies such as artificial intelligence for patients with DM.
Assuntos
Efeitos Psicossociais da Doença , Retinopatia Diabética , Edema Macular , Humanos , Estudos Retrospectivos , Edema Macular/economia , Edema Macular/tratamento farmacológico , Edema Macular/etiologia , Edema Macular/epidemiologia , Masculino , Feminino , Pessoa de Meia-Idade , Retinopatia Diabética/economia , Retinopatia Diabética/epidemiologia , Idoso , Estudos de Coortes , República da Coreia/epidemiologia , Adulto , Assistência Centrada no Paciente/economia , Assistência Centrada no Paciente/estatística & dados numéricos , Fator A de Crescimento do Endotélio Vascular/antagonistas & inibidores , Custos de Cuidados de Saúde/estatística & dados numéricosRESUMO
PURPOSE: Adolescents and young adults (AYA) with cancer require highly individualized, age-specific end-of-life care. This study identified the characteristics of AYA patients with cancer receiving home-based palliative care and explored their unique needs and challenges compared with other age groups. METHODS: This retrospective cohort study analyzed the medical records of AYA patients with cancer to compare home-based palliative care characteristics with those of older age groups. The study included 81 AYA patients with cancer aged 16-39 years who had received home palliative care at a single institution from 2013 to 2023. They were compared with 5,017 patients with cancer aged 40 years and older. Patient background, duration of home end-of-life care, sedation, and association with social factors were examined retrospectively from medical records. RESULTS: The median age of AYA patients with cancer was 34 years; 65.4% were female. The primary cancer sites were gastrointestinal (34.6%) and thoracic (23.5%), with 80.0% at Stage IV. The median home care duration was 28.5 days, shorter than that for older patients (40 days) (p < 0.0001). Home care rates were similar between AYA and older patients (82.4% vs. 79.9%, p = 0.16). Sedation was more common in the 30-39 age group than in the 16-29 group (33.3% vs. 6.3%, p = 0.049). CONCLUSION: AYA patients with cancer achieved a high rate of end-of-life care at home, although their duration of home care was shorter. The characteristics of home care varied depending on the primary site and age, highlighting the importance of creating highly individualized care plans.
Assuntos
Serviços de Assistência Domiciliar , Neoplasias , Cuidados Paliativos , Humanos , Estudos Retrospectivos , Feminino , Cuidados Paliativos/métodos , Masculino , Adolescente , Adulto , Adulto Jovem , Neoplasias/terapia , Assistência Terminal/métodos , Pessoa de Meia-Idade , Fatores Etários , Estudos de Coortes , IdosoRESUMO
AIM: In older adults, pressure injuries (PIs) are common and, despite advancements in PI care, are associated with increased complications in patients with dementia for whom standardized treatments are unavailable. Herein, we aimed to examine the influence of dementia status on the care practice pattern and healing outcomes for PI on discharge. METHODS: This retrospective cohort study used data from the Diagnosis Procedure Combination database for 2014-2015 and the Annual Report for Functions of Medical Institutions and included patients aged ≥65 years. The DESIGN-R (depth, exudates, size, inflammation/infection, granulation, necrosis and rating) classification system was used to determine PI severity, and the Dementia Scale was used to assess dementia status. The measured outcomes included advanced PI care and healing upon discharge. Multivariable logistic regression, accounting for hospital-related clustering, was used to examine the association of dementia with these outcomes. RESULTS: Among 20 386 patients from 1198 hospitals included in the analysis, 32.5%, 20.1% and 47.3% had no, mild and severe dementia, respectively. After adjustment for patients' and hospital characteristics, compared with those without dementia, patients with severe dementia were significantly less likely to undergo advanced treatments, such as skin grafts (adjusted odds ratio [aOR], 0.62 [95% confidential interval [CI], 0.40-0.97]; P = 0.034) and flap surgeries (aOR, 0.57 [95% CI, 0.42-0.77]; P < 0.001), and had a reduced likelihood of PI healing (aOR, 0.80 [95% CI, 0.72-0.90]; P < 0.001). CONCLUSIONS: Severe dementia was associated with poor PI healing outcomes, indicating potential treatment disparities. Thus, PI treatment should be provided according to the severity of dementia. Geriatr Gerontol Int 2024; â¢â¢: â¢â¢-â¢â¢.
RESUMO
The purpose of this study is to explore the use of damage control techniques in the emergency surgical management of polytrauma patients - those with traumatic injuries affecting at least two anatomical regions - at a District General Hospital in Greece. We conducted a retrospective review of medical records from patients who visited the orthopedic emergency department between 2021 and 2024. From approximately 10,000 injured patients treated annually in our emergency department, we selected a sample of 29 polytrauma patients who required surgical intervention. We utilized the Injury Severity Score (ISS) to evaluate these patients. For 16 patients, the initial surgical intervention was also the definitive treatment, utilizing intramedullary nailing or internal osteosynthesis techniques. In the remaining 13 patients, damage control techniques, including external osteosynthesis (ExFix), were employed. The ISS was the primary criterion for deciding between definitive management and damage control procedures. Data on the 13 patients managed with damage control techniques were further analyzed and are presented in this study. External osteosynthesis was used to stabilize fractures and control bleeding, particularly in patients with multiple orthopedic injuries such as femoral or diaphyseal tibial fractures. This approach facilitated resuscitation and recovery. Our findings suggest that stabilizing long bone fractures with external fixation in patients with an ISS greater than 9 is both safe and likely contributes to overall recovery. This study demonstrates that a damage control approach for polytrauma patients with significant orthopedic trauma is effective for fracture stabilization and bleeding control. Additionally, in three cases, this approach also served as the definitive treatment.
RESUMO
PURPOSE: Low-dose computed tomography lung cancer screening is effective for reducing lung cancer mortality. It is critical to understand the lung cancer screening practices for screen-eligible individuals living in Alabama and Georgia where lung cancer is the leading cause of cancer death. High lung cancer incidence and mortality rates are attributed to high smoking rates among underserved, low income, and rural populations. Therefore, the purpose of this study is to define sociodemographic and clinical characteristics of patients who were screened for lung cancer at an Academic Medical Center (AMC) in Alabama and a Safety Net Hospital (SNH) in Georgia. METHODS: A retrospective cohort study of screen-eligible patients was constructed using electronic health records between 2015 and 2020 seen at an Academic Medical Center (AMC) and a Safety Net Hospital (SNH) separately. Chi-square tests and Student t tests were used to compare screening uptake across patient demographic and clinical variables. Bivariate and multivariate logistic regressions determined significant predictors of lung cancer screening uptake. RESULTS: At the AMC, 67,355 were identified as eligible for LCS and 1,129 were screened. In bivariate analyses, there were several differences between those who were screened and those who were not screened. Screening status in the site at Alabama-those with active tobacco use are significantly more likely to be screened than former smokers (OR: 3.208, p < 0.01). For every 10-unit increase in distance, the odds of screening decreased by about 15% (OR: 0.848, p < 0.01). For every 10-year increase in age, the odds of screening decrease by about 30% (OR: 0.704, p < 0.01). Each additional comorbidity increases the odds of screening by about 7.5% (OR: 1.075, p < 0.01). Those with both private and public insurance have much higher odds of screening compared to those with only private insurance (OR: 5.403, p < 0.01). However, those with only public insurance have lower odds of screening compared to those with private insurance (OR: 0.393, p < 0.01). At the SNH-each additional comorbidity increased the odds of screening by about 11.9% (OR: 1.119, p = 0.01). Notably, those with public insurance have significantly higher odds of being screened compared to those with private insurance (OR: 2.566, p < 0.01). CONCLUSION: The study provides evidence that LCS has not reached all subgroups and that additional targeted efforts are needed to increase lung cancer screening uptake. Furthermore, disparity was noticed between adults living closer to screening institutions and those who lived farther.
RESUMO
Purpose: Periprosthetic (PPFF) and peri-implant femoral fractures (PIFFs) are troublesome complications of prosthetic and implant surgery, the prior being described to have a greater delay to surgery when compared with standard hip fractures. The implications of PPFF delay being disputed in the current literature and those of PIFF have not been investigated. The aim of this study was to determine whether the time from radiological examination to surgery differs between hip fractures and PPFF/PIFF, and the possible consequences of delay and group affiliation on morbidity, mortality, and readmissions. Methods: One hundred and thirty-six participants were admitted to Danderyd hospital during 2020, cases exposed to PPFF or PIFF (n = 35) and hip fracture controls (n = 101) matched at 1:3 with respect to age and sex. Timestamps from radiology, surgery, and death were retrieved from the Swedish fracture registry, data on adverse events (AEs), and readmissions were collected through retrospective medical record review for 90-days postsurgery. Results: Linear regression showed that time to surgery differed in case and control cohorts by a mean of 24.8 h, p < 0.001, and AEs were significantly more common in cases, p = 0.046. Unadjusted binary logistic regression indicated a possible relationship between time to surgery increasing the rate of AEs by 1.3% per hour of delay, 95% confidence interval [CI]: (1-1.03). Conclusion: This study reveals a significant delay in surgery for PPFFs and PIFFs compared with standard hip fractures, leading to higher adverse event rates. While mortality and readmissions did not differ significantly, the delay underscores the need for timely intervention in these complex cases. Further research is needed to address these challenges and improve patient outcomes. Level of Evidence: III.
RESUMO
Introduction RTOG 1205 is the only randomized study to evaluate the safety and efficacy of reirradiation (reRT) in recurrent glioblastoma (GBM). While this study showed that reRT was safe and improves progression-free survival (PFS), an improved approach to reRT is still needed. In this study, we report on patterns of failure and outcomes in a cohort of patients with recurrent GBM who underwent reRT. We hypothesize that patients at high risk of leptomeningeal spread (LMS) are not good candidates for reRT due to the risk of treatment-related toxicity without clinical benefit. Methods In this retrospective study, patients with recurrent GBM who underwent reRT at a single institution from 2015-2023 were included. Sociodemographic, treatment, and outcomes data were collected via chart review. Time to progression was defined as the time from the start of reRT to progression per the Response Assessment in Neuro-Oncology (RANO) criteria. Overall survival (OS) was defined as the time from the start of reRT to death. PFS and OS were estimated using the Kaplan-Meier method. Results Thirteen patients with recurrent GBM who underwent reRT were identified. The median age at diagnosis was 58 years. Six patients (46.2%) had tumors that were O6-methylguanine-DNA methyltransferase (MGMT) methylated, four (30.8%) were MGMT unmethylated, and three (23.11%) had unknown MGMT status. Eight patients underwent repeat resection after recurrence and before reRT. Most patients (n=7) received 35 Gy in 10 fractions with concurrent bevacizumab, while other patients were treated with 25-40 Gy in 5-15 fractions with grade 1 or less acute toxicity. Three patients were treated with tumor-treating fields. The median follow-up was five months. Median PFS was three months [95% confidence interval (95% CI): one to four months] and median OS was five months (95% CI, 1-8 months) as compared to 7.1 months and 10.1 months, respectively, on RTOG 1205. Five patients developed LMS after reRT, one patient died before progression, and the remaining seven patients all developed progression within one centimeter of the recurrent tumor. Of the patients who developed LMS, all had tumors abutting the ventricles and three underwent resection 2-17 months before reRT. Conclusion Patterns of failure suggest a potential treatment selection approach for patients with recurrent GBM, in which patients at high risk of LMS (tumor abutting ventricles with or without recent surgery) should not undergo reRT, while patients at low risk of LMS are good candidates for reRT. Furthermore, reRT could be administered with reduced margins given that all non-LMS recurrences were within 1cm of the original tumor. Additional studies are needed to validate this approach.
RESUMO
OBJECTIVES: To determine the proportion of individuals with detectable antigen in plasma or serum after SARS-CoV-2 infection and the association of antigen detection with postacute sequelae of COVID-19 (PASC) symptoms. METHODS: Plasma and serum samples were collected from adults participating in four independent studies at different time points, ranging from several days up to 14 months post-SARS-CoV-2 infection. The primary outcome measure was to quantify SARS-CoV-2 antigens, including the S1 subunit of spike, full-length spike, and nucleocapsid, in participant samples. The presence of 34 commonly reported PASC symptoms during the postacute period was determined from participant surveys or chart reviews of electronic health records. RESULTS: Of the 1569 samples analysed from 706 individuals infected with SARS-CoV-2, 21% (95% CI, 18-24%) were positive for either S1, spike, or nucleocapsid. Spike was predominantly detected, and the highest proportion of samples was spike positive (20%; 95% CI, 18-22%) between 4 and 7 months postinfection. In total, 578 participants (82%) reported at least one of the 34 PASC symptoms included in our analysis ≥1 month postinfection. Cardiopulmonary, musculoskeletal, and neurologic symptoms had the highest reported prevalence in over half of all participants, and among those participants, 43% (95% CI, 40-45%) on average were antigen-positive. Among the participants who reported no ongoing symptoms (128, 18%), antigen was detected in 28 participants (21%). The presence of antigen was associated with the presence of one or more PASC symptoms, adjusting for sex, age, time postinfection, and cohort (OR, 1.8; 95% CI, 1.4-2.2). DISCUSSION: The findings of this multicohort study indicate that SARS-CoV-2 antigens can be detected in the blood of a substantial proportion of individuals up to 14 months after infection. While approximately one in five asymptomatic individuals was antigen-positive, roughly half of all individuals reporting ongoing cardiopulmonary, musculoskeletal, and neurologic symptoms were antigen-positive.
RESUMO
Some studies have identified influencing factors of COVID-19 illness in elderly, such as underlying diseases, but research on the effect of nutritional status is still lacking. This study retrospectively examined the influence of nutritional status on the outcome of elderly COVID-19 inpatients. A retrospective analysis of the clinical data of 4241 COVID-19 patients who were admitted to a third-class hospital of Nanchang between November 1, 2022 and January 31, 2023 was conducted. Nutritional status was assessed using the prognostic nutritional index (PNI) and controlling nutritional status score (CONUT). The influence of nutritional status on the outcome of COVID-19 patients was determined through multivariate adjustment analysis, restrictive cubic spline, and receiver operating characteristic curve (ROC). Compared with mild/no malnutrition, severe malnutrition substantially increased the critical outcome of COVID-19. A linear relationship was observed between the odds ratio (OR) and PNI and CONUT (P > 0.05). The area under the ROC curve indicated that PNI was the better predictor. The optimal cutoff value of PNI was 38.04 (95%CI: 0.797 ~ 0.836, AUC = 0.817), with a sensitivity of 70.7% and a specificity of 79.6%. The critical illness of elderly COVID-19 patients shows a linear relationship with malnutrition at admission. The use of PNI to assess the prognosis of COVID-19 eldeely patients is reliable, highlighting the importance for doctors to closely pay attention to the nutritional status of COVID-19 patients. Focusing on nutritional status in clinical practice can effectively reduce the critical illness of elderly COVID-19 patients.
Assuntos
COVID-19 , Hospitalização , Desnutrição , Avaliação Nutricional , Estado Nutricional , Humanos , COVID-19/complicações , COVID-19/mortalidade , Desnutrição/complicações , Idoso , Feminino , Masculino , Estudos Retrospectivos , Idoso de 80 Anos ou mais , Prognóstico , Curva ROC , SARS-CoV-2/isolamento & purificação , Índice de Gravidade de DoençaRESUMO
Increased fracture risk due to oral glucocorticoids (GCs) rapidly decreases with GC discontinuation. However, evidence for this is limited. We found that fracture risk decreased rapidly in the first year after GC discontinuation, while hip fracture risk remained higher than reference levels for about two years after GC discontinuation. PURPOSE: We investigated changes in fracture risk following discontinuation of long-term oral glucocorticoids (GCs) using Japan's nationwide health insurance claims database (NDBJ). METHODS: We identified patients aged ≥ 50 years who initiated GC therapy in 2012-2019. Those receiving ≥ 5 mg (prednisolone or equivalent, PSL)/day for ≥ 72 days in the initial 90 days of GC therapy were classified as the GC-exposure group, and those receiving < 5 mg PSL/day for < 30 days were classified as the reference group. Patients discontinuing GC after 90 days of GC therapy were classified as the GC-discontinuation group; all others were classified as the GC-continuation group. We tracked the incidence rates of hip and clinical vertebral fractures for up to 990 days, and assessed fracture risk after GC discontinuation by hazard ratios (HR) adjusted by inverse probability weighting using propensity scores for GC discontinuation. RESULTS: There was a total of 52,179 GC-discontinuation, 91,969 GC-continuation, and 43,138 reference group women, and 57,560, 93,736, and 33,696 men in the corresponding groups, respectively. According to adjusted HRs, incidence rates of fractures were significantly lower in the GC-discontinuation group than in the GC-continuation group in the initial 90 days after GC discontinuation and remained significant for 360 days, except for hip fracture in men. HRs for hip fractures remained significantly higher in the GC-discontinuation group compared to the reference group for 720 days post-discontinuation. CONCLUSION: Fracture risk declines rapidly in the first year after GC discontinuation, but vigilance is necessary as the increased risk persists for two years post-discontinuation.
RESUMO
BACKGROUND: Acute pancreatitis (AP) is a significant health concern with potential for recurrent episodes and serious complications. The risk of recurrence in type 2 diabetes (T2D) or obesity can be influenced by various factors and treatments, including GLP-1 receptor agonists (GLP-1RAs). This study evaluates the risk of recurrent AP among patients with a history of the condition, focusing on the effects of different GLP-1RA treatments. OBJECTIVES: Our objective is to compare the recurrence risks of AP between patients treated with different GLP-1RAs. METHODS: We conducted a retrospective cohort study using the TriNetX platform, encompassing 258,238 individuals with T2D or obesity who have a history of AP. We assessed the recurrence of AP over a five-year period, analyzing data on treatment regimens, with a focus on the use of Semaglutide, Tirzepatide, and other GLP-1RAs. RESULTS: GLP-1RA users experienced significantly lower recurrence rates of AP, with those without risk factors showing GLP-1RA users had a recurrence rate of 13.8 % compared to 40.9 % for non-users. Semaglutide and Tirzepatide showed the most favorable outcomes; Semaglutide users had lower recurrence rates than Exenatide (10.1 % vs. 27 %) and slightly lower than Dulaglutide (13.6 % vs. 15.4 %), though not statistically significant with Dulaglutide. Tirzepatide users displayed the lowest recurrence risk at 6.2 %, significantly lower than those on Semaglutide (11.7 %). CONCLUSIONS: GLP-1RAs, particularly Semaglutide and Tirzepatide, are associated with a reduced risk of recurrent AP in people with T2D or obesity. The differential risk profile between these drugs highlights the need for further studies and personalized treatment plans.
RESUMO
Sepsis and hypertension pose significant health risks, yet the optimal mean arterial pressure (MAP) target for resuscitation remains uncertain. This study investigates the association between average MAP (a-MAP) within the initial 24 h of intensive care unit admission and clinical outcomes in patients with sepsis and primary hypertension using the Medical Information Mart for Intensive Care (MIMIC) IV database. Multivariable Cox regression assessed the association between a-MAP and 30-day mortality. Kaplan-Meier and log-rank analyses constructed survival curves, while restricted cubic splines (RCS) illustrated the nonlinear relationship between a-MAP and 30-day mortality. Subgroup analyses ensured robustness. The study involved 8,810 patients. Adjusted hazard ratios for 30-day mortality in the T1 group (< 73 mmHg) and T3 group (≥ 80 mmHg) compared to the T2 group (73-80 mmHg) were 1.25 (95% CI 1.09-1.43, P = 0.001) and 1.44 (95% CI 1.25-1.66, P < 0.001), respectively. RCS revealed a U-shaped relationship (non-linearity: P < 0.001). Kaplan-Meier curves demonstrated significant differences (P < 0.0001). Subgroup analysis showed no significant interactions. Maintaining an a-MAP of 73 to 80 mmHg may be associated with a reduction in 30-day mortality. Further validation through prospective randomized controlled trials is warranted.
Assuntos
Pressão Arterial , Estado Terminal , Hipertensão , Sepse , Humanos , Masculino , Feminino , Hipertensão/mortalidade , Hipertensão/fisiopatologia , Hipertensão/complicações , Estado Terminal/mortalidade , Sepse/mortalidade , Sepse/fisiopatologia , Pessoa de Meia-Idade , Idoso , Estudos Retrospectivos , Unidades de Terapia Intensiva , Estimativa de Kaplan-MeierRESUMO
BACKGROUND: Lung cancer is one of the most common cancers and causes of cancer death in Canada. Some previous literature suggests that socioeconomic inequalities in lung cancer screening, treatment and survival may exist. The objective of this study was to compare overall survival for immigrants versus long-term residents of Ontario, Canada among patients diagnosed with lung cancer. METHODS: This population-based retrospective cohort study utilized linked health administrative databases and identified all individuals (immigrants and long-term residents) aged 40 + years diagnosed with incident lung cancer between April 1, 2012 and March 31, 2017. The primary outcome was 5-year overall survival with December 31, 2019 as the end of the follow-up period. We implemented adjusted Cox proportional hazards models stratified by age at diagnosis, sex, and cancer stage at diagnosis to examine survival. RESULTS: Thirty-eight thousand seven hundred eighty-eight individuals diagnosed with lung cancer were included in our cohort including 7% who were immigrants. Immigrants were younger at diagnosis and were more likely to reside in the lowest neighbourhood income quintile (30.6% versus 24.5%) than long-term residents. After adjusting for age at diagnosis, neighbourhood income quintile, comorbidities, visits to primary care in the 6 to 30 months before diagnosis, continuity of care, cancer type and cancer stage at diagnosis, immigrant status was associated with a lower hazard of dying 5-years post-diagnosis for both females (0.7; 95% CI 0.6-0.8) and males (0.7; 95% CI 0.6-0.7) in comparison to long-term residents. This trend held in adjusted models stratified by cancer stage at diagnosis. For example, female immigrants diagnosed with early stage lung cancer had a hazard ratio of 0.5 (95% CI 0.4-0.7) in comparison to long-term residents. CONCLUSION: Overall survival post diagnosis with lung cancer was better among Ontario immigrants versus long-term residents. Additional research, potentially on the protective effects of immigrant enclave and the intersection of immigrant status with racial/ethnic identity, is needed to further explore why better overall survival for immigrants remained.
Assuntos
Emigrantes e Imigrantes , Neoplasias Pulmonares , Humanos , Feminino , Neoplasias Pulmonares/mortalidade , Neoplasias Pulmonares/epidemiologia , Neoplasias Pulmonares/patologia , Neoplasias Pulmonares/etnologia , Emigrantes e Imigrantes/estatística & dados numéricos , Masculino , Ontário/epidemiologia , Estudos Retrospectivos , Pessoa de Meia-Idade , Idoso , Adulto , Idoso de 80 Anos ou mais , Fatores Socioeconômicos , Modelos de Riscos ProporcionaisRESUMO
Acute heat illness (AHI) from extreme environmental heat exposure can lead to emergency department (ED) visits, hospitalization, and even death. While the ICD ninth revision codes for AHI have been validated in the U.S., there have been no studies on the validity of ICD-10 codes for AHI in Canada. The objective of this study was to assess the validity of an ICD-10 coding algorithm for ED encounters for AHI. We conducted a retrospective cohort study of children and adults who had ED encounters at two large academic, tertiary hospitals in London, Canada, between May and September 2014-2018. We developed an algorithm of ICD-10 codes for AHI based upon a literature review and clinical expertise. Our "gold-standard" definition of AHI was patient-reported heat exposure and documentation of at least one heat-related complaint. To establish positive predictive value (PPV), we reviewed 62 algorithm-positive records and noted which met our "gold-standard" definition. To calculate negative predictive value (NPV), sensitivity (Sn), and specificity (Sp), we randomly reviewed 964 ED records for associated ICD-10 codes and diagnoses. Two independent reviewers completed blinded data abstraction, with duplicate abstraction in 20% of the sample. Of the 62 algorithm-positive records, mean (SD) age was 38.8 (23.8) years; 37% were female. PPV was 61.3 ± 12.1% (95% CI). Of the 964 randomly selected records, mean (SD) age was 41.7 (26.5) years; 51.1% were female. The NPV was 99.7 ± 0.4%, sensitivity 25.0 ± 42.4%, and specificity 100.0 ± 0.0%. An ICD-10 coding algorithm for AHI had high specificity but was limited in sensitivity. This algorithm can be used to assemble and study cohorts of patients who have had an AHI, but may underestimate the true incidence of AHI presentations in the ED.
Assuntos
Algoritmos , Serviço Hospitalar de Emergência , Classificação Internacional de Doenças , Humanos , Estudos Retrospectivos , Serviço Hospitalar de Emergência/estatística & dados numéricos , Masculino , Feminino , Adulto , Criança , Adolescente , Pessoa de Meia-Idade , Adulto Jovem , Transtornos de Estresse por Calor/diagnóstico , Transtornos de Estresse por Calor/epidemiologia , Pré-Escolar , Londres , IdosoRESUMO
AIMS: The utilization of sulfonylurea (SU) for the management of Type 2 Diabetes Mellitus (T2DM) has witnessed a decline, attributed to the rising popularity of alternative medications and uncertainties surrounding the cardiovascular risk profile of SUs. This study aimed to investigate the potential association between SU intake and the incidence of cardiovascular events in patients with T2DM. METHODS: A retrospective cohort study, based on a general practice (GP) registry, was designed, encompassing patients diagnosed with T2DM between 2005 and 2014.Follow-up persisted until the occurrence of a cardiovascular event, loss to follow-up, or until December 31, 2022. Comparative analyses were conducted between patients, receiving SU treatment and those without RESULTS: Data from a cohort comprising 5589 patients revealed that 13â¯% and 13.1â¯% of individuals in the comparator group and the SU group, respectively, experienced a cardiovascular event. However, no statistically significant elevation in the risk of cardiovascular events was observed after SU usage. Furthermore, the glycated haemoglobin (HbA1c) levels were significantly higher in the SU group (7.0â¯% vs. 6.4â¯%, p < 0.001). CONCLUSIONS: The findings from this study indicate that the use of sulfonylureas SUs is not associated with a statistically significant increase in the risk of cardiovascular events among patients with type T2DM. These results contribute to the ongoing discourse on the safety and efficacy of SU therapy in diabetes management.