Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 191
Filtrar
1.
Nutr Clin Pract ; 2024 Jun 15.
Artigo em Inglês | MEDLINE | ID: mdl-38877983

RESUMO

BACKGROUND: Body mass index (BMI) is criticized for being unjust and biased in relatively healthy racial and ethnic groups. Therefore, the current analysis examines if BMI predicts body composition, specifically adiposity, in a racially and ethnically diverse acutely ill patient population. METHODS: Patients admitted with SARS-CoV-2 having an evaluable diagnostic chest, abdomen, and/or pelvic computed tomography (CT) study (within 5 days of admission) were included in this retrospective cohort. Cross-sectional areas (centimeters squared) of the subcutaneous adipose tissue (SAT), visceral adipose tissue (VAT), and intramuscular adipose tissue (IMAT) were quantified. Total adipose tissue (TAT) was calculated as sum of these areas. Admission height and weight were applied to calculate BMI, and self-reported race and ethnicity were used for classification. General linear regression models were conducted to estimate correlations and assess differences between groups. RESULTS: On average, patients (n = 134) were aged 58.2 (SD = 19.1) years, 60% male, and racially and ethnically diverse (33% non-Hispanic White [NHW], 33% non-Hispanic Black [NHB], 34% Hispanic). Correlations between BMI and SAT and BMI and TAT were strongest revealing estimates of 0.707 (0.585, 0.829) and 0.633 (0.534, 0.792), respectively. When examining the various adiposity compartments across race and ethnicity, correlations were similar and significant differences were not detected for TAT with SAT, VAT, or IMAT (all P ≥ 0.05). CONCLUSIONS: These findings support the routine use of applying BMI as a proxy measure of total adiposity for acutely ill patients identifying as NHW, NHB, and Hispanic. Our results inform the validity and utility of this tool in clinical nutrition practice.

2.
Nutrients ; 16(11)2024 May 31.
Artigo em Inglês | MEDLINE | ID: mdl-38892657

RESUMO

Despite evidence suggesting the importance of psychological resilience for successful aging, little is known about the relationship between diet quality and resilience at different ages. Our study aims to examine the association between diet quality and resilience across the stages of adulthood. Using Stanfords' WELL for Life (WELL) survey data, we conducted a cross-sectional study of diet quality, resilience, sociodemographic, perceived stress, lifestyle, and mental health factors among 6171 Bay Area adults. Diet quality was measured by the WELL Diet Score, which ranges from 0-120. A higher score indicates a better diet quality. Linear regression analysis was used to evaluate the association between the WELL Diet Score and overall resilience and within the following age groups: early young (18-24), late young (25-34), middle (35-49), and late adulthood (≥50). To test whether these associations varied by age groups, an age group by resilience interaction term was also examined. In the fully adjusted model, the WELL Diet Score was positively and significantly associated with overall resilience (all ages (ß = 1.2 ± sd: 0.2, p < 0.001)) and within each age group (early young (ß = 1.1 ± sd: 0.3, p < 0.001); late young (ß = 1.2 ± sd: 0.3, p < 0.001); middle (ß = 0.9 ± sd: 0.3, p < 0.001); and late adulthood (ß = 1.0 ± sd: 0.3, p < 0.001)). Young adults demonstrated the strongest associations between diet quality and resilience. However, there were no significant age-by-resilience interactions. Diet quality may be positively associated with resilience at all stages of adulthood. Further research is needed to determine whether assessing and addressing resilience could inform the development of more effective dietary interventions, particularly in young adults.


Assuntos
Dieta , Resiliência Psicológica , Humanos , Estudos Transversais , Adulto , Masculino , Feminino , Adulto Jovem , Pessoa de Meia-Idade , Adolescente , Dieta/psicologia , Dieta Saudável/psicologia , Saúde Mental , Estilo de Vida , Fatores Etários , Qualidade de Vida
3.
Transpl Infect Dis ; 26(3): e14295, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38761060

RESUMO

BACKGROUND: Though the use of Hepatitis B viremic (HBV) donor kidneys may be a safe alternative to improve access to transplantation, there has not been wide acceptance of this practice. In this study, we determined the safety and effectiveness of HBV NAT (+) donor kidneys in a protocolized manner in an older adult population. METHODS: Over a 3-year period, 16 decreased donor kidney transplants were performed with HBV NAT+ kidneys. Recipients of HBV NAT+ kidneys were treated with entecavir started pre-operatively and continued for 52 weeks. RESULTS: HBV NAT+ kidneys were preferentially used in older (68 ± 5 vs. 64 ± 9 years; p = .01) recipients with less dialysis time (93.8% < 5 years vs. 67% <5 years; p = .03). In this cohort, 3/16 had detectable HBV PCR 1-week post-transplant, but all were negative at 9- and 12-months. Calculated estimated glomerular filtration rate (eGFR) was slightly decreased 12-months post-transplant. Post-transplant outcomes in an age-matched cohort showed no difference in rates of delayed graft function, readmission within 30 days, and graft loss or death within 6 months of transplant (p > .05). CONCLUSION: Transplants with HBV NAT+ donor kidneys in a pre-emptive treatment protocol allow for increased safe access to transplantation in older adult recipients with little or no dialysis time.


Assuntos
Antivirais , Taxa de Filtração Glomerular , Hepatite B , Transplante de Rim , Doadores de Tecidos , Viremia , Humanos , Transplante de Rim/efeitos adversos , Masculino , Feminino , Idoso , Pessoa de Meia-Idade , Antivirais/uso terapêutico , Vírus da Hepatite B/efeitos dos fármacos , Guanina/análogos & derivados , Guanina/uso terapêutico , Sobrevivência de Enxerto , Função Retardada do Enxerto
4.
PLoS One ; 19(4): e0295293, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38598554

RESUMO

RiSE study aims to evaluate a race-based stress-reduction intervention as an effective strategy to improve coping and decrease stress-related symptoms, inflammatory burden, and modify DNA methylation of stress response-related genes in older AA women. This article will describe genomic analytic methods to be utilized in this longitudinal, randomized clinical trial of older adult AA women in Chicago and NYC that examines the effect of the RiSE intervention on DNAm pre- and post-intervention, and its overall influence on inflammatory burden. Salivary DNAm will be measured at baseline and 6 months following the intervention, using the Oragene-DNA kit. Measures of perceived stress, depressive symptoms, fatigue, sleep, inflammatory burden, and coping strategies will be assessed at 4 time points including at baseline, 4 weeks, 8 weeks, and 6 months. Genomic data analysis will include the use of pre-processed and quality-controlled methylation data expressed as beta (ß) values. Association analyses will be performed to detect differentially methylated sites on the targeted candidate genes between the intervention and non-intervention groups using the Δß (changes in methylation) with adjustment for age, health behaviors, early life adversity, hybridization batch, and top principal components of the probes as covariates. To account for multiple testing, we will use FDR adjustment with a corrected p-value of <0.05 regarded as statistically significant. To assess the relationship between inflammatory burden and Δß among the study samples, we will repeat association analyses with the inclusion of individual inflammation protein measures. ANCOVA will be used because it is more statistically powerful to detect differences.


Assuntos
Negro ou Afro-Americano , Metilação de DNA , Idoso , Feminino , Humanos , Negro ou Afro-Americano/genética , Chicago , Genômica , Inflamação/genética , Cidade de Nova Iorque , Ensaios Clínicos Controlados Aleatórios como Assunto
5.
J Cardiovasc Med (Hagerstown) ; 25(4): 318-326, 2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-38488066

RESUMO

BACKGROUND: Diastolic dysfunction is a predictor of poor outcomes in many cardiovascular conditions. At present, it is unclear whether diastolic dysfunction predicts adverse outcomes in patients with atypical aortic stenosis who undergo aortic valve replacement (AVR). METHODS: Five hundred and twenty-three patients who underwent transcatheter AVR (TAVR) (n = 303) and surgical AVR (SAVR) (n = 220) at a single institution were included in our analysis. Baseline left and right heart invasive hemodynamics were assessed. Baseline transthoracic echocardiograms were reviewed to determine aortic stenosis subtype and parameters of diastolic dysfunction. Aortic stenosis subtype was categorized as typical (normal flow, high-gradient) aortic stenosis, classical, low-flow, low-gradient (cLFLG) aortic stenosis, and paradoxical, low-flow, low-gradient (pLFLG) aortic stenosis. Cox proportional hazard models were utilized to examine the relation between invasive hemodynamic or echocardiographic variables of diastolic dysfunction, aortic stenosis subtype, and all-cause mortality. Propensity-score analysis was performed to study the relation between aortic stenosis subtype and the composite outcome [death/cerebrovascular accident (CVA)]. RESULTS: The median STS risk was 5.3 and 2.5% for TAVR and SAVR patients, respectively. Relative to patients with typical aortic stenosis, patients with atypical (cLFLG and pLFLG) aortic stenosis displayed a significantly higher prevalence of diastolic dysfunction (LVEDP ≥ 20mmHg, PCWP ≥ 20mmHg, echo grade II or III diastolic dysfunction, and echo-PCWP ≥ 20mmHg) and, independently of AVR treatment modality, had a significantly increased risk of death. In propensity-score analysis, patients with atypical aortic stenosis had higher rates of death/CVA than typical aortic stenosis patients, independently of diastolic dysfunction and AVR treatment modality. CONCLUSION: We demonstrate the novel observation that compared with patients with typical aortic stenosis, patients with atypical aortic stenosis have a higher burden of diastolic dysfunction. We corroborate the worse outcomes previously reported in atypical versus typical aortic stenosis and demonstrate, for the first time, that this observation is independent of AVR treatment modality. Furthermore, the presence of diastolic dysfunction does not independently predict outcome in atypical aortic stenosis regardless of treatment type, suggesting that other factors are responsible for adverse clinical outcomes in this higher risk cohort.


Assuntos
Estenose da Valva Aórtica , Implante de Prótese de Valva Cardíaca , Próteses Valvulares Cardíacas , Substituição da Valva Aórtica Transcateter , Humanos , Valva Aórtica/diagnóstico por imagem , Valva Aórtica/cirurgia , Implante de Prótese de Valva Cardíaca/efeitos adversos , Resultado do Tratamento , Estenose da Valva Aórtica/diagnóstico por imagem , Estenose da Valva Aórtica/cirurgia , Substituição da Valva Aórtica Transcateter/efeitos adversos , Fatores de Risco , Índice de Gravidade de Doença
6.
Crit Care Explor ; 6(3): e1066, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38505174

RESUMO

OBJECTIVES: Alcohol withdrawal syndrome (AWS) may progress to require high-intensity care. Approaches to identify hospitalized patients with AWS who received higher level of care have not been previously examined. This study aimed to examine the utility of Clinical Institute Withdrawal Assessment Alcohol Revised (CIWA-Ar) for alcohol scale scores and medication doses for alcohol withdrawal management in identifying patients who received high-intensity care. DESIGN: A multicenter observational cohort study of hospitalized adults with alcohol withdrawal. SETTING: University of Chicago Medical Center and University of Wisconsin Hospital. PATIENTS: Inpatient encounters between November 2008 and February 2022 with a CIWA-Ar score greater than 0 and benzodiazepine or barbiturate administered within the first 24 hours. The primary composite outcome was patients who progressed to high-intensity care (intermediate care or ICU). INTERVENTIONS: None. MAIN RESULTS: Among the 8742 patients included in the study, 37.5% (n = 3280) progressed to high-intensity care. The odds ratio for the composite outcome increased above 1.0 when the CIWA-Ar score was 24. The sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) at this threshold were 0.12 (95% CI, 0.11-0.13), 0.95 (95% CI, 0.94-0.95), 0.58 (95% CI, 0.54-0.61), and 0.64 (95% CI, 0.63-0.65), respectively. The OR increased above 1.0 at a 24-hour lorazepam milligram equivalent dose cutoff of 15 mg. The sensitivity, specificity, PPV, and NPV at this threshold were 0.16 (95% CI, 0.14-0.17), 0.96 (95% CI, 0.95-0.96), 0.68 (95% CI, 0.65-0.72), and 0.65 (95% CI, 0.64-0.66), respectively. CONCLUSIONS: Neither CIWA-Ar scores nor medication dose cutoff points were effective measures for identifying patients with alcohol withdrawal who received high-intensity care. Research studies for examining outcomes in patients who deteriorate with AWS will require better methods for cohort identification.

7.
Int J Heart Fail ; 6(1): 36-43, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-38303916

RESUMO

Background and Objectives: Atrial fibrillation is common in patients with cardiac amyloidosis. However, the optimal anticoagulation strategy to prevent thromboembolic events in patients with cardiac amyloidosis and atrial fibrillation is unknown. This systematic review and meta-analysis compares direct oral anticoagulants (DOACs) vs. vitamin K antagonists (VKAs) in patients with cardiac amyloidosis and atrial fibrillation. Methods: We performed a systematic literature review to identify clinical studies of anticoagulation therapies for patients with cardiac amyloidosis and atrial fibrillation. The primary outcomes of major bleeding and thrombotic events were reported using random effects risk ratios (RRs) with 95% confidence interval (CI). Results: Our search yielded 97 potential studies and evaluated 14 full-text articles based on title and abstract. We excluded 10 studies that were review articles or did not compare anticoagulation. We included 4 studies reporting on 1,579 patients. The pooled estimates are likely underpowered due to small sample sizes. There was no difference in bleeding events for patients with cardiac amyloidosis and atrial fibrillation treated with DOACs compared to VKAs with a RR of 0.64 (95% CI, 0.38-1.10; p=0.10). There were decreased thrombotic events for patients with cardiac amyloidosis and atrial fibrillation treated with DOACs compared to VKAs with a RR of 0.50 (95% CI, 0.32-0.79; p=0.003). Conclusions: This systematic review and meta-analysis suggests that DOACs are as safe and effective as VKAs in patients with cardiac amyloidosis and atrial fibrillation. However, more data are needed to investigate clinical differences in anticoagulation therapy in this patient population.

8.
J Neurol Surg B Skull Base ; 85(1): 67-74, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38274479

RESUMO

Objective The evolution of acoustic neuroma (AN) care continues to shift focus on balancing optimized tumor resection and control with preservation of neurological function. Prior learning curve analyses of AN resection have demonstrated a plateau between 20 and 100 surgeries. In this study of 860 consecutive AN surgeries, we investigate the presence of an extended learning curve tail for AN resection. Methods A retrospective cohort study of AN resections by a single interdisciplinary team between 1988 and 2018 was performed. Proportional odds models and restricted cubic splines were used to determine the association between the timing of surgery and odds of improved postoperative outcomes. Results The likelihood of improved postoperative House-Brackmann (HB) scores increased in the first 400 procedures, with HB 1 at 36% in 1988 compared with 79% in 2004. While the probability of a better HB score increased over time, there was a temporary decrease in slope of the cubic spline between 2005 and 2009. The last 400 cases continued to see improvement in optimal HB outcomes: adjusted odds of HB 1 score were twofold higher in both 2005 to 2009 (adjusted odds ratio [aOR]: 2.11, 95% confidence interval [CI]: 1.38-3.22, p < 0.001) and 2010 to 2018 (aOR: 2.18, 95% CI: 1.49-3.19, p < 0.001). Conclusion In contrast to prior studies, our study demonstrates the steepest growth for learning, as measured by rates of preservation of facial function outcomes (HB 1), occurs in the first 400 AN resections. Additionally, improvements in patient outcomes continued even 30 years into practice, underlining the importance of lifelong learning.

9.
Biomedicines ; 11(11)2023 Nov 15.
Artigo em Inglês | MEDLINE | ID: mdl-38002060

RESUMO

Cytomegalovirus (CMV) and BK Polyomavirus (BKPyV) are the most common opportunistic pathogens following kidney transplantation. We evaluated 102 patients with a median age of 63 at Edward Hines VA Hospital from November 2020 to December 2022. Our primary interest was the incidence of CMV and BKPyV infections, as well as CMV and BKPyV coinfection. Secondary interests included time to infection, rejection, and graft and patient survival. There were no statistically significant differences in patient age, donor age, race, transplant type, incidence of delayed graft function, or induction in both cohorts (any infection (N = 46) vs. those without (N = 56)). There was a 36% (37/102) incidence of CMV, a 17.6% (18/102) of BKPyV and an 8.8% (9/102) incidence of coinfection. There was a decreased incidence of CMV infection in Basiliximab induction versus antithymocyte globulin (21% and 43%). CMV risk status had no effect on the incidence of CMV infection following transplant. African American recipients had a lower incidence of BKPyV infection (12% vs. 39%), yet a higher incidence was observed in those with high cPRA (50% vs. 14%). Most CMV and/or BKPyV infections occurred within the first six months post-transplant (54%). Immunosuppression management of the elderly should continually be evaluated to reduce opportunistic infections post-transplant.

10.
J Am Heart Assoc ; 12(19): e028342, 2023 10 03.
Artigo em Inglês | MEDLINE | ID: mdl-37750587

RESUMO

Background Isolated cardiac sarcoid (iCS) is reported to have more severe clinical presentation and greater risk of adverse events compared with cardiac sarcoid (CS) with extracardiac involvement (nonisolated CS). Delays in diagnosing specific organ involvement may play a role in these described differences. Methods and Results A retrospective observational study of patients with CS over a 20-year period was conducted. Objective evidence of organ involvement and time of onset based on consensus criteria were identified. CS was confirmed by histology in all patients from myocardium only (iCS) or extracardiac tissue (nonisolated CS). The primary end point was a composite of mortality, orthotopic heart transplant, and durable left ventricular assist device implantation. CS was isolated in 9 of 50 patients (18%). Among baseline characteristics, iCS and nonisolated CS differed significantly only in the frequency of sustained ventricular tachycardia at presentation (78% versus 37%; P=0.03) and delay in CS diagnosis >6 months (67% versus 5%; P<0.01). A nonsignificant trend toward lower left ventricular ejection fraction and more frequent heart failure in iCS was observed. Over a median follow-up of 9.7 years (95% CI, 6.8-10.8), 18 patients reached the primary end point (13 deaths, 2 orthotopic heart transplants, and 3 durable left ventricular assist device implantations). The 1-, 5-, and 10-year event-free survival rates were 96% (95% CI, 85%-99%), 79% (95% CI, 64%-88%), and 58% (95% CI, 40%-73%), respectively, without differences between groups. There were no significant predictors of the primary end point, including delayed CS diagnosis. Conclusions Long-term outcomes were similar between iCS and nonisolated CS in patients with histologically documented sarcoid. Diagnostic delays may contribute to differences in the dominant clinical presentation, despite similar outcomes.


Assuntos
Cardiomiopatias , Sarcoidose , Humanos , Prognóstico , Diagnóstico Tardio , Cardiomiopatias/diagnóstico , Cardiomiopatias/terapia , Volume Sistólico , Função Ventricular Esquerda , Sarcoidose/complicações , Sarcoidose/diagnóstico , Sarcoidose/terapia , Estudos Retrospectivos
11.
Alcohol Clin Exp Res (Hoboken) ; 47(5): 908-918, 2023 May.
Artigo em Inglês | MEDLINE | ID: mdl-37526580

RESUMO

BACKGROUND: Nurses and other first responders are at high risk of exposure to the SARS-CoV2 virus, and many have developed severe COVID-19 infection. A better understanding of the factors that increase the risk of infection after exposure to the virus could help to address this. Although several risk factors such as obesity, diabetes, and hypertension have been associated with an increased risk of infection, many first responders develop severe COVID-19 without established risk factors. As inflammation and cytokine storm are the primary mechanisms in severe COVID-19, other factors that promote an inflammatory state could increase the risk of COVID-19 in exposed individuals. Alcohol misuse and shift work with subsequent misaligned circadian rhythms are known to promote a pro-inflammatory state and thus could increase susceptibility to COVID-19. To test this hypothesis, we conducted a prospective, cross-sectional observational survey-based study in nurses using the American Nursing Association network. METHOD: We used validated structured questionnaires to assess alcohol consumption (the Alcohol Use Disorders Identification Test) and circadian typology or chronotype (the Munich Chronotype Questionnaire Shift -MCTQ-Shift). RESULTS: By latent class analysis (LCA), high-risk features of alcohol misuse were associated with a later chronotype, and binge drinking was greater in night shift workers. The night shift was associated with more than double the odds of COVID-19 infection of the standard shift (OR 2.67, 95% CI: 1.18 to 6.07). Binge drinkers had twice the odds of COVID-19 infection of those with low-risk features by LCA (OR: 2.08, 95% CI: 0.75 to 5.79). CONCLUSION: Working night shifts or binge drinking may be risk factors for COVID-19 infection among nurses. Understanding the mechanisms underlying these risk factors could help to mitigate the impact of COVID-19 on our at-risk healthcare workforce.

12.
Clin Neuroradiol ; 33(4): 1123-1131, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37410170

RESUMO

PURPOSE: Acute traumatic osseous and cartilaginous injuries to the larynx are rare injuries presenting to the emergency department. Despite the low reported incidence, laryngeal trauma carries a high morbidity and mortality. The purpose of this study is to identify fracture and soft tissue injury patterns in laryngeal trauma and explore associations with patient demographics, mechanisms of injury, urgent airway and surgical intervention. METHODS: A retrospective review of patients with laryngeal injury who underwent multidetector computed tomography (MDCT) imaging was performed. The CT findings of laryngeal and hyoid fracture location, fracture displacement, and soft tissue injuries were recorded. Clinical data including patient demographics, mechanisms of injury, frequency of airway and surgical intervention were also recorded. Correlation of imaging characteristics with patient demographics, mechanism of injury and interventions were assessed for statistical significance using χ2 and Fisher's exact tests. RESULTS: The median patient age was 40 years old with a strong male predominance. The most common mechanisms of injury included motor vehicle collisions and penetrating gunshot wounds. Thyroid cartilage fractures were the most common fracture type. Findings of fracture displacement and airway hematoma had a higher correlation with requiring urgent airway management. CONCLUSION: Radiologists' early recognition and prompt communication of laryngeal trauma to the clinical service is important to reduce associated morbidity and mortality. Displaced fractures and laryngeal hematomas should be promptly conveyed to the clinical service as they are associated with more complex injuries and higher rates of urgent airway management and surgical intervention.


Assuntos
Laringe , Fraturas da Coluna Vertebral , Ferimentos por Arma de Fogo , Ferimentos Penetrantes , Humanos , Masculino , Adulto , Feminino , Laringe/diagnóstico por imagem , Laringe/lesões , Tomografia Computadorizada Multidetectores , Estudos Retrospectivos
13.
Urogynecology (Phila) ; 29(8): 678-686, 2023 08 01.
Artigo em Inglês | MEDLINE | ID: mdl-37490707

RESUMO

IMPORTANCE: A greater understanding of the relationship between toileting behaviors and lower urinary tract symptoms (LUTS) has the potential to generate awareness and improvement of overall bladder health in specific populations. OBJECTIVE: The aim of the study was to investigate the prevalence and correlation between maladaptive toileting behaviors and LUTS among female medical trainees and attending physicians. STUDY DESIGN: We surveyed female medical students, residents, fellows, and attending physicians at an academic hospital, capturing demographics, voiding behaviors, LUTS, and fluid intake using the Bristol Female Lower Urinary Tract Symptoms Short Form, the Toileting Behavior-Women's Elimination Behaviors, and the Beverage Intake Questionnaire. RESULTS: A total of 146 medical students and physicians participated in the study. Eighty-three percent reported at least 1 LUTS, most commonly storage symptoms, particularly incontinence (30%, stress urinary incontinence > urgency urinary incontinence). Altered toileting behaviors included "worrying about public toilet cleanliness" (82%), "emptying the bladder before leaving home" (81%), "delaying emptying their bladder when busy" (87%), and "waiting until they could not hold urine any longer" (57%). Total Toileting Behavior-Women's Elimination Behaviors scores were significantly associated with total Bristol Female Lower Urinary Tract Symptoms scores (ß = 0.27; 95% CI, 0.12-0.42; P<0.01). This remained true after adjusting for total fluid intake in medical students (ß = 0.41, P<0.01) and resident physicians (ß = 0.28, P = 0.03) but was not correlated among attending physicians (ß = -0.07, P = 0.77). CONCLUSIONS: Female physicians and medical students experience a high prevalence of LUTS. Many engage in maladaptive toileting behaviors, which highly correlate with LUTS (especially among medical students and residents) and may lead to impaired bladder health.


Assuntos
Sintomas do Trato Urinário Inferior , Médicos , Estudantes de Medicina , Incontinência Urinária , Feminino , Humanos , Sintomas do Trato Urinário Inferior/epidemiologia , Micção , Bexiga Urinária , Incontinência Urinária/epidemiologia
14.
Clin Shoulder Elb ; 26(2): 169-174, 2023 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-37316178

RESUMO

BACKGROUND: Sleep quality, quantity, and efficiency have all been demonstrated to be adversely affected by rotator cuff pathology. Previous measures of assessing the impact of rotator cuff pathology on sleep have been largely subjective in nature. This study was undertaken to objectively analyze this relationship through the use of activity monitors. METHODS: Patients with full-thickness rotator cuff tears at a single institution were prospectively enrolled between 2018 and 2020. Waistworn accelerometers were provided for the patients to use each night for 14 days. Sleep efficiency was calculated using the ratio of the time spent sleeping to the total amount of time that was spent in bed. Retraction of the rotator cuff tear was classified using the Patte staging system. RESULTS: This study included 36 patients: 18 with Patte stage 1 disease, 14 with Patte stage 2 disease, and 4 patients with Patte stage 3 disease. During the study, 25 participants wore the monitor on multiple nights, and ultimately their data was used for the analysis. No difference in the median sleep efficiency was appreciated amongst these groups (P>0.1), with each cohort of patients demonstrating a generally high sleep efficiency. CONCLUSIONS: The severity of retraction of the rotator cuff tear did not appear to correlate with changes in sleep efficiency for patients (P>0.1). These findings can better inform providers on how to counsel their patients who present with complaints of poor sleep in the setting of full-thickness rotator cuff tears. Level of evidence: Level II.

15.
J Gastrointest Surg ; 27(9): 1794-1803, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-37316761

RESUMO

BACKGROUND: Neoadjuvant tyrosine kinase inhibitor (TKI) therapy has reduced tumor burden and improved survival in both primary and recurrent gastrointestinal stromal tumors (GISTs). However, no clear guidelines exist on optimal patient selection for neoadjuvant therapy (NAT). Our aim was to analyze factors and outcomes associated with the therapeutic sequence of TKI therapy before and/or after surgery for gastric GISTs. METHODS: We performed a retrospective study of patients surgically treated for a gastric GIST utilizing the 2006-2018 National Cancer Database. We examined demographic, clinical, and pathological characteristics associated with NAT versus adjuvant therapy (AT) using logistic regression. RESULTS: Of the 3732 patients, 20.4% received NAT and 79.6% had AT. Among patients receiving therapy, NAT significantly increased over our study period (12% to 30.7%). A majority of the AT group received a partial gastrectomy (77.9%) compared with the NAT group who received more near-total/total gastrectomy or gastrectomy with en bloc resection (p < 0.001). In a multivariable model, patients were more likely to receive NAT when insured (private, aOR: 2.37, 95% CI: 1.31-4.29), treated at an academic/research program (aOR: 1.83, 95% CI: 1.49-2.56), had tumors located in the proximal stomach (aOR: 1.40, 95% CI: 1.06-1.86), tumor size > 10 cm (aOR: 1.88, 95% CI: 1.41-2.51), and received near-total/total gastrectomy (aOR: 1.81, 95% CI: 1.42-2.29). There were no differences in outcomes. CONCLUSION: NAT for gastric GIST has increased in utilization. NAT was used in patients with larger tumors and who underwent more extensive resection. Despite these factors, outcomes were similar to patients receiving only AT. More studies are required to determine the therapeutic sequence for gastric GISTs.


Assuntos
Tumores do Estroma Gastrointestinal , Neoplasias Gástricas , Humanos , Tumores do Estroma Gastrointestinal/tratamento farmacológico , Tumores do Estroma Gastrointestinal/cirurgia , Terapia Neoadjuvante , Estudos Retrospectivos , Recidiva Local de Neoplasia/cirurgia , Neoplasias Gástricas/cirurgia , Neoplasias Gástricas/patologia , Gastrectomia
16.
Nutr Clin Pract ; 38(5): 1009-1020, 2023 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-37312258

RESUMO

BACKGROUND: Patients with low muscle mass and acute SARS-CoV-2 infection meet the Global Leadership Initiative on Malnutrition (GLIM) etiologic and phenotypic criteria to diagnose malnutrition, respectively. However, available cut-points to classify individuals with low muscle mass are not straightforward. Using computed tomography (CT) to determine low muscularity, we assessed the prevalence of malnutrition using the GLIM framework and associations with clinical outcomes. METHODS: A retrospective cohort was conducted gathering patient data from various clinical resources. Patients admitted to the COVID-19 unit (March 2020 to June 2020) with appropriate/evaluable CT studies (chest or abdomen/pelvis) within the first 5 days of admission were considered eligible. Sex- and vertebral-specific skeletal muscle indices (SMI; cm2 /m2 ) from healthy controls were used to determine low muscle mass. Injury-adjusted SMI were derived, extrapolated from cancer cut-points and explored. Descriptive statistics and mediation analyses were completed. RESULTS: Patients (n = 141) were 58.2 years of age and racially diverse. Obesity (46%), diabetes (40%), and cardiovascular disease (68%) were prevalent. Using healthy controls and injury-adjusted SMI, malnutrition prevalence was 26% (n = 36/141) and 50% (n = 71/141), respectively. Mediation analyses demonstrated a significant reduction in the effect of malnutrition on outcomes in the presence of Acute Physiology and Chronic Health Evaluation II, supporting the mediating effects of severity of illness intensive care unit (ICU) admission, ICU length of stay, mechanical ventilation, complex respiratory support, discharge status (all P values = 0.03), and 28-day mortality (P = 0.04). CONCLUSIONS: Future studies involving the GLIM criteria should consider these collective findings in their design, analyses, and implementation.


Assuntos
COVID-19 , Desnutrição , Humanos , Liderança , Estudos Retrospectivos , COVID-19/epidemiologia , SARS-CoV-2 , Tomografia Computadorizada por Raios X , Desnutrição/diagnóstico , Desnutrição/epidemiologia , Avaliação Nutricional , Estado Nutricional
17.
J Burn Care Res ; 2023 Jun 21.
Artigo em Inglês | MEDLINE | ID: mdl-37339870

RESUMO

The Burn Care Quality Platform (BCQP) consolidates data previously collected from the National Burn Repository and the Burn Quality Improvement Program into a single registry. Its data elements and their associated definitions are tailored to create consistency across other national trauma registries, namely the National Trauma Data Bank implemented by the American College of Surgeons Trauma Quality Improvement Program (ACS TQIP). The BCQP now includes 103 participating burn centers and has captured data from 375,000 total patients as of 2021. With 12,000 patients entered under the current data dictionary, the BCQP represents the largest registry of its kind. On behalf of the American Burn Association Research Committee, the aim of this whitepaper is to provide a succinct overview of the BCQP, showcasing its unique features, strengths, limitations, and relevant statistical considerations. This whitepaper will highlight the resources available to the burn research community and offer insight on proper study design when preparing to conduct a large data set investigation for burn care. All recommendations herein were formulated through the consensus of a multidisciplinary committee and based on the available scientific evidence.

18.
AIDS Care ; 35(8): 1251-1258, 2023 08.
Artigo em Inglês | MEDLINE | ID: mdl-37128634

RESUMO

People living with HIV/AIDS (PLWHA) have long experienced structural, community, and personal stigma. We explored differences in experienced HIV-related stigma according to race/ethnicity using quantitative and qualitative measures. Sixty-four patients were enrolled in this study (22 White and 42 people of color [POC]). POC scored higher than White PLWHA on all 12 survey statements, with statistically significant differences in disclosure concerns and with one of the statements on public attitudes towards PLWHA. Common themes in the qualitative interview were HIV disclosure concerns and fear of rejection. These data demonstrate that stigma continues to be a significant concern for PLWHA, particularly POC, meaningfully impacting their lives. By acknowledging and working to reduce negative perceptions about PLWHA, physicians may improve care for their patients by developing more trusting relationships.


Assuntos
Infecções por HIV , Humanos , Etnicidade , Estigma Social , Revelação , Inquéritos e Questionários
19.
Arch Gynecol Obstet ; 308(3): 919-926, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-37170033

RESUMO

INTRODUCTION AND HYPOTHESIS: Limited health literacy (HL) is a risk factor for poor patient outcomes, including pain. Chronic pelvic pain (CPP) is a prevalent disorder affecting up to 25% of women and coexists with multiple overlapping conditions. This study aimed to describe health literacy in women with CPP, primarily correlate HL to pain intensity and pain duration, and secondarily correlate HL to mood symptoms and pain catastrophizing. We hypothesized that women with CPP with higher HL would report lower levels of pain intensity and duration. METHODS: This was a prospective, cross-sectional study. Forty-five women with CPP were recruited from outpatient Physical Medicine & Rehabilitation and Female Pelvic Medicine & Reconstructive Surgery clinics. Validated questionnaires were administered to evaluate pain intensity and duration, pain disability, psychological symptoms, pain catastrophizing, and health literacy. Statistical analyses included descriptive statistics of patient characteristics and summary scores, as well as Spearman's rank correlation coefficients (rho) to assess the strength of associations between summary scores and health literacy. RESULTS: Forty-five women with CPP were enrolled with mean age of 49 years, majority non-Hispanic White, and median chronic pelvic pain duration of 7 years. Possible or high likelihood of limited health literacy was identified in 20% women with CPP (11.1% and 8.9%, respectively). Limited health literacy was moderately correlated with pain intensity, depressive symptoms, and pain catastrophizing. Pain duration was not significantly correlated with health literacy. The remaining 80% of women with CPP were likely to have adequate health literacy. CONCLUSIONS: A majority of women with CPP in this single center study were likely to have adequate health literacy. Limited health literacy was seen in a minority of women with CPP but was moderately correlated with greater pain intensity, more depressive symptoms, and higher pain catastrophizing. This study identified that women with CPP were likely to have adequate HL, but underscores the importance of considering HL screening and interventions in those with higher pain intensity, depression, and pain catastrophizing.


Assuntos
Dor Crônica , Letramento em Saúde , Feminino , Humanos , Pessoa de Meia-Idade , Masculino , Dor Pélvica/etiologia , Estudos Transversais , Estudos Prospectivos
20.
Liver Transpl ; 29(11): 1192-1198, 2023 11 01.
Artigo em Inglês | MEDLINE | ID: mdl-37076131

RESUMO

The donor operation and the hemodynamics during declaration resulting in donor warm ischemia time have been linked to the outcomes in donation after circulatory death (DCD) liver transplantation (LT). Scrutiny of the donor hemodynamics at the time of withdrawal of life support concluded that a functional donor warm ischemia time may be associated with LT graft failure. Unfortunately, the definition for functional donor warm ischemia time has not reached a consensus-but has almost always incorporated time spent in a hypoxic state. Herein, we reviewed 1114 DCD LT cases performed at the 20 highest volume centers during 2014 and 2018. Donor hypoxia began within 3 minutes of withdrawal of life support for 60% of cases and within 10 minutes for 95% of cases. Graft survival was 88.3% at 1 year and 80.3% at 3 years. Scrutinizing the time spent under hypoxic conditions (oxygen saturation ≤ 80%) during the withdrawal of life support, we found an increasing risk of graft failure as hypoxic time increased from 0 to 16 minutes. After 16 minutes and up to 50 minutes, we did not find any increased risk of graft failure. In conclusion, after 16 minutes of time in hypoxia, the risk of graft failure in DCD LT did not increase. The current evidence suggests that an over-reliance on hypoxia time may lead to an unnecessary increase in DCD liver discard and may not be as useful for predicting graft loss after LT.


Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Humanos , Isquemia Quente/efeitos adversos , Transplante de Fígado/efeitos adversos , Transplante de Fígado/métodos , Saturação de Oxigênio , Seleção do Doador , Fatores de Risco , Doadores de Tecidos , Hipóxia/etiologia , Sobrevivência de Enxerto , Estudos Retrospectivos , Morte
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...