Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
J Am Geriatr Soc ; 2024 Aug 08.
Artigo em Inglês | MEDLINE | ID: mdl-39115437

RESUMO

BACKGROUND: Alzheimer's disease is the most common type of dementia and is responsible for up to 80% of dementia diagnoses and is the sixth leading cause of death in the United States. An estimated 38,000 American Indian/Alaska Native (AI/AN) people aged ≥65 years were living with Alzheimer's disease and related dementias (ADRD) in 2020, a number expected to double by 2030 and quadruple by 2050. Administrative healthcare data from the Indian Health Service (IHS) were used to estimate ADRD among AI/AN populations. METHODS: Administrative IHS healthcare data from federal fiscal years 2016 to 2020 from the IHS National Data Warehouse were used to calculate the count and rate per 100,000 AI/AN adults aged ≥45 years with at least one ADRD diagnosis code on their medical record. RESULTS: This study identified 12,877 AI/AN adults aged ≥45 years with an ADRD diagnosis code, with an overall rate of 514 per 100,000. Of those, 1856 people were aged 45-64. Females were 1.2 times (95% confidence interval: 1.1-1.2) more likely than males to have a medical visit with an ADRD diagnosis code. CONCLUSIONS: Many AI/AN people with ADRD rely on IHS, tribal, and urban Indian health programs. The high burden of ADRD in AI/AN populations aged 45-64 utilizing IHS health services highlights the need for implementation of ADRD risk reduction strategies and assessment and diagnosis of ADRD in younger AI/AN populations. This study provides a baseline to assess future progress for efforts addressing ADRD in AI/AN communities.

3.
PLOS Digit Health ; 3(6): e0000527, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38935590

RESUMO

Study-specific data quality testing is an essential part of minimizing analytic errors, particularly for studies making secondary use of clinical data. We applied a systematic and reproducible approach for study-specific data quality testing to the analysis plan for PRESERVE, a 15-site, EHR-based observational study of chronic kidney disease in children. This approach integrated widely adopted data quality concepts with healthcare-specific evaluation methods. We implemented two rounds of data quality assessment. The first produced high-level evaluation using aggregate results from a distributed query, focused on cohort identification and main analytic requirements. The second focused on extended testing of row-level data centralized for analysis. We systematized reporting and cataloguing of data quality issues, providing institutional teams with prioritized issues for resolution. We tracked improvements and documented anomalous data for consideration during analyses. The checks we developed identified 115 and 157 data quality issues in the two rounds, involving completeness, data model conformance, cross-variable concordance, consistency, and plausibility, extending traditional data quality approaches to address more complex stratification and temporal patterns. Resolution efforts focused on higher priority issues, given finite study resources. In many cases, institutional teams were able to correct data extraction errors or obtain additional data, avoiding exclusion of 2 institutions entirely and resolving 123 other gaps. Other results identified complexities in measures of kidney function, bearing on the study's outcome definition. Where limitations such as these are intrinsic to clinical data, the study team must account for them in conducting analyses. This study rigorously evaluated fitness of data for intended use. The framework is reusable and built on a strong theoretical underpinning. Significant data quality issues that would have otherwise delayed analyses or made data unusable were addressed. This study highlights the need for teams combining subject-matter and informatics expertise to address data quality when working with real world data.

4.
Transfusion ; 64(8): 1533-1542, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-38783709

RESUMO

BACKGROUND: Whole blood transfusion has been found to increase the likelihood of patient survival within both military and civilian medicine contexts. However, no whole blood transfusion training curriculum currently exists within undergraduate or graduate medical education in the United States. The purpose of our study was to: (1) determine the impact of simulation-based training on medical students' abilities to conduct whole blood transfusions; and (2) determine the impact of simulation-based training on medical students' confidence in conducting whole blood transfusions. STUDY DESIGN AND METHODS: We assessed 157 third-year military medical students' ability to conduct whole blood transfusion before and after Operation Gunpowder, a 2-day high-fidelity prolonged casualty care simulation. We conducted a paired samples t-test to compare the students' pre- and post-simulation performance scores as well as self-reported confidence and stress ratings. RESULTS: There was a significant difference in students' scores at the beginning of the course (M = 20.469, SD 6.40675) compared to their scores at the end of the course (M = 30.361, SD = 2.10053); t(155) = -18.833, p < .001. The effect size for this analysis (d = 6.56) was large. There was a significant difference (p < .001) between the pre- and post-ratings for all self-reported confidence and stress survey items. DISCUSSION: Our results suggest that simulation-based training is an effective means of training medical students to conduct whole blood transfusiontraining in a limited resource simulated environment where blood inventories may be limited.


Assuntos
Transfusão de Sangue , Estudantes de Medicina , Humanos , Feminino , Masculino , Competência Clínica , Treinamento por Simulação/métodos , Adulto , Medicina Militar/educação , Currículo
5.
Mil Med ; 2024 Apr 30.
Artigo em Inglês | MEDLINE | ID: mdl-38687599

RESUMO

INTRODUCTION: Providing resilient Damage Control Resuscitation capabilities as close to the point of injury as possible is paramount to reducing mortality and improving patient outcomes for our nation's warfighters. Emergency Fresh Whole Blood Transfusions (EFWBT) play a critical role in supporting this capability, especially in future large-scale combat operations against peer adversaries with expected large patient volumes, restrictive operating environments, and unreliable logistical supply lines. Although there are service-specific training programs for whole blood transfusion, there is currently no dedicated EFWBT training for future military medical officers. To address this gap, we developed, implemented, and evaluated a training program to enhance EFWBT proficiency in third-year military medical students at the F. Edward Hebert School of Medicine at the USU. MATERIALS AND METHODS: After reviewing both the 75th Ranger Regiment Ranger O-Low Titer program and the Marine Corps' Valkyrie program, along with the relevant Joint Trauma System Clinical Practice Guidelines, we created a streamlined and abbreviated training curriculum. The training consisted of both online preparatory materials as well as a 2-hour in-person training that included didactic and experiential learning components. Participants were 165 active duty third-year medical students at USU. Participants were assessed using a pre- and post-assessment self-reported questionnaire on their confidence in the practical application and administrative oversight requirements of an EFWBT program. Participants' performance was also assessed using a pre/post knowledge assessment consisting of 10 multiple choice questions identified as critical to understanding of the academic principles of EFWBT along with the baseline questionnaire. RESULTS: Differences in the mean scores of the pre- and post-assessment self-reported questionnaire (increased from 2.32 to 3.95) were statistically significant (P < .001). Similarly, there was a statistically significant improvement in student test scores, with the mean score increasing by approximately 3 points or 30%. There was no significant difference in student confidence assessment or test scores based on branch of service. Students who had previously deployed did not show a statistically significant difference in scores compared to students who had not previously deployed. CONCLUSIONS: Our results suggest that the implementation of streamlined EFWBT training into the undergraduate medical education of future military medical officers offers an efficient way to improve their baseline proficiency in EFWBTs. Future research is needed to assess the impact of this training on real-world applications in forward-deployed environments.

6.
J Cardiopulm Rehabil Prev ; 44(4): 231-238, 2024 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-38669319

RESUMO

PURPOSE: Cardiac rehabilitation (CR) improves patient outcomes and quality of life and can be provided virtually through hybrid CR. However, little is known about CR availability in conjunction with broadband access, a requirement for hybrid CR. This study examined the intersection of CR and broadband availability at the county level, nationwide. METHODS: Data were gathered and analyzed in 2022 from the 2019 American Community Survey, the Centers for Medicare & Medicaid Services, and the Federal Communications Commission. Spatially adaptive floating catchments were used to calculate county-level percent CR availability among Medicare fee-for-service beneficiaries. Counties were categorized: by CR availability, whether lowest (ie, CR deserts), medium, or highest; and by broadband availability, whether CR deserts with majority-available broadband, or dual deserts. Results were stratified by state. County-level characteristics were examined for statistical significance by CR availability category. RESULTS: Almost half of US adults (n = 116 325 976, 47.2%) lived in CR desert counties (1691 counties). Among adults in CR desert counties, 96.8% were in CR deserts with majority-available broadband (112 626 906). By state, the percentage of the adult population living in CR desert counties ranged from 3.2% (New Hampshire) to 100% (Hawaii and Washington, DC). Statistically significant differences in county CR availability existed by race/ethnicity, education, and income. CONCLUSIONS: Almost half of US adults live in CR deserts. Given that up to 97% of adults living in CR deserts may have broadband access, implementation of hybrid CR programs that include a telehealth component could expand CR availability to as many as 113 million US adults.


Assuntos
Reabilitação Cardíaca , Acessibilidade aos Serviços de Saúde , Humanos , Estados Unidos , Reabilitação Cardíaca/estatística & dados numéricos , Reabilitação Cardíaca/métodos , Acessibilidade aos Serviços de Saúde/estatística & dados numéricos , Masculino , Feminino , Idoso , Pessoa de Meia-Idade , Adulto , Medicare/estatística & dados numéricos
7.
MMWR Surveill Summ ; 73(2): 1-11, 2024 05 02.
Artigo em Inglês | MEDLINE | ID: mdl-38687830

RESUMO

Problem/Condition: A 2019 report quantified the higher percentage of potentially excess (preventable) deaths in U.S. nonmetropolitan areas compared with metropolitan areas during 2010-2017. In that report, CDC compared national, regional, and state estimates of preventable premature deaths from the five leading causes of death in nonmetropolitan and metropolitan counties during 2010-2017. This report provides estimates of preventable premature deaths for additional years (2010-2022). Period Covered: 2010-2022. Description of System: Mortality data for U.S. residents from the National Vital Statistics System were used to calculate preventable premature deaths from the five leading causes of death among persons aged <80 years. CDC's National Center for Health Statistics urban-rural classification scheme for counties was used to categorize the deaths according to the urban-rural county classification level of the decedent's county of residence (1: large central metropolitan [most urban], 2: large fringe metropolitan, 3: medium metropolitan, 4: small metropolitan, 5: micropolitan, and 6: noncore [most rural]). Preventable premature deaths were defined as deaths among persons aged <80 years that exceeded the number expected if the death rates for each cause in all states were equivalent to those in the benchmark states (i.e., the three states with the lowest rates). Preventable premature deaths were calculated separately for the six urban-rural county categories nationally, the 10 U.S. Department of Health and Human Services public health regions, and the 50 states and the District of Columbia. Results: During 2010-2022, the percentage of preventable premature deaths among persons aged <80 years in the United States increased for unintentional injury (e.g., unintentional poisoning including drug overdose, unintentional motor vehicle traffic crash, unintentional drowning, and unintentional fall) and stroke, decreased for cancer and chronic lower respiratory disease (CLRD), and remained stable for heart disease. The percentages of preventable premature deaths from the five leading causes of death were higher in rural counties in all years during 2010-2022. When assessed by the six urban-rural county classifications, percentages of preventable premature deaths in the most rural counties (noncore) were consistently higher than in the most urban counties (large central metropolitan and fringe metropolitan) for the five leading causes of death during the study period.During 2010-2022, preventable premature deaths from heart disease increased most in noncore (+9.5%) and micropolitan counties (+9.1%) and decreased most in large central metropolitan counties (-10.2%). Preventable premature deaths from cancer decreased in all county categories, with the largest decreases in large central metropolitan and large fringe metropolitan counties (-100.0%; benchmark achieved in both county categories in 2019). In all county categories, preventable premature deaths from unintentional injury increased, with the largest increases occurring in large central metropolitan (+147.5%) and large fringe metropolitan (+97.5%) counties. Preventable premature deaths from CLRD decreased most in large central metropolitan counties where the benchmark was achieved in 2019 and increased slightly in noncore counties (+0.8%). In all county categories, preventable premature deaths from stroke decreased from 2010 to 2013, remained constant from 2013 to 2019, and then increased in 2020 at the start of the COVID-19 pandemic. Percentages of preventable premature deaths varied across states by urban-rural county classification during 2010-2022. Interpretation: During 2010-2022, nonmetropolitan counties had higher percentages of preventable premature deaths from the five leading causes of death than did metropolitan counties nationwide, across public health regions, and in most states. The gap between the most rural and most urban counties for preventable premature deaths increased during 2010-2022 for four causes of death (cancer, heart disease, CLRD, and stroke) and decreased for unintentional injury. Urban and suburban counties (large central metropolitan, large fringe metropolitan, medium metropolitan, and small metropolitan) experienced increases in preventable premature deaths from unintentional injury during 2010-2022, leading to a narrower gap between the already high (approximately 69% in 2022) percentage of preventable premature deaths in noncore and micropolitan counties. Sharp increases in preventable premature deaths from unintentional injury, heart disease, and stroke were observed in 2020, whereas preventable premature deaths from CLRD and cancer continued to decline. CLRD deaths decreased during 2017-2020 but increased in 2022. An increase in the percentage of preventable premature deaths for multiple leading causes of death was observed in 2020 and was likely associated with COVID-19-related conditions that contributed to increased mortality from heart disease and stroke. Public Health Action: Routine tracking of preventable premature deaths based on urban-rural county classification might enable public health departments to identify and monitor geographic disparities in health outcomes. These disparities might be related to different levels of access to health care, social determinants of health, and other risk factors. Identifying areas with a high prevalence of potentially preventable mortality might be informative for interventions.


Assuntos
Causas de Morte , Mortalidade Prematura , População Rural , População Urbana , Humanos , Estados Unidos/epidemiologia , Idoso , Pessoa de Meia-Idade , Adulto , Adolescente , População Urbana/estatística & dados numéricos , População Rural/estatística & dados numéricos , Adulto Jovem , Lactente , Pré-Escolar , Criança , Feminino , Masculino , Idoso de 80 Anos ou mais , Recém-Nascido , Neoplasias/mortalidade
8.
ACS Appl Mater Interfaces ; 16(17): 22326-22333, 2024 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-38635965

RESUMO

Low-temperature large-area growth of two-dimensional (2D) transition-metal dichalcogenides (TMDs) is critical for their integration with silicon chips. Especially, if the growth temperatures can be lowered below the back-end-of-line (BEOL) processing temperatures, the Si transistors can interface with 2D devices (in the back end) to enable high-density heterogeneous circuits. Such configurations are particularly useful for neuromorphic computing applications where a dense network of neurons interacts to compute the output. In this work, we present low-temperature synthesis (400 °C) of 2D tungsten diselenide (WSe2) via the selenization of the W film under ultrahigh vacuum (UHV) conditions. This simple yet effective process yields large-area, homogeneous films of 2D TMDs, as confirmed by several characterization techniques, including reflection high-energy electron diffraction, atomic force microscopy, transmission electron microscopy, and different spectroscopy methods. Memristors fabricated using the grown WSe2 film are leveraged to realize a novel compact neuron circuit that can be reconfigured to enable homeostasis.

9.
J Spec Oper Med ; 2024 Mar 13.
Artigo em Inglês | MEDLINE | ID: mdl-38446068

RESUMO

BACKGROUND: Fast and reliable blood collection is critical to emergency walking blood banks (WBB) because mortality significantly declines when blood is quickly administered to a warfighter with hemorrhagic shock. Phlebotomy for WBB is accomplished via either the "straight stick" (SS) or "ruggedized lock" (RL) method. SS comprises a 16-gauge phlebotomy needle connected to a blood collection bag via tubing. The RL device collects blood through the same apparatus, but has a capped, intravenous (IV) catheter between the needle and the donor's arm. This is the first study to compare these two methods in battlefield-relevant metrics. METHODS: Military first responders and licensed medical providers (N=86) were trained in SS and RL as part of fresh whole blood training exercises. Outcomes included venipuncture success rates, time to IV access, blood collection times, total time, and user preferences, using a within-subjects crossover design. Data were analyzed using ANOVA and nonparametric statistics at p<0.05. RESULTS: SS outperformed RL in first venipuncture success rates (76% vs. 64%, p=0.07), IV access times (448 [standard error of the mean; SE 23] vs. 558 [SE 31] s, p<0.01), and blood collection bag fill times (573 [SE 48] vs. 703 [SE 44] s, p<0.05), resulting in an approximate 3.5-minute faster time overall. Survey data were mixed, with users perceiving SS as simpler and faster, but RL as more reliable and secure. CONCLUSION: SS is optimal when timely collection is imperative, while RL may be preferable when device stability or replacing the collection bag is a consideration.

10.
Nat Commun ; 15(1): 2334, 2024 Mar 14.
Artigo em Inglês | MEDLINE | ID: mdl-38485722

RESUMO

The ability to scale two-dimensional (2D) material thickness down to a single monolayer presents a promising opportunity to realize high-speed energy-efficient memristors. Here, we report an ultra-fast memristor fabricated using atomically thin sheets of 2D hexagonal Boron Nitride, exhibiting the shortest observed switching speed (120 ps) among 2D memristors and low switching energy (2pJ). Furthermore, we study the switching dynamics of these memristors using ultra-short (120ps-3ns) voltage pulses, a frequency range that is highly relevant in the context of modern complementary metal oxide semiconductor (CMOS) circuits. We employ statistical analysis of transient characteristics to gain insights into the memristor switching mechanism. Cycling endurance data confirms the ultra-fast switching capability of these memristors, making them attractive for next generation computing, storage, and Radio-Frequency (RF) circuit applications.

11.
J Spec Oper Med ; 2024 Mar 13.
Artigo em Inglês | MEDLINE | ID: mdl-38408046

RESUMO

BACKGROUND: Blood is a highly valuable medical resource that necessitates strict guidelines to ensure the safety and well-being of the recipient. Since the onset of the war in Ukraine there has been an increased demand for training in emergency fresh whole blood transfusion (EFWBT) to improve damage control resuscitation capabilities. To meet this demand, we developed, implemented, and evaluated a training program aimed at enhancing Ukrainian EFWBT proficiency. METHODS: Eight Ukrainian healthcare professionals (UHPs), including six physicians and two medics, completed our training, derived from the Joint Trauma System Clinical Practice Guidelines, Tactical Combat Casualty Care (TCCC) Guidelines, 75th Ranger Regiment Ranger O-Low Titer (ROLO) program, and Marine Corps Valkyrie program. Participants were assessed on their confidence in the practical application and administrative oversight requirements of an EFWBT program. A cross-comparison was conducted between a larger data set of third-year medical students from the Uniformed Services University and the UHPs to determine the statistical significance of the program. RESULTS: The difference in mean scores of UHPs during preand post-training was statistically significant (p<0.001). Additionally, the average rate of improvement was greater for the UHPs compared with the third-year medical students (p=0.000065). CONCLUSION: Our study revealed that the application of an EFWBT training program for UHPs can significantly increase confidence in their ability to conduct EFWBTs on the battlefield. Further larger-scale research is needed to determine the impact of this training on performance outcomes.

12.
Adv Mater ; 36(23): e2308711, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38381601

RESUMO

Batteries utilizing a sodium (Na) metal anode with a liquid electrolyte are promising for affordable large-scale energy storage. However, a deep understanding of the intrinsic degradation mechanisms is limited by challenges in accessing the buried interfaces. Here, cryogenic electron microscopy of intact electrode:separator:electrode stacks is performed and degradation and failure of symmetric Na||Na coin cells occurs through the infiltration of Na metal through the pores of the separator rather than by mechanical puncturing by dendrites is revealed. It is shown the interior structure of the cell (electrode:separator:electrode) must be preserved and deconstructing the cell into different layers for characterization results in artifacts. In intact cell stacks, minimal liquid is found between the electrodes and separator, leading to intimate electrode:separator interfaces. After electrochemical cycling, Na infiltrates into the pore free-volume, growing through the separator to create electrical shorts and degradation. The Na infiltration occurs at interfacial regions devoid of solid-electrolyte interphase (SEI), revealing SEI plays an important role in preventing Na from growing into the separator by being a physical barrier that the plated Na cannot penetrate. These results shed new light on the fundamental failure mechanisms in Na batteries and demonstrate the importance of preserving the cell structure and buried interfaces.

13.
Artigo em Inglês | MEDLINE | ID: mdl-38873403

RESUMO

Introduction: From 1999 to 2020, the suicide rate in Virginia increased from 13.1 to 15.9 per 100,000 persons aged 10 years and older. Few studies have examined spatial patterns of suicide geographies smaller than the county level. Methods: We analyzed data from suicide decedents aged ≥10 years from 2010 through 2015 in the Virginia Violent Death Reporting System. We identified spatial clusters of high suicide rates using spatially adaptive filtering with standardized mortality ratio (SMR) significantly higher than the state SMR (p < 0.001). We compared demographic characteristics, method of injury, and suicide circumstances of decedents within each cluster to decedents outside any cluster. Results: We identified 13 high-risk suicide clusters (SMR between 1.7 and 2.0). Suicide decedents in the clusters were more likely to be older (40+ years), non-Hispanic white, widowed/divorced/separated, and less likely to have certain precipitating suicide circumstances than decedents outside the clusters. Suicide by firearm was more common in four clusters, and suicide by poisoning was more common in two clusters compared to the rest of the state. Conclusions: There are important differences between geographic clusters of suicide in Virginia. These results suggest that place-specific risk factors for suicide may be relevant for targeted suicide prevention.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA