Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 512
Filtrar
1.
Prehosp Emerg Care ; : 1-10, 2024 Oct 03.
Artigo em Inglês | MEDLINE | ID: mdl-39361267

RESUMO

OBJECTIVES: The delta shock index (ΔSI), defined as the change in shock index (SI) over time, is associated with hospital morbidity and mortality, but prehospital studies about ΔSI are limited. We investigate the association of prehospital ΔSI with mortality and resource utilization, hypothesizing that increases in SI among field trauma patients are associated with increased mortality and blood product transfusion. METHODS: We performed a multicenter, retrospective, observational study from the Linking Investigators in Trauma and Emergency Services (LITES) network. We obtained data from January 2017 to June 2021. We fit logistic regression models to evaluate the association between an increase ΔSI > 0.1 and 28-day mortality and blood product transfusion within 4 hours of emergency department (ED) arrival. We used negative binomial models to evaluate the association between ΔSI > 0.1 and days in hospital, intensive care unit (ICU), and on ventilator (up to 28 days). RESULTS: We identified 33,219 prehospital patients. We excluded burn patients and those without documented prehospital or ED heart rate or blood pressure, resulting in 30,511 cases for analysis. In adjusted analysis for the primary outcome of 28-day mortality, patients who had a ΔSI > 0.1 based on initial vital signs were 31% more likely to die (adjusted odds ratio (AOR) of 1.31, 95% CI 1.21-1.41) compared to those patients who had a ΔSI ≤0.1. These patients also spent 16% more days in hospital (adjusted incident rate ratio (AIRR) 1.16, 95% CI 1.14-1.19), 34% more days in ICU (AIRR 1.34, 95% CI 1.28-1.41), and 61% more days on ventilator (ARR 1.61, 95% CI 1.47-1.75). Additionally, patients with a ΔSI > 0.1 had higher odds of receiving blood products (AOR 2.00, 95% CI 1.88-2.12) within 4 hours of ED arrival. Models fit excluding hypotensive patients performed similarly. CONCLUSIONS: An increase of greater than 0.1 in the ΔSI was associated with increased 28-day mortality; increased days in hospital, in ICU, and on ventilator; and increased need for blood product transfusion within 4 hours of ED arrival. This association held true for initially normotensive patients. Validation and implementation are needed to incorporate ΔSI into prehospital and ED triage.

2.
Cureus ; 16(8): e68310, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-39350859

RESUMO

Introduction Acute cholecystitis is a common complication of gallstone disease. Likewise, gallbladder necrosis is a complication of cholecystitis associated with higher risks of morbidity and mortality. Identification of risk factors which portend to gallbladder necrosis is key in prioritizing the management of higher-risk patients. This study aimed to identify such factors that predict the development of gallbladder necrosis. Method A retrospective review of all patients undergoing emergency cholecystectomy in a tertiary hospital over a two-year period was performed. Gallbladder necrosis was diagnosed on histopathological examination of operative specimens. Multivariable logistic regression was performed to determine risk factors for gallbladder necrosis. Results A total of 163 patients underwent acute cholecystectomy and 43 (26%) had proven gallbladder necrosis. Multivariable analysis demonstrated that elevated white cell count (WCC) (OR 1.122, 95%CI 1.031-1.221, p=0.007), elevated C-reactive protein (CRP) (OR 1.004, 95%CI 1.001-1.008, p=0.022) and positive smoking status (OR 5.724, 95%CI 1.323-24.754, p=0.020) were independently predictive of gallbladder necrosis. Notably, advancing age, elevated BMI, diabetes mellitus or American Society of Anesthesiologists (ASA) grade were not found to be associated with developing necrosis. Conclusion Patients at risk of gallbladder necrosis include those with higher WCC, CRP, and active smokers. Given the increased potential complications, these risk factors should be identified early in the management of those admitted with gallstone disease to ensure such patients receive aggressive medical therapy alongside timely and guided surgical intervention.

4.
Curr Med Res Opin ; : 1-14, 2024 Sep 28.
Artigo em Inglês | MEDLINE | ID: mdl-39340768

RESUMO

OBJECTIVE: To assess tablet utilization patterns and describe pre-treatment characteristics among new users of rimegepant. BACKGROUND: Rimegepant is the only oral calcitonin gene-related peptide antagonist approved in the United States for both the acute and preventive treatment of migraine. METHODS: We conducted a retrospective cohort study of people with migraine who initiated treatment with rimegepant using two US commercial claims databases (MarketScan® and Optum®). Patients (≥18 years old) with migraine who newly initiated rimegepant were included. Patients were stratified into two groups representing acute (quantity = 8) and prevention (quantity = 15 or 16) use cohorts. Baseline characteristics and medication use history were assessed on index and during the 365-day pre-index period. Rimegepant utilization periods were calculated based on days supplied and varying approaches to define use periods. Tablet quantity per 30 days was reported separately for both acute and prevention cohorts. RESULTS: In MarketScan, a total of 14,037 rimegepant users were identified; 11,195 (79.8%) in the acute group and 1,880 (13.4%) in the prevention group. Rimegepant utilization for acute use was 4.9 ± 2.1 tablets per 30 days and for preventive use was 13.1 ± 7.7 tablets per 30 days. There was high baseline prevalence of triptan contraindications, warnings, and high cardiovascular risk, with a combined 46.2% meeting one or more of these criteria. Acute medication overuse was also common (25.1%) prior to rimegepant initiation. Results were consistent in the Optum database. CONCLUSION: Our analysis provides the first real-world data available on tablet utilization and characteristics of new users of rimegepant.


There is little information available on the characteristics of people with migraine who start to use rimegepant, which is the only medicine approved for both the prevention of migraine attacks and the acute treatment of migraine attacks after they have started.Information on new users of rimegepant at least 18 years of age was obtained from two commercial databases of US healthcare claims (MarketScan® and Optum®). The researchers used this information to evaluate people's age, sex, pre-existing illnesses, and prior use of migraine medications at the time they started using rimegepant, and they also used several different methods to estimate how often people used rimegepant after treatment was started.The MarketScan database contained information on 14,037 people with migraine who started using rimegepant, with this group having an average age of 43 years and being comprised mostly of females (88%). Prior to starting rimegepant, almost half (46%) of the people were considered to have high cardiovascular risk and 25% considered at risk of overusing acute migraine medications. Most of the 14,037 people (80%) who started rimegepant used it to treat migraine attacks after they started and this group used approximately 5 tablets every month. The smaller number of people who used rimegepant to prevent migraine attacks used approximately 13 tablets every month. The information obtained from the Optum database was similar to that obtained from the MarketScan database.The researchers' analysis is the first to describe the characteristics of people with migraine who start to use rimegepant outside the setting of a controlled clinical trial. Their results show that new users of rimegepant represent a complex population with a significant profile of pre-existing illness and a diverse treatment history.

5.
Front Robot AI ; 11: 1404543, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-39228689

RESUMO

Physical interaction with patients, for example conducted as part of a diagnostic examination or surgical procedure, provides clinicians with a wealth of information about their condition. Simulating this interaction is of great interest to researchers in both haptics and medical education, and the development of softness changing tactile interfaces is important in recreating the feel of different soft tissues. This paper presents designs for a variety of novel electromechanical and electromagnetic mechanisms for controlling particle jamming-based, hardness changing tactile displays, intended to allow medical trainees to experience these physical interactions in a range of simulation settings such as clinical skills teaching laboratories. Each design is then subjected to a battery of mechanical tests to evaluate its effectiveness compared to the state of the art, as well as their suitability for simulating the physical hardness of different types of soft tissues, previously characterised in established literature. These results demonstrate that all of the technologies presented are able to exhibit a measurable hardness change, with Shore hardness values between 3A and 57A achieved by the most effective constriction-based device. The electromechanical devices based on constriction and compression, and the state-of-the-art pneumatic device, were able to achieve hardness changes within a range that is useful for replicating the softness of organic tissue. The electromechanical and electromagnetic devices were also found to effect their full range of hardness change in less than a second, compared to several seconds for the state-of-the-art. These results show that the performance of softness changing tactile displays can be improved with the electromechanical actuation techniques proposed in this paper, and that such displays are able to replicate the physical characteristics of soft tissues and may therefore be of benefit in medical training and simulation scenarios.

6.
Schizophr Bull ; 2024 Sep 06.
Artigo em Inglês | MEDLINE | ID: mdl-39241701

RESUMO

BACKGROUND AND HYPOTHESIS: Schizophrenia is associated with a decreased pursuit of risky rewards during uncertain-risk decision-making. However, putative mechanisms subserving this disadvantageous risky reward pursuit, such as contributions of cognition and relevant traits, remain poorly understood. STUDY DESIGN: Participants (30 schizophrenia/schizoaffective disorder [SZ]; 30 comparison participants [CP]) completed the Balloon Analogue Risk Task (BART). Computational modeling captured subprocesses of uncertain-risk decision-making: Risk Propensity, Prior Belief of Success, Learning Rate, and Behavioral Consistency. IQ, self-reported risk-specific processes (ie, Perceived Risks and Expected Benefit of Risks), and non-risk-specific traits (ie, defeatist beliefs; hedonic tone) were examined for relationships with Risk Propensity to determine what contributed to differences in risky reward pursuit. STUDY RESULTS: On the BART, the SZ group exhibited lower Risk Propensity, higher Prior Beliefs of Success, and comparable Learning Rates. Furthermore, Risk Propensity was positively associated with IQ across groups. Linear models predicting Risk Propensity revealed 2 interactions: 1 between group and Perceived Risk, and 1 between IQ and Perceived Risk. Specifically, in both the SZ group and individuals with below median IQ, lower Perceived Risks was related to lower Risk Propensity. Thus, lower perception of financial risks was associated with a less advantageous pursuit of uncertain-risk rewards. CONCLUSIONS: Findings suggest consistently decreased risk-taking on the BART in SZ may reflect risk imperception, the failure to accurately perceive and leverage relevant information to guide the advantageous pursuit of risky rewards. Additionally, our results highlight the importance of cognition in uncertain-risk decision-making.

7.
Artigo em Inglês | MEDLINE | ID: mdl-39269259

RESUMO

BACKGROUND: Whole blood (WB) resuscitation is increasingly common in adult trauma centers and some pediatric trauma centers, as studies have noted its safety and potential superiority to component therapy (CT). Previous analyses have evaluated WB as a binary variable (any versus none), and little is known regarding the "dose response" of WB in relation to total transfusion volume (TTV) (WB/TTV ratio). METHODS: Injured children younger than 18 years who received any blood transfusion within 4 hours of hospital arrival across 456 US trauma centers were included from the American College of Surgeons Trauma Quality Improvement Program database. The primary outcome was 24-hour mortality, and the secondary outcome was 4-hour mortality. Multivariate analysis was used to evaluate associations between WB administration and mortality and WB/TTV ratio and mortality. RESULTS: Of 4,323 pediatric patients included in final analysis, 88% (3,786) received CT only, and 12% (537) received WB with or without CT. Compared with the CT group, WB recipients were more likely to be in shock, according to pediatric age-adjusted shock index (71% vs. 60%) and had higher median (interquartile range) Injury Severity Score (26 [17-35] vs. 25 [16-24], p = 0.007). Any WB transfusion was associated with 42% decreased odds of mortality at 4 hours (adjusted odds ratio [aOR], 0.58 [95% confidence interval, 0.35-0.97]; p = 0.038) and 54% decreased odds of mortality at 24 hours (aOR, 0.46 [0.33-0.66]; p < 0.001). Each 10% increase in WB/TTV ratio was associated with a 9% decrease in 24-hour mortality (aOR, 0.91 [0.85-0.97]; p = 0.006). Subgroup analyses for age younger than 14 years and receipt of massive transfusion (>40 mL/kg) also showed statistically significant survival benefit for 24-hour mortality. CONCLUSION: In this retrospective American College of Surgeons Trauma Quality Improvement Program analysis, use of WB was independently associated with reduced 24-hour mortality in children; further, higher proportions of WB used over the total resuscitation (WB/TTV ratio) were associated with a stepwise increase in survival. LEVEL OF EVIDENCE: Prognostic and Epidemiological; Level III.

8.
Chem Commun (Camb) ; 2024 Sep 04.
Artigo em Inglês | MEDLINE | ID: mdl-39228333

RESUMO

Cyclic nucleoside phosphates have been shown previously to be adequately activated to oligomerise under dry conditions. Herein, it is demonstrated that 3',5'-cyclic adenosine monophosphate (3',5'-cAMP) and glycine when subjected to dehydration under alkaline conditions form phosphoramidate-linked conjugates. Solid-state reaction mechanisms investigated by DFT suggest why the reaction does not occur efficiently in the aqueous phase.

9.
Ann Surg ; 2024 Sep 18.
Artigo em Inglês | MEDLINE | ID: mdl-39291384

RESUMO

OBJECTIVE: To determine the proportion and characteristics of injured rural residents treated at urban trauma centers (TCs), urban non-trauma centers (NTCs), rural TCs, and rural NTCs. SUMMARY BACKGROUND DATA: Timely treatment at a designated trauma center improves outcomes for patients with serious injuries, but rural residents have limited access to designated trauma centers. Rural non-trauma centers may constitute an underrecognized source of trauma care. METHODS: We used the National Emergency Department Sample to conduct a retrospective, pooled cross-sectional study of ED visits among rural residents with injury severity score (ISS) ≥ 9 (indicating at least moderate injury). Hospitals were designated as a trauma (TC) or non-trauma center (NTC) and as rural or urban. We compared management, disposition, and outcomes among hospital types. RESULTS: Of 748,587 injured rural residents from 2016-2020, 384,113 (51.3%) were treated in rural NTCs, 232,845 (31.1%) in urban TCs, 116,493 (15.6%) in urban NTCs, and 15,137 (2.0%) in rural TCs. Injuries treated at rural NTCs were moderate in severity (ISS 9-15) in 76.6% of visits, severe (ISS 16-25) in 15.7%, and very severe (ISS > 25) in 1.1%. Urban TCs saw the highest proportion of very severe injuries (17.3%). Rural NTCs managed 77.5% of visits definitively, discharging 72.8%. They transferred 21.9% of patients. Length of stay was longest and hospital charges highest for patients treated in urban TCs, which also performed the most procedures. Rural NTCs had the shortest length of stay and lowest mean charges. CONCLUSIONS: Rural non-trauma centers provided initial care for more than half of injured rural residents, including 2 in 5 of those with the most severe injuries, and managed more than 3 in 4 definitively. These hospitals may be an under-recognized component of the US trauma system.

10.
Pharmacoepidemiol Drug Saf ; 33(8): e5861, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-39090796

RESUMO

PURPOSE: Concomitant use of hormonal contraceptive agents (HCAs) and enzyme-inducting antiepileptic drugs (EIAEDs) may lead to contraceptive failure and unintended pregnancy. This review identified and evaluated concordance and quality of clinical treatment guidelines related to the use of HCAs in women with epilepsy (WWE) receiving EIAEDs. METHODS: Relevant clinical guidelines were identified across four databases and were independently evaluated for quality utilizing the AGREE-II protocol instrument. Quality in this context is defined as the rigor and transparency of the methodologies used to develop the guideline. Guidelines were further assessed in terms of concordance and discordance with the latest body of knowledge concerning the use of hormonal contraception in the presence of EIAEDs. RESULTS: A total of n = 5 guidelines were retrieved and evaluated. Overall guideline scores ranged from 17% to 92%, while individual domain scores ranged from 0% to 100%. Contraceptive guidelines consistently recommended the use of intrauterine systems and long-acting injectables in the presence of EIAEDs, recommended against the use of oral, transdermal, and vaginal ring contraceptives, and differed regarding recommendations related to implants. Guidelines agreed regarding recommendations that women treated with EIAEDs should receive intrauterine systems and long-acting injectables; however, the suggested frequency of administration of injectable contraceptives differed. The use of intrauterine systems in this population is supported by evidence, but there is uncertainty surrounding the use of long-acting injectables and contraceptive implants. CONCLUSIONS: To mitigate the risk of unintended pregnancy and its consequences, recommendations related to implants and long-acting injectable contraceptives should be evidence-based.


Assuntos
Anticonvulsivantes , Contraceptivos Hormonais , Interações Medicamentosas , Epilepsia , Guias de Prática Clínica como Assunto , Humanos , Epilepsia/tratamento farmacológico , Feminino , Anticonvulsivantes/administração & dosagem , Anticonvulsivantes/efeitos adversos , Contraceptivos Hormonais/administração & dosagem , Contraceptivos Hormonais/efeitos adversos , Gravidez , Gravidez não Planejada
11.
Artigo em Inglês | MEDLINE | ID: mdl-39093636

RESUMO

BACKGROUND: Trauma systems save lives by coordinating timely and effective responses to injury. However, trauma system effectiveness varies geographically, with worse outcomes observed in rural settings. Prior data suggest that undertriage may play a role in this disparity. Our aim was to explore potential driving factors for decision making among clinicians for undertriaged trauma patients. METHODS: We performed a retrospective analysis of the National Emergency Medical Services Information System database among patients who met physiologic or anatomic national field triage guideline criteria for transport to the highest level of trauma center. Undertriage was defined as transport to a non-level I/II trauma center. Multivariable logistic regression was used to determine demographic, injury, and system characteristics associated with undertriage. Undertriaged patients were then categorized into "recognized" and "unrecognized" groups using the documented reason for transport destination to identify underlying factors associated with undertriage. RESULTS: A total of 36,094 patients were analyzed. Patients in urban areas were more likely to be transported to a destination based on protocol rather than the closest available facility. As expected, patients injured in urban regions were less likely to be undertriaged than their suburban (adjusted odds ratio [aOR], 2.69; 95% confidence interval [95% CI], 2.21-3.31), rural (aOR, 2.71; 95% CI, 2.28-3.21), and wilderness counterparts (aOR, 3.99; 95% CI, 2.93-5.45). The strongest predictor of undertriage was patient/family choice (aOR, 6.29; 5.28-7.50), followed by closest facility (aOR, 5.49; 95% CI, 4.91-6.13) as the reason for hospital selection. Nonurban settings had over twice the odds of recognizing the presence of triage criteria among undertriaged patients (p < 0.05). CONCLUSION: Patients with injuries in nonurban settings and those with less apparent causes of severe injury are more likely to experience undertriage. By analyzing how prehospital clinicians choose transport destinations, we identified patient and system factors associated with undertriage. Targeting these at-risk demographics and contributing factors may help alleviate regional disparities in undertriage. LEVEL OF EVIDENCE: Diagnostic; Level IV.

12.
bioRxiv ; 2024 Jul 26.
Artigo em Inglês | MEDLINE | ID: mdl-39211172

RESUMO

Drugs that modulate N-methyl-D-aspartate (NMDA) or γ-Aminobutyric acid type A (GABA A ) receptors can shed light on their role in synaptic plasticity mechanisms underlying the effects of non-invasive brain stimulation. However, research on the combined effects of these drugs and exogenous stimulation on motor learning is limited. This study aimed to investigate the effects of pharmacological interventions combined with intermittent theta-burst stimulation (iTBS) on human motor learning. Nine right-handed healthy subjects (mean age ± SD: 31.56 ± 12.96 years; 6 females) participated in this double-blind crossover study. All participants were assigned to four drug conditions in a randomized order: (1) D-cycloserine (partial NMDA receptor agonist), (2) D-cycloserine + dextromethorphan (NMDA receptor agonist + antagonist), (3) lorazepam (GABA A receptor agonist), and (4) placebo (identical microcrystalline cellulose capsule). After drug intake, participants practiced the 12-item keyboard sequential task as a baseline measure. Two hours after drug intake, iTBS was administered at the primary motor cortex. Following iTBS, the retention test was performed in the same manner as the baseline measure. Our findings revealed that lorazepam combined with iTBS impaired motor learning during the retention test. Future studies are still needed for a better understanding of the mechanisms through which TMS may influence human motor learning.

13.
Proc Natl Acad Sci U S A ; 121(33): e2407400121, 2024 Aug 13.
Artigo em Inglês | MEDLINE | ID: mdl-39110735

RESUMO

HIV-1 transcript function is controlled in part by twinned transcriptional start site usage, where 5' capped RNAs beginning with a single guanosine (1G) are preferentially packaged into progeny virions as genomic RNA (gRNA) whereas those beginning with three sequential guanosines (3G) are retained in cells as mRNAs. In 3G transcripts, one of the additional guanosines base pairs with a cytosine located within a conserved 5' polyA element, resulting in formation of an extended 5' polyA structure as opposed to the hairpin structure formed in 1G RNAs. To understand how this remodeling influences overall transcript function, we applied in vitro biophysical studies with in-cell genome packaging and competitive translation assays to native and 5' polyA mutant transcripts generated with promoters that differentially produce 1G or 3G RNAs. We identified mutations that stabilize the 5' polyA hairpin structure in 3G RNAs, which promote RNA dimerization and Gag binding without sequestering the 5' cap. None of these 3G transcripts were competitively packaged, confirming that cap exposure is a dominant negative determinant of viral genome packaging. For all RNAs examined, conformations that favored 5' cap exposure were both poorly packaged and more efficiently translated than those that favored 5' cap sequestration. We propose that structural plasticity of 5' polyA and other conserved RNA elements place the 5' leader on a thermodynamic tipping point for low-energetic (~3 kcal/mol) control of global transcript structure and function.


Assuntos
Genoma Viral , HIV-1 , Conformação de Ácido Nucleico , Biossíntese de Proteínas , RNA Viral , HIV-1/genética , RNA Viral/genética , RNA Viral/metabolismo , RNA Viral/química , Humanos , Empacotamento do Genoma Viral , Mutação , Montagem de Vírus/genética , Capuzes de RNA/metabolismo , Capuzes de RNA/genética , RNA Mensageiro/genética , RNA Mensageiro/metabolismo
14.
J Alzheimers Dis ; 101(1): 133-145, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-39121116

RESUMO

Background: Lewy body dementia (LBD) is the second most common neurodegenerative dementia in the US, presenting unique end-of-life challenges. Objective: This study examined healthcare utilization and care continuity in the last year of life in LBD. Methods: Medicare claims for enrollees with LBD, continuously enrolled in the year preceding death, were examined from 2011-2018. We assessed hospital stays, emergency department (ED) visits, intensive care unit (ICU) admissions, life-extending procedures, medications, and care continuity. Results: We identified 45,762 LBD decedents, predominantly female (51.8%), White (85.9%), with average age of 84.1 years (SD 7.5). There was a median of 2 ED visits (IQR 1-5) and 1 inpatient stay (IQR 0-2). Higher age was inversely associated with ICU stays (Odds Ratio [OR] 0.96; 95% Confidence Interval [CI] 0.96-0.97) and life-extending procedures (OR 0.96; 95% CI 0.95-0.96). Black and Hispanic patients experienced higher rates of ED visits, inpatient hospitalizations, ICU admissions, life-extending procedures, and in-hospital deaths relative to White patients. On average, 15 (7.5) medications were prescribed in the last year. Enhanced care continuity correlated with reduced hospital (OR 0.72; 95% CI 0.70-0.74) and ED visits (OR 0.71; 95% CI 0.69-0.87) and fewer life-extending procedures (OR 0.71; 95% CI 0.64-0.79). Conclusions: This study underscored the complex healthcare needs of people with LBD during their final year, which was influenced by age and race. Care continuity may reduce hospital and ED visits and life-extending procedures.


Assuntos
Doença por Corpos de Lewy , Medicare , Aceitação pelo Paciente de Cuidados de Saúde , Assistência Terminal , Humanos , Doença por Corpos de Lewy/terapia , Doença por Corpos de Lewy/epidemiologia , Feminino , Masculino , Assistência Terminal/estatística & dados numéricos , Idoso de 80 Anos ou mais , Idoso , Estados Unidos/epidemiologia , Medicare/estatística & dados numéricos , Aceitação pelo Paciente de Cuidados de Saúde/estatística & dados numéricos , Hospitalização/estatística & dados numéricos , Unidades de Terapia Intensiva/estatística & dados numéricos , Serviço Hospitalar de Emergência/estatística & dados numéricos , Continuidade da Assistência ao Paciente/estatística & dados numéricos
17.
Pharmacoepidemiol Drug Saf ; 33(8): e5852, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-39099262

RESUMO

PURPOSE: To estimate incidence rates of suicidal ideation and behavior following treatment initiation with gabapentinoids or dopamine agonists (DAs) in patients with newly diagnosed early-onset idiopathic restless legs syndrome (RLS) and to examine suicidal behavior risk, comparing between those receiving gabapentinoids and DAs. METHODS: A new user retrospective cohort study using MarketScan claims data from 2012 to 2019 was conducted. Exposures were monotherapy gabapentinoids or DAs initiated within 60 days of new RLS diagnosis. Three varying outcome measures of suicidality were examined and incidence rates were calculated for each. A log-binomial regression model the estimated relative risk (RR) of the outcomes with gabapentinoids. Propensity score weighting adjusted for baseline covariates, including age, substance use disorders, hyperlipidemia, antipsychotic use, hypnotic/sedative use, and mood stabilizer use, which were most imbalanced before weighting. RESULTS: The cohort included 6672 patients, with 4986 (74.7%) initiating a DA and 1686 (25.3%) initiating a gabapentinoid. Incidence rates for all outcome measures were higher in the gabapentinoid group (suicidality: 21.6 vs. 10.7 per 1000 person-years; suicidality with self-harm: 23.0 vs. 11.1 per 1000 person-years; overdose- and suicide-related events: 30.0 vs. 15.5 person-years). Associated risk of suicidality (adjusted RR, 1.27 [95% CI, 0.86-1.88]); suicidality with self-harm (adjusted RR, 1.30 [95% CI, 0.89-1.90]); or overdose- and suicide-related events (adjusted RR, 1.30 [95% CI, 0.93-1.80]) was not significant with gabapentinoids. CONCLUSIONS: Incidence rates for suicidal ideation and behavior were higher among the gabapentinoid group, although increased risk was not detected after adjustment. A possible signal cannot be ruled out given limitations of the data and rarity of the outcome.


Assuntos
Gabapentina , Síndrome das Pernas Inquietas , Ideação Suicida , Humanos , Feminino , Masculino , Estudos Retrospectivos , Síndrome das Pernas Inquietas/epidemiologia , Síndrome das Pernas Inquietas/tratamento farmacológico , Adulto , Pessoa de Meia-Idade , Gabapentina/efeitos adversos , Incidência , Agonistas de Dopamina/efeitos adversos , Agonistas de Dopamina/uso terapêutico , Adulto Jovem , Estudos de Coortes , Idoso , Adolescente , Fatores de Risco
18.
J Neurosurg ; : 1-10, 2024 May 10.
Artigo em Inglês | MEDLINE | ID: mdl-39076152

RESUMO

OBJECTIVE: Traumatic brain injury (TBI) and hemorrhage are responsible for the largest proportion of all trauma-related deaths. In polytrauma patients at risk of hemorrhage and TBI, the diagnosis, prognosis, and management of TBI remain poorly characterized. The authors sought to characterize the predictive capabilities of glial fibrillary acidic protein (GFAP) and ubiquitin C-terminal hydrolase L1 (UCH-L1) measurements in patients with hemorrhagic shock with and without concomitant TBI. METHODS: The authors performed a secondary analysis on serial blood samples derived from a prospective observational cohort study that focused on comparing early whole-blood and component resuscitation. A convenience sample of patients was used in which samples were collected at three time points and the presence of TBI or no TBI via CT imaging was documented. GFAP and UCH-L1 measurements were performed on plasma samples using the i-STAT Alinity point-of-care platform. Using classification tree recursive partitioning, the authors determined the measurement cut points for each biomarker to maximize the abilities for predicting the diagnosis of TBI, Rotterdam CT imaging scores, and 6-month Glasgow Outcome Scale-Extended (GOSE) scores. RESULTS: Biomarker comparisons demonstrated that GFAP and UCH-L1 measurements were associated with the presence of TBI at all time points. Classification tree analyses demonstrated that a GFAP level > 286 pg/ml for the sample taken upon the patient's arrival had an area under the receiver operating characteristic curve of 0.77 for predicting the presence of TBI. The classification tree results demonstrated that a cut point of 3094 pg/ml for the arrival GFAP measurement was the most predictive for an elevated Rotterdam score on the initial and second CT scans and for TBI progression between scans. No significant associations between any of the most predictive cut points for UCH-L1 and Rotterdam CT scores or TBI progression were found. The predictive capabilities of UCH-L1 were limited by the range allowed by the point-of-care platform. Arrival GFAP cut points remained strong independent predictors after controlling for all potential polytrauma confounders, including injury characteristics, shock severity, and resuscitation. CONCLUSIONS: Early measurements of GFAP and UCH-L1 on a point-of-care device are significantly associated with CT-diagnosed TBI in patients with polytrauma and shock. Early elevated GFAP measurements are associated with worse head CT scan Rotterdam scores, TBI progression, and worse GOSE scores, and these associations are independent of other injury attributes, shock severity, and early resuscitation characteristics.

19.
Brain Stimul ; 17(4): 867-875, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-39059712

RESUMO

Temporal interference electrical neurostimulation (TI) is a relatively new method of non-invasive neurostimulation that may be able to stimulate deep brain regions without stimulating the overlying superficial regions. Although some recent studies have demonstrated the success of TI in modulating task-induced BOLD activity in humans, there is limited information on intended and off-target effects of TI during resting-state. We simultaneously performed TI stimulation with the set-up optimized for maximum focality in the left caudate and collected resting-state fMRI data to investigate the effects of TI on human BOLD signals. We found increased BOLD activation in a part of the mid-orbitofrontal cortex (OFC) and parahippocampal gyrus. Results indicate that TI can induce increased BOLD activation in the region that receives the highest magnitude of TI amplitude modulation in humans, with good safety and tolerability profiles. We also show the limits of spatial precision and explore the nature and causes of additional off-target effects. TI may be a promising approach for addressing questions about the causal role of deep brain structures in human cognition and may also afford new clinical treatments.


Assuntos
Imageamento por Ressonância Magnética , Córtex Pré-Frontal , Humanos , Masculino , Adulto , Feminino , Córtex Pré-Frontal/fisiologia , Córtex Pré-Frontal/diagnóstico por imagem , Estimulação Elétrica/métodos , Mapeamento Encefálico/métodos , Estimulação Encefálica Profunda/métodos , Adulto Jovem
20.
Trauma Surg Acute Care Open ; 9(1): e001479, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-39027653

RESUMO

Background: Emergency general surgery (EGS) often demands timely interventions, yet data for triage and timing are limited. This study explores the relationship between hospital arrival-to-operation time and mortality in EGS patients. Study design: We performed a retrospective cohort study using an EGS registry at four hospitals, enrolling adults who underwent operative intervention for a primary American Association for the Surgery of Trauma-defined EGS diagnosis between 2021 and 2023. We excluded patients undergoing surgery more than 72 hours after admission as non-urgent and defined our exposure of interest as the time from the initial vital sign capture to the skin incision timestamp. We assessed the association between operative timing quintiles and in-hospital mortality using a mixed-effect hierarchical multivariable model, adjusting for patient demographics, comorbidities, organ dysfunction, and clustering at the hospital level. Results: A total of 1199 patients were included. The median time to operating room (OR) was 8.2 hours (IQR 4.9-20.5 hours). Prolonged time to OR increased the relative likelihood of in-hospital mortality. Patients undergoing an operation between 6.7 and 10.7 hours after first vitals had the highest odds of in-hospital mortality compared with operative times <4.2 hours (reference quintile) (adjusted OR (aOR) 68.994; 95% CI 4.608 to 1032.980, p=0.002). A similar trend was observed among patients with operative times between 24.4 and 70.9 hours (aOR 69.682; 95% CI 2.968 to 1636.038, p=0.008). Conclusion: Our findings suggest that prompt operative intervention is associated with lower in-hospital mortality rates among EGS patients. Further work to identify the most time-sensitive populations is warranted. These results may begin to inform benchmarking for triaging interventions in the EGS population to help reduce mortality rates. Level of evidence: IV.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA