Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 560
Filter
Add more filters

Publication year range
1.
Am J Physiol Heart Circ Physiol ; 326(6): H1366-H1385, 2024 06 01.
Article in English | MEDLINE | ID: mdl-38578240

ABSTRACT

Deterioration of physiological systems, like the cardiovascular system, occurs progressively with age impacting an individual's health and increasing susceptibility to injury and disease. Cellular senescence has an underlying role in age-related alterations and can be triggered by natural aging or prematurely by stressors such as the bacterial toxin lipopolysaccharide (LPS). The metabolism of polyunsaturated fatty acids by CYP450 enzymes produces numerous bioactive lipid mediators that can be further metabolized by soluble epoxide hydrolase (sEH) into diol metabolites, often with reduced biological effects. In our study, we observed age-related cardiac differences in female mice, where young mice demonstrated resistance to LPS injury, and genetic deletion or pharmacological inhibition of sEH using trans-4-[4-(3-adamantan-1-yl-ureido)-cyclohexyloxy]-benzoic acid attenuated LPS-induced cardiac dysfunction in aged female mice. Bulk RNA-sequencing analyses revealed transcriptomics differences in aged female hearts. The confirmatory analysis demonstrated changes to inflammatory and senescence gene markers such as Il-6, Mcp1, Il-1ß, Nlrp3, p21, p16, SA-ß-gal, and Gdf15 were attenuated in the hearts of aged female mice where sEH was deleted or inhibited. Collectively, these findings highlight the role of sEH in modulating the aging process of the heart, whereby targeting sEH is cardioprotective.NEW & NOTEWORTHY Soluble epoxide hydrolase (sEH) is an essential enzyme for converting epoxy fatty acids to their less bioactive diols. Our study suggests deletion or inhibition of sEH impacts the aging process in the hearts of female mice resulting in cardioprotection. Data indicate targeting sEH limits inflammation, preserves mitochondria, and alters cellular senescence in the aged female heart.


Subject(s)
Aging , Epoxide Hydrolases , Lipopolysaccharides , Animals , Female , Mice , Age Factors , Aging/metabolism , Cellular Senescence/drug effects , Epoxide Hydrolases/metabolism , Epoxide Hydrolases/genetics , Lipopolysaccharides/toxicity , Mice, Inbred C57BL , Mice, Knockout , Myocytes, Cardiac/drug effects , Myocytes, Cardiac/metabolism , Myocytes, Cardiac/pathology , Sex Factors
2.
Crit Care Med ; 52(2): 210-222, 2024 02 01.
Article in English | MEDLINE | ID: mdl-38088767

ABSTRACT

OBJECTIVES: To determine if a real-time monitoring system with automated clinician alerts improves 3-hour sepsis bundle adherence. DESIGN: Prospective, pragmatic clinical trial. Allocation alternated every 7 days. SETTING: Quaternary hospital from December 1, 2020 to November 30, 2021. PATIENTS: Adult emergency department or inpatients meeting objective sepsis criteria triggered an electronic medical record (EMR)-embedded best practice advisory. Enrollment occurred when clinicians acknowledged the advisory indicating they felt sepsis was likely. INTERVENTION: Real-time automated EMR monitoring identified suspected sepsis patients with incomplete bundle measures within 1-hour of completion deadlines and generated reminder pages. Clinicians responsible for intervention group patients received reminder pages; no pages were sent for controls. The primary analysis cohort was the subset of enrolled patients at risk of bundle nonadherent care that had reminder pages generated. MEASUREMENTS AND MAIN RESULTS: The primary outcome was orders for all 3-hour bundle elements within guideline time limits. Secondary outcomes included guideline-adherent delivery of all 3-hour bundle elements, 28-day mortality, antibiotic discontinuation within 48-hours, and pathogen recovery from any culture within 7 days of time-zero. Among 3,269 enrolled patients, 1,377 had reminder pages generated and were included in the primary analysis. There were 670 (48.7%) at-risk patients randomized to paging alerts and 707 (51.3%) to control. Bundle-adherent orders were placed for 198 intervention patients (29.6%) versus 149 (21.1%) controls (difference: 8.5%; 95% CI, 3.9-13.1%; p = 0.0003). Bundle-adherent care was delivered for 152 (22.7%) intervention versus 121 (17.1%) control patients (difference: 5.6%; 95% CI, 1.4-9.8%; p = 0.0095). Mortality was similar between groups (8.4% vs 8.3%), as were early antibiotic discontinuation (35.1% vs 33.4%) and pan-culture negativity (69.0% vs 68.2%). CONCLUSIONS: Real-time monitoring and paging alerts significantly increased orders for and delivery of guideline-adherent care for suspected sepsis patients at risk of 3-hour bundle nonadherence. The trial was underpowered to determine whether adherence affected mortality. Despite enrolling patients with clinically suspected sepsis, early antibiotic discontinuation and pan-culture negativity were common, highlighting challenges in identifying appropriate patients for sepsis bundle application.


Subject(s)
Sepsis , Shock, Septic , Adult , Humans , Prospective Studies , Feedback , Hospital Mortality , Anti-Bacterial Agents/therapeutic use , Guideline Adherence
3.
Mol Ecol ; : e17511, 2024 Aug 31.
Article in English | MEDLINE | ID: mdl-39215560

ABSTRACT

Signals of natural selection can be quickly eroded in high gene flow systems, curtailing efforts to understand how and when genetic adaptation occurs in the ocean. This long-standing, unresolved topic in ecology and evolution has renewed importance because changing environmental conditions are driving range expansions that may necessitate rapid evolutionary responses. One example occurs in Kellet's whelk (Kelletia kelletii), a common subtidal gastropod with an ~40- to 60-day pelagic larval duration that expanded their biogeographic range northwards in the 1970s by over 300 km. To test for genetic adaptation, we performed a series of experimental crosses with Kellet's whelk adults collected from their historical (HxH) and recently expanded range (ExE), and conducted RNA-Seq on offspring that we reared in a common garden environment. We identified 2770 differentially expressed genes (DEGs) between 54 offspring samples with either only historical range (HxH offspring) or expanded range (ExE offspring) ancestry. Using SNPs called directly from the DEGs, we assigned samples of known origin back to their range of origin with unprecedented accuracy for a marine species (92.6% and 94.5% for HxH and ExE offspring, respectively). The SNP with the highest predictive importance occurred on triosephosphate isomerase (TPI), an essential metabolic enzyme involved in cold stress response. TPI was significantly upregulated and contained a non-synonymous mutation in the expanded range. Our findings pave the way for accurately identifying patterns of dispersal, gene flow and population connectivity in the ocean by demonstrating that experimental transcriptomics can reveal mechanisms for how marine organisms respond to changing environmental conditions.

4.
Ophthalmology ; 131(9): 1021-1032, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38423216

ABSTRACT

PURPOSE: To evaluate the safety and intraocular pressure (IOP)-lowering efficacy of 2 models of the travoprost intraocular implant (fast-eluting [FE] and slow-eluting [SE] types) from 1 of 2 phase 3 trials (the GC-010 trial). DESIGN: Multicenter, randomized, double-masked, sham-controlled, noninferiority trial. PARTICIPANTS: Patients with open-angle glaucoma or ocular hypertension having an unmedicated baseline mean diurnal IOP (average of 8 am, 10 am, and 4 pm time points) of ≥ 21 mmHg, and IOP of ≤ 36 mmHg at each of the 8 am, 10 am, and 4 pm timepoints at baseline. METHODS: Study eyes were randomized to the travoprost intraocular implant (FE implant [n = 200] or SE implant [n = 197] model) or to timolol ophthalmic solution 0.5% twice daily (n = 193). MAIN OUTCOME MEASURES: The primary outcome was mean change from baseline IOP in the study eye at 8 am and 10 am, at each of day 10, week 6, and month 3. Safety outcomes included adverse events (AEs) and ophthalmic assessments. RESULTS: Mean IOP reduction from baseline over the 6 time points ranged from 6.6 to 8.4 mmHg for the FE implant group, from 6.6 to 8.5 mmHg for the SE implant group, and from 6.5 to 7.7 mmHg for the timolol group. The primary efficacy end point was met; the upper limit of the 95% confidence interval of the difference between the implant groups and the timolol group was < 1 mmHg at all 6 time points. Study eye AEs, most of mild or moderate severity, were reported in 21.5%, 27.2%, and 10.8% of patients in the FE implant, SE implant, and timolol groups, respectively. The most common AEs included iritis (FE implant, 0.5%; SE implant, 5.1%), ocular hyperemia (FE implant, 3.0%; SE implant, 2.6%), reduced visual acuity (FE implant, 1.0%; SE implant, 4.1%; timolol, 0.5%), and IOP increased (FE implant, 3.5%; SE implant, 2.6%; timolol, 2.1%). One serious study eye AE occurred (endophthalmitis). CONCLUSIONS: The travoprost intraocular implant demonstrated robust IOP reduction over the 3-month primary efficacy evaluation period after a single administration. The IOP-lowering efficacy in both implant groups was statistically and clinically noninferior to that in the timolol group, with a favorable safety profile. FINANCIAL DISCLOSURE(S): Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.


Subject(s)
Antihypertensive Agents , Drug Implants , Glaucoma, Open-Angle , Intraocular Pressure , Ocular Hypertension , Tonometry, Ocular , Travoprost , Humans , Glaucoma, Open-Angle/drug therapy , Glaucoma, Open-Angle/physiopathology , Intraocular Pressure/drug effects , Intraocular Pressure/physiology , Ocular Hypertension/drug therapy , Ocular Hypertension/physiopathology , Travoprost/therapeutic use , Travoprost/administration & dosage , Antihypertensive Agents/administration & dosage , Antihypertensive Agents/therapeutic use , Antihypertensive Agents/adverse effects , Female , Male , Double-Blind Method , Aged , Middle Aged , Treatment Outcome , Visual Acuity/physiology , Timolol/administration & dosage , Timolol/therapeutic use , Timolol/adverse effects , Ophthalmic Solutions , Aged, 80 and over , Adult
5.
J Vasc Surg ; 79(1): 111-119.e2, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37717639

ABSTRACT

OBJECTIVE: Many patients with chronic limb-threatening ischemia (CLTI) have additional comorbidities requiring systemic immunosuppression. Few studies have analyzed whether these medications may inhibit graft integration and effectiveness, or conversely, whether they may prevent inflammation and/or restenosis. Therefore, our study aim was to examine the effect of systemic immunosuppression vs no immunosuppression on outcomes after any first-time lower extremity revascularization for CLTI. METHODS: We identified all patients undergoing first-time infrainguinal bypass graft (BPG) or percutaneous transluminal angioplasty with or without stenting (PTA/S) for CLTI at our institution between 2005 and 2014. Patients were stratified by procedure type and immunosuppression status, defined as ≥6 weeks of any systemic immunosuppression therapy ongoing at the time of intervention. Immunosuppression vs nonimmunosuppression were the primary comparison groups in our analyses. Primary outcomes included perioperative complications, reintervention, primary patency, and limb salvage, with Kaplan-Meier and Cox proportional hazard models used for univariate and multivariate analyses, respectively. RESULTS: Among 1312 patients, 667 (51%) underwent BPG and 651 (49%) underwent PTA/S, of whom 65 (10%) and 95 (15%) were on systemic immunosuppression therapy, respectively. Whether assessing BPG or PTA/S patients, there were no differences noted in perioperative outcomes, including perioperative mortality, myocardial infarction, stroke, hematoma, or surgical site infection (P > .05). For BPG patients, Kaplan-Meier analysis and log-rank testing demonstrated no significant difference in three-year reintervention (37% vs 33% [control]; P = .75), major amputation (27% vs 15%; P = .64), or primary patency (72% vs 66%; P = .35) rates. Multivariate analysis via Cox regression confirmed these findings (immunosuppression hazard ratio [HR] for reintervention, 0.95; 95% CI, 0.56-1.60; P = .85; for major amputation, HR, 1.44; 95% CI, 0.70-2.96; P = .32; and for primary patency. HR, 0.97; 95% CI, 0.69-1.38; P = .88). For PTA/S patients, univariate analysis revealed similar rates of reintervention (37% vs 39% [control]; P = .57) and primary patency (59% vs 63%; P = .21); however, immunosuppressed patients had higher rates of major amputation (23% vs 12%; P = .01). After using Cox regression to adjust for baseline demographics, as well as operative and anatomic characteristics, immunosuppression was not associated with any differences in reintervention (HR, 0.75; 95% CI, 0.49-1.16; P = .20), major amputation (HR, 1.46; 95% CI, 0.81-2.62; P = .20), or primary patency (HR, 0.84; 95% CI, 0.59-1.19; P = .32). Sensitivity analyses for the differences in makeup of immunosuppression regimens (steroids vs other classes) did not alter the interpretation of any findings in either BPG or PTA/S cohorts. CONCLUSIONS: Our findings demonstrate that patients with chronic systemic immunosuppression, as compared with those who are not immunosuppressed, does not have a significant effect on late outcomes after lower extremity revascularization, as measured by primary patency, reintervention, or major amputation.


Subject(s)
Angioplasty, Balloon , Peripheral Arterial Disease , Humans , Chronic Limb-Threatening Ischemia , Ischemia/diagnostic imaging , Ischemia/surgery , Vascular Surgical Procedures/adverse effects , Lower Extremity/surgery , Limb Salvage , Treatment Outcome , Immunosuppression Therapy , Retrospective Studies , Risk Factors , Peripheral Arterial Disease/diagnostic imaging , Peripheral Arterial Disease/surgery , Vascular Patency
6.
Neurochem Res ; 49(9): 2303-2318, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38856889

ABSTRACT

Brain-derived neurotrophic factor (BDNF) is vital for synaptic plasticity, cell persistence, and neuronal development in peripheral and central nervous systems (CNS). Numerous intracellular signalling pathways involving BDNF are well recognized to affect neurogenesis, synaptic function, cell viability, and cognitive function, which in turn affects pathological and physiological aspects of neurons. Stroke has a significant psycho-socioeconomic impact globally. Central post-stroke pain (CPSP), also known as a type of chronic neuropathic pain, is caused by injury to the CNS following a stroke, specifically damage to the somatosensory system. BDNF regulates a broad range of functions directly or via its biologically active isoforms, regulating multiple signalling pathways through interactions with different types of receptors. BDNF has been shown to play a major role in facilitating neuroplasticity during post-stroke recovery and a pro-nociceptive role in pain development in the nervous system. BDNF-tyrosine kinase receptors B (TrkB) pathway promotes neurite outgrowth, neurogenesis, and the prevention of apoptosis, which helps in stroke recovery. Meanwhile, BDNF overexpression plays a role in CPSP via the activation of purinergic receptors P2X4R and P2X7R. The neuronal hyperexcitability that causes CPSP is linked with BDNF-TrkB interactions, changes in ion channels and inflammatory reactions. This review provides an overview of BDNF synthesis, interactions with certain receptors, and potential functions in regulating signalling pathways associated with stroke and CPSP. The pathophysiological mechanisms underlying CPSP, the role of BDNF in CPSP, and the challenges and current treatment strategies targeting BDNF are also discussed.


Subject(s)
Brain-Derived Neurotrophic Factor , Stroke , Humans , Brain-Derived Neurotrophic Factor/metabolism , Animals , Stroke/metabolism , Stroke/complications , Neuralgia/metabolism , Neuralgia/etiology , Neuralgia/drug therapy , Receptor, trkB/metabolism , Signal Transduction/physiology , Neuronal Plasticity/physiology
7.
Neuropsychobiology ; 83(2): 61-72, 2024.
Article in English | MEDLINE | ID: mdl-38574476

ABSTRACT

INTRODUCTION: Neurobiological dysfunction is associated with depression in children and adolescents. While research in adult depression suggests that inflammation may underlie the association between depression and brain alterations, it is unclear if altered levels of inflammatory markers provoke neurobiological dysfunction in early-onset depression. The aim of this scoping review was to provide an overview of existing literature investigating the potential interaction between neurobiological function and inflammation in depressed children and adolescents. METHODS: Systematic searches were conducted in six databases. Primary research studies that included measures of both neurobiological functioning and inflammation among children (≤18 years) with a diagnosis of depression were included. RESULTS: Four studies (240 participants; mean age 16.0 ± 0.6 years, 62% female) meeting inclusion criteria were identified. Studies primarily examined the inflammatory markers interleukin 6, tumor necrosis factor alpha, C-reactive protein, and interleukin 1 beta. Exploratory whole brain imaging and analysis as well as region of interest approaches focused on the anterior cingulate cortex, basal ganglia, and white matter tracts were conducted. Most studies found correlations between neurobiological function and inflammatory markers; however, depressive symptoms were not observed to moderate these effects. CONCLUSIONS: A small number of highly heterogeneous studies indicate that depression may not modulate the association between altered inflammation and neurobiological dysfunction in children and adolescents. Replication in larger samples using consistent methodological approaches (focus on specific inflammatory markers, examine certain brain areas) is needed to advance the knowledge of potential neuro-immune interactions early in the course of depression.


Subject(s)
Inflammation , Humans , Adolescent , Child , Inflammation/physiopathology , Brain/physiopathology , Brain/diagnostic imaging , Brain/metabolism , Depression/physiopathology , Female , Male , Neuroinflammatory Diseases/physiopathology , Neuroinflammatory Diseases/immunology , Depressive Disorder/physiopathology
8.
Cereb Cortex ; 33(12): 7797-7815, 2023 06 08.
Article in English | MEDLINE | ID: mdl-36944537

ABSTRACT

The prefrontal cortex (PFC) has long been associated with arbitrating between approach and avoidance in the face of conflicting and uncertain motivational information, but recent work has also highlighted medial temporal lobe (MTL) involvement. It remains unclear, however, how the contributions of these regions differ in their resolution of conflict information and uncertainty. We designed an fMRI paradigm in which participants approached or avoided object pairs that differed by motivational conflict and outcome uncertainty (complete certainty vs. complete uncertainty). Behavioral data and decision-making parameters estimated using the hierarchical drift diffusion model revealed that participants' responding was driven by conflict rather than uncertainty. Our neural data suggest that PFC areas contribute to cognitive control during approach-avoidance conflict by potentially adjusting response caution and the strength of evidence generated towards either choice, with differential involvement of anterior cingulate cortex and dorsolateral prefrontal cortex. The MTL, on the other hand, appears to contribute to evidence generation, with the hippocampus linked to evidence accumulation for stimuli. Although findings within perirhinal cortex were comparatively equivocal, some evidence suggests contributions to perceptual representations, particularly under conditions of threat. Our findings provide evidence that MTL and PFC regions may contribute uniquely to arbitrating approach-avoidance conflict.


Subject(s)
Hippocampus , Temporal Lobe , Humans , Hippocampus/physiology , Temporal Lobe/diagnostic imaging , Temporal Lobe/physiology , Prefrontal Cortex/diagnostic imaging , Prefrontal Cortex/physiology , Magnetic Resonance Imaging , Motivation
9.
BMC Pregnancy Childbirth ; 24(1): 612, 2024 Sep 20.
Article in English | MEDLINE | ID: mdl-39304824

ABSTRACT

BACKGROUND: The prevalence of low birth weight (LBW) has stagnated at approximately 12% for the past 15 years in Nepal, significantly impacting newborn survival. While antenatal care (ANC) visits and iron-folic acid supplementation are recognised as important interventions to reduce LBW, there is a lack of evidence regarding their combined effect. This study aimed to explore the potential synergistic impact of ANC and iron-folic acid supplementation on LBW in Nepal by analyzing data from two national surveys. METHODS: The nationally representative Nepal Demographic and Health Surveys of 2016 and 2022 were used, and the pooled dataset was analysed. Birth weight and the prevalence of LBW (i.e. birthweight < 2500 g) were reported using descriptive statistics. The associations among LBW, ANC visits, and iron-folic acid supplementation were examined using logistic regression analyses. RESULTS: The mean birth weight was 3011 g, with an LBW prevalence of 11.2%. Not attending ANC (Adjusted Odds Ratio (AOR): 1.49; 95% Confidence Interval (CI): 1.14, 1.95) and not consuming iron-folic acid supplements (AOR: 1.43; 95% CI: 1.11, 1.84) were independently associated with a higher likelihood of having LBW. Furthermore, when considering both factors together, mothers who attended less than four ANC visits and consumed iron-folic acid for ≤ 90 days had the higher likelihood of having LBW (AOR: 1.99; 95% CI: 1.35, 2.60) compared to those who did not. CONCLUSIONS: This study highlights that the individual and joint influence of ANC visits and iron-folic acid supplementation on having LBW. These findings underscore the significance of ANC attendance and iron-folic acid supplementation in preventing LBW. Traditionally, these two interventions were primarily considered as maternal survival strategies. However, our findings indicate that these existing interventions could be utilised further for both maternal and newborn survival. Given that these services are offered free of cost and are available near people's homes through the National Safe Motherhood Programme in Nepal, efforts to increase the uptake of these services should be strengthened while emphasising their role in preventing LBW.


Subject(s)
Dietary Supplements , Folic Acid , Infant, Low Birth Weight , Iron , Prenatal Care , Humans , Nepal/epidemiology , Prenatal Care/statistics & numerical data , Folic Acid/administration & dosage , Folic Acid/therapeutic use , Female , Pregnancy , Dietary Supplements/statistics & numerical data , Adult , Infant, Newborn , Iron/administration & dosage , Iron/therapeutic use , Young Adult , Prevalence , Adolescent , Health Surveys
10.
Am J Emerg Med ; 81: 111-115, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38733663

ABSTRACT

BACKGROUND AND OBJECTIVES: Patient monitoring systems provide critical information but often produce loud, frequent alarms that worsen patient agitation and stress. This may increase the use of physical and chemical restraints with implications for patient morbidity and autonomy. This study analyzes how augmenting alarm thresholds affects the proportion of alarm-free time and the frequency of medications administered to treat acute agitation. METHODS: Our emergency department's patient monitoring system was modified on June 28, 2022 to increase the tachycardia alarm threshold from 130 to 150 and to remove alarm sounds for several arrhythmias, including bigeminy and premature ventricular beats. A pre-post study was performed lasting 55 days before and 55 days after this intervention. The primary outcome was change in number of daily patient alarms. The secondary outcomes were alarm-free time per day and median number of antipsychotic and benzodiazepine medications administered per day. The safety outcome was the median number of patients transferred daily to the resuscitation area. We used quantile regression to compare outcomes between the pre- and post-intervention period and linear regression to correlate alarm-free time with the number of sedating medications administered. RESULTS: Between the pre- and post-intervention period, the median number of alarms per day decreased from 1332 to 845 (-37%). This was primarily driven by reduced low-priority arrhythmia alarms from 262 to 21 (-92%), while the median daily census was unchanged (33 vs 32). Median hours per day free from alarms increased from 1.0 to 2.4 (difference 1.4, 95% CI 0.8-2.1). The median number of sedating medications administered per day decreased from 14 to 10 (difference - 4, 95% CI -1 to -7) while the number of escalations in level of care to our resuscitation care area did not change significantly. Multivariable linear regression showed a 60-min increase of alarm-free time per day was associated with 0.8 (95% CI 0.1-1.4) fewer administrations of sedating medication while an additional patient on the behavioral health census was associated with 0.5 (95% CI 0.0-1.1) more administrations of sedating medication. CONCLUSION: A reasonable change in alarm parameter settings may increase the time patients and healthcare workers spend in the emergency department without alarm noise, which in this study was associated with fewer doses of sedating medications administered.


Subject(s)
Clinical Alarms , Emergency Service, Hospital , Psychomotor Agitation , Humans , Male , Psychomotor Agitation/drug therapy , Female , Middle Aged , Antipsychotic Agents/therapeutic use , Antipsychotic Agents/administration & dosage , Adult , Aged , Benzodiazepines/therapeutic use , Benzodiazepines/administration & dosage , Monitoring, Physiologic/methods , Hypnotics and Sedatives/therapeutic use , Hypnotics and Sedatives/administration & dosage
11.
Am J Emerg Med ; 75: 143-147, 2024 01.
Article in English | MEDLINE | ID: mdl-37950982

ABSTRACT

BACKGROUND: Many academic medical centers (AMC) transfer patients who require admission but not tertiary care to partner community hospitals from their emergency departments (ED). These transfers alleviate ED boarding but may worsen existing healthcare disparities. We assessed whether disparities exist in the transfer of patients from one AMC ED to a community hospital General Medical Service. METHODS: We performed a retrospective cohort study on all patients screened for transfer between April 1 and December 31, 2021. During the screening process, the treating ED physician determines whether the patient meets standardized clinical criteria and a patient coordinator requests patient consent. We collected patient demographics data from the electronic health record and performed logistic regression at each stage of the transfer process to analyze how individual characteristics impact the odds of proceeding with transfer. RESULTS: 5558 patients were screened and 596 (11%) ultimately transferred. 1999 (36%) patients were Black or Hispanic, 698 (12%) had a preferred language other than English, and 956 (17%) were on Medicaid or uninsured. A greater proportion of Black and Hispanic patients were deemed eligible for interhospital transfer compared to White patients and a greater proportion of Hispanic patients completed transfer to the community hospital (p < 0.017 after Bonferroni correction). After accounting for other demographic variables, patients older than 50 (OR 1.21, 95% CI 1.04-1.40), with a preferred language other than English (OR 1.27, 95% CI 1.00-1.62), and from a priority neighborhood (OR 1.38, 95% CI 1.18-1.61) were more likely to be eligible for transfer, while patients who were male (OR 1.50, 95% CI 1.10-2.05) and younger than 50 (OR 1.85, 95% CI 1.20-2.78) were more likely to consent to transfer (p < 0.05). CONCLUSION: Health disparities exist in the screening process for our interfacility transfer program. Further investigation into why these disparities exist and mitigation strategies should be undertaken.


Subject(s)
Hospitals, Community , Patient Transfer , United States , Humans , Male , Female , Retrospective Studies , Emergency Service, Hospital , Health Inequities
12.
Osteoarthritis Cartilage ; 31(10): 1365-1376, 2023 10.
Article in English | MEDLINE | ID: mdl-37364817

ABSTRACT

OBJECTIVE: The detrimental effects of blood exposure on articular tissues are well characterized, but the individual contributions of specific whole blood components are yet to be fully elucidated. Better understanding of mechanisms that drive cell and tissue damage in hemophilic arthropathy will inform novel therapeutic strategies. The studies here aimed to identify the specific contributions of intact and lysed red blood cells (RBCs) on cartilage and the therapeutic potential of Ferrostatin-1 in the context of lipid changes, oxidative stress, and ferroptosis. METHODS: Changes to biochemical and mechanical properties following intact RBC treatment were assessed in human chondrocyte-based tissue-engineered cartilage constructs and validated against human cartilage explants. Chondrocyte monolayers were assayed for changes to intracellular lipid profiles and the presence of oxidative and ferroptotic mechanisms. RESULTS: Markers of tissue breakdown were observed in cartilage constructs without parallel losses in DNA (control: 786.3 (102.2) ng/mg; RBCINT: 751 (126.4) ng/mg; P = 0.6279), implicating nonlethal chondrocyte responses to intact RBCs. Dose-dependent loss of viability in response to intact and lysed RBCs was observed in chondrocyte monolayers, with greater toxicity observed with lysates. Intact RBCs induced changes to chondrocyte lipid profiles, upregulating highly oxidizable fatty acids (e.g., FA 18:2) and matrix disrupting ceramides. RBC lysates induced cell death via oxidative mechanisms that resemble ferroptosis. CONCLUSIONS: Intact RBCs induce intracellular phenotypic changes to chondrocytes that increase vulnerability to tissue damage while lysed RBCs have a more direct influence on chondrocyte death by mechanisms that are representative of ferroptosis.


Subject(s)
Cartilage, Articular , Chondrocytes , Humans , Chondrocytes/metabolism , Hemarthrosis/metabolism , Cartilage, Articular/metabolism , Erythrocytes/metabolism , Oxidative Stress , Lipids
13.
Bioscience ; 73(10): 748-757, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37854891

ABSTRACT

The recovery of wild tigers in India and Nepal is a remarkable conservation achievement, but it sets the stage for increased human-wildlife conflict where parks are limited in size and where tigers reside outside reserves. We deployed an innovative technology, the TrailGuard AI camera-alert system, which runs on-the-edge artificial intelligence algorithms to detect tigers and poachers and transmit real-time images to designated authorities responsible for managing prominent tiger landscapes in India. We successfully captured and transmitted the first images of tigers using cameras with embedded AI and detected poachers. Notifications of tiger images were received in real time, approximately 30 seconds from camera trigger to appearing in a smart phone app. We review use cases of this AI-based real-time alert system for managers and local communities and suggest how the system could help monitor tigers and other endangered species, detect poaching, and provide early warnings for human-wildlife conflict.

14.
Ann Emerg Med ; 81(4): 485-491, 2023 04.
Article in English | MEDLINE | ID: mdl-36669909

ABSTRACT

STUDY OBJECTIVE: Delays in the second dose of antibiotics in the emergency department (ED) are associated with increased morbidity and mortality in patients with serious infections. We analyzed the influence of clinical decision support to prevent delays in second doses of broad-spectrum antibiotics in the ED. METHODS: We allocated adult patients who received cefepime or piperacillin/tazobactam in 9 EDs within an integrated health care system to an electronic alert that reminded ED clinicians to reorder antibiotics at the appropriate interval vs usual care. The primary outcome was a median delay in antibiotic administration. Secondary outcomes were rates of intensive care unit (ICU) admission, hospital mortality, and hospital length of stay. We included a post hoc secondary outcome of frequency of major delay (>25% of expected interval for second antibiotic dose). RESULTS: A total of 1,113 ED patients treated with cefepime or piperacillin/tazobactam were enrolled in the study, of whom 420 remained under ED care when their second dose was due and were included in the final analysis. The clinical decision support tool was associated with reduced antibiotic delays (median difference 35 minutes, 95% confidence interval [CI], 5 to 65). There were no differences in ICU transfers, inpatient mortality, or hospital length of stay. The clinical decision support tool was associated with decreased probability of major delay (absolute risk reduction 13%, 95% CI, 6 to 20). CONCLUSIONS: The implementation of a clinical decision support alert reminding clinicians to reorder second doses of antibiotics was associated with a reduction in the length and frequency of antibiotic delays in the ED. There was no effect on the rates of ICU transfers, inpatient mortality, or hospital length of stay.


Subject(s)
Anti-Bacterial Agents , Hospitalization , Adult , Humans , Anti-Bacterial Agents/therapeutic use , Cefepime , Piperacillin, Tazobactam Drug Combination , Emergency Service, Hospital , Length of Stay , Retrospective Studies
15.
BMC Pregnancy Childbirth ; 23(1): 521, 2023 Jul 17.
Article in English | MEDLINE | ID: mdl-37460948

ABSTRACT

BACKGROUND: Antenatal care (ANC) ensures continuity of care in maternal and foetal health. Understanding the quality and timing of antenatal care (ANC) is important to further progress maternal health in Nepal. This study aimed to investigate the proportion of and factors associated with, key ANC services in western Nepal. METHODS: Data from a community-based cohort study were utilized to evaluate the major ANC service outcomes: (i) three or less ANC visits (underutilization) (ii) late initiation (≥ 4 months) and (iii) suboptimal ANC (< 8 quality indicators). Mothers were recruited and interviewed within 30 days of childbirth. The outcomes and the factors associated with them were reported using frequency distribution and multiple logistic regressions, respectively. RESULTS: Only 7.5% of 735 mothers reported not attending any ANC visits. While only a quarter (23.77%) of mothers reported under-utilizing ANC, more than half of the women (55.21%) initiated ANC visits late, and one-third (33.8%) received suboptimal ANC quality. A total of seven factors were associated with the suboptimal ANC. Mothers with lower education attainment, residing in rural areas, and those who received service at home, were more likely to attain three or less ANC visits, late initiation of ANC, and report receiving suboptimal ANC. Furthermore, mothers from poor family backgrounds appeared to initiate ANC late. Mothers from disadvantaged Madhesi communities tended to receive suboptimal ANC. CONCLUSIONS: Despite a high ANC attendance, a significant proportion of mothers had initiated ANC late and received suboptimal care. There is a need to tailor ANC services to better support women from Madhesi ethnic community, as well as those with poor and less educated backgrounds to reduce the inequalities in maternal health care.


Subject(s)
Parturition , Prenatal Care , Female , Humans , Pregnancy , Cohort Studies , Mothers , Nepal , Maternal Health , Healthcare Disparities , Socioeconomic Factors , Residence Characteristics
16.
Am J Emerg Med ; 64: 96-100, 2023 02.
Article in English | MEDLINE | ID: mdl-36502653

ABSTRACT

OBJECTIVE: Skin and soft tissue infections (SSTI) are commonly diagnosed in the emergency department (ED). While most SSTI are diagnosed with patient history and physical exam alone, ED clinicians may order CT imaging when they suspect more serious or complicated infections. Patients who inject drugs are thought to be at higher risk for complications from SSTI and may undergo CT imaging more frequently. The objective of this study is to characterize CT utilization when evaluating for SSTI in ED patients particularly in patients with intravenous drug use (IVDU), the frequency of significant and actionable findings from CT imaging, and its impact on subsequent management and ED operations. METHODS: We performed a retrospective analysis of encounters involving a diagnosis of SSTI in seven EDs across an integrated health system between October 2019 and October 2021. Descriptive statistics were used to assess overall trends, compare CT utilization frequencies, actionable imaging findings, and surgical intervention between patients who inject drugs and those who do not. Multivariable logistic regression was used to analyze patient factors associated with higher likelihood of CT imaging. RESULTS: There were 4833 ED encounters with an ICD-10 diagnosis of SSTI during the study period, of which 6% involved a documented history of IVDU and 30% resulted in admission. 7% (315/4833) of patients received CT imaging, and 22% (70/315) of CTs demonstrated evidence of possible deep space or necrotizing infections. Patients with history of IVDU were more likely than patients without IVDU to receive a CT scan (18% vs 6%), have a CT scan with findings suspicious for deep-space or necrotizing infection (4% vs 1%), and undergo surgical drainage in the operating room within 48 h of arrival (5% vs 2%). Male sex, abnormal vital signs, and history of IVDU were each associated with higher likelihood of CT utilization. Encounters involving CT scans had longer median times to ED disposition than those without CT scans, regardless of whether these encounters resulted in admission (9.0 vs 5.5 h), ED observation (5.5 vs 4.1 h), or discharge (6.8 vs 2.9 h). DISCUSSION: ED clinicians ordered CT scans in 7% of encounters when evaluating for SSTI, most frequently in patients with abnormal vital signs or a history of IV drug use. Patients with a history of IVDU had higher rates of CT findings suspicious for deep space infections or necrotizing infections and higher rates of incision and drainage procedures in the OR. While CT scans significantly extended time spent in the ED for patients, this appeared justified by the high rate of actionable findings found on imaging, particularly for patients with a history of IVDU.


Subject(s)
Soft Tissue Infections , Substance Abuse, Intravenous , Humans , Male , Soft Tissue Infections/diagnostic imaging , Soft Tissue Infections/drug therapy , Retrospective Studies , Tomography, X-Ray Computed , Emergency Service, Hospital , Vital Signs , Substance Abuse, Intravenous/complications , Substance Abuse, Intravenous/epidemiology
17.
Int Arch Occup Environ Health ; 96(3): 355-363, 2023 Apr.
Article in English | MEDLINE | ID: mdl-36089622

ABSTRACT

PURPOSE: Occupational exposure to dust has been recognised as a significant health hazard to mine workers. This study aimed to investigate the association between exposure to inhalable (INH) and respirable (RES) dust and respiratory health among mine workers in Western Australia using an industry-wide exposure database. METHODS: The database comprised cross-sectional surveys conducted by mining companies for the period 2001-2012. The study population consisted of 12,797 workers who were monitored for exposure to INH and RES dust and undertook health assessments including a respiratory questionnaire and spirometry test. RESULTS: Despite the general trend of declining exposure to both INH and RES dust observed over the 12 years period, mine workers reported a higher prevalence of phlegm and cough when exposed to elevated concentrations of INH and RES dust. Logistic regression analysis further confirmed the positive association between INH dust exposure and the prevalence of phlegm with an adjusted odds ratio of 1.033 (95% CI 1.012-1.052). Overall, 6.3% of miners might have potential airway obstruction, and exposure to INH dust was associated with impaired lung function parameters. CONCLUSION: Exposure levels of INH and RES dust particles among mine workers have reduced considerably and were well below currently legislated occupational exposure limits. However, given the reported higher prevalence of phlegm and cough among those with elevated dust concentrations, there is a continued need for effective dust exposure monitoring and control in the mineral mining industry.


Subject(s)
Lung Diseases , Miners , Occupational Exposure , Humans , Cough , Dust/analysis , Cross-Sectional Studies , Australia , Occupational Exposure/analysis
18.
Clin Transplant ; 36(7): e14670, 2022 07.
Article in English | MEDLINE | ID: mdl-35396887

ABSTRACT

BACKGROUND: Ex vivo lung perfusion (EVLP) is used to assess and preserve lungs prior to transplantation. However, its inherent immunomodulatory effects are not completely understood. We examine perfusate and tissue compartments to determine the change in immune cell composition in human lungs maintained on EVLP. METHODS: Six human lungs unsuitable for transplantation underwent EVLP. Tissue and perfusate samples were obtained during cold storage and at 1-, 3- and 6-h during perfusion. Flow cytometry, immunohistochemistry, and bead-based immunoassays were used to measure leukocyte composition and cytokines. Mean values between baseline and time points were compared by Student's t test. RESULTS: During the 1st hour of perfusion, perfusate neutrophils increased (+22.2 ± 13.5%, p < 0.05), monocytes decreased (-77.5 ± 8.6%, p < 0.01) and NK cells decreased (-61.5 ± 22.6%, p < 0.01) compared to cold storage. In contrast, tissue neutrophils decreased (-22.1 ± 12.2%, p < 0.05) with no change in monocytes and NK cells. By 6 h, perfusate neutrophils, NK cells, and tissue neutrophils were similar to baseline. Perfusate monocytes remained decreased, while tissue monocytes remained unchanged. There was no significant change in B cells or T cell subsets. Pro-inflammatory cytokines (IL-1b, G-CSF, IFN-gamma, CXCL2, CXCL1 granzyme A, and granzyme B) and lymphocyte activating cytokines (IL-2, IL-4, IL-6, IL-8) increased during perfusion. CONCLUSIONS: Early mobilization of innate immune cells occurs in both perfusate and tissue compartments during EVLP, with neutrophils and NK cells returning to baseline and monocytes remaining depleted after 6 h. The immunomodulatory effect of EVLP may provide a therapeutic window to decrease the immunogenicity of lungs prior to transplantation.


Subject(s)
Lung Transplantation , Cytokines/metabolism , Humans , Leukocytes/metabolism , Lung , Perfusion , Tissue Donors
19.
Cereb Cortex ; 31(5): 2701-2719, 2021 03 31.
Article in English | MEDLINE | ID: mdl-33429427

ABSTRACT

The rodent ventral and primate anterior hippocampus have been implicated in approach-avoidance (AA) conflict processing. It is unclear, however, whether this structure contributes to AA conflict detection and/or resolution, and if its involvement extends to conditions of AA conflict devoid of spatial/contextual information. To investigate this, neurologically healthy human participants first learned to approach or avoid single novel visual objects with the goal of maximizing earned points. Approaching led to point gain and loss for positive and negative objects, respectively, whereas avoidance had no impact on score. Pairs of these objects, each possessing nonconflicting (positive-positive/negative-negative) or conflicting (positive-negative) valences, were then presented during functional magnetic resonance imaging. Participants either made an AA decision to score points (Decision task), indicated whether the objects had identical or differing valences (Memory task), or followed a visual instruction to approach or avoid (Action task). Converging multivariate and univariate results revealed that within the medial temporal lobe, perirhinal cortex, rather than the anterior hippocampus, was predominantly associated with object-based AA conflict resolution. We suggest the anterior hippocampus may not contribute equally to all learned AA conflict scenarios and that stimulus information type may be a critical and overlooked determinant of the neural mechanisms underlying AA conflict behavior.


Subject(s)
Avoidance Learning , Choice Behavior , Conflict, Psychological , Hippocampus/diagnostic imaging , Memory/physiology , Motivation , Perirhinal Cortex/diagnostic imaging , Temporal Lobe/diagnostic imaging , Adolescent , Adult , Decision Making , Female , Functional Neuroimaging , Hippocampus/physiology , Humans , Learning/physiology , Magnetic Resonance Imaging , Male , Perirhinal Cortex/physiology , Temporal Lobe/physiology , Young Adult
20.
Am J Emerg Med ; 56: 205-210, 2022 06.
Article in English | MEDLINE | ID: mdl-35427856

ABSTRACT

OBJECTIVES: Caring for patients with COVID-19 has resulted in a considerable strain on hospital capacity. One strategy to mitigate crowding is the use of ED-based observation units to care for patients who may have otherwise required hospitalization. We sought to create a COVID-19 Observation Protocol for our ED Observation Unit (EDOU) for patients with mild to moderate COVID-19 to allow emergency physicians (EP) to gather more data for or against admission and intervene in a timely manner to prevent clinical deterioration. METHODS: This was a retrospective cohort study which included all patients who were positive for SARS-CoV-2 at the time of EDOU placement for the primary purpose of monitoring COVID-19 disease. Our institution updated the ED Observation protocol partway into the study period. Descriptive statistics were used to characterize demographics. We assessed for differences in demographics, clinical characteristics, and outcomes between admitted and discharged patients. Multivariate logistic regression models were used to assess whether meeting criteria for the ED observation protocols predicted disposition. RESULTS: During the time period studied, 120 patients positive for SARS-CoV-2 were placed in the EDOU for the primary purpose of monitoring COVID-19 disease. The admission rate for patients in the EDOU during the study period was 35%. When limited to patients who met criteria for version 1 or version 2 of the protocol, this dropped to 21% and 25% respectively. Adherence to the observation protocol was 62% and 60% during the time of version 1 and version 2 implementation, respectively. Using a multivariate logistic regression, meeting criteria for either version 1 (OR = 3.17, 95% CI 1.34-7.53, p < 0.01) or version 2 (OR = 3.18, 95% CI 1.39-7.30, p < 0.01) of the protocol resulted in a higher likelihood of discharge. There was no difference in EDOU LOS between admitted and discharged patients. CONCLUSION: An ED observation protocol can be successfully created and implemented for COVID-19 which allows the EP to determine which patients warrant hospitalization. Meeting protocol criteria results in an acceptable admission rate.


Subject(s)
COVID-19 , COVID-19/epidemiology , Clinical Observation Units , Emergency Service, Hospital , Humans , Observation , Retrospective Studies , SARS-CoV-2
SELECTION OF CITATIONS
SEARCH DETAIL