RESUMEN
We have recently demonstrated the role of the Fyn-PKCδ signaling pathway in status epilepticus (SE)-induced neuroinflammation and epileptogenesis in experimental models of temporal lobe epilepsy (TLE). In this study, we show a significant disease-modifying effect and the mechanisms of a Fyn/Src tyrosine kinase inhibitor, saracatinib (SAR, also known as AZD0530), in the rat kainate (KA) model of TLE. SAR treatment for a week, starting the first dose (25â¯mg/kg, oral) 4â¯h after the onset of SE, significantly reduced spontaneously recurring seizures and epileptiform spikes during the four months of continuous video-EEG monitoring. Immunohistochemistry of brain sections and Western blot analyses of hippocampal lysates at 8-day (8d) and 4-month post-SE revealed a significant reduction of SE-induced astrogliosis, microgliosis, neurodegeneration, phosphorylated Fyn/Src-419 and PKCδ-tyr311, in SAR-treated group when compared with the vehicle control. We also found the suppression of nitroxidative stress markers such as iNOS, 3-NT, 4-HNE, and gp91phox in the hippocampus, and nitrite and ROS levels in the serum of the SAR-treated group at 8d post-SE. The qRT-PCR (hippocampus) and ELISA (serum) revealed a significant reduction of key proinflammatory cytokines TNFα and IL-1ß mRNA in the hippocampus and their protein levels in serum, in addition to IL-6 and IL-12, in the SAR-treated group at 8d in contrast to the vehicle-treated group. These findings suggest that SAR targets some of the key biomarkers of epileptogenesis and modulates neuroinflammatory and nitroxidative pathways that mediate the development of epilepsy. Therefore, SAR can be developed as a potential disease-modifying agent to prevent the development and progression of TLE.
Asunto(s)
Benzodioxoles/uso terapéutico , Modelos Animales de Enfermedad , Inhibidores Enzimáticos/uso terapéutico , Epilepsia del Lóbulo Temporal/tratamiento farmacológico , Ácido Kaínico/toxicidad , Proteínas Proto-Oncogénicas c-fyn/antagonistas & inhibidores , Quinazolinas/uso terapéutico , Animales , Benzodioxoles/farmacología , Electroencefalografía/métodos , Inhibidores Enzimáticos/farmacología , Epilepsia del Lóbulo Temporal/inducido químicamente , Epilepsia del Lóbulo Temporal/metabolismo , Mediadores de Inflamación/antagonistas & inhibidores , Mediadores de Inflamación/metabolismo , Masculino , Proteínas Proto-Oncogénicas c-fyn/metabolismo , Quinazolinas/farmacología , Ratas , Ratas Sprague-Dawley , Especies Reactivas de Oxígeno/antagonistas & inhibidores , Especies Reactivas de Oxígeno/metabolismo , Telemetría/métodosRESUMEN
Chemical nerve agents (CNA) are increasingly becoming a threat to both civilians and military personnel. CNA-induced acute effects on the nervous system have been known for some time and the long-term consequences are beginning to emerge. In this study, we used diisopropylfluorophosphate (DFP), a seizurogenic CNA to investigate the long-term impact of its acute exposure on the brain and its mitigation by an inducible nitric oxide synthase (iNOS) inhibitor, 1400W as a neuroprotectant in the rat model. Several experimental studies have demonstrated that DFP-induced seizures and/or status epilepticus (SE) causes permanent brain injury, even after the countermeasure medication (atropine, oxime, and diazepam). In the present study, DFP-induced SE caused a significant increase in iNOS and 3-nitrotyrosine (3-NT) at 24â¯h, 48â¯h, 7d, and persisted for a long-term (12â¯weeks post-exposure), which led to the hypothesis that iNOS is a potential therapeutic target in DFP-induced brain injury. To test the hypothesis, we administered 1400W (20â¯mg/kg, i.m.) or the vehicle twice daily for the first three days of post-exposure. 1400W significantly reduced DFP-induced iNOS and 3-NT upregulation in the hippocampus and piriform cortex, and the serum nitrite levels at 24â¯h post-exposure. 1400W also prevented DFP-induced mortality in <24â¯h. The brain immunohistochemistry (IHC) at 7d post-exposure revealed a significant reduction in gliosis and neurodegeneration (NeuN+ FJB positive cells) in the 1400W-treated group. 1400W, in contrast to the vehicle, caused a significant reduction in the epileptiform spiking and spontaneous recurrent seizures (SRS) during 12â¯weeks of continuous video-EEG study. IHC of brain sections from the same animals revealed a significant reduction in reactive gliosis (both microgliosis and astrogliosis) and neurodegeneration across various brain regions in the 1400W-treated group when compared to the vehicle-treated group. A multiplex assay from hippocampal lysates at 6â¯weeks post-exposure showed a significant increase in several key pro-inflammatory cytokines/chemokines such as IL-1α, TNFα, IL-1ß, IL-2, IL-6, IL-12, IL-17a, MCP-1, LIX, and Eotaxin, and a growth factor, VEGF in the vehicle-treated animals. 1400W significantly suppressed IL-1α, TNFα, IL-2, IL-12, and MCP-1 levels. It also suppressed DFP-induced serum nitrite levels at 6â¯weeks post-exposure. In the Morris water maze, the vehicle-treated animals spent significantly less time in the target quadrant in a probe trial at 9d post-exposure compared to their time spent in the same quadrant 11â¯days previously (i.e., 2â¯days prior to DFP exposure). Such a difference was not observed in the 1400W and control groups. However, learning and short-term memory were unaffected when tested at 10-16d and 28-34d post-exposure. Accelerated rotarod, horizontal bar test, and the forced swim test revealed no significant changes between groups. Overall, the findings from this study suggest that 1400W may be considered as a potential therapeutic agent as a follow-on therapy for CNA exposure, after controlling the acute symptoms, to prevent mortality and some of the long-term neurotoxicity parameters such as epileptiform spiking, SRS, neurodegeneration, reactive gliosis in some brain regions, and certain key proinflammatory cytokines and chemokine.
Asunto(s)
Amidinas/farmacología , Bencilaminas/farmacología , Encéfalo/efectos de los fármacos , Isoflurofato/toxicidad , Fármacos Neuroprotectores/farmacología , Síndromes de Neurotoxicidad/patología , Animales , Encéfalo/patología , Modelos Animales de Enfermedad , Inhibidores Enzimáticos/farmacología , Masculino , Agentes Nerviosos/toxicidad , Degeneración Nerviosa/inducido químicamente , Degeneración Nerviosa/patología , Óxido Nítrico Sintasa de Tipo II/antagonistas & inhibidores , Ratas , Ratas Sprague-DawleyRESUMEN
PURPOSE: Intensive care unit (ICU) resources are a costly but effective commodity used in the management of critically ill patients with chronic obstructive pulmonary disease (COPD). ICU admission decisions are determined by patient diagnosis and severity of illness, but also may be affected by hospital differences in quality and performance. We investigate the variability in ICU utilization for patients with COPD and its association with hospital characteristics. METHODS: Using a 3M administrative dataset spanning 2008-2013, we conducted a retrospective cohort study of adult patients discharged with COPD at hospitals in three state to determine variability in ICU utilization. Quality metrics were calculated for each hospital using observed-to-expected (O/E) ratios for overall mortality and length of stay. Logistic and multilevel multivariate regression models were constructed, estimating the association between hospital quality metrics on ICU utilization, after adjustment for available clinical factors and hospital characteristics. RESULTS: In 434 hospitals with 570,517 COPD patient visits, overall ICU admission rate was 33.1% [range 0-89%; median (IQR) 24% (8, 54)]. The addition of patient, hospital, and quality characteristics decreased the overall variability attributable to individual hospital differences seen within our cohort from 40.9 to 33%. Odds of ICU utilization were increased for larger hospitals and those seeing lower pulmonary case volume. Hospitals with better overall O/E ratios for length of stay or mortality had lower ICU utilization. CONCLUSIONS: Hospital characteristics, including quality metrics, are associated with variability in ICU utilization for COPD patients, with higher ICU utilization seen for lower performing hospitals.
Asunto(s)
Hospitalización/estadística & datos numéricos , Unidades de Cuidados Intensivos/estadística & datos numéricos , Tiempo de Internación/estadística & datos numéricos , Mortalidad , Enfermedad Pulmonar Obstructiva Crónica/terapia , Calidad de la Atención de Salud , Anciano , Estudios Transversales , Femenino , Hospitales/normas , Hospitales/estadística & datos numéricos , Hospitales de Alto Volumen/estadística & datos numéricos , Hospitales de Bajo Volumen/estadística & datos numéricos , Humanos , Modelos Logísticos , Masculino , Persona de Mediana Edad , Análisis MultinivelRESUMEN
OBJECTIVE: To examine the prevalence of cervical spine injuries among children and adolescents referred with suspected and diagnosed sports-related concussion (SRC); and evaluate the effect of cervical spine dysfunction (CSD) on physician-documented clinical recovery following SRC. SETTING: A multidisciplinary pediatric concussion program. PARTICIPANTS: A total of 266 patients (6-19 years) referred with suspected SRC. DESIGN: A retrospective cohort study. MAIN MEASURES: CSD defined as neurological symptoms localized to the cervical spine or the presence of neck pain, headache, or dizziness and abnormal cervical spine examination findings; physician-documented clinical recovery. RESULTS: One patient was diagnosed with a T1 compression fracture. Of the 246 patients diagnosed with SRC, 80 (32.5%) met the clinical criteria for CSD including 4 patients with central cord neuropraxia and 1 with a spinal cord injury without radiographic abnormality (SCIWORA). Excluding patients with central cord neuropraxia OR SCIWORA, patients with SRC with CSD took longer to achieve physician-documented clinical recovery (28.5 days vs 17 days, P < .0001) and were 3.95 times more likely to experience delayed physician-documented clinical recovery (>4 weeks postinjury) compared with those without CSD. CONCLUSIONS: Patients with suspected and diagnosed SRC can present with a wide spectrum of coincident cervical spine injuries. Cervical spine dysfunction may be a risk factor for delayed clinical recovery.
Asunto(s)
Traumatismos en Atletas/epidemiología , Conmoción Encefálica/epidemiología , Vértebras Cervicales/lesiones , Fracturas de la Columna Vertebral/diagnóstico , Adolescente , Vértebras Cervicales/diagnóstico por imagen , Niño , Estudios de Cohortes , Femenino , Fracturas por Compresión/diagnóstico , Fracturas por Compresión/epidemiología , Humanos , Imagen por Resonancia Magnética , Masculino , Recuperación de la Función , Derivación y Consulta/estadística & datos numéricos , Estudios Retrospectivos , Fracturas de la Columna Vertebral/epidemiología , Factores de Tiempo , Tomografía Computarizada por Rayos X , Adulto JovenRESUMEN
OBJECTIVES: ICU admission delays can negatively affect patient outcomes, but emergency department volume and boarding times may also affect these decisions and associated patient outcomes. We sought to investigate the effect of emergency department and ICU capacity strain on ICU admission decisions and to examine the effect of emergency department boarding time of critically ill patients on in-hospital mortality. DESIGN: A retrospective cohort study. SETTING: Single academic tertiary care hospital. PATIENTS: Adult critically ill emergency department patients for whom a consult for medical ICU admission was requested, over a 21-month period. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Patient data, including severity of illness (Mortality Probability Model III on Admission), outcomes of mortality and persistent organ dysfunction, and hourly census reports for the emergency department, for all ICUs and all adult wards were compiled. A total of 854 emergency department requests for ICU admission were logged, with 455 (53.3%) as "accept" and 399 (46.7%) as "deny" cases, with median emergency department boarding times 4.2 hours (interquartile range, 2.8-6.3 hr) and 11.7 hours (3.2-20.3 hr) and similar rates of persistent organ dysfunction and/or death 41.5% and 44.6%, respectively. Those accepted were younger (mean ± SD, 61 ± 17 vs 65 ± 18 yr) and more severely ill (median Mortality Probability Model III on Admission score, 15.3% [7.0-29.5%] vs 13.4% [6.3-25.2%]) than those denied admission. In the multivariable model, a full medical ICU was the only hospital-level factor significantly associated with a lower probability of ICU acceptance (odds ratio, 0.55 [95% CI, 0.37-0.81]). Using propensity score analysis to account for imbalances in baseline characteristics between those accepted or denied for ICU admission, longer emergency department boarding time after consult was associated with higher odds of mortality and persistent organ dysfunction (odds ratio, 1.77 [1.07-2.95]/log10 hour increase). CONCLUSIONS: ICU admission decisions for critically ill emergency department patients are affected by medical ICU bed availability, though higher emergency department volume and other ICU occupancy did not play a role. Prolonged emergency department boarding times were associated with worse patient outcomes, suggesting a need for improved throughput and targeted care for patients awaiting ICU admission.
Asunto(s)
Ocupación de Camas , Enfermedad Crítica/terapia , Servicio de Urgencia en Hospital/estadística & datos numéricos , Unidades de Cuidados Intensivos/estadística & datos numéricos , Admisión del Paciente/estadística & datos numéricos , Adulto , Factores de Edad , Ocupación de Camas/estadística & datos numéricos , Enfermedad Crítica/mortalidad , Femenino , Humanos , Masculino , Insuficiencia Multiorgánica/epidemiología , Estudios Retrospectivos , Centros de Atención Terciaria/estadística & datos numéricos , Factores de Tiempo , Resultado del Tratamiento , Triaje , Listas de EsperaRESUMEN
BACKGROUND: Neuraxial anesthesia is increasingly recommended for hip/knee replacements as some studies show improved outcomes on the individual level. With hospital-level studies lacking, we assessed the relationship between hospital-level neuraxial anesthesia utilization and outcomes. METHODS: National data on 808,237 total knee and 371,607 hip replacements were included (Premier Healthcare 2006 to 2014; 550 hospitals). Multivariable associations were measured between hospital-level neuraxial anesthesia volume (subgrouped into quartiles) and outcomes (respiratory/cardiac complications, blood transfusion/intensive care unit need, opioid utilization, and length/cost of hospitalization). Odds ratios (or percent change) and 95% CI are reported. Volume-outcome relationships were additionally assessed by plotting hospital-level neuraxial anesthesia volume against predicted hospital-specific outcomes; trend tests were applied with trendlines' R statistics reported. RESULTS: Annual hospital-specific neuraxial anesthesia volume varied greatly: interquartile range, 3 to 78 for hips and 6 to 163 for knees. Increasing frequency of neuraxial anesthesia was not associated with reliable improvements in any of the study's clinical outcomes. However, significant reductions of up to -14.1% (95% CI, -20.9% to -6.6%) and -15.6% (95% CI, -22.8% to -7.7%) were seen for hospitalization cost in knee and hip replacements, respectively, both in the third quartile of neuraxial volume. This coincided with significant volume effects for hospitalization cost; test for trend P < 0.001 for both procedures, R 0.13 and 0.41 for hip and knee replacements, respectively. CONCLUSIONS: Increased hospital-level use of neuraxial anesthesia is associated with lower hospitalization cost for lower joint replacements. However, additional studies are needed to elucidate all drivers of differences found before considering hospital-level neuraxial anesthesia use as a potential marker of quality.
Asunto(s)
Anestesia Local/tendencias , Artroplastia de Reemplazo de Cadera/tendencias , Artroplastia de Reemplazo de Rodilla/tendencias , Hospitales/tendencias , Evaluación de Resultado en la Atención de Salud/tendencias , Anciano , Anestesia de Conducción/normas , Anestesia de Conducción/tendencias , Anestesia Local/normas , Artroplastia de Reemplazo de Cadera/normas , Artroplastia de Reemplazo de Rodilla/normas , Femenino , Humanos , Masculino , Persona de Mediana Edad , Evaluación de Resultado en la Atención de Salud/normas , Estudios Retrospectivos , Resultado del TratamientoRESUMEN
PURPOSE: Obstructive sleep apnea (OSA) has been linked to higher rates of perioperative complications. Practice guidelines recommend minimizing opioids in this cohort to reduce complications. However, a paucity of evidence exists relating different levels of opioid prescription to perioperative complications. Our aim was to investigate if different levels of opioid prescription are related to perioperative complication risk in patients with OSA. METHODS: A total of 107,610 OSA patients undergoing total knee or hip arthroplasty between 2006 and 2013 were identified in a nationwide database and divided into subgroups according to the amount of opioids prescribed. We then compared those subgroups for odds of perioperative complications using multilevel multivariable logistic regression models. RESULTS: OSA patients with higher levels of opioid prescription had increased odds for gastrointestinal complications (OR 1.90, 95% CI 1.47-2.46), prolonged length of stay (OR 1.64, 95% CI 1.57-1.72), and increased cost of care (OR 1.48, 95% CI 1.40-1.57). However, we found lower odds for pulmonary complications (OR 0.85, 95% CI 0.74-0.96) for the high-prescription group. CONCLUSIONS: Higher levels of opioid prescription were associated with higher odds for gastrointestinal complications and adverse effects on cost and length of stay but lower odds for pulmonary complications in OSA patients undergoing joint arthroplasties. The latter finding is unlikely causal but may represent more preventive measures and early interventions among those patients. Attempts to reduce opioid prescription should be undertaken to improve quality and safety of care in this challenging cohort in the perioperative setting.
Asunto(s)
Analgésicos Opioides/efectos adversos , Artroplastia de Reemplazo de Cadera , Artroplastia de Reemplazo de Rodilla , Complicaciones Posoperatorias/inducido químicamente , Apnea Obstructiva del Sueño/complicaciones , Adulto , Anciano , Anciano de 80 o más Años , Analgésicos Opioides/administración & dosificación , Analgésicos Opioides/uso terapéutico , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios RetrospectivosRESUMEN
BACKGROUND: Emerging evidence associating obstructive sleep apnea (OSA) with adverse perioperative outcomes has recently heightened the level of awareness among perioperative physicians. In particular, estimates projecting the high prevalence of this condition in the surgical population highlight the necessity of the development and adherence to "best clinical practices." In this context, a number of expert panels have generated recommendations in an effort to provide guidance for perioperative decision-making. However, given the paucity of insights into the status of the implementation of recommended practices on a national level, we sought to investigate current utilization, trends, and the penetration of OSA care-related interventions in the perioperative management of patients undergoing lower joint arthroplasties. METHODS: In this population-based analysis, we identified 1,107,438 (Premier Perspective database; 2006-2013) cases of total hip and knee arthroplasties and investigated utilization and temporal trends in the perioperative use of regional anesthetic techniques, blood oxygen saturation monitoring (oximetry), supplemental oxygen administration, positive airway pressure therapy, advanced monitoring environments, and opioid prescription among patients with and without OSA. RESULTS: The utilization of regional anesthetic techniques did not differ by OSA status and overall <25% and 15% received neuraxial anesthesia and peripheral nerve blocks, respectively. Trend analysis showed a significant increase in peripheral nerve block use by >50% and a concurrent decrease in opioid prescription. Interestingly, while the absolute number of patients with OSA receiving perioperative oximetry, supplemental oxygen, and positive airway pressure therapy significantly increased over time, the proportional use significantly decreased by approximately 28%, 36%, and 14%, respectively. A shift from utilization of intensive care to telemetry and stepdown units was seen. CONCLUSIONS: On a population-based level, the implementation of OSA-targeted interventions seems to be limited with some of the current trends virtually in contrast to practice guidelines. Reasons for these findings need to be further elucidated, but observations of a dramatic increase in absolute utilization with a proportional decrease may suggest possible resource constraints as a contributor.
Asunto(s)
Artroplastia de Reemplazo de Rodilla/efectos adversos , Presión de las Vías Aéreas Positiva Contínua , Atención Perioperativa/métodos , Apnea Obstructiva del Sueño/complicaciones , Anciano , Anestesia Local , Anestesiología , Anestésicos , Cuidados Críticos , Femenino , Humanos , Masculino , Persona de Mediana Edad , Evaluación de Resultado en la Atención de Salud , Oximetría , Terapia por Inhalación de Oxígeno , Complicaciones Posoperatorias , Estudios Retrospectivos , Apnea Obstructiva del Sueño/fisiopatología , Factores de Tiempo , Resultado del TratamientoRESUMEN
BACKGROUND: Current clinical guidelines consider regimens consisting of either ritonavir-boosted atazanavir or ritonavir-boosted lopinavir and a nucleoside reverse transcriptase inhibitor (NRTI) backbone among their recommended and alternative first-line antiretroviral regimens. However, these guidelines are based on limited evidence from randomized clinical trials and clinical experience. METHODS: We compared these regimens with respect to clinical, immunologic, and virologic outcomes using data from prospective studies of human immunodeficiency virus (HIV)-infected individuals in Europe and the United States in the HIV-CAUSAL Collaboration, 2004-2013. Antiretroviral therapy-naive and AIDS-free individuals were followed from the time they started a lopinavir or an atazanavir regimen. We estimated the 'intention-to-treat' effect for atazanavir vs lopinavir regimens on each of the outcomes. RESULTS: A total of 6668 individuals started a lopinavir regimen (213 deaths, 457 AIDS-defining illnesses or deaths), and 4301 individuals started an atazanavir regimen (83 deaths, 157 AIDS-defining illnesses or deaths). The adjusted intention-to-treat hazard ratios for atazanavir vs lopinavir regimens were 0.70 (95% confidence interval [CI], .53-.91) for death, 0.67 (95% CI, .55-.82) for AIDS-defining illness or death, and 0.91 (95% CI, .84-.99) for virologic failure at 12 months. The mean 12-month increase in CD4 count was 8.15 (95% CI, -.13 to 16.43) cells/µL higher in the atazanavir group. Estimates differed by NRTI backbone. CONCLUSIONS: Our estimates are consistent with a lower mortality, a lower incidence of AIDS-defining illness, a greater 12-month increase in CD4 cell count, and a smaller risk of virologic failure at 12 months for atazanavir compared with lopinavir regimens.
Asunto(s)
Fármacos Anti-VIH/uso terapéutico , Terapia Antirretroviral Altamente Activa/métodos , Sulfato de Atazanavir/uso terapéutico , Infecciones por VIH/tratamiento farmacológico , Lopinavir/uso terapéutico , Adolescente , Adulto , Recuento de Linfocito CD4 , Estudios de Cohortes , Conducta Cooperativa , Países Desarrollados , Europa (Continente) , Femenino , Infecciones por VIH/inmunología , Infecciones por VIH/virología , Humanos , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Resultado del Tratamiento , Estados Unidos , Carga Viral , Adulto JovenRESUMEN
PURPOSE: In 2007 and 2009, the American Academy of Orthopaedic Surgeons released Clinical Practice Guidelines (CPG) for diagnosis and treatment of carpal tunnel syndrome (CTS) based upon review of the literature. The lack of consistently high-level evidence resulted in several recommendations, some strongly supported, some weakly supported, and others controversial. We postulated that a survey of American Society for Surgery of the Hand (ASSH) members would provide insight into practice patterns among hand surgeons treating CTS and demonstrate the extent to which the CPG influenced practice behavior. METHODS: A multiple-choice questionnaire including detailed commonly observed clinical scenarios was developed, pre-tested, and approved by our institutional review board and the ASSH Web site committee chair. An anonymous electronic survey was emailed to ASSH members. RESULTS: Surveys were sent to 2,650 eligible ASSH members, and 27% responded. Seventy-two percent would advise a patient to have carpal tunnel release (CTR) if the patient had both classic history/examination of CTS and complete relief following cortisone injection. Forty-seven percent responded that in this scenario electrodiagnostic testing (EDX) is rarely or never necessary to recommend CTR. Seventy-nine percent of respondents were at least slightly more likely to order EDX based on CPG recommendations. Of these respondents, 57% replied that this was because of potential medicolegal ramifications. CONCLUSIONS: Although the CPG recommended EDX before surgery, and although most responding ASSH members use EDX to advise CTR, a majority answered that a supporting history and physical examination alone can be sufficient to recommend surgery, that a positive response to a cortisone injection can be sufficient indication for CTR, that EDX is not necessary in all cases of CTS, and that they would perform CTR in face of normal EDX if cortisone temporarily resolved symptoms. Among respondents more likely to order EDX based on the CPG, 57% answered that it was in some circumstances due to potential medicolegal ramifications. TYPE OF STUDY/LEVEL OF EVIDENCE: Economic and decision analysis III.
Asunto(s)
Síndrome del Túnel Carpiano/diagnóstico , Síndrome del Túnel Carpiano/cirugía , Adhesión a Directriz , Ortopedia , Pautas de la Práctica en Medicina , Antiinflamatorios/uso terapéutico , Cortisona/uso terapéutico , Electrodiagnóstico , Encuestas de Atención de la Salud , Humanos , Evaluación de Resultado en la Atención de Salud , Selección de Paciente , Guías de Práctica Clínica como Asunto , Sociedades Médicas , Encuestas y Cuestionarios , Estados UnidosRESUMEN
Enhlink is a computational tool for scATAC-seq data analysis, facilitating precise interrogation of enhancer function at the single-cell level. It employs an ensemble approach incorporating technical and biological covariates to infer condition-specific regulatory DNA linkages. Enhlink can integrate multi-omic data for enhanced specificity, when available. Evaluation with simulated and real data, including multi-omic datasets from the mouse striatum and novel promoter capture Hi-C data, demonstrate that Enhlink outperfoms alternative methods. Coupled with eQTL analysis, it identified a putative super-enhancer in striatal neurons. Overall, Enhlink offers accuracy, power, and potential for revealing novel biological insights in gene regulation.
Asunto(s)
Elementos de Facilitación Genéticos , Regiones Promotoras Genéticas , Animales , Ratones , Programas Informáticos , Sitios de Carácter Cuantitativo , Cuerpo Estriado/metabolismo , Análisis de la Célula IndividualRESUMEN
PURPOSE: Although there are many studies that have examined substance use and mental health concerns in rural areas, there is a paucity of research related to the prevalence of substance use and mental well-being in agriculturally based occupations. This study aimed to determine the prevalence of alcohol and opioid misuse and anxiety among adults in agriculturally based occupations in the rural Midwest and to determine the risk factors for alcohol misuse. METHODS: Data were collected via mailed surveys with 1,791 surveys returned. Participants completed the Alcohol Use Disorder Identification Test, the Drug Abuse Screening Test-1, the Generalized Anxiety Disorder Screener, and reported demographic data. Multivariable logistic regression was used to examine factors associated with alcohol misuse. RESULTS: Younger age, male, not married, agriculturally based workers were significantly associated with alcohol misuse. For opioid use, the highest prevalence rate (10%) was found among direct agricultural workers who were not married and in the age group 19-39. The highest anxiety prevalence rate was found in participants aged 19-39 (15.5%) who also scored in the highest level of alcohol misuse with a prevalence rate of 27.9%. CONCLUSIONS: Future research is suggested in the areas of gender identity and anxiety in agricultural populations and agriculturally based occupations as protective factors for opioid misuse.
Asunto(s)
Alcoholismo , Trastornos Relacionados con Opioides , Mal Uso de Medicamentos de Venta con Receta , Adulto , Humanos , Masculino , Femenino , Analgésicos Opioides/efectos adversos , Estudios Transversales , Alcoholismo/epidemiología , Alcoholismo/psicología , Identidad de Género , Trastornos Relacionados con Opioides/tratamiento farmacológico , Ansiedad/epidemiología , Trastornos de Ansiedad/inducido químicamente , Trastornos de Ansiedad/tratamiento farmacológico , Etanol , Ocupaciones , Mal Uso de Medicamentos de Venta con Receta/psicologíaRESUMEN
Enhancers play a crucial role in regulating gene expression and their functional status can be queried with cell type precision using using single-cell (sc)ATAC-seq. To facilitate analysis of such data, we developed Enhlink, a novel computational approach that leverages single-cell signals to infer linkages between regulatory DNA sequences, such as enhancers and promoters. Enhlink uses an ensemble strategy that integrates cell-level technical covariates to control for batch effects and biological covariates to infer robust condition-specific links and their associated p-values. It can integrate simultaneous gene expression and chromatin accessibility measurements of individual cells profiled by multi-omic experiments for increased specificity. We evaluated Enhlink using simulated and real scATAC-seq data, including those paired with physical enhancer-promoter links enumerated by promoter capture Hi-C and with multi-omic scATAC-/RNA-seq data we generated from the mouse striatum. These examples demonstrated that our method outperforms popular alternative strategies. In conjunction with eQTL analysis, Enhlink revealed a putative super-enhancer regulating key cell type-specific markers of striatal neurons. Taken together, our analyses demonstrate that Enhlink is accurate, powerful, and provides features that can lead to novel biological insights.
RESUMEN
BACKGROUND: The increasing prevalence of overweight and obesity needs effective approaches for weight loss in primary care and community settings. We compared weight loss with standard treatment in primary care with that achieved after referral by the primary care team to a commercial provider in the community. METHODS: In this parallel group, non-blinded, randomised controlled trial, 772 overweight and obese adults were recruited by primary care practices in Australia, Germany, and the UK. Participants were randomly assigned with a computer-generated simple randomisation sequence to receive either 12 months of standard care as defined by national treatment guidelines, or 12 months of free membership to a commercial programme (Weight Watchers), and followed up for 12 months. The primary outcome was weight change over 12 months. Analysis was by intention to treat (last observation carried forward [LOCF] and baseline observation carried forward [BOCF]) and in the population who completed the 12-month assessment. This trial is registered, number ISRCTN85485463. FINDINGS: 377 participants were assigned to the commercial programme, of whom 230 (61%) completed the 12-month assessment; and 395 were assigned to standard care, of whom 214 (54%) completed the 12-month assessment. In all analyses, participants in the commercial programme group lost twice as much weight as did those in the standard care group. Mean weight change at 12 months was -5·06 kg (SE 0·31) for those in the commercial programme versus -2·25 kg (0·21) for those receiving standard care (adjusted difference -2·77 kg, 95% CI -3·50 to -2·03) with LOCF; -4·06 kg (0·31) versus -1·77 kg (0·19; adjusted difference -2·29 kg, -2·99 to -1·58) with BOCF; and -6·65 kg (0·43) versus -3·26 kg (0·33; adjusted difference -3·16 kg, -4·23 to -2·11) for those who completed the 12-month assessment. Participants reported no adverse events related to trial participation. INTERPRETATION: Referral by a primary health-care professional to a commercial weight loss programme that provides regular weighing, advice about diet and physical activity, motivation, and group support can offer a clinically useful early intervention for weight management in overweight and obese people that can be delivered at large scale. FUNDING: Weight Watchers International, through a grant to the UK Medical Research Council.
Asunto(s)
Comercio , Obesidad/terapia , Sobrepeso/terapia , Derivación y Consulta , Pérdida de Peso , Adiposidad , Glucemia/análisis , Presión Sanguínea , Peso Corporal , Femenino , Humanos , Insulina/sangre , Lípidos/sangre , Masculino , Persona de Mediana Edad , Atención Primaria de Salud , Circunferencia de la CinturaRESUMEN
In the UK contemporary estimates of dietary Fe intakes rely upon food Fe content data from the 1980s or before. Moreover, there has been speculation that the natural Fe content of foods has fallen over time, predominantly due to changes in agricultural practices. Therefore, we re-analysed common plant-based foods of the UK diet for their Fe content (the '2000s analyses') and compared the values with the most recent published values (the '1980s analyses') and the much older published values (the '1930s analyses'), the latter two being from different editions of the McCance and Widdowson food tables. Overall, there was remarkable consistency between analytical data for foods spanning the 70 years. There was a marginal, but significant, apparent decrease in natural food Fe content from the 1930s to 1980s/2000s. Whether this represents a true difference or is analytical error between the eras is unclear and how it could translate into differences in intake requires clarification. However, fortificant Fe levels (and fortificant Fe intake based upon linked national data) did appear to have increased between the 1980s and 2000s, and deserve further attention in light of recent potential concerns over the long-term safety and effectiveness of fortificant Fe. In conclusion, the overall Fe content of plant-based foods is largely consistent between the 1930s and 2000s, with a fall in natural dietary Fe content negated or even surpassed by a rise in fortificant Fe but for which the long-term effects are uncertain.
Asunto(s)
Hierro/análisis , Plantas Comestibles/química , Adulto , Agricultura/métodos , Dieta , Grano Comestible/química , Alimentos Fortificados/análisis , Humanos , Hierro/administración & dosificación , Persona de Mediana Edad , Reino UnidoRESUMEN
High saturated fat intake is an established risk factor for several chronic diseases. The objective of the present study is to report dietary intakes and main food sources of fat and fatty acids (FA) from the first year of the National Diet and Nutrition Survey (NDNS) rolling programme in the UK. Dietary data were collected using 4 d estimated food diaries (n 896) and compared with dietary reference values (DRV) and previous NDNS results. Total fat provided 34-36 % food energy (FE) across all age groups, which was similar to previous surveys for adults. Men (19-64 years) and older girls (11-18 years) had mean intakes just above the DRV, while all other groups had mean total fat intakes of < 35 % FE. SFA intakes were lower compared with previous surveys, ranging from 13 to 15 % FE, but still above the DRV. Mean MUFA intakes were 12·5 % FE for adults and children aged 4-18 years and all were below the DRV. Mean n-3 PUFA intake represented 0·7-1·1 % FE. Compared with previous survey data, the direction of change for n-3 PUFA was upwards for all age groups, although the differences in absolute terms were very small. Trans-FA intakes were lower than in previous NDNS and were less than 2 g/d for all age groups, representing 0·8 % FE and lower than the DRV in all age groups. In conclusion, dietary intake of fat and FA is moving towards recommended levels for the UK population. However, there remains room for considerable further improvement.
Asunto(s)
Grasas de la Dieta/administración & dosificación , Ácidos Grasos/administración & dosificación , Adolescente , Adulto , Factores de Edad , Anciano , Niño , Estudios Transversales , Registros de Dieta , Ingestión de Energía , Ácidos Grasos Monoinsaturados/administración & dosificación , Ácidos Grasos Omega-3/administración & dosificación , Femenino , Promoción de la Salud , Encuestas Epidemiológicas/métodos , Humanos , Lactante , Masculino , Política Nutricional , Ácidos Grasos trans/administración & dosificación , Reino UnidoRESUMEN
Tissue engineering has been at the forefront of medical research for more than 20 years. One of the most promising applications of tissue engineering is its use in cartilage repair. A successful cartilage repair model relies on the intricate interplay between cells, scaffolds, and the environment. Dedication to this field has resulted in a wide variety of materials available for use as scaffolds, each with advantages and disadvantages. In this article, we explore these materials and provide concise descriptions of the major scaffold subtypes. Included in this review are synthetic scaffolds, hydrogels, nanofibers, biologic scaffolds derived from fibrin, collagen, hyaluronic acid, alginate, and PRP, as well as intact extracellular matrix (ECM) scaffolds. Scaffolds represent a cornerstone of the tissue-engineering model for cartilage repair. This promising technology offers new hope in the continuing effort to repair cartilage defects and improve the quality of life for patients worldwide.
Asunto(s)
Cartílago Articular , Ingeniería de Tejidos/métodos , Andamios del Tejido , Cartílago Articular/lesiones , Cartílago Articular/cirugía , Humanos , Andamios del Tejido/químicaRESUMEN
Cocaine use and overdose deaths attributed to cocaine have increased significantly in the United States in the last 10 years. Despite the prevalence of cocaine use disorder (CUD) and the personal and societal problems it presents, there are currently no approved pharmaceutical treatments. The absence of treatment options is due, in part, to our lack of knowledge about the etiology of CUDs. There is ample evidence that genetics plays a role in increasing CUD risk but thus far, very few risk genes have been identified in human studies. Genetic studies in mice have been extremely useful for identifying genetic loci and genes, but have been limited to very few genetic backgrounds, leaving substantial phenotypic, and genetic diversity unexplored. Herein we report the measurement of cocaine-induced behavioral sensitization using a 19-day protocol that captures baseline locomotor activity, initial locomotor response to an acute exposure to cocaine and locomotor sensitization across 5 exposures to the drug. These behaviors were measured in 51 genetically diverse Collaborative Cross (CC) strains along with their inbred founder strains. The CC was generated by crossing eight genetically diverse inbred strains such that each inbred CC strain has genetic contributions from each of the founder strains. Inbred CC mice are infinitely reproducible and provide a stable, yet diverse genetic platform on which to study the genetic architecture and genetic correlations among phenotypes. We have identified significant differences in cocaine locomotor sensitivity and behavioral sensitization across the panel of CC strains and their founders. We have established relationships among cocaine sensitization behaviors and identified extreme responding strains that can be used in future studies aimed at understanding the genetic, biological, and pharmacological mechanisms that drive addiction-related behaviors. Finally, we have determined that these behaviors exhibit relatively robust heritability making them amenable to future genetic mapping studies to identify addiction risk genes and genetic pathways that can be studied as potential targets for the development of novel therapeutics.
RESUMEN
The National Diet and Nutrition Survey (NDNS) is a cross-sectional survey designed to gather data representative of the UK population on food consumption, nutrient intakes and nutritional status. The objectives of the present paper were to identify and describe food consumption and nutrient intakes in the UK from the first year of the NDNS rolling programme (2008-09) and compare these with the 2000-01 NDNS of adults aged 19-64 years and the 1997 NDNS of young people aged 4-18 years. Differences in median daily food consumption and nutrient intakes between the surveys were compared by sex and age group (4-10 years, 11-18 years and 19-64 years). There were no changes in energy, total fat or carbohydrate intakes between the surveys. Children aged 4-10 years had significantly lower consumption of soft drinks (not low calorie), crisps and savoury snacks and chocolate confectionery in 2008-09 than in 1997 (all P < 0·0001). The percentage contribution of non-milk extrinsic sugars to food energy was also significantly lower than in 1997 in children aged 4-10 years (P < 0·0001), contributing 13·7-14·6 % in 2008-09 compared with 16·8 % in 1997. These changes were not as marked in older children and there were no changes in these foods and nutrients in adults. There was still a substantial proportion (46 %) of girls aged 11-18 years and women aged 19-64 years (21 %) with mean daily Fe intakes below the lower reference nutrient intake. Since previous surveys there have been some positive changes in intakes especially in younger children. However, further attention is required in other groups, in particular adolescent girls.
Asunto(s)
Encuestas sobre Dietas , Encuestas Nutricionales , Adolescente , Adulto , Niño , Preescolar , Estudios Transversales , Ingestión de Alimentos , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estado Nutricional , Reino Unido , Adulto JovenRESUMEN
BACKGROUND: The scale of overweight and obesity in the UK places a considerable burden on the NHS. In some areas the NHS has formed partnerships with commercial companies to offer weight management services, but there has been little evaluation of these schemes.This study is an independent audit of the Weight Watchers NHS Referral scheme and evaluates the weight change of obese and overweight adults referred to Weight Watchers (WW) by the NHS. METHOD: Data was obtained from the WW NHS Referral Scheme database for 29,326 referral courses started after 2nd April 2007 and ending before 6th October 2009 [90% female; median age 49 years (IQR 38-61 years); median BMI 35.1 kg/m2 (IQR 31.8-39.5 kg/m2). Participants received vouchers (funded by the PCT following referral by a healthcare professional) to attend 12 WW meetings. Body weight was measured at WW meetings and relayed to the central database. RESULTS: Median weight change for all referrals was -2.8 kg [IQR -5.9--0.7 kg] representing -3.1% initial weight. 33% of all courses resulted in loss of ≥5% initial weight. 54% of courses were completed. Median weight change for those completing a first course was -5.4 kg [IQR -7.8--3.1 kg] or -5.6% of initial weight. 57% lost ≥5% initial weight. CONCLUSIONS: A third of all patients who were referred to WW through the WW NHS Referral Scheme and started a 12 session course achieved ≥5% weight loss, which is usually associated with clinical benefits. This is the largest audit of NHS referral to a commercial weight loss programme in the UK and results are comparable with other options for weight loss available through primary care.