RESUMEN
People with psychosis exhibit thalamo-cortical hyperconnectivity and cortico-cortical hypoconnectivity with sensory networks, however, it remains unclear if this applies to all sensory networks, whether it arises from other illness factors, or whether such differences could form the basis of a viable biomarker. To address the foregoing, we harnessed data from the Human Connectome Early Psychosis Project and computed resting-state functional connectivity (RSFC) matrices for 54 healthy controls and 105 psychosis patients. Primary visual, secondary visual ("visual2"), auditory, and somatomotor networks were defined via a recent brain network partition. RSFC was determined for 718 regions via regularized partial correlation. Psychosis patients-both affective and non-affective-exhibited cortico-cortical hypoconnectivity and thalamo-cortical hyperconnectivity in somatomotor and visual2 networks but not in auditory or primary visual networks. When we averaged and normalized the visual2 and somatomotor network connections, and subtracted the thalamo-cortical and cortico-cortical connectivity values, a robust psychosis biomarker emerged (p = 2e-10, Hedges' g = 1.05). This "somato-visual" biomarker was present in antipsychotic-naive patients and did not depend on confounds such as psychiatric comorbidities, substance/nicotine use, stress, anxiety, or demographics. It had moderate test-retest reliability (ICC = 0.62) and could be recovered in five-minute scans. The marker could discriminate groups in leave-one-site-out cross-validation (AUC = 0.79) and improve group classification upon being added to a well-known neurocognition task. Finally, it could differentiate later-stage psychosis patients from healthy or ADHD controls in two independent data sets. These results introduce a simple and robust RSFC biomarker that can distinguish psychosis patients from controls by the early illness stages.
RESUMEN
BACKGROUND: There are an estimated 1.5 million children living with human immunodeficiency virus (CLHIV), most residing in sub-Saharan Africa. A common hospital presentation of CLHIV is new-onset seizure, for which imaging is helpful but not routinely performed due to scarce resources. We present imaging findings and their association with clinical risk factors and outcomes in a cohort of Zambian CLHIV presenting with new-onset seizure. METHODS: In this prospective cohort study, participants were recruited at the University Teaching Hospital in Lusaka, Zambia. Various clinical and demographic characteristics were obtained. Computed tomography (CT), magnetic resonance imaging (MRI), or both were obtained during admission or shortly after discharge. If both studies were available, MRI data was used. Two neuroradiologists interpreted images using REDCap-based NeuroInterp, a tool that quantifies brain imaging findings. Age-dependent neuropsychologic assessments were administered. RESULTS: Nineteen of 39 (49%) children had a brain MRI, 16 of 39 (41%) had CT, and four of 39 (10%) had both. Mean age was 6.8 years (S.D. = 4.8). Children with advanced HIV disease had higher odds of atrophy (odds ration [OR] 7.2, 95% confidence interval [CI] 1.1 to 48.3). Focal abnormalities were less likely in children receiving antiretroviral therapy (ART) (OR 0.22, 95% CI 0.05 to 1.0). Children with neurocognitive impairment were more likely to have atrophy (OR 8.4, 95% CI 1.3 to 55.4) and less likely to have focal abnormalities (OR 0.2, 95% CI 0.03 to 0.9). CONCLUSIONS: Focal brain abnormalities on MRI were less likely in CLHIV on ART. Brain atrophy was the most common imaging abnormality, which was linked to severe neurocognitive impairment.
Asunto(s)
Infecciones por VIH , Imagen por Resonancia Magnética , Convulsiones , Tomografía Computarizada por Rayos X , Humanos , Zambia/epidemiología , Masculino , Infecciones por VIH/diagnóstico por imagen , Infecciones por VIH/complicaciones , Femenino , Niño , Convulsiones/diagnóstico por imagen , Convulsiones/etiología , Preescolar , Estudios Prospectivos , Encéfalo/diagnóstico por imagen , Encéfalo/patología , Adolescente , NeuroimagenRESUMEN
People with psychosis exhibit thalamo-cortical hyperconnectivity and cortico-cortical hypoconnectivity with sensory networks, however, it remains unclear if this applies to all sensory networks, whether it arises from other illness factors, or whether such differences could form the basis of a viable biomarker. To address the foregoing, we harnessed data from the Human Connectome Early Psychosis Project and computed resting-state functional connectivity (RSFC) matrices for 54 healthy controls and 105 psychosis patients. Primary visual, secondary visual ("visual2"), auditory, and somatomotor networks were defined via a recent brain network partition. RSFC was determined for 718 regions via regularized partial correlation. Psychosis patients- both affective and non-affective-exhibited cortico-cortical hypoconnectivity and thalamo-cortical hyperconnectivity in somatomotor and visual2 networks but not in auditory or primary visual networks. When we averaged and normalized the visual2 and somatomotor network connections, and subtracted the thalamo-cortical and cortico-cortical connectivity values, a robust psychosis biomarker emerged (p=2e-10, Hedges' g=1.05). This "somato-visual" biomarker was present in antipsychotic-naive patients and did not depend on confounds such as psychiatric comorbidities, substance/nicotine use, stress, anxiety, or demographics. It had moderate test-retest reliability (ICC=.61) and could be recovered in five-minute scans. The marker could discriminate groups in leave-one-site-out cross-validation (AUC=.79) and improve group classification upon being added to a well-known neurocognition task. Finally, it could differentiate later-stage psychosis patients from healthy or ADHD controls in two independent data sets. These results introduce a simple and robust RSFC biomarker that can distinguish psychosis patients from controls by the early illness stages.
RESUMEN
BACKGROUND/OBJECTIVES: Stroke damage to the primary visual cortex induces large, homonymous visual field defects that impair daily living. Here, we asked if vision-related quality of life (VR-QoL) is impacted by time since stroke. SUBJECTS/METHODS: We conducted a retrospective meta-analysis of 95 occipital stroke patients (female/male = 26/69, 27-78 years old, 0.5-373.5 months poststroke) in whom VR-QoL was estimated using the National Eye Institute Visual Functioning Questionnaire (NEI-VFQ) and its 10-item neuro-ophthalmic supplement (Neuro10). Visual deficit severity was represented by the perimetric mean deviation (PMD) calculated from 24-2 Humphrey visual fields. Data were compared with published cohorts of visually intact controls. The relationship between VR-QoL and time poststroke was assessed across participants, adjusting for deficit severity and age with a multiple linear regression analysis. RESULTS: Occipital stroke patients had significantly lower NEI-VFQ and Neuro10 composite scores than controls. All subscale scores describing specific aspects of visual ability and functioning were impaired except for ocular pain and general health, which did not differ significantly from controls. Surprisingly, visual deficit severity was not correlated with either composite score, both of which increased with time poststroke, even when adjusting for PMD and age. CONCLUSIONS: VR-QoL appears to improve with time postoccipital stroke, irrespective of visual deficit size or patient age at insult. This may reflect the natural development of compensatory strategies and lifestyle adjustments. Thus, future studies examining the impact of rehabilitation on daily living in this patient population should consider the possibility that their VR-QoL may change gradually over time, even without therapeutic intervention.
Asunto(s)
Calidad de Vida , Accidente Cerebrovascular , Humanos , Femenino , Persona de Mediana Edad , Masculino , Accidente Cerebrovascular/fisiopatología , Accidente Cerebrovascular/complicaciones , Anciano , Adulto , Estudios Retrospectivos , Trastornos de la Visión/fisiopatología , Trastornos de la Visión/etiología , Lóbulo Occipital/fisiopatología , Campos Visuales/fisiologíaRESUMEN
Purpose: Damage to the adult primary visual cortex (V1) causes vision loss in the contralateral hemifield, initiating a process of transsynaptic retrograde degeneration (TRD). Here, we examined retinal correlates of TRD using a new metric to account for global changes in inner retinal thickness and asked if perceptual training in the intact or blind field impacts its progression. Methods: We performed a meta-analysis of optical coherence tomography data in 48 participants with unilateral V1 stroke and homonymous visual defects who completed clinical trial NCT03350919. After measuring the thickness of the macular ganglion cell and inner plexiform layer (GCL-IPL) and the peripapillary retinal nerve fiber layer (RNFL), we computed individual laterality indices (LI) at baseline and after â¼6 months of daily motion discrimination training in the intact or blind field. Increasingly positive LI denoted greater layer thinning in retinal regions affected versus unaffected by the cortical damage. Results: Pretraining, the affected GCL-IPL and RNFL were thinner than their unaffected counterparts, generating LI values positively correlated with time since stroke. Participants trained in their intact field exhibited increased LIGCL-IPL. Those trained in their blind field had no significant change in LIGCL-IPL. LIRNFL did not change in either group. Conclusions: Relative shrinkage of the affected versus unaffected macular GCL-IPL can be reliably measured at an individual level and increases with time post-V1 stroke. Relative thinning progressed during intact-field training but appeared to be halted by training within the blind field, suggesting a potentially neuroprotective effect of this simple behavioral intervention.
Asunto(s)
Retina , Accidente Cerebrovascular , Adulto , Humanos , Lateralidad Funcional , Neuronas , Tomografía de Coherencia Óptica , Ensayos Clínicos como AsuntoRESUMEN
OBJECTIVE: To determine the long-term outcomes, including mortality and recurrent seizures, among children living with HIV (CLWH) who present with new onset seizure. METHODS: Zambian CLWH and new onset seizure were enrolled prospectively to determine the risk of and risk factors for recurrent seizures. Demographic data, clinical profiles, index seizure etiology, and 30-day mortality outcomes were previously reported. After discharge, children were followed quarterly to identify recurrent seizures and death. Given the high risk of early death, risk factors for recurrent seizure were evaluated using a model that adjusted for mortality. RESULTS: Among 73 children enrolled, 28 died (38%), 22 within 30-days of the index seizure. Median follow-up was 533 days (IQR 18-957) with 5% (4/73) lost to follow-up. Seizure recurrence was 19% among the entire cohort. Among children surviving at least 30-days after the index seizure, 27% had a recurrent seizure. Median time from index seizure to recurrent seizure was 161 days (IQR 86-269). Central nervous system opportunistic infection (CNS OI), as the cause for the index seizure was protective against recurrent seizures and higher functional status was a risk factor for seizure recurrence. SIGNIFICANCE: Among CLWH presenting with new onset seizure, mortality risks remain elevated beyond the acute illness period. Recurrent seizures are common and are more likely in children with higher level of functioning even after adjusting for the outcome of death. Newer antiseizure medications appropriate for co-usage with antiretroviral therapies are needed for the care of these children. CNS OI may represent a potentially reversible provocation for the index seizure, while seizures in high functioning CLWH without a CNS OI may be the result of a prior brain injury or susceptibility to seizures unrelated to HIV and thus represent an ongoing predisposition to seizures. PLAIN LANGUAGE SUMMARY: This study followed CLWH who experienced a new onset seizure to find out how many go on to have more seizures and identify any patient characteristics associated with having more seizures. The study found that mortality rates continue to be high beyond the acute clinical presentation with new onset seizure. Children with a CNS OI causing the new onset seizure had a lower risk of later seizures, possibly because the trigger for the seizure can be treated. In contrast, high functioning children without a CNS OI were at higher risk of future seizures.
Asunto(s)
Epilepsia Generalizada , Infecciones por VIH , Niño , Humanos , Anticonvulsivantes/uso terapéutico , Estudios de Cohortes , Convulsiones/tratamiento farmacológico , Epilepsia Generalizada/tratamiento farmacológico , Infecciones por VIH/complicaciones , Infecciones por VIH/tratamiento farmacológico , Daño Encefálico Crónico/inducido químicamente , Daño Encefálico Crónico/complicaciones , Daño Encefálico Crónico/tratamiento farmacológicoRESUMEN
BACKGROUND: Seizures are relatively common among children with HIV in low- and middle-income countries and are associated with significant morbidity and mortality. Early treatment with antiretroviral therapy (ART) may reduce this risk by decreasing rates of central nervous system infections and HIV encephalopathy. METHODS: We conducted a prospective, unmatched case-control study. We enrolled children with new-onset seizure from University Teaching Hospital in Lusaka, Zambia and 2 regional hospitals in rural Zambia. Controls were children with HIV and no history of seizures. Recruitment took place from 2016 to 2019. Early treatment was defined as initiation of ART before 12 months of age, at a CD4 percentage >15% in children aged 12-60 months or a CD4 count >350 cells/mm 3 for children aged 60 months or older. Logistic regression models were used to evaluate the association between potential risk factors and seizures. RESULTS: We identified 73 children with new-onset seizure and compared them with 254 control children with HIV but no seizures. Early treatment with ART was associated with a significant reduction in the odds of seizures [odds ratio (OR) 0.04, 95% confidence interval: 0.02 to 0.09; P < 0.001]. Having an undetectable viral load at the time of enrollment was strongly protective against seizures (OR 0.03, P < 0.001), whereas history of World Health Organization Stage 4 disease (OR 2.2, P = 0.05) or CD4 count <200 cells/mm 3 (OR 3.6, P < 0.001) increased risk of seizures. CONCLUSIONS: Early initiation of ART and successful viral suppression would likely reduce much of the excess seizure burden in children with HIV.
Asunto(s)
Fármacos Anti-VIH , Infecciones por VIH , Niño , Humanos , Lactante , Infecciones por VIH/complicaciones , Infecciones por VIH/tratamiento farmacológico , Zambia/epidemiología , Estudios de Casos y Controles , Factores de Riesgo , Convulsiones/tratamiento farmacológico , Convulsiones/prevención & control , Convulsiones/complicaciones , Recuento de Linfocito CD4 , Fármacos Anti-VIH/uso terapéuticoRESUMEN
Streamflow-duration assessment methods (SDAMs) are rapid, indicator-based tools for classifying streamflow duration (e.g., intermittent vs perennial flow) at the reach scale. Indicators are easily assessed stream properties used as surrogates of flow duration, which is too resource intensive to measure directly for many reaches. Invertebrates are commonly used as SDAM indicators because many are not highly mobile, and different species have life stages that require flow for different durations and times of the year. The objectives of this study were to 1) identify invertebrate taxa that can be used as SDAM indicators to distinguish between stream reaches having intermittent and perennial flow, 2) to compare indicator strength across different taxonomic and numeric resolutions, and 3) to assess the relative importance of season and habitat type on the ability of invertebrates to predict streamflow-duration class. We used 2 methods, random forest models and indicator species analysis, to analyze aquatic and terrestrial invertebrate data (presence/absence, density, and biomass) at the family and genus levels from 370 samples collected from both erosional and depositional habitats during both wet and dry seasons. In total, 36 intermittent and 53 perennial reaches were sampled along 31 forested headwater streams in 4 level II ecoregions across the United States. Random forest models for family- and genus-level datasets had stream classification accuracy ranging from 88.9 to 93.2%, with slightly higher accuracy for density than for presence/absence and biomass datasets. Season (wet/dry) tended to be a stronger predictor of streamflow-duration class than habitat (erosional/depositional). Many taxa at the family (58.8%) and genus level (61.6%) were collected from both intermittent and perennial reaches, and most taxa that were exclusive to 1 streamflow-duration class were rarely collected. However, 23 family-level or higher taxa (20 aquatic and 3 terrestrial) and 44 aquatic genera were identified as potential indicators of streamflow-duration class for forested headwater streams. The utility of the potential indicators varied across level II ecoregions in part because of representation of intermittent and perennial reaches in the dataset but also because of variable ecological responses to drying among species. Aquatic invertebrates have been an important field indicator of perennial reaches in existing SDAMs, but our findings highlight how including aquatic and terrestrial invertebrates as indicators of intermittent reaches can further maximize the data collected for streamflow-duration classifications.
RESUMEN
BACKGROUND: Home Blood Pressure Monitoring (HBPM) that includes a team with a clinical pharmacist is an evidence-based intervention that improves blood pressure (BP). Yet, strategies for promoting its adoption in primary care are lacking. We developed potentially feasible and sustainable implementation strategies to improve hypertension control and BP equity. METHODS: We assessed barriers and facilitators to HBPM and iteratively adapted implementation strategies through key informative interviews and guidance from a multistakeholder stakeholder team involving investigators, clinicians, and practice administration. RESULTS: Strategies include: 1) pro-active outreach to patients; 2) provision of BP devices; 3) deployment of automated bidirectional texting to support patients through education messages for patients to transmit their readings to the clinical team; 3) a hypertension visit note template; 4) monthly audit and feedback reports on progress to the team; and 5) training to the patients and teams. We will use a stepped wedge randomized trial to assess RE-AIM outcomes. These are defined as follows Reach: the proportion of eligible patients who agree to participate in the BP texting; Effectiveness: the proportion of eligible patients with their last BP reading <140/90 (six months); Adoption: the proportion of patients invited to the BP texting; Implementation: patients who text their BP reading ≥10 of days per month; and Maintenance: sustained BP control post-intervention (twelve months). We will also examine RE-AIM metrics stratified by race and ethnicity. CONCLUSIONS: Findings will inform the impact of strategies for the adoption of team-based HPBM and the impact of the intervention on hypertension control and equity. REGISTRATION DETAILS: www. CLINICALTRIALS: gov Identifier: NCT05488795.
Asunto(s)
Monitoreo Ambulatorio de la Presión Arterial , Hipertensión , Humanos , Presión Sanguínea/fisiología , Monitoreo Ambulatorio de la Presión Arterial/métodos , Hipertensión/diagnóstico , Hipertensión/terapia , Farmacéuticos , Ensayos Clínicos Controlados Aleatorios como AsuntoRESUMEN
BACKGROUND: Antiretroviral treatment improves health related quality of life (HRQoL) of people with human immunodeficiency virus (PWH). However, one third initiating first-line treatment experience virological failure and the determinants of HRQoL in this key population are unknown. Our study aims to identify determinants of among PWH failing antiretroviral treatment in sub-Saharan Africa. METHODS: We analysed data from a cohort of PWH having virological failure (> 1,000 copies/mL) on first-line ART in South Africa and Uganda. We measured HRQoL using the EuroQOL EQ-5D-3L and used a two-part regression model to obtain by-country analyses for South Africa and Uganda. The first part identifies risk factors that were associated with the likelihood of participants reporting perfect health (utility = 1) versus non-perfect health (utility < 1). The second part identifies risk factors that were associated with the EQ-5 L-3L utility scores for participants reporting non-perfect health. We performed sensitivity analyses to compare the results between the two-part model using tobit models and ordinary least squares regression. RESULTS: In both countries, males were more likely to report perfect health and participants with at least one comorbidity were less likely to report perfect health. In South Africa, participants with side effects and in Uganda those with opportunistic infections were also less likely to report perfect health. In Uganda, participants with 100% ART adherence were more likely to report perfect health. In South Africa, high HIV viral load, experiencing ART side effects, and the presence of opportunistic infections were each associated with lower HRQoL, whereas participants with 100% ART adherence reported higher HRQoL. In Uganda participants with lower CD4 count had lower HRQoL. CONCLUSION: Markers of advanced disease (opportunistic infection, high viral load, low CD4), side effects, comorbidities and lack of ART adherence negatively impacted HRQoL for PWH experiencing virological failure. TRIAL REGISTRATION: ClinicalTrials.gov: NCT02787499.
Asunto(s)
Infecciones por VIH , Infecciones Oportunistas , Masculino , Humanos , VIH , Calidad de Vida , Sudáfrica/epidemiología , Antirretrovirales , Infecciones por VIH/tratamiento farmacológico , Infecciones por VIH/epidemiologíaRESUMEN
Background: Food insecurity has been linked to suboptimal antiretroviral therapy (ART) adherence in persons with HIV (PWH). This association has not been evaluated using tenofovir diphosphate (TFV-DP) in dried blood spots (DBSs), a biomarker of cumulative ART adherence and exposure. Methods: Within a prospective South African cohort of treatment-naive PWH initiating ART, a subset of participants with measured TFV-DP in DBS values was assessed for food insecurity status. Bivariate and multivariate median-based regression analysis compared the association between food insecurity and TFV-DP concentrations in DBSs adjusting for age, sex, ethnicity, medication possession ratio (MPR), and estimated glomerular filtration rate. Results: Drug concentrations were available for 285 study participants. Overall, 62 (22%) PWH reported worrying about food insecurity and 44 (15%) reported not having enough food to eat in the last month. The crude median concentrations of TFV-DP in DBSs differed significantly between those who expressed food insecurity worry versus those who did not (599 [interquartile range {IQR}, 417-783] fmol/punch vs 716 [IQR, 453-957] fmol/punch; P = .032). In adjusted median-based regression, those with food insecurity worry had concentrations of TFV-DP that were 155 (95% confidence interval, -275 to -35; P = .012) fmol/punch lower than those who did not report food insecurity worry. Age and MPR remained significantly associated with TFV-DP. Conclusions: In this study, food insecurity worry is associated with lower TFV-DP concentrations in South African PWH. This highlights the role of food insecurity as a social determinant of HIV outcomes including ART failure and resistance.
RESUMEN
BACKGROUND: The use of a Left Ventricular Assist Device (LVAD) in patients with advanced heart failure refractory to optimal medical management has progressed steadily over the past two decades. Data have demonstrated reduced LVAD efficacy, worse clinical outcome, and higher mortality for patients who experience significant ventricular tachyarrhythmia (VTA). We hypothesize that a novel prophylactic intra-operative VTA ablation protocol at the time of LVAD implantation may reduce the recurrent VTA and adverse events postimplant. METHODS: We designed a prospective, multicenter, open-label, randomized-controlled clinical trial enrolling 100 patients who are LVAD candidates with a history of VTA in the previous 5 years. Enrolled patients will be randomized in a 1:1 fashion to intra-operative VTA ablation (n = 50) versus conventional medical management (n = 50) with LVAD implant. Arrhythmia outcomes data will be captured by an implantable cardioverter defibrillator (ICD) to monitor VTA events, with a uniform ICD programming protocol. Patients will be followed prospectively over a mean of 18 months (with a minimum of 9 months) after LVAD implantation to evaluate recurrent VTA, adverse events, and procedural outcomes. Secondary endpoints include right heart function/hemodynamics, healthcare utilization, and quality of life. CONCLUSION: The primary aim of this first-ever randomized trial is to assess the efficacy of intra-operative ablation during LVAD surgery in reducing VTA recurrence and improving clinical outcomes for patients with a history of VTA.
Asunto(s)
Desfibriladores Implantables , Insuficiencia Cardíaca , Corazón Auxiliar , Taquicardia Ventricular , Humanos , Corazón Auxiliar/efectos adversos , Estudios Prospectivos , Calidad de Vida , Factores de Riesgo , Electrocardiografía , Arritmias Cardíacas , Taquicardia Ventricular/etiología , Resultado del TratamientoRESUMEN
Repairable adhesive elastomers are emerging materials employed in compelling applications such as soft robotics, biosensing, tissue regeneration, and wearable electronics. Facilitating adhesion requires strong interactions, while self-healing requires bond dynamicity. This contrast in desired bond characteristics presents a challenge in the design of healable adhesive elastomers. Furthermore, 3D printability of this novel class of materials has received limited attention, restricting the potential design space of as-built geometries. Here, we report a series of 3D-printable elastomeric materials with self-healing ability and adhesive properties. Repairability is obtained using Thiol-Michael dynamic crosslinkers incorporated into the polymer backbone, while adhesion is facilitated with acrylate monomers. Elastomeric materials with excellent elongation up to 2000%, self-healing stress recovery >95%, and strong adhesion with metallic and polymeric surfaces are demonstrated. Complex functional structures are successfully 3D printed using a commercial digital light processing (DLP) printer. Shape-selective lifting of low surface energy poly(tetrafluoroethylene) objects is achieved using soft robotic actuators with interchangeable 3D-printed adhesive end effectors, wherein tailored contour matching leads to increased adhesion and successful lifting capacity. The demonstrated utility of these adhesive elastomers provides unique capabilities to easily program soft robot functionality.
RESUMEN
Existing methods for estimation of dynamic treatment regimes are mostly limited to intention-to-treat analyses-which estimate the effect of randomization to a particular treatment regime without considering the compliance behavior of patients. In this article, we propose a novel nonparametric Bayesian Q-learning approach to construct optimal sequential treatment regimes that adjust for partial compliance. We consider the popular potential compliance framework, where some potential compliances are latent and need to be imputed. The key challenge is learning the joint distribution of the potential compliances, which we accomplish using a Dirichlet process mixture model. Our approach provides two kinds of treatment regimes: (1) conditional regimes that depend on the potential compliance values; and (2) marginal regimes where the potential compliances are marginalized. Extensive simulation studies highlight the usefulness of our method compared to intention-to-treat analyses. We apply our method to the Adaptive Treatment for Alcohol and Cocaine Dependence (ENGAGE) study , where the goal is to construct optimal treatment regimes to engage patients in therapy.
Asunto(s)
Teorema de Bayes , Humanos , Simulación por ComputadorRESUMEN
Existing methods for estimating the mean outcome under a given sequential treatment rule often rely on intention-to-treat analyses, which estimate the effect of following a certain treatment rule regardless of compliance behavior of patients. There are two major concerns with intention-to-treat analyses: (1) the estimated effects are often biased toward the null effect; (2) the results are not generalizable and reproducible due to the potentially differential compliance behavior. These are particularly problematic in settings with a high level of non-compliance, such as substance use disorder studies. Our work is motivated by the Adaptive Treatment for Alcohol and Cocaine Dependence study (ENGAGE), which is a multi-stage trial that aimed to construct optimal treatment strategies to engage patients in therapy. Due to the relatively low level of compliance in this trial, intention-to-treat analyses essentially estimate the effect of being randomized to a certain treatment, instead of the actual effect of the treatment. We obviate this challenge by defining the target parameter as the mean outcome under a dynamic treatment regime conditional on a potential compliance stratum. We propose a flexible non-parametric Bayesian approach based on principal stratification, which consists of a Gaussian copula model for the joint distribution of the potential compliances, and a Dirichlet process mixture model for the treatment sequence specific outcomes. We conduct extensive simulation studies which highlight the utility of our approach in the context of multi-stage randomized trials. We show robustness of our estimator to non-linear and non-Gaussian settings as well.
Asunto(s)
Toma de Decisiones , Cooperación del Paciente , Humanos , Teorema de Bayes , Simulación por Computador , Resultado del TratamientoRESUMEN
Mountaintop removal coal mining (MTR) has been a major source of landscape change in the Central Appalachians of the United States (US). Changes in stream hydrology, channel geomorphology and water quality caused by MTR coal mining can lead to severe impairment of stream ecological integrity. The objective of the Clean Water Act (CWA) is to restore and maintain the ecological integrity of the Nation's waters. Sensitive, readily measured indicators of ecosystem structure and function are needed for the assessment of stream ecological integrity. Most CWA assessments rely on structural indicators; inclusion of functional indicators could make these assessments more holistic and effective. The goals of this study were: (1) test the efficacy of selected carbon (C) and nitrogen (N) cycling and microbial structural and functional indicators for assessing MTR coal mining impacts on streams; (2) determine whether indicators respond to impacts in a predictable manner; and (3) determine if functional indicators are less likely to change than are structural indicators in response to stressors associated with MTR coal mining. The structural indicators are water quality and sediment organic matter concentrations, and the functional indicators relate to microbial activity and biofilm production. Seasonal measurements were conducted over the course of a year in streams draining small MTR-impacted and forested watersheds in the Twentymile Creek watershed of West Virginia (WV). Five of the eight structural parameters measured had significant responses, with all means greater in the MTR-impacted streams than in the forested streams. These responses resulted from changes in source or augmentation of the original source of the C and N structural parameters because of MTR coal mining. Nitrate concentration and the stable carbon isotopic ratio of dissolved inorganic carbon were the most effective indicators evaluated in this study. Only three of the fourteen functional indicators measured had significant responses to MTR coal mining, with all means greater in the forested streams than in the MTR-impacted streams. These results suggest that stressors associated with MTR coal mining caused reduction in some aspects of microbial cycling, but resource subsidies may have counterbalanced some of the inhibition leading to no observable change in most of the functional indicators. The detritus base, which is thought to confer functional stability, was likely sustained in the MTR-impacted streams by channel storage and/or leaf litter inputs from their largely intact riparian zones. Overall, our results largely support the hypothesis that certain functional processes are more resistant to stress induced change than structural properties but also suggest the difficulty of identifying suitable functional indicators for ecological integrity assessment.
RESUMEN
OBJECTIVE: This study aimed to evaluate the 9-month cost and health-related quality of life (HRQOL) outcomes of resistance versus viral load testing strategies to manage virological failure in low-middle income countries. METHODS: We analyzed secondary outcomes from the REVAMP clinical trial: a pragmatic, open label, parallel-arm randomized trial investigating resistance versus viral load testing for individuals failing first-line treatment in South Africa and Uganda. We collected resource data, valued according to local cost data and used the 3-level version of EQ-5D to measure HRQOL at baseline and 9 months. We applied seemingly unrelated regression equations to account for the correlation between cost and HRQOL. We conducted intention-to-treat analyses with multiple imputation using chained equations for missing data and performed sensitivity analyses using complete cases. RESULTS: For South Africa, resistance testing and opportunistic infections were associated with statistically significantly higher total costs, and virological suppression was associated with lower total cost. Higher baseline utility, higher cluster of differentiation 4 (CD4) count, and virological suppression were associated with better HRQOL. For Uganda, resistance testing and switching to second-line treatment were associated with higher total cost, and higher CD4 was associated with lower total cost. Higher baseline utility, higher CD4 count, and virological suppression were associated with better HRQOL. Sensitivity analyses of the complete-case analysis confirmed the overall results. CONCLUSION: Resistance testing showed no cost or HRQOL advantage in South Africa or Uganda over the 9-month REVAMP clinical trial.
Asunto(s)
Fármacos Anti-VIH , Humanos , Fármacos Anti-VIH/uso terapéutico , Calidad de Vida , SudáfricaRESUMEN
Background:To facilitate advances in spinal muscular atrophy therapeutic research, it is important to determine the impact and prevalence of symptoms experienced by children with spinal muscular atrophy. Methods: We conducted qualitative interviews with caregivers of children with spinal muscular atrophy. From these interviews, we generated a survey inquiring about 260 symptoms of importance grouped into 17 symptomatic themes. Results: Sixteen caregivers of children with spinal muscular atrophy aged from 4 months to 12 years participated in initial interviews, and 77 caregivers completed the survey. Higher symptom prevalence was associated with spinal muscular atrophy type, SMN2 copy number, and functional status. Hip, thigh, or knee weakness had the greatest reported impact on the lives of children with spinal muscular atrophy. Conclusions: This research provides one of the largest data sets regarding disease burden in children with spinal muscular atrophy. The most prevalent symptoms are not identical to those with the greatest impact. This unique insight into the most impactful symptoms will help focus therapeutic development in spinal muscular atrophy.
Asunto(s)
Atrofia Muscular Espinal , Atrofias Musculares Espinales de la Infancia , Humanos , Niño , Estudios Transversales , Atrofia Muscular Espinal/diagnóstico , Costo de Enfermedad , Cuidadores , Prevalencia , Atrofias Musculares Espinales de la Infancia/epidemiología , Atrofias Musculares Espinales de la Infancia/terapiaRESUMEN
Wastewaters and leachates from various inland resource extraction activities contain high ionic concentrations and differ in ionic composition, which complicates the understanding and effective management of their relative risks to stream ecosystems. To this end, we conducted a stream mesocosm dose-response experiment using two dosing recipes prepared from industrial salts. One recipe was designed to generally reflect the major ion composition of deep well brines (DWB) produced from gas wells (primarily Na+, Ca2+, and Cl-) and the other, the major ion composition of mountaintop mining (MTM) leachates from coal extraction operations (using salts dissociating to Ca2+, Mg2+, Na+, SO42- and HCO3-)-both sources being extensive in the Central Appalachians of the USA. The recipes were dosed at environmentally relevant nominal concentrations of total dissolved solids (TDS) spanning 100 to 2000 mg/L for 43 d under continuous flow-through conditions. The colonizing native algal periphyton and benthic invertebrates comprising the mesocosm ecology were assessed with response sensitivity distributions (RSDs) and hazard concentrations (HCs) at the taxa, community (as assemblages), and system (as primary and secondary production) levels. Single-species toxicity tests were run with the same recipes. Dosing the MTM recipe resulted in a significant loss of secondary production and invertebrate taxa assemblages that diverged from the control at all concentrations tested. Comparatively, intermediate doses of the DWB recipe had little consequence or increased secondary production (for emergence only) and had assemblages less different from the control. Only the highest dose of the DWB recipe had a negative impact on certain ecologies. The MTM recipe appeared more toxic, but overall, for both types of resource extraction wastewaters, the mesocosm responses suggested significant changes in stream ecology would not be expected for specific conductivity below 300 µS/cm, a published aquatic life benchmark suggested for the region.
RESUMEN
Purpose: Damage to the adult primary visual cortex (V1) causes vision loss in the contralateral hemifield, initiating a process of trans-synaptic retrograde degeneration (TRD). Here, we examined retinal correlates of TRD using a new metric to account for global changes in inner retinal thickness, and asked if perceptual training in the intact or blind field impacts its progression. Methods: We performed a meta-analysis of optical coherence tomography (OCT) data in 48 participants with unilateral V1 stroke and homonymous visual defects, who completed clinical trial NCT03350919. After measuring the thickness of the macular ganglion cell and inner plexiform layers (GCL-IPL), and the peripapillary retinal nerve fiber layer (RNFL), we computed individual laterality indices (LI) at baseline and after ~6 months of daily motion discrimination training in the intact- or blind-field. Increasingly positive LI denoted greater layer thinning in retinal regions affected versus unaffected by the cortical damage. Results: Pre-training, the affected GCL-IPL and RNFL were thinner than their unaffected counterparts, generating LI values positively correlated with time since stroke. Participants trained in their intact-field exhibited increased LIGCL-IPL. Those trained in their blind-field had no significant change in LIGCL-IPL. LIRNFL did not change in either group. Conclusions: Relative shrinkage of the affected versus unaffected macular GCL-IPL can be reliably measured at an individual level and increases with time post-V1 stroke. Relative thinning progressed during intact-field training, but appeared to be halted by training within the blind field, suggesting a potentially neuroprotective effect of this simple behavioral intervention.