RESUMO
People with psychosis exhibit thalamo-cortical hyperconnectivity and cortico-cortical hypoconnectivity with sensory networks, however, it remains unclear if this applies to all sensory networks, whether it arises from other illness factors, or whether such differences could form the basis of a viable biomarker. To address the foregoing, we harnessed data from the Human Connectome Early Psychosis Project and computed resting-state functional connectivity (RSFC) matrices for 54 healthy controls and 105 psychosis patients. Primary visual, secondary visual ("visual2"), auditory, and somatomotor networks were defined via a recent brain network partition. RSFC was determined for 718 regions via regularized partial correlation. Psychosis patients-both affective and non-affective-exhibited cortico-cortical hypoconnectivity and thalamo-cortical hyperconnectivity in somatomotor and visual2 networks but not in auditory or primary visual networks. When we averaged and normalized the visual2 and somatomotor network connections, and subtracted the thalamo-cortical and cortico-cortical connectivity values, a robust psychosis biomarker emerged (p = 2e-10, Hedges' g = 1.05). This "somato-visual" biomarker was present in antipsychotic-naive patients and did not depend on confounds such as psychiatric comorbidities, substance/nicotine use, stress, anxiety, or demographics. It had moderate test-retest reliability (ICC = 0.62) and could be recovered in five-minute scans. The marker could discriminate groups in leave-one-site-out cross-validation (AUC = 0.79) and improve group classification upon being added to a well-known neurocognition task. Finally, it could differentiate later-stage psychosis patients from healthy or ADHD controls in two independent data sets. These results introduce a simple and robust RSFC biomarker that can distinguish psychosis patients from controls by the early illness stages.
RESUMO
Existing methods for estimating the mean outcome under a given sequential treatment rule often rely on intention-to-treat analyses, which estimate the effect of following a certain treatment rule regardless of compliance behavior of patients. There are two major concerns with intention-to-treat analyses: (1) the estimated effects are often biased toward the null effect; (2) the results are not generalizable and reproducible due to the potentially differential compliance behavior. These are particularly problematic in settings with a high level of non-compliance, such as substance use disorder studies. Our work is motivated by the Adaptive Treatment for Alcohol and Cocaine Dependence study (ENGAGE), which is a multi-stage trial that aimed to construct optimal treatment strategies to engage patients in therapy. Due to the relatively low level of compliance in this trial, intention-to-treat analyses essentially estimate the effect of being randomized to a certain treatment, instead of the actual effect of the treatment. We obviate this challenge by defining the target parameter as the mean outcome under a dynamic treatment regime conditional on a potential compliance stratum. We propose a flexible non-parametric Bayesian approach based on principal stratification, which consists of a Gaussian copula model for the joint distribution of the potential compliances, and a Dirichlet process mixture model for the treatment sequence specific outcomes. We conduct extensive simulation studies which highlight the utility of our approach in the context of multi-stage randomized trials. We show robustness of our estimator to non-linear and non-Gaussian settings as well.
Assuntos
Tomada de Decisões , Cooperação do Paciente , Humanos , Teorema de Bayes , Simulação por Computador , Resultado do TratamentoRESUMO
Repairable adhesive elastomers are emerging materials employed in compelling applications such as soft robotics, biosensing, tissue regeneration, and wearable electronics. Facilitating adhesion requires strong interactions, while self-healing requires bond dynamicity. This contrast in desired bond characteristics presents a challenge in the design of healable adhesive elastomers. Furthermore, 3D printability of this novel class of materials has received limited attention, restricting the potential design space of as-built geometries. Here, we report a series of 3D-printable elastomeric materials with self-healing ability and adhesive properties. Repairability is obtained using Thiol-Michael dynamic crosslinkers incorporated into the polymer backbone, while adhesion is facilitated with acrylate monomers. Elastomeric materials with excellent elongation up to 2000%, self-healing stress recovery >95%, and strong adhesion with metallic and polymeric surfaces are demonstrated. Complex functional structures are successfully 3D printed using a commercial digital light processing (DLP) printer. Shape-selective lifting of low surface energy poly(tetrafluoroethylene) objects is achieved using soft robotic actuators with interchangeable 3D-printed adhesive end effectors, wherein tailored contour matching leads to increased adhesion and successful lifting capacity. The demonstrated utility of these adhesive elastomers provides unique capabilities to easily program soft robot functionality.
RESUMO
BACKGROUND: Antiretroviral treatment improves health related quality of life (HRQoL) of people with human immunodeficiency virus (PWH). However, one third initiating first-line treatment experience virological failure and the determinants of HRQoL in this key population are unknown. Our study aims to identify determinants of among PWH failing antiretroviral treatment in sub-Saharan Africa. METHODS: We analysed data from a cohort of PWH having virological failure (> 1,000 copies/mL) on first-line ART in South Africa and Uganda. We measured HRQoL using the EuroQOL EQ-5D-3L and used a two-part regression model to obtain by-country analyses for South Africa and Uganda. The first part identifies risk factors that were associated with the likelihood of participants reporting perfect health (utility = 1) versus non-perfect health (utility < 1). The second part identifies risk factors that were associated with the EQ-5 L-3L utility scores for participants reporting non-perfect health. We performed sensitivity analyses to compare the results between the two-part model using tobit models and ordinary least squares regression. RESULTS: In both countries, males were more likely to report perfect health and participants with at least one comorbidity were less likely to report perfect health. In South Africa, participants with side effects and in Uganda those with opportunistic infections were also less likely to report perfect health. In Uganda, participants with 100% ART adherence were more likely to report perfect health. In South Africa, high HIV viral load, experiencing ART side effects, and the presence of opportunistic infections were each associated with lower HRQoL, whereas participants with 100% ART adherence reported higher HRQoL. In Uganda participants with lower CD4 count had lower HRQoL. CONCLUSION: Markers of advanced disease (opportunistic infection, high viral load, low CD4), side effects, comorbidities and lack of ART adherence negatively impacted HRQoL for PWH experiencing virological failure. TRIAL REGISTRATION: ClinicalTrials.gov: NCT02787499.
Assuntos
Infecções por HIV , Infecções Oportunistas , Masculino , Humanos , HIV , Qualidade de Vida , África do Sul/epidemiologia , Antirretrovirais , Infecções por HIV/tratamento farmacológico , Infecções por HIV/epidemiologiaRESUMO
BACKGROUND: The use of a Left Ventricular Assist Device (LVAD) in patients with advanced heart failure refractory to optimal medical management has progressed steadily over the past two decades. Data have demonstrated reduced LVAD efficacy, worse clinical outcome, and higher mortality for patients who experience significant ventricular tachyarrhythmia (VTA). We hypothesize that a novel prophylactic intra-operative VTA ablation protocol at the time of LVAD implantation may reduce the recurrent VTA and adverse events postimplant. METHODS: We designed a prospective, multicenter, open-label, randomized-controlled clinical trial enrolling 100 patients who are LVAD candidates with a history of VTA in the previous 5 years. Enrolled patients will be randomized in a 1:1 fashion to intra-operative VTA ablation (n = 50) versus conventional medical management (n = 50) with LVAD implant. Arrhythmia outcomes data will be captured by an implantable cardioverter defibrillator (ICD) to monitor VTA events, with a uniform ICD programming protocol. Patients will be followed prospectively over a mean of 18 months (with a minimum of 9 months) after LVAD implantation to evaluate recurrent VTA, adverse events, and procedural outcomes. Secondary endpoints include right heart function/hemodynamics, healthcare utilization, and quality of life. CONCLUSION: The primary aim of this first-ever randomized trial is to assess the efficacy of intra-operative ablation during LVAD surgery in reducing VTA recurrence and improving clinical outcomes for patients with a history of VTA.
Assuntos
Desfibriladores Implantáveis , Insuficiência Cardíaca , Coração Auxiliar , Taquicardia Ventricular , Humanos , Coração Auxiliar/efeitos adversos , Estudos Prospectivos , Qualidade de Vida , Fatores de Risco , Eletrocardiografia , Arritmias Cardíacas , Taquicardia Ventricular/etiologia , Resultado do TratamentoRESUMO
OBJECTIVES: Viral suppression (VS) is the hallmark of successful antiretroviral therapy (ART) programmes. We sought to compare clinic retention, virological outcomes, drug resistance and mortality between peri-urban and rural settings in South Africa after first-line ART. METHODS: Beginning in July 2014, 1000 (500 peri-urban and 500 rural) ART-naïve patients with HIV were enrolled and managed according to local standard of care. Clinic retention, virological suppression, virological failure (VF), genotypic drug resistance and mortality were assessed. The definition of VS was a viral load ≤1000 copies/ml. Time to event analyses were stratified by site, median age and gender. Kaplan-Meier curves were calculated and graphed with log-rank modelling to compare curves. RESULTS: Based on 2741 patient-years of follow-up, retention and mortality did not differ between sites. Among all 1000 participants, 47%, 84% and 91% had achieved VS by 6, 12 and 24 months, respectively, which was observed earlier in the peri-urban site. At both sites, men aged < 32 years had the highest proportion of VF (15.5%), while women aged > 32 years had the lowest, at 7.1% (p = 0.018). Among 55 genotypes, 42 (76.4%) had at one or more resistance mutations, which did not differ by site. K103N (59%) and M184V (52%) were the most common mutations, followed by V106M and K65R (31% each). Overall, death was infrequent (< 4%). CONCLUSIONS: No significant differences in treatment outcomes between peri-urban and rural clinics were observed. In both settings, young men were especially susceptible to clinic attrition and VF. More effective adherence support for this important demographic group is needed to achieve UNAIDS targets.
Assuntos
Fármacos Anti-HIV , Infecções por HIV , HIV-1 , Fármacos Anti-HIV/farmacologia , Fármacos Anti-HIV/uso terapêutico , Antirretrovirais/farmacologia , Antirretrovirais/uso terapêutico , Farmacorresistência Viral/genética , Feminino , Infecções por HIV/tratamento farmacológico , Humanos , Masculino , África do Sul , Carga ViralRESUMO
OBJECTIVES: HIV virological failure remains a major threat to programme success in sub-Saharan Africa. While HIV drug resistance (HIVDR) and inadequate adherence are the main drivers of virological failure, the individual, clinical and health system characteristics that lead to poor outcomes are not well understood. The objective of this paper is to identify those characteristics among people failing first-line antiretroviral therapy (ART). METHODS: We enrolled a cohort of adults in HIV care experiencing virological failure on first-line ART at five sites and used standard statistical methods to characterize them with a focus on three domains: individual/demographic, clinical, and health system, and compared each by country of enrolment. RESULTS: Of 840 participants, 51% were women, the median duration on ART was 3.2 years [interquartile range (IQR) 1.1, 6.4 years] and the median CD4 cell count prior to failure was 281 cells/µL (IQR 121, 457 cells/µL). More than half of participants [53%; 95% confidence interval (CI) 49-56%] stated that they had > 90% adherence and 75% (95% CI 72-77%) took their ART on time all or most of the time. Conversely, the vast majority (90%; 95% CI 86-92%) with a completed genotypic drug resistance test had any HIV drug resistance. This population had high health system use, reporting a median of 3 (IQR 2.6) health care visits and a median of 1 (IQR 1.1) hospitalization in the preceding 6 months. CONCLUSIONS: Patients failing first-line ART in sub-Saharan Africa generally report high rates of adherence to ART, have extremely high rates of HIV drug resistance and utilize significant health care resources. Health systems interventions to promptly detect and manage treatment failure will be a prerequisite to establishing control of the HIV epidemic.
Assuntos
Fármacos Anti-HIV , Infecções por HIV , Adulto , Contagem de Linfócito CD4 , Farmacorresistência Viral , Feminino , Infecções por HIV/tratamento farmacológico , Infecções por HIV/epidemiologia , Humanos , África do Sul/epidemiologia , Falha de Tratamento , Uganda/epidemiologia , Carga ViralRESUMO
An adaptive treatment length strategy is a sequential stage-wise treatment strategy where a subject's treatment begins at baseline and one chooses to stop or continue treatment at each stage provided the subject has been continuously treated. The effects of treatment are assumed to be cumulative and, therefore, the effect of treatment length on clinical endpoint, measured at the end of the study, is of primary scientific interest. At the same time, adverse treatment-terminating events may occur during the course of treatment that require treatment be stopped immediately. Because the presence of a treatment-terminating event may be strongly associated with the study outcome, the treatment-terminating event is informative. In observational studies, decisions to stop or continue treatment depend on covariate history that confounds the relationship between treatment length on outcome. We propose a new risk-set weighted estimator of the mean potential outcome under the condition that time-dependent covariates update at a set of common landmarks. We show that our proposed estimator is asymptotically linear given mild assumptions and correctly specified working models. Specifically, we study the theoretical properties of our estimator when the nuisance parameters are modeled using either parametric or semiparametric methods. The finite sample performance and theoretical results of the proposed estimator are evaluated through simulation studies and demonstrated by application to the Enhanced Suppression of the Platelet Receptor IIb/IIIa with Integrilin Therapy (ESPRIT) infusion trial data.
Assuntos
Modelos Estatísticos , Simulação por Computador , Resultado do TratamentoRESUMO
Sequential, multiple assignment, randomized trials (SMARTs) compare sequences of treatment decision rules called dynamic treatment regimes (DTRs). In particular, the Adaptive Treatment for Alcohol and Cocaine Dependence (ENGAGE) SMART aimed to determine the best DTRs for patients with a substance use disorder. While many authors have focused on a single pairwise comparison, addressing the main goal involves comparisons of >2 DTRs. For complex comparisons, there is a paucity of methods for binary outcomes. We fill this gap by extending the multiple comparisons with the best (MCB) methodology to the Bayesian binary outcome setting. The set of best is constructed based on simultaneous credible intervals. A substantial challenge for power analysis is the correlation between outcome estimators for distinct DTRs embedded in SMARTs due to overlapping subjects. We address this using Robins' G-computation formula to take a weighted average of parameter draws obtained via simulation from the parameter posteriors. We use non-informative priors and work with the exact distribution of parameters avoiding unnecessary normality assumptions and specification of the correlation matrix of DTR outcome summary statistics. We conduct simulation studies for both the construction of a set of optimal DTRs using the Bayesian MCB procedure and the sample size calculation for two common SMART designs. We illustrate our method on the ENGAGE SMART. The R package SMARTbayesR for power calculations is freely available on the Comprehensive R Archive Network (CRAN) repository. An RShiny app is available at https://wilart.shinyapps.io/shinysmartbayesr/.
Assuntos
Projetos de Pesquisa , Teorema de Bayes , Simulação por Computador , Humanos , Tamanho da AmostraRESUMO
This paper reports the results of a randomized controlled trial (RCT) to assess the efficacy of Nexus, a telehealth delivered intervention that combines Couples' HIV counseling and testing (CHTC) with home-based HIV-testing, examining the impact of the intervention on the couples' formation and adherence to safer sexual agreements. Between 2016 and 2018, 424 couples were recruited online from the U.S and randomized to the intervention arm (a telehealth delivered CHTC session with two home HIV-testing kits) or a control arm (two home HIV-testing kits), with study assessments at baseline, 3 and 6 months. Outcomes were the formation and adherence to safer sexual agreements, dyadic discordance in sexual agreements, breakage of sexual agreements, and perceptions of PrEP. Couples in the intervention arm had significantly greater odds of reporting a safer sexual agreement (3 months OR 1.87, p-value 0.005, and 6 months OR 1.84, p-value 0.007), lower odds of reporting discordant sexual agreements at 6 months (OR 0.62, p-value 0.048), and a significantly lower odds of reporting breaking their sexual agreement (3 months OR 0.51, p-value 0.035, and 6 months OR 0.23, p-value 0.000). By 6 months, couples in the intervention arm were less likely to say PrEP was beneficial to one (RRR 0.33, P = 0.000) or both of them (RRR 0.29, P = 0.000) than being beneficial to neither of the partners. The high levels of acceptability and efficacy of the intervention demonstrate strong potential for the scale-up of this efficacious intervention that is delivered through a low-cost telehealth platform.
Assuntos
Infecções por HIV , Telemedicina , Quitinases , Aconselhamento/métodos , Infecções por HIV/diagnóstico , Infecções por HIV/prevenção & controle , Infecções por HIV/psicologia , Humanos , Masculino , Proteínas de Plantas , Parceiros Sexuais/psicologiaRESUMO
BACKGROUND: Household food purchases (HFP) are in the pathway between the community food environment and the foods available in households for consumption. As such, HFP data have emerged as alternatives to monitor population dietary trends over-time. In this paper, we investigate the use of loyalty card datasets as unexplored sources of continuously collected HFP data to describe temporal trends in household produce purchases. METHODS: We partnered with a grocery store chain to obtain a loyalty card database with grocery transactions by household from January 2016-October 2018. We included households in an urban county with complete observations for head of household age group, household income group, and family size. Data were summarized as weighted averages (95% CI) of percent produce purchased out of all foods purchased by household per month. We modeled seasonal and linear trends in the proportion of produce purchases by age group and income while accounting for repeated observations per household using generalized estimating equations. RESULTS: There are 290,098 households in the database (88% of all county households). At baseline, the smallest and largest percent produce purchases are observed among the youngest and lowest income (12.2%, CI 11.1; 13.3) and the oldest and highest income households (19.3, CI 18.9; 19.6); respectively. The seasonal variations are consistent in all age and income groups with an April-June peak gradually descending until December. However, the average linear change in percent produce purchased per household per year varies by age and income being the steepest among the youngest households at each income level (from 1.42%, CI 0.98;1.8 to 0.69%, CI 0.42;0.95) while the oldest households experience almost no annual change. CONCLUSIONS: We explored the potential of a collaboration with a food retailer to use continuously collected loyalty card data for public health nutrition purposes. Our findings suggest a trend towards a healthier pattern in long-term food purchases and household food availability among the youngest households that may lessen the population chronic disease burden if sustained. Understanding the foods available for consumption within households allows public health advocates to develop and evaluate policies and programs promoting foods and nutrients along the life course.
Assuntos
Comportamento do Consumidor , Características da Família , Humanos , Renda , Dieta , Preferências AlimentaresRESUMO
BACKGROUND: Virologic failure in HIV predicts the development of drug resistance and mortality. Genotypic resistance testing (GRT), which is the standard of care after virologic failure in high-income settings, is rarely implemented in sub-Saharan Africa. OBJECTIVE: To estimate the effectiveness of GRT for improving virologic suppression rates among people with HIV in sub-Saharan Africa for whom first-line therapy fails. DESIGN: Pragmatic, unblinded, randomized controlled trial. (ClinicalTrials.gov: NCT02787499). SETTING: Ambulatory HIV clinics in the public sector in Uganda and South Africa. PATIENTS: Adults receiving first-line antiretroviral therapy with a recent HIV RNA viral load of 1000 copies/mL or higher. INTERVENTION: Participants were randomly assigned to receive standard of care (SOC), including adherence counseling sessions and repeated viral load testing, or immediate GRT. MEASUREMENTS: The primary outcome of interest was achievement of an HIV RNA viral load below 200 copies/mL 9 months after enrollment. RESULTS: The trial enrolled 840 persons, divided equally between countries. Approximately half (51%) were women. Most (72%) were receiving a regimen of tenofovir, emtricitabine, and efavirenz at enrollment. The rate of virologic suppression did not differ 9 months after enrollment between the GRT group (63% [263 of 417]) and SOC group (61% [256 of 423]; odds ratio [OR], 1.11 [95% CI, 0.83 to 1.49]; P = 0.46). Among participants with persistent failure (HIV RNA viral load ≥1000 copies/mL) at 9 months, the prevalence of drug resistance was higher in the SOC group (76% [78 of 103] vs. 59% [48 of 82]; OR, 2.30 [CI, 1.22 to 4.35]; P = 0.014). Other secondary outcomes, including 9-month survival and retention in care, were similar between groups. LIMITATION: Participants were receiving nonnucleoside reverse transcriptase inhibitor-based therapy at enrollment, limiting the generalizability of the findings. CONCLUSION: The addition of GRT to routine care after first-line virologic failure in Uganda and South Africa did not improve rates of resuppression. PRIMARY FUNDING SOURCE: The President's Emergency Plan for AIDS Relief and the National Institute of Allergy and Infectious Diseases.
Assuntos
Terapia Antirretroviral de Alta Atividade , Farmacorresistência Viral , Infecções por HIV/tratamento farmacológico , Adulto , Alcinos/uso terapêutico , Benzoxazinas/uso terapêutico , Ciclopropanos/uso terapêutico , Emtricitabina/uso terapêutico , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , África do Sul , Tenofovir/uso terapêutico , Uganda , Carga ViralRESUMO
Background:Studies have shown that teleophthalmology programs using a nonmydriatic camera in primary care settings can improve rates of diabetic retinopathy (DR) screening. However, such programs are not yet widespread due to common challenges in sustainability.Purpose:To comprehensively evaluate clinical and operational measures of an urban primary care clinic's 1-year pilot teleophthalmology DR evaluation program.Materials and Methods:This retrospective analysis used five metrics to evaluate the program: clinical diabetic retinal exam (DRE) rate, visual acuity and pathology, camera utilization, billing and insurance reimbursements, and outcomes of follow-up referrals.Results:Two hundred eleven patients were screened over 14 months. The DRE rate had more than doubled (34-75%). Of the patients, 55.9% had vision better than 20/50 in each eye and 21% with at least 1 eye worse than or equal to 20/70. DR was noted in 11% of patients. The program's first few months saw greatest camera use. Government and Medicare Advantage insurers were significantly (p < 0.001) less likely to reimburse than commercial insurers. Twenty-seven percent of patients screened had documented follow-up with an eye care provider within 16 months of their screening. Patients diagnosed with DR or recommended follow-up within 1 month were significantly (p < 0.001) more likely to schedule an appointment.Discussion:Challenges to program sustainability include efficient utilization, reimbursement from governmental insurers, and adherence to follow-up recommendations.Conclusions:Assessing teleophthalmology programs with the aforementioned five metrics allows for a comprehensive evaluation of impact and sustainability. This may be utilized to standardize the implementation and evaluation of such programs across diverse settings.
Assuntos
Diabetes Mellitus , Retinopatia Diabética , Oftalmologia , Telemedicina , Idoso , Retinopatia Diabética/diagnóstico , Humanos , Programas de Rastreamento , Medicare , Atenção Primária à Saúde , Avaliação de Programas e Projetos de Saúde , Estudos Retrospectivos , Estados UnidosRESUMO
Screening for chronic diseases, such as cancer, is an important public health priority, but traditionally only the frequency or rate of screening has received attention. In this work, we study the importance of adhering to recommended screening policies and develop new methodology to better optimize screening policies when adherence is imperfect. We consider a progressive disease model with four states (healthy, undetectable preclinical, detectable preclinical, clinical), and overlay this with a stochastic screening-behavior model using the theory of renewal processes that allows us to capture imperfect adherence to screening programs in a transparent way. We show that decreased adherence leads to reduced efficacy of screening programs, quantified here using elements of the lead time distribution (i.e., the time between screening diagnosis and when diagnosis would have occurred clinically in the absence of screening). Under the assumption of an inverse relationship between prescribed screening frequency and individual adherence, we show that the optimal screening frequency generally decreases with increasing levels of non-adherence. We apply this model to an example in breast cancer screening, demonstrating how accounting for imperfect adherence affects the recommended screening frequency.
Assuntos
Neoplasias da Mama , Neoplasias da Mama/diagnóstico , Doença Crônica , Feminino , HumanosRESUMO
As this is more of a reference article, I chose not to have an abstract similar to the paper I wrote in 2016 regarding flat feet in the military.
Assuntos
Militares , Ferimentos e Lesões , Tornozelo/cirurgia , Humanos , Guerra do Iraque 2003-2011 , Estados UnidosRESUMO
OBJECTIVES: There is conflicting evidence on the impact of pre-existing HIV drug resistance mutations (DRMs) in patients infected with non-B subtype virus. METHODS: We performed a case-cohort substudy of the AIDS Drug Resistance Surveillance Study, which enrolled South African patients initiating first-line efavirenz/emtricitabine/tenofovir. Pre-ART DRMs were detected by Illumina sequencing of HIV pol and DRMs present at <20% of the viral population were labelled as minority variants (MVs). Weighted Cox proportional hazards models estimated the association between pre-ART DRMs and risk of virological failure (VF), defined as confirmed HIV-1 RNA ≥1000 copies/mL after ≥5 months of ART. RESULTS: The evaluable population included 178 participants from a randomly selected subcohort (16 with VF, 162 without VF) and 83 additional participants with VF. In the subcohort, 16% of participants harboured ≥1 majority DRM. The presence of any majority DRM was associated with a 3-fold greater risk of VF (Pâ=â0.002), which increased to 9.2-fold (Pâ<â0.001) in those with <2 active drugs. Thirteen percent of participants harboured MV DRMs in the absence of majority DRMs. Presence of MVs alone had no significant impact on the risk of VF. Inclusion of pre-ART MVs with majority DRMs improved the sensitivity but reduced the specificity of predicting VF. CONCLUSIONS: In a South African cohort, the presence of majority DRMs increased the risk of VF, especially for participants receiving <2 active drugs. The detection of drug-resistant MVs alone did not predict an increased risk of VF, but their inclusion with majority DRMs affected the sensitivity/specificity of predicting VF.
Assuntos
Fármacos Anti-HIV , Infecções por HIV , HIV-1 , Fármacos Anti-HIV/farmacologia , Fármacos Anti-HIV/uso terapêutico , Farmacorresistência Viral , Infecções por HIV/tratamento farmacológico , Infecções por HIV/epidemiologia , HIV-1/genética , Humanos , Mutação , África do Sul/epidemiologia , Falha de Tratamento , Carga ViralRESUMO
PURPOSE: The ability to maintain an absolute, submaximal torque level during fatiguing contractions is controlled, in part, by the recruitment of larger motor units. These motor units are commonly identified based on greater action potential peak-to-peak amplitude values. It is unclear, however, if motor unit action potential (MUAP) amplitude values during low torque, fatiguing contractions reach similar levels as those observed during non-fatigued, high torque contractions. To establish a clearer understanding of motor unit control during fatigue, we compared MUAP amplitude during 50 and 80% maximum voluntary contraction (MVC) torque contractions and at the beginning, middle, and end of a 30% MVC fatigue protocol. METHODS: Eleven untrained men (mean age = 24 years) performed isometric contractions at 50 and 80% MVC, followed by repeated contractions at 30% MVC. Surface electromyographic (EMG) signals were detected from the vastus lateralis and decomposed to quantify the peak-to-peak amplitude of individual MUAPs. A two-level multilevel model was estimated, allowing examination of simultaneous measures of MUAP amplitude within participants and controlling for the dependence between measures within participants. RESULTS: Results from the multilevel analyses suggested that there were not statistically significant differences in MUAP amplitude between 80% MVC and end fatigue. Separate repeated-measures analyses of variance indicated that there were not statistically significant mean differences in greatest MUAP or surface EMG amplitude between 80% MVC and end fatigue. CONCLUSIONS: MUAP and surface EMG amplitude values during a 30% MVC fatiguing protocol appear to be comparable to those observed during a non-fatigued 80% MVC condition.
Assuntos
Potencial Evocado Motor , Contração Isométrica , Fadiga Muscular , Músculo Esquelético/fisiologia , Adulto , Humanos , Masculino , TorqueRESUMO
Many longitudinal studies with a binary outcome measure involve a fraction of subjects with a homogeneous response profile. In our motivating data set, a study on the rate of human immunodeficiency virus (HIV) self-testing in a population of men who have sex with men (MSM), a substantial proportion of the subjects did not self-test during the follow-up study. The observed data in this context consist of a binary sequence for each subject indicating whether or not that subject experienced any events between consecutive observation time points, so subjects who never self-tested were observed to have a response vector consisting entirely of zeros. Conventional longitudinal analysis is not equipped to handle questions regarding the rate of events (as opposed to the odds, as in the classical logistic regression model). With the exception of discrete mixture models, such methods are also not equipped to handle settings in which there may exist a group of subjects for whom no events will ever occur, i.e. a so-called "never-responder" group. In this article, we model the observed data assuming that events occur according to some unobserved continuous-time stochastic process. In particular, we consider the underlying subject-specific processes to be Poisson conditional on some unobserved frailty, leading to a natural focus on modeling event rates. Specifically, we propose to use the power variance function (PVF) family of frailty distributions, which contains both the gamma and inverse Gaussian distributions as special cases and allows for the existence of a class of subjects having zero frailty. We generalize a computational algorithm developed for a log-gamma random intercept model (Conaway, 1990. A random effects model for binary data. Biometrics46, 317-328) to compute the exact marginal likelihood, which is then maximized to obtain estimates of model parameters. We conduct simulation studies, exploring the performance of the proposed method in comparison with competitors. Applying the PVF as well as a Gaussian random intercept model and a corresponding discrete mixture model to our motivating data set, we conclude that the group assigned to receive follow-up messages via SMS was self-testing at a significantly lower rate than the control group, but that there is no evidence to support the existence of a group of never-testers.
Assuntos
Bioestatística/métodos , Infecções por HIV/diagnóstico , Programas de Rastreamento/estatística & dados numéricos , Modelos Estatísticos , Aceitação pelo Paciente de Cuidados de Saúde/estatística & dados numéricos , Adulto , Infecções por HIV/prevenção & controle , Homossexualidade Masculina , Humanos , Estudos Longitudinais , Masculino , Sistemas de Alerta , Envio de Mensagens de TextoRESUMO
OBJECTIVE: Interactions between enzyme-inducing anti-seizure medications (EI-ASMs) and antiretroviral drugs (ARVs) can lead to decreased ARV levels and may increase the likelihood of viral resistance. We conducted a study to determine if co-usage of ARVs and EI-ASMs is associated with ARV-resistant human immunodeficiency virus (HIV) among people living with HIV in Zambia. METHODS: Eligible participants were ≥18 years of age and concurrently taking ASMs and ARVs for at least 1 month of the prior 6-month period. Data were obtained regarding medication and HIV history. CD4 counts, plasma viral loads (pVLs), and HIV genotype and resistance profile in participants with a pVL >1000 copies/mL were obtained. Pearson's test of independence was used to determine whether treatment with EI-ASM was associated with pVL >1000/mL copies. RESULTS: Of 50 participants, 41 (82%) were taking carbamazepine (37 on monotherapy), and all had stable regimens in the prior 6 months. Among the 13 ARV regimens used, 68% had a tenofovir/lamivudine backbone. The majority (94%) were on a stable ARV regimen for >6 months. Median CD4 nadir was 205 cells/mm3 (interquartile range [IQR] 88-389), and 60% of participants had commenced ARV treatment before advanced disease occurred. Mean CD4 count at enrollment was 464 cells/mm3 (SD 226.3). Seven participants (14%) had a CD4 count <200 cells/mm3 . Four (8%) had a pVL >1000 copies/mL; all were on carbamazepine. Three participants with elevated pVL had a CD4 count <200 cells/mm3 . None had documented adherence concerns by providers; however, two had events concerning for clinical failure. HIV genotype testing showed mutations in three participants. Carbamazepine was not found to correlate with elevated pVL (P = .58). SIGNIFICANCE: EI-ASMs are commonly used in sub-Saharan Africa. Despite concurrent use of EI-ASMs and ARVs, the majority of participants showed CD4 counts >200 cells/mm3 and were virally suppressed. Carbamazepine was not associated with an increased risk of virological failure or ARV-resistant HIV.
Assuntos
Fármacos Anti-HIV/uso terapêutico , Anticonvulsivantes/uso terapêutico , Carbamazepina/uso terapêutico , Epilepsia/tratamento farmacológico , Infecções por HIV/tratamento farmacológico , Adulto , Instituições de Assistência Ambulatorial/estatística & dados numéricos , Fármacos Anti-HIV/efeitos adversos , Anticonvulsivantes/efeitos adversos , Contagem de Linfócito CD4 , Carbamazepina/efeitos adversos , Interações Medicamentosas , Farmacorresistência Viral , Epilepsia/complicações , Feminino , Infecções por HIV/complicações , Humanos , Masculino , Resultado do Tratamento , Carga Viral/efeitos dos fármacos , ZâmbiaRESUMO
Nutrient pollution from human activities remains a common problem facing stream ecosystems. Identifying ecological responses to phosphorus and nitrogen can inform decisions affecting the protection and management of streams and their watersheds. Diatoms are particularly useful because they are a highly diverse group of unicellular algae found in nearly all aquatic environments and are sensitive responders to increased nutrient concentrations. Here, we used DNA metabarcoding of stream diatoms as an approach to quantifying effects of total phosphorus (TP) and total nitrogen (TN). Threshold indicator taxa analysis (TITAN) identified operational taxonomic units (OTUs) that increased or decreased along TP and TN gradients along with nutrient concentrations at which assemblages had substantial changes in the occurrences and relative abundances of OTUs. Boosted regression trees showed that relative abundances of gene sequence reads for OTUs identified by TITAN as low P, high P, low N, or high N diatoms had strong relationships with nutrient concentrations, which provided support for potentially using these groups of diatoms as metrics in monitoring programs. Gradient forest analysis provided complementary information by characterizing multi-taxa assemblage change using multiple predictors and results from random forest models for each OTU. Collectively, these analyses showed that notable changes in diatom assemblage structure and OTUs began around 20 µg TP/L, low P diatoms decreased substantially and community change points occurred from 75 to 150 µg/L, and high P diatoms became increasingly dominant from 150 to 300 µg/L. Diatoms also responded to TN with large decreases in low N diatoms occurring from 280 to 525 µg TN/L and a transition to dominance by high N diatoms from 525-850 µg/L. These diatom responses to TP and TN could be used to inform protection efforts (i.e., anti-degradation) and management goals (i.e., nutrient reduction) in streams and watersheds. Our results add to the growing support for using diatom metabarcoding in monitoring programs.