Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
1.
PLoS One ; 19(4): e0301260, 2024.
Article in English | MEDLINE | ID: mdl-38557772

ABSTRACT

OBJECTIVE: We assessed equity in the uptake of remote foot temperature monitoring (RTM) for amputation prevention throughout a large, integrated US healthcare system between 2019 and 2021, including comparisons across facilities and between patients enrolled and eligible patients not enrolled in RTM focusing on the Reach and Adoption dimensions of the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) framework. MATERIAL AND METHODS: To assess whether there was equitable use of RTM across facilities, we examined distributions of patient demographic, geographic, and facility characteristics across facility RTM use categories (e.g., no RTM use, and low, moderate, and high RTM use) among all eligible patients (n = 46,294). Second, to understand whether, among facilities using RTM, there was equitable enrollment of patients in RTM, we compared characteristics of patients enrolled in RTM (n = 1066) relative to a group of eligible patients not enrolled in RTM (n = 27,166) using logistic regression and including all covariates. RESULTS: RTM use increased substantially from an average of 11 patients per month to over 40 patients per month between 2019 and 2021. High-use RTM facilities had higher complexity and a lower ratio of patients per podiatrist but did not have consistent evidence of better footcare process measures. Among facilities offering RTM, enrollment varied by age, was inversely associated with Black race (vs. white), low income, living far from specialty care, and being in the highest quartiles of telehealth use prior to enrollment. Enrollment was positively associated with having osteomyelitis, Charcot foot, a partial foot amputation, BMI≥30 kg/m2, and high outpatient utilization. CONCLUSIONS: RTM growth was concentrated in a small number of higher-resourced facilities, with evidence of lower enrollment among those who were Black and lived farther from specialty care. Future studies are needed to identify and address barriers to uptake of new interventions like RTM to prevent exacerbating existing ulceration and amputation disparities.


Subject(s)
Telemedicine , Humans , Temperature
2.
Mil Med ; 2024 Mar 25.
Article in English | MEDLINE | ID: mdl-38536226

ABSTRACT

INTRODUCTION: The effects of smoking on lung function among post-9/11 Veterans deployed to environments with high levels of ambient particulate matter are incompletely understood. MATERIALS AND METHODS: We analyzed interim data (04/2018-03/2020) from the Veterans Affairs (VA) Cooperative Studies Program #595, "Service and Health Among Deployed Veterans". Veterans with ≥1 land-based deployments enrolled at 1 of 6 regional Veterans Affairs sites completed questionnaires and spirometry. Multivariable linear regression models assessed associations between cigarette smoking (cumulative, deployment-related and non-deployment-related) with pulmonary function. RESULTS: Among 1,836 participants (mean age 40.7 ± 9.6, 88.6% male), 44.8% (n = 822) were ever-smokers (mean age 39.5 ± 9.5; 91.2% male). Among ever-smokers, 86% (n = 710) initiated smoking before deployment, while 11% (n = 90) initiated smoking during deployment(s). Smoking intensity was 50% greater during deployment than other periods (0.75 versus 0.50 packs-per-day; P < .05), and those with multiple deployments (40.4%) were more likely to smoke during deployment relative to those with single deployments (82% versus 74%). Total cumulative pack-years (median [IQR] = 3.8 [1, 10]) was inversely associated with post-bronchodilator FEV1%-predicted (-0.82; [95% CI] = [-1.25, -0.50] %-predicted per 4 pack-years) and FEV1/FVC%-predicted (-0.54; [95% CI] = [-0.78, -0.43] %-predicted per 4 pack-years). Deployment-related pack-years demonstrated similar point estimates of associations with FEV1%-predicted (-0.61; [95% CI] = [-2.28, 1.09]) and FEV1/FVC%-predicted (-1.09; [95% CI] = [-2.52, 0.50]) as non-deployment-related pack-years (-0.83; [95% CI] = [-1.26, -0.50] for FEV1%-predicted; -0.52; [95% CI] = [-0.73, -0.36] for FEV1/FVC%-predicted). CONCLUSIONS: Although cumulative pack-years smoking was modest in this cohort, an inverse association with pulmonary function was detectable. Deployment-related pack-years had a similar association with pulmonary function compared to non-deployment-related pack-years.

3.
Pain ; 2024 Jan 02.
Article in English | MEDLINE | ID: mdl-38189184

ABSTRACT

ABSTRACT: Although many individuals with chronic pain use analgesics, the methods used in many randomized controlled trials (RCTs) do not sufficiently account for confounding by differential post-randomization analgesic use. This may lead to underestimation of average treatment effects and diminished power. We introduce (1) a new measure-the Numeric Rating Scale of Underlying Pain without concurrent Analgesic use (NRS-UP(A))-which can shift the estimand of interest in an RCT to target effects of a treatment on pain intensity in the hypothetical situation where analgesic use was not occurring at the time of outcome assessment; and (2) a new pain construct-an individuals' perceived effect of analgesic use on pain intensity (EA). The NRS-UP(A) may be used as a secondary outcome in RCTs of point treatments or nonpharmacologic treatments. Among 662 adults with back pain in primary care, participants' mean value of the NRS-UP(A) among those using analgesics was 1.2 NRS points higher than their value on the conventional pain intensity NRS, reflecting a mean EA value of -1.2 NRS points and a perceived beneficial effect of analgesics. More negative values of EA (ie, greater perceived benefit) were associated with a greater number of analgesics used but not with pain intensity, analgesic type, or opioid dose. The NRS-UP(A) and EA were significantly associated with future analgesic use 6 months later, but the conventional pain NRS was not. Future research is needed to determine whether the NRS-UP(A), used as a secondary outcome may allow pain RCTs to target alternative estimands with clinical relevance.

4.
Occup Environ Med ; 81(2): 59-65, 2024 Feb 02.
Article in English | MEDLINE | ID: mdl-37968126

ABSTRACT

OBJECTIVES: Characterise inhalational exposures during deployment to Afghanistan and Southwest Asia and associations with postdeployment respiratory symptoms. METHODS: Participants (n=1960) in this cross-sectional study of US Veterans (Veterans Affairs Cooperative Study 'Service and Health Among Deployed Veterans') completed an interviewer-administered questionnaire regarding 32 deployment exposures, grouped a priori into six categories: burn pit smoke; other combustion sources; engine exhaust; mechanical and desert dusts; toxicants; and military job-related vapours gas, dusts or fumes (VGDF). Responses were scored ordinally (0, 1, 2) according to exposure frequency. Factor analysis supported item reduction and category consolidation yielding 28 exposure items in 5 categories. Generalised linear models with a logit link tested associations with symptoms (by respiratory health questionnaire) adjusting for other covariates. OR were scaled per 20-point score increment (normalised maximum=100). RESULTS: The cohort mean age was 40.7 years with a median deployment duration of 11.7 months. Heavy exposures to multiple inhalational exposures were commonly reported, including burn pit smoke (72.7%) and VGDF (72.0%). The prevalence of dyspnoea, chronic bronchitis and wheeze in the past 12 months was 7.3%, 8.2% and 15.6%, respectively. Burn pit smoke exposure was associated with dyspnoea (OR 1.22; 95% CI 1.06 to 1.47) and chronic bronchitis (OR 1.22; 95% CI 1.13 to 1.44). Exposure to VGDF was associated with dyspnoea (OR 1.29; 95% CI 1.14 to 1.58) and wheeze (OR 1.18; 95% CI 1.02 to 1.35). CONCLUSION: Exposures to burn pit smoke and military occupational VGDF during deployment were associated with an increased odds of chronic respiratory symptoms among US Veterans.


Subject(s)
Bronchitis, Chronic , Occupational Exposure , Veterans , Humans , Adult , Bronchitis, Chronic/epidemiology , Bronchitis, Chronic/etiology , Occupational Exposure/adverse effects , Cross-Sectional Studies , Environmental Exposure/adverse effects , Smoke , Dyspnea/epidemiology , Dyspnea/etiology , Gases/analysis , Dust
5.
Open Forum Infect Dis ; 10(7): ofad330, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37484899

ABSTRACT

Background: Over 870 000 severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections have occurred among Veterans Health Administration users, and 24 000 have resulted in death. We examined early outcomes of SARS-CoV-2 infection in hospitalized veterans. Methods: In an ongoing, prospective cohort study, we enrolled veterans age ≥18 tested for SARS-CoV-2 and hospitalized at 15 Department of Veterans Affairs medical centers between February 2021 and June 2022. We estimated adjusted odds ratios (aORs), adjusted incidence rate ratios (aIRRs), and adjusted hazard ratios (aHRs) for maximum illness severity within 30 days of study entry (defined using the 4-category VA Severity Index for coronavirus disease 2019 [COVID-19]), as well as length of hospitalization and rehospitalization within 60 days, in relationship with demographic characteristics, Charlson comorbidity index (CCI), COVID-19 vaccination, and calendar period of enrollment. Results: The 542 participants included 329 (61%) who completed a primary vaccine series (with or without booster; "vaccinated"), 292 (54%) enrolled as SARS-CoV-2-positive, and 503 (93%) men, with a mean age of 64.4 years. High CCI scores (≥5) occurred in 61 (44%) vaccinated and 29 (19%) unvaccinated SARS-CoV-2-positive participants. Severe illness or death occurred in 29 (21%; 6% died) vaccinated and 31 (20%; 2% died) unvaccinated SARS-CoV-2-positive participants. SARS-CoV-2-positive inpatients per unit increase in CCI had greater multivariable-adjusted odds of severe illness (aOR, 1.21; 95% CI, 1.01-1.45), more hospitalization days (aIRR, 1.06; 95% CI, 1.03-1.10), and rehospitalization (aHR, 1.07; 95% CI, 1.01-1.12). Conclusions: In a cohort of hospitalized US veterans with SARS-CoV-2 infection, those with a higher CCI had more severe COVID-19 illness, more hospital days, and rehospitalization, after adjusting for vaccination status, age, sex, and calendar period.

6.
Diabetes Care ; 46(8): 1464-1468, 2023 08 01.
Article in English | MEDLINE | ID: mdl-37319007

ABSTRACT

OBJECTIVE: We evaluated the effectiveness of remote foot temperature monitoring (RTM) in the Veterans Affairs health care system. RESEARCH DESIGN AND METHODS: We conducted a retrospective cohort study that included 924 eligible patients enrolled in RTM between 2019 and 2021 who were matched up to 3:1 to 2,757 nonenrolled comparison patients. We used conditional Cox regression to estimate adjusted cause-specific hazard ratios (aHRs) and corresponding 95% CIs for lower-extremity amputation (LEA) as the primary outcome and all-cause hospitalization and death as secondary outcomes. RESULTS: RTM was not associated with LEA incidence (aHR 0.92, 95% CI 0.62-1.37) or all-cause hospitalization (aHR 0.97, 95% CI 0.82-1.14) but was inversely associated (reduced risk) with death (aHR 0.63, 95% CI 0.49-0.82). CONCLUSIONS: This study does not provide support that RTM reduces the risk of LEA or all-cause hospitalization in individuals with a history of diabetic foot ulcer. Randomized controlled trials can overcome important limitations.


Subject(s)
Delivery of Health Care, Integrated , Diabetic Foot , Humans , Retrospective Studies , Temperature , Diabetic Foot/surgery , Diabetic Foot/epidemiology , Amputation, Surgical , Risk Factors
7.
medRxiv ; 2023 Jun 05.
Article in English | MEDLINE | ID: mdl-37333215

ABSTRACT

Individual treatments for chronic low back pain (CLBP) have small magnitude effects. Combining different types of treatments may produce larger effects. This study used a 2×2 factorial randomized controlled trial (RCT) design to combine procedural and behavioral treatments for CLBP. The study aims were to: (1) assess feasibility of conducting a factorial RCT of these treatments; and (2) estimate individual and combined treatment effects of (a) lumbar radiofrequency ablation (LRFA) of the dorsal ramus medial branch nerves (vs. a simulated LRFA control procedure) and (b) Activity Tracker-Informed Video-Enabled Cognitive Behavioral Therapy program for CLBP (AcTIVE-CBT) (vs. an educational control treatment) on back-related disability at 3 months post-randomization. Participants (n=13) were randomized in a 1:1:1:1 ratio. Feasibility goals included an enrollment proportion ≥30%, a randomization proportion ≥80%, and a ≥80% proportion of randomized participants completing the 3-month Roland-Morris Disability Questionnaire (RMDQ) primary outcome endpoint. An intent-to-treat analysis was used. The enrollment proportion was 62%, the randomization proportion was 81%, and all randomized participants completed the primary outcome. Though not statistically significant, there was a beneficial, moderate-magnitude effect of LRFA vs. control on 3-month RMDQ (-3.25 RMDQ points; 95% CI: -10.18, 3.67). There was a significant, beneficial, large-magnitude effect of AcTIVECBT vs. control (-6.29, 95% CI: -10.97, -1.60). Though not statistically significant, there was a beneficial, large effect of LRFA+AcTIVE-CBT vs. control (-8.37; 95% CI: -21.47, 4.74). We conclude that it is feasible to conduct an RCT combining procedural and behavioral treatments for CLBP.

8.
J Exp Med ; 220(7)2023 07 03.
Article in English | MEDLINE | ID: mdl-37058141

ABSTRACT

Distinct CD4+ T cell epitopes have been associated with spontaneous control of HIV-1 replication, but analysis of antigen-dependent factors that influence epitope selection is lacking. To examine these factors, we used a cell-free antigen processing system that incorporates soluble HLA-DR (DR1), HLA-DM (DM), cathepsins, and full-length protein antigens for epitope identification by LC-MS/MS. HIV-1 Gag, Pol, Env, Vif, Tat, Rev, and Nef were examined using this system. We identified 35 novel epitopes, including glycopeptides. Epitopes from smaller HIV-1 proteins mapped to regions of low protein stability and higher solvent accessibility. HIV-1 antigens associated with limited CD4+ T cell responses were processed efficiently, while some protective epitopes were inefficiently processed. 55% of epitopes obtained from cell-free processing induced memory CD4+ T cell responses in HIV-1+ donors, including eight of 19 novel epitopes tested. Thus, an in vitro processing system utilizing the components of Class II processing reveals factors influencing epitope selection of HIV-1 and represents an approach to understanding epitope selection from non-HIV-1 antigens.


Subject(s)
HIV Infections , Vaccines , Humans , Antigen Presentation , Chromatography, Liquid , Tandem Mass Spectrometry , Epitopes, T-Lymphocyte , Antigens, Viral
9.
J Pain ; 24(2): 332-344, 2023 02.
Article in English | MEDLINE | ID: mdl-36220482

ABSTRACT

The 0 to 10 numeric rating scale of pain intensity is a standard outcome in randomized controlled trials (RCTs) of pain treatments. For individuals taking analgesics, there may be a disparity between "observed" pain intensity (pain intensity with concurrent analgesic use) and pain intensity without concurrent analgesic use (what the numeric rating scale would be had analgesics not been taken). Using a contemporary causal inference framework, we compare analytic methods that can potentially account for concurrent analgesic use, first in statistical simulations, and second in analyses of real (non-simulated) data from an RCT of lumbar epidural steroid injections. The default analytic method was ignoring analgesic use, which is the most common approach in pain RCTs. Compared to ignoring analgesic use and other analytic methods, simulations showed that a quantitative pain and analgesia composite outcome based on adding 1.5 points to pain intensity for those who were taking an analgesic (the QPAC1.5) optimized power and minimized bias. Analyses of real RCT data supported the results of the simulations, showing greater power with analysis of the QPAC1.5 as compared to ignoring analgesic use and most other methods examined. We propose alternative methods that should be considered in the analysis of pain RCTs. PERSPECTIVE: This article presents the conceptual framework behind a new quantitative pain and analgesia composite outcome, the QPAC1.5, and the results of statistical simulations and analyses of trial data supporting improvements in power and bias using the QPAC1.5. Methods of this type should be considered in the analysis of pain RCTs.


Subject(s)
Analgesics, Opioid , Analgesics , Humans , Randomized Controlled Trials as Topic , Analgesics/therapeutic use , Pain Management/methods , Pain/drug therapy , Pain, Postoperative/drug therapy
10.
BMC Musculoskelet Disord ; 23(1): 376, 2022 Apr 21.
Article in English | MEDLINE | ID: mdl-35449043

ABSTRACT

BACKGROUND: Although it is generally accepted that physical activity and flares of low back pain (LBP) are related, evidence for the directionality of this association is mixed. The Flares of Low back pain with Activity Research Study (FLAReS) takes a novel approach to distinguish the short-term effects of specific physical activities on LBP flares from the cumulative effects of such activities, by conducting a longitudinal case-crossover study nested within a cohort study. The first aim is to estimate the short-term effects (≤ 24 h) of specific physical activities on LBP flares among Veterans in primary care in the Veterans Affairs healthcare system. The second aim is to estimate the cumulative effects of specific activities on LBP-related functional limitations at 1-year follow-up. METHODS: Up to 550 adults of working age (18-65 years) seen for LBP in primary care complete up to 36 "Scheduled" surveys over 1-year follow-up, and also complete unscheduled "Flare Window" surveys after the onset of new flares. Each survey asks about current flares and other factors associated with LBP. Surveys also inquire about activity exposures over the 24 h, and 2 h, prior to the time of survey completion (during non-flare periods) or prior to the time of flare onset (during flares). Other questions evaluate the number, intensity, duration, and/or other characteristics of activity exposures. Other exposures include factors related to mood, lifestyle, exercise, concurrent treatments, and injuries. Some participants wear actigraphy devices for weeks 1-4 of the study. The first aim will examine associations between 10 specific activity categories and participant-reported flares over 1-year follow-up. The second aim will examine associations between the frequency of exposure to 10 activity categories over weeks 1-4 of follow-up and long-term functional limitations at 12 months. All analyses will use a biopsychosocial framework accounting for potential confounders and effect modifiers. DISCUSSION: FLAReS will provide empirically derived estimates of both the short-term and cumulative effects of specific physical activities for Veterans with LBP, helping to better understand the role of physical activities in those with LBP. TRIAL REGISTRATION: ClinicalTrials.gov NCT04828330 , registered April 2, 2021.


Subject(s)
Low Back Pain , Adolescent , Adult , Aged , Cohort Studies , Cross-Over Studies , Exercise , Humans , Low Back Pain/diagnosis , Low Back Pain/epidemiology , Low Back Pain/therapy , Middle Aged , Surveys and Questionnaires , Young Adult
11.
Menopause ; 28(10): 1125-1129, 2021 07 26.
Article in English | MEDLINE | ID: mdl-34313612

ABSTRACT

OBJECTIVE: Hormone therapy (HT) is used by menopausal women to treat vasomotor symptoms. Venous thromboembolism (VTE) is an important risk of HT use, and more knowledge on the comparative safety of different estrogenic compounds is useful for women who use HT for these symptoms. The objective was to compare the risk of VTE among users of oral conjugated equine estrogen (CEE), oral estradiol (E2), and transdermal E2, in a cohort of women veterans. METHODS: This retrospective cohort study included all women veterans aged 40 to 89 years, using CEE or E2, without prior VTE, between 2003 and 2011. All incident VTE events were adjudicated. Time-to-event analyses using a time-varying HT exposure evaluated the relative VTE risk between estrogen subtypes, with adjustment for age, race, and body mass index, with stratification for prevalent versus incident use of HT. RESULTS: Among 51,571 users of HT (74.5% CEE, 12.6% oral, and 12.9% transdermal E2 at cohort entry), with a mean age of 54.0 years, the incidence of VTE was 1.9/1,000 person-years. Compared with CEE use, in the multivariable regression model, there was no difference in the risk of incident VTE associated with oral E2 use (hazard ratio 0.96, 95% CI 0.64-1.46) or with transdermal E2 use (hazard ratio 0.95, 95% CI 0.60-1.49). Results were unchanged when restricting to incident users of HT. CONCLUSIONS: Among women veterans, the risk of VTE was similar in users of oral CEE, oral E2, and transdermal E2. These findings do not confirm the previously observed greater safety of transdermal and oral E2 over CEE.


Subject(s)
Estrogen Replacement Therapy , Veterans , Administration, Cutaneous , Administration, Oral , Estrogen Replacement Therapy/adverse effects , Estrogens/adverse effects , Estrogens, Conjugated (USP) , Female , Humans , Middle Aged , Postmenopause , Retrospective Studies
12.
Mol Ther Methods Clin Dev ; 21: 670-680, 2021 Jun 11.
Article in English | MEDLINE | ID: mdl-34141822

ABSTRACT

Despite their exceptional capacity for transgene delivery ex vivo, lentiviral (LV) vectors have been slow to demonstrate clinical utility in the context of in vivo applications. Unresolved safety concerns related to broad LV vector tropism have limited LV vectors to ex vivo applications. Here, we report on a novel LV vector-pseudotyping strategy involving envelope glycoproteins of Tupaia paramyxovirus (TPMV) engineered to specifically target human cell-surface receptors. LV vectors pseudotyped with the TPMV hemagglutinin (H) protein bearing the interleukin (IL)-13 ligand in concert with the TPMV fusion (F) protein allowed efficient transduction of cells expressing the human IL-13 receptor alpha 2 (IL-13Rα2). Immunodeficient mice bearing orthotopically implanted human IL-13Rα2 expressing NCI-H1299 non-small cell lung cancer cells were injected intravenously with a single dose of LV vector pseudotyped with the TPMV H-IL-13 glycoprotein. Vector biodistribution was monitored using bioluminescence imaging of firefly luciferase transgene expression, revealing specific transduction of tumor tissue. A quantitative droplet digital PCR (ddPCR) analysis of lung tissue samples revealed a >15-fold increase in the tumor transduction in mice treated with LV vectors displaying IL-13 relative to those without IL-13. Our results show that TPMV envelope glycoproteins can be equipped with ligands to develop targeted LV vectors for in vivo applications.

14.
Mol Ther Methods Clin Dev ; 18: 631-638, 2020 Sep 11.
Article in English | MEDLINE | ID: mdl-32775497

ABSTRACT

The use of the human embryonic kidney (HEK) 293T cell line to manufacture vectors for in vivo applications raises safety concerns due to the presence of SV40 T antigen-encoding sequences. We used CRISPR-Cas9 genome editing to remove the SV40 T antigen-encoding sequences from HEK293T cells by transfecting them with a recombinant plasmid expressing Cas9 and two distinct single guide RNAs (sgRNAs) corresponding to the beginning and end of the T antigen coding region. Cell clones lacking T antigen-encoding sequences were identified using PCR. Whole-genome (WG) and targeted locus amplification (TLA) sequencing of the parental HEK293T cell line revealed multiple SV40 T antigen-encoding sequences replacing cellular sequences on chromosome 3. The putative T antigen null clones demonstrated a loss of sequence reads mapping to T antigen-encoding sequences. Western blot analysis of cell extracts prepared from the T antigen null clones confirmed that the SV40 large and small T antigen proteins were absent. Lentiviral vectors produced using the T antigen null clones exhibited titers up to 1.5 × 107 transducing units (TU)/mL, while the titers obtained from the parent HEK293T cell line were up to 4 × 107 TU/mL. The capacity of the T antigen-negative cells to produce high titer adeno-associated virus (AAV) vectors was also evaluated. The results obtained revealed that the lack of T antigen sequences did not impact AAV vector titers.

15.
Proc Natl Acad Sci U S A ; 117(27): 15763-15771, 2020 07 07.
Article in English | MEDLINE | ID: mdl-32571938

ABSTRACT

HIV-1 latency is a major barrier to cure. Identification of small molecules that destabilize latency and allow immune clearance of infected cells could lead to treatment-free remission. In vitro models of HIV-1 latency involving cell lines or primary cells have been developed for characterization of HIV-1 latency and high-throughput screening for latency-reversing agents (LRAs). We have shown that the majority of LRAs identified to date are relatively ineffective in cells from infected individuals despite activity in model systems. We show here that, for diverse LRAs, latency reversal observed in model systems involves a heat shock factor 1 (HSF1)-mediated stress pathway. Small-molecule inhibition of HSF1 attenuated HIV-1 latency reversal by histone deactylase inhibitors, protein kinase C agonists, and proteasome inhibitors without interfering with the known mechanism of action of these LRAs. However, latency reversal by second mitochondria-derived activator of caspase (SMAC) mimetics was not affected by inhibition of HSF1. In cells from infected individuals, inhibition of HSF1 attenuated latency reversal by phorbol ester+ionomycin but not by anti-CD3+anti-CD28. HSF1 promotes elongation of HIV-1 RNA by recruiting P-TEFb to the HIV-1 long terminal repeat (LTR), and we show that inhibition of HSF1 attenuates the formation of elongated HIV-1 transcripts. We demonstrate that in vitro models of latency have higher levels of the P-TEFb subunit cyclin T1 than primary cells, which may explain why many LRAs are functional in model systems but relatively ineffective in primary cells. Together, these studies provide insights into why particular LRA combinations are effective in reversing latency in cells from infected individuals.


Subject(s)
HIV Infections/genetics , HIV-1/genetics , Heat Shock Transcription Factors/genetics , Virus Latency/genetics , Anti-HIV Agents/pharmacology , Apoptosis Regulatory Proteins/genetics , Cyclin T/genetics , HIV Infections/virology , HIV-1/pathogenicity , Heat Shock Transcription Factors/antagonists & inhibitors , Histone Deacetylase Inhibitors/pharmacology , Humans , Mitochondrial Proteins/genetics , Positive Transcriptional Elongation Factor B/genetics , Protein Kinase C/genetics , RNA, Viral/drug effects , RNA, Viral/genetics , Small Molecule Libraries/pharmacology , Terminal Repeat Sequences/genetics , Virus Activation/genetics
16.
Diabetes Care ; 43(5): 1033-1040, 2020 05.
Article in English | MEDLINE | ID: mdl-32161048

ABSTRACT

OBJECTIVE: To assess whether the risk of subsequent lower-limb amputations and death following an initial toe amputation among individuals with diabetes has changed over time and varies by demographic characteristics and geographic region. RESEARCH DESIGN AND METHODS: Using Veterans Health Administration (VHA) electronic medical records from 1 October 2004 to 30 September 2016, we determined risk of subsequent ipsilateral minor and major amputation within 1 year after an initial toe/ray amputation among veterans with diabetes. To assess changes in the annual rate of subsequent amputation over time, we estimated age-adjusted incidence of minor and major subsequent ipsilateral amputation for each year, separately for African Americans (AAs) and whites. Geographic variation was assessed across VHA markets (n = 89) using log-linear Poisson regression models adjusting for age and ethnoracial category. RESULTS: Among 17,786 individuals who had an initial toe amputation, 34% had another amputation on the same limb within 1 year, including 10% who had a major ipsilateral amputation. Median time to subsequent ipsilateral amputation (minor or major) was 36 days. One-year risk of subsequent major amputation decreased over time, but risk of subsequent minor amputation did not. Risk of subsequent major ipsilateral amputation was higher in AAs than whites. After adjusting for age and ethnoracial category, 1-year risk of major subsequent amputation varied fivefold across VHA markets. CONCLUSIONS: Nearly one-third of individuals require reamputation following an initial toe amputation, although risks of subsequent major ipsilateral amputation have decreased over time. Nevertheless, risks remain particularly high for AAs and vary substantially geographically.


Subject(s)
Amputation, Surgical/statistics & numerical data , Diabetes Mellitus/surgery , Diabetic Foot/surgery , Reoperation/statistics & numerical data , Toes/surgery , Aged , Amputation, Surgical/methods , Diabetic Foot/epidemiology , Female , History, 20th Century , History, 21st Century , Humans , Incidence , Male , Middle Aged , Military Personnel/statistics & numerical data , Reoperation/methods , Risk Factors , United States/epidemiology , Veterans/statistics & numerical data
17.
Sci Transl Med ; 12(528)2020 01 29.
Article in English | MEDLINE | ID: mdl-31996465

ABSTRACT

The latent reservoir of HIV-1 in resting CD4+ T cells is a major barrier to cure. It is unclear whether the latent reservoir resides principally in particular subsets of CD4+ T cells, a finding that would have implications for understanding its stability and developing curative therapies. Recent work has shown that proliferation of HIV-1-infected CD4+ T cells is a major factor in the generation and persistence of the latent reservoir and that latently infected T cells that have clonally expanded in vivo can proliferate in vitro without producing virions. In certain CD4+ memory T cell subsets, the provirus may be in a deeper state of latency, allowing the cell to proliferate without producing viral proteins, thus permitting escape from immune clearance. To evaluate this possibility, we used a multiple stimulation viral outgrowth assay to culture resting naïve, central memory (TCM), transitional memory (TTM), and effector memory (TEM) CD4+ T cells from 10 HIV-1-infected individuals on antiretroviral therapy. On average, only 1.7% of intact proviruses across all T cell subsets were induced to transcribe viral genes and release replication-competent virus after stimulation of the cells. We found no consistent enrichment of intact or inducible proviruses in any T cell subset. Furthermore, we observed notable plasticity among the canonical memory T cell subsets after activation in vitro and saw substantial person-to-person variability in the inducibility of infectious virus release. This finding complicates the vision for a targeted approach for HIV-1 cure based on T cell memory subsets.


Subject(s)
CD4-Positive T-Lymphocytes/immunology , HIV-1/immunology , Immunologic Memory , Proviruses/immunology , T-Lymphocyte Subsets/immunology , Adult , Cell Differentiation/genetics , Cell Proliferation/drug effects , DNA, Viral/blood , DNA, Viral/genetics , Gene Expression Regulation, Viral , HIV-1/genetics , HIV-1/growth & development , Humans , Lymphocyte Activation/immunology , Lymphocyte Count , Phenotype , Phylogeny , Transcription, Genetic , Virus Replication/genetics
18.
Ann Thorac Surg ; 109(6): 1782-1788, 2020 06.
Article in English | MEDLINE | ID: mdl-31706873

ABSTRACT

BACKGROUND: The ability of handoff redesign to improve short-term outcomes is well established, yet an effective approach for achieving widespread adoption is unknown. An implementation science-based approach capable of influencing the leading indicators of widespread adoption was used to redesign handoffs from the cardiac operating room to the intensive care unit. METHODS: A transdisciplinary, unit-based team used a 12-step implementation process. The steps were divided into 4 phases: planning, engaging, executing, and evaluating. Based on unit-determined best practices, a "handoff bundle" was designed. This included team training, structured education with video illustration, and cognitive aids. Fidelity and acceptability were measured before, during, and after the handoff bundle was deployed. RESULTS: Redesign and implementation of the handoff process occurred over 12 months. Multiple rapid-cycle process improvements led to reductions in the handoff duration from 12.6 minutes to 10.7 minutes (P < .014). Fidelity to unit-determined handoff best practices was assessed based on a sample of the cardiac surgery population preimplantation and postimplementation. Twenty-three handoff best practices (information and tasks) demonstrated improvements compared with the preimplementation period. Provider satisfaction scores 2.5 years after implementation remained high compared with the redesign phase (87 vs. 84; P = .133). CONCLUSIONS: The use of an implementation-based approach for handoff redesign can be effective for improving the leading indicators of successful adoption of a structured handoff process. Future quality improvement studies addressing sustainability and widespread adoption of this approach appear to be warranted, and should include the relationships to improved care coordination and reduced preventable medical errors.


Subject(s)
Cardiac Surgical Procedures , Coronary Care Units/organization & administration , Implementation Science , Patient Care Team/standards , Patient Handoff/organization & administration , Quality Improvement , Aged , Female , Follow-Up Studies , Humans , Male , Operating Rooms/standards , Patient Transfer/methods , Retrospective Studies
19.
Cell Host Microbe ; 26(1): 73-85.e4, 2019 07 10.
Article in English | MEDLINE | ID: mdl-31295427

ABSTRACT

Evaluation of HIV cure strategies is complicated by defective proviruses that persist in ART-treated patients but are irrelevant to cure. Non-human primates (NHP) are essential for testing cure strategies. However, the persisting proviral landscape in ART-treated NHPs is uncharacterized. Here, we describe viral genomes persisting in ART-treated, simian immunodeficiency virus (SIV)-infected NHPs, simian-human immunodeficiency virus (SHIV)-infected NHPs, and humans infected with HIV-2, an SIV-related virus. The landscapes of persisting SIV, SHIV, and HIV-2 genomes are also dominated by defective sequences. However, there was a significantly higher fraction of intact SIV proviral genomes compared to ART-treated HIV-1 or HIV-2 infected humans. Compared to humans with HIV-1, SIV-infected NHPs had more hypermutated genomes, a relative paucity of clonal SIV sequences, and a lower frequency of deleted genomes. Finally, we report an assay for measuring intact SIV genomes which may have value in cure research.


Subject(s)
Anti-Retroviral Agents/therapeutic use , Genetic Variation , HIV Infections/drug therapy , HIV-1/drug effects , HIV-2/drug effects , Simian Acquired Immunodeficiency Syndrome/drug therapy , Simian Immunodeficiency Virus/drug effects , Animals , Defective Viruses/genetics , Genome, Viral , HIV Infections/virology , HIV-1/classification , HIV-1/genetics , HIV-2/classification , HIV-2/genetics , Humans , Macaca mulatta , Proviruses/genetics , Simian Acquired Immunodeficiency Syndrome/virology , Simian Immunodeficiency Virus/classification , Simian Immunodeficiency Virus/genetics
20.
Am J Physiol Renal Physiol ; 316(6): F1114-F1123, 2019 06 01.
Article in English | MEDLINE | ID: mdl-30908934

ABSTRACT

Little is known about the population genetics of water balance. A recent meta-genome-wide association study on plasma sodium concentration identified novel loci of high biological plausibility, yet heritability of the phenotype has never been convincingly shown in European ancestry. The present study linked the Vietnam Era Twin Registry with the Department of Veterans Affairs VistA patient care clinical database. Participants (n = 2,370, 59.6% monozygotic twins and 40.4% dizygotic twins) had a median of seven (interquartile range: 3-14) plasma sodium determinations between October 1999 and March 2017. Heritability of the mean plasma sodium concentration among all twins was 0.41 (95% confidence interval: 0.35-0.46) and 0.49 (95% confidence interval: 0.43-0.54) after exclusion of 514 twins with only a single plasma sodium determination. Heritability among Caucasian (n = 1,958) and African-American (n = 268) twins was 0.41 (95% confidence interval: 0.34-0.47) and 0.36 (95% confidence interval: 0.17-0.52), respectively. Exclusion of data from twins who had been prescribed medications known to impact systemic water balance had no effect. The ability of the present study to newly detect substantial heritability across multiple racial groups was potentially a function of the cohort size and relatedness, exclusion of sodium determinations confounded by elevated plasma glucose and/or reduced glomerular filtration rate, transformation of plasma sodium for the independent osmotic effect of plasma glucose, and use of multiple laboratory determinations per individual over a period of years. Individual-level plasma sodium concentration exhibited longitudinal stability (i.e., individuality); the degree to which individual-level means differed from the population mean was substantial, irrespective of the number of determinations. In aggregate, these data establish the heritability of plasma sodium concentration in European ancestry and corroborate its individuality.


Subject(s)
Genetic Heterogeneity , Heredity , Sodium/blood , Twins, Dizygotic/genetics , Twins, Monozygotic/genetics , Veterans , Water-Electrolyte Balance/genetics , Black or African American/genetics , Biological Variation, Individual , Databases, Factual , Genetics, Population , Glomerular Filtration Rate/genetics , Humans , Male , Middle Aged , Registries , United States , White People/genetics
SELECTION OF CITATIONS
SEARCH DETAIL
...