Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 39
Filter
Add more filters

Publication year range
1.
Ann Neurol ; 90(3): 428-439, 2021 09.
Article in English | MEDLINE | ID: mdl-34216034

ABSTRACT

OBJECTIVE: Among older adults, the ability to stand or walk while performing cognitive tasks (ie, dual-tasking) requires coordinated activation of several brain networks. In this multicenter, double-blinded, randomized, and sham-controlled study, we examined the effects of modulating the excitability of the left dorsolateral prefrontal cortex (L-DLPFC) and the primary sensorimotor cortex (SM1) on dual-task performance "costs" to standing and walking. METHODS: Fifty-seven older adults without overt illness or disease completed 4 separate study visits during which they received 20 minutes of transcranial direct current stimulation (tDCS) optimized to facilitate the excitability of the L-DLPFC and SM1 simultaneously, or each region separately, or neither region (sham). Before and immediately after stimulation, participants completed a dual-task paradigm in which they were asked to stand and walk with and without concurrent performance of a serial-subtraction task. RESULTS: tDCS simultaneously targeting the L-DLPFC and SM1, as well as tDCS targeting the L-DLPFC alone, mitigated dual-task costs to standing and walking to a greater extent than tDCS targeting SM1 alone or sham (p < 0.02). Blinding efficacy was excellent and participant subjective belief in the type of stimulation received (real or sham) did not contribute to the observed functional benefits of tDCS. INTERPRETATION: These results demonstrate that in older adults, dual-task decrements may be amenable to change and implicate L-DPFC excitability as a modifiable component of the control system that enables dual-task standing and walking. tDCS may be used to improve resilience and the ability of older results to walk and stand under challenging conditions, potentially enhancing everyday functioning and reducing fall risks. ANN NEUROL 2021;90:428-439.


Subject(s)
Aging/physiology , Gait/physiology , Postural Balance/physiology , Prefrontal Cortex/physiology , Psychomotor Performance/physiology , Transcranial Direct Current Stimulation/methods , Aged , Aged, 80 and over , Double-Blind Method , Female , Humans , Male , Pilot Projects
2.
Am J Public Health ; 112(5): 762-765, 2022 05.
Article in English | MEDLINE | ID: mdl-35324261

ABSTRACT

Objectives. To examine whether COVID-19 vaccine mandates that allow a test-out exemption for nursing home staff are associated with increased staff vaccination rates in nursing homes. Methods. Using the National Healthcare Safety Network data, we conducted analyses to test trends over time in statewide staff vaccination rates between June 1, 2021, and August 29, 2021, in Mississippi, 4 adjacent states, and the United States overall. Results. COVID-19 staff vaccination rates increased slowly following Mississippi enacting a vaccinate-or-test-out policy, achieving small, but statistically greater gains than most comparator states. Yet, staff vaccination rates in Mississippi remained well below the national average and similar numerically to surrounding states without mandates. Conclusions. Mississippi's COVID-19 vaccinate-or-test policy was ineffective in meaningfully increasing staff vaccination rates. For COVID-19 nursing home mandates to be effective while still balancing the staff turnover risks, facilities might consider a more stringent or hybrid approach (e.g., test-out option not offered to new staff). Public Health Implications. Statewide COVID-19 vaccine mandates, when given a test-out option, do not appear to be an effective strategy to meaningfully increase nursing home staff COVID-19 vaccination. (Am J Public Health. 2022;112(5):762-765. https://doi.org/10.2105/AJPH.2022.306800).


Subject(s)
COVID-19 Vaccines , COVID-19 , COVID-19/prevention & control , Humans , Nursing Homes , Policy , SARS-CoV-2 , United States , Vaccination
3.
J Neuroeng Rehabil ; 19(1): 123, 2022 11 11.
Article in English | MEDLINE | ID: mdl-36369027

ABSTRACT

BACKGROUND: In older adults, the extent to which performing a cognitive task when standing diminishes postural control is predictive of future falls and cognitive decline. The neurophysiology of such "dual-tasking" and its effect on postural control (i.e., dual-task cost) in older adults are poorly understood. The purpose of this study was to use electroencephalography (EEG) to examine the effects of dual-tasking when standing on brain activity in older adults. We hypothesized that compared to single-task "quiet" standing, dual-task standing would decrease alpha power, which has been linked to decreased motor inhibition, as well as increase the ratio of theta to beta power, which has been linked to increased attentional control. METHODS: Thirty older adults without overt disease completed four separate visits. Postural sway together with EEG (32-channels) were recorded during trials of standing with and without a concurrent verbalized serial subtraction dual-task. Postural control was measured by average sway area, velocity, and path length. EEG metrics included absolute alpha-, theta-, and beta-band powers as well as theta/beta power ratio, within six demarcated regions-of-interest: the left and right anterior, central, and posterior regions of the brain. RESULTS: Most EEG metrics demonstrated moderate-to-high between-day test-retest reliability (intra-class correlation coefficients > 0.70). Compared with quiet standing, dual-tasking decreased alpha-band power particularly in the central regions bilaterally (p = 0.002) and increased theta/beta power ratio in the anterior regions bilaterally (p < 0.001). A greater increase in theta/beta ratio from quiet standing to dual-tasking in numerous demarcated brain regions correlated with greater dual-task cost (i.e., absolute increase, indicative of worse performance) to postural sway metrics (r = 0.45-0.56, p < 0.01). Lastly, participants who exhibited greater alpha power during dual-tasking in the anterior-right (r = 0.52, p < 0.01) and central-right (r = 0.48, p < 0.01) regions had greater postural sway velocity during dual-tasking. CONCLUSION: In healthy older adults, alpha power and theta/beta power ratio change with dual-task standing. The change in theta/beta power ratio in particular may be related to the ability to regulate standing postural control when simultaneously performing unrelated, attention-demanding cognitive tasks. Modulation of brain oscillatory activity might therefore be a novel target to minimize dual-task cost in older adults.


Subject(s)
Attention , Postural Balance , Humans , Aged , Reproducibility of Results , Postural Balance/physiology , Attention/physiology , Standing Position , Brain , Cognition/physiology
4.
Ann Neurol ; 87(1): 75-83, 2020 01.
Article in English | MEDLINE | ID: mdl-31693765

ABSTRACT

OBJECTIVE: Symptomatic head trauma associated with American-style football (ASF) has been linked to brain pathology, along with physical and mental distress in later life. However, the longer-term effects of such trauma on objective metrics of cognitive-motor function remain poorly understood. We hypothesized that ASF-related symptomatic head trauma would predict worse gait performance, particularly during dual task conditions (ie, walking while performing an additional cognitive task), in later life. METHODS: Sixty-six retired professional ASF players aged 29 to 75 years completed a health and wellness questionnaire. They also completed a validated smartphone-based assessment in their own homes, during which gait was monitored while they walked normally and while they performed a verbalized serial-subtraction cognitive task. RESULTS: Participants who reported more symptomatic head trauma, defined as the total number of impacts to the head or neck followed by concussion-related symptoms, exhibited greater dual task cost (ie, percentage increase) to stride time variability (ie, the coefficient of variation of mean stride time). Those who reported ≥1 hit followed by loss of consciousness, compared to those who did not, also exhibited greater dual task costs to this metric. Relationships between reported trauma and dual task costs were independent of age, body mass index, National Football League career duration, and history of musculoskeletal surgery. Symptomatic head trauma was not correlated with average stride times in either walking condition. INTERPRETATION: Remote, smartphone-based assessments of dual task walking may be utilized to capture meaningful data sensitive to the long-term impact of symptomatic head trauma in former professional ASF players and other contact sport athletes. ANN NEUROL 2020;87:75-83.


Subject(s)
Cognition/physiology , Craniocerebral Trauma/physiopathology , Football/injuries , Gait/physiology , Adult , Aged , Brain Concussion/complications , Brain Concussion/physiopathology , Craniocerebral Trauma/complications , Humans , Male , Middle Aged , Remote Sensing Technology/methods , Retirement , Self Report , Smartphone/statistics & numerical data , Surveys and Questionnaires
5.
Mov Disord ; 36(11): 2693-2698, 2021 11.
Article in English | MEDLINE | ID: mdl-34406695

ABSTRACT

BACKGROUND: Treatments of freezing of gait (FOG) in Parkinson's disease are suboptimal. OBJECTIVE: The aim of this study was to evaluate the effects of multiple sessions of transcranial direct current stimulation (tDCS) targeting the left dorsolateral prefrontal cortex and primary motor cortex (M1) on FOG. METHODS: Seventy-seven individuals with Parkinson's disease and FOG were enrolled in a double-blinded randomized trial. tDCS and sham interventions comprised 10 sessions over 2 weeks followed by five once-weekly sessions. FOG-provoking test performance (primary outcome), functional outcomes, and self-reported FOG severity were assessed. RESULTS: Primary analyses demonstrated no advantage for tDCS in the FOG-provoking test. In secondary analyses, tDCS, compared with sham, decreased self-reported FOG severity and increased daily living step counts. Among individuals with mild-to-moderate FOG severity, tDCS improved FOG-provoking test time and self-report of FOG. CONCLUSIONS: Multisession tDCS targeting the left dorsolateral prefrontal cortex and M1 did not improve laboratory-based FOG-provoking test performance. Improvements observed in participants with mild-to-moderate FOG severity warrant further investigation. © 2021 International Parkinson and Movement Disorder Society.


Subject(s)
Gait Disorders, Neurologic , Motor Cortex , Parkinson Disease , Transcranial Direct Current Stimulation , Double-Blind Method , Gait/physiology , Gait Disorders, Neurologic/complications , Gait Disorders, Neurologic/therapy , Humans , Parkinson Disease/complications , Parkinson Disease/therapy , Prefrontal Cortex
6.
Child Dev ; 91(5): 1563-1576, 2020 09.
Article in English | MEDLINE | ID: mdl-31814133

ABSTRACT

This study used longitudinal cross-lagged modeling to examine reciprocal relations between maternal depression and child behavior problems. Data were drawn from 3,119 children (40% Hispanic, 30% African American, 20% White, and 10% other) from the Family and Child Experiences Survey of 2009 (a nationally representative sample of children served by Head Start). Results documented reciprocal relations between maternal depression and child behavior problems across early childhood (i.e., child age 3-5). Furthermore, the effect of child behavior problems on maternal depression was moderated by child race/ethnicity during children's first year in Head Start, such that the negative effect of child behavior problems on African American mothers' depression was more pronounced compared to Hispanics and other racial/ethnic groups.


Subject(s)
Child Behavior Disorders/epidemiology , Depression/epidemiology , Mother-Child Relations/psychology , Mothers/psychology , Adult , Black or African American/psychology , Black or African American/statistics & numerical data , Child , Child Behavior Disorders/ethnology , Child Behavior Disorders/etiology , Child, Preschool , Cohort Studies , Depression/complications , Depression/ethnology , Depression/psychology , Early Intervention, Educational/statistics & numerical data , Female , Hispanic or Latino/psychology , Hispanic or Latino/statistics & numerical data , Humans , Longitudinal Studies , Male , Mother-Child Relations/ethnology , Mothers/statistics & numerical data , Parenting/psychology , Problem Behavior/psychology , Risk Factors , Surveys and Questionnaires , United States/epidemiology , White People/psychology , White People/statistics & numerical data , Young Adult
7.
Sensors (Basel) ; 20(16)2020 Aug 10.
Article in English | MEDLINE | ID: mdl-32785163

ABSTRACT

Freezing of gait (FOG) is a debilitating motor phenomenon that is common among individuals with advanced Parkinson's disease. Objective and sensitive measures are needed to better quantify FOG. The present work addresses this need by leveraging wearable devices and machine-learning methods to develop and evaluate automated detection of FOG and quantification of its severity. Seventy-one subjects with FOG completed a FOG-provoking test while wearing three wearable sensors (lower back and each ankle). Subjects were videotaped before (OFF state) and after (ON state) they took their antiparkinsonian medications. Annotations of the videos provided the "ground-truth" for FOG detection. A leave-one-patient-out validation process with a training set of 57 subjects resulted in 84.1% sensitivity, 83.4% specificity, and 85.0% accuracy for FOG detection. Similar results were seen in an independent test set (data from 14 other subjects). Two derived outcomes, percent time frozen and number of FOG episodes, were associated with self-report of FOG. Bother derived-metrics were higher in the OFF state than in the ON state and in the most challenging level of the FOG-provoking test, compared to the least challenging level. These results suggest that this automated machine-learning approach can objectively assess FOG and that its outcomes are responsive to therapeutic interventions.


Subject(s)
Gait Analysis/instrumentation , Gait Disorders, Neurologic , Machine Learning , Parkinson Disease , Wearable Electronic Devices , Aged , Gait Disorders, Neurologic/diagnosis , Humans , Parkinson Disease/diagnosis
8.
N Engl J Med ; 374(19): 1811-21, 2016 May 12.
Article in English | MEDLINE | ID: mdl-27168432

ABSTRACT

BACKGROUND: Currently, the diagnosis of chronic obstructive pulmonary disease (COPD) requires a ratio of forced expiratory volume in 1 second (FEV1) to forced vital capacity (FVC) of less than 0.70 as assessed by spirometry after bronchodilator use. However, many smokers who do not meet this definition have respiratory symptoms. METHODS: We conducted an observational study involving 2736 current or former smokers and controls who had never smoked and measured their respiratory symptoms using the COPD Assessment Test (CAT; scores range from 0 to 40, with higher scores indicating greater severity of symptoms). We examined whether current or former smokers who had preserved pulmonary function as assessed by spirometry (FEV1:FVC ≥0.70 and an FVC above the lower limit of the normal range after bronchodilator use) and had symptoms (CAT score, ≥10) had a higher risk of respiratory exacerbations than current or former smokers with preserved pulmonary function who were asymptomatic (CAT score, <10) and whether those with symptoms had different findings from the asymptomatic group with respect to the 6-minute walk distance, lung function, or high-resolution computed tomographic (HRCT) scan of the chest. RESULTS: Respiratory symptoms were present in 50% of current or former smokers with preserved pulmonary function. The mean (±SD) rate of respiratory exacerbations among symptomatic current or former smokers was significantly higher than the rates among asymptomatic current or former smokers and among controls who never smoked (0.27±0.67 vs. 0.08±0.31 and 0.03±0.21 events, respectively, per year; P<0.001 for both comparisons). Symptomatic current or former smokers, regardless of history of asthma, also had greater limitation of activity, slightly lower FEV1, FVC, and inspiratory capacity, and greater airway-wall thickening without emphysema according to HRCT than did asymptomatic current or former smokers. Among symptomatic current or former smokers, 42% used bronchodilators and 23% used inhaled glucocorticoids. CONCLUSIONS: Although they do not meet the current criteria for COPD, symptomatic current or former smokers with preserved pulmonary function have exacerbations, activity limitation, and evidence of airway disease. They currently use a range of respiratory medications without any evidence base. (Funded by the National Heart, Lung, and Blood Institute and the Foundation for the National Institutes of Health; SPIROMICS ClinicalTrials.gov number, NCT01969344.).


Subject(s)
Pulmonary Disease, Chronic Obstructive/physiopathology , Smoking/physiopathology , Adult , Aged , Aged, 80 and over , Asthma/complications , Bronchodilator Agents/therapeutic use , Confounding Factors, Epidemiologic , Female , Forced Expiratory Volume , Humans , Lung/diagnostic imaging , Lung/physiology , Male , Middle Aged , Pulmonary Disease, Chronic Obstructive/complications , Pulmonary Disease, Chronic Obstructive/diagnosis , Symptom Assessment , Tomography, X-Ray Computed , Vital Capacity
9.
PLoS Genet ; 12(8): e1006011, 2016 Aug.
Article in English | MEDLINE | ID: mdl-27532455

ABSTRACT

Implementing precision medicine for complex diseases such as chronic obstructive lung disease (COPD) will require extensive use of biomarkers and an in-depth understanding of how genetic, epigenetic, and environmental variations contribute to phenotypic diversity and disease progression. A meta-analysis from two large cohorts of current and former smokers with and without COPD [SPIROMICS (N = 750); COPDGene (N = 590)] was used to identify single nucleotide polymorphisms (SNPs) associated with measurement of 88 blood proteins (protein quantitative trait loci; pQTLs). PQTLs consistently replicated between the two cohorts. Features of pQTLs were compared to previously reported expression QTLs (eQTLs). Inference of causal relations of pQTL genotypes, biomarker measurements, and four clinical COPD phenotypes (airflow obstruction, emphysema, exacerbation history, and chronic bronchitis) were explored using conditional independence tests. We identified 527 highly significant (p < 8 X 10-10) pQTLs in 38 (43%) of blood proteins tested. Most pQTL SNPs were novel with low overlap to eQTL SNPs. The pQTL SNPs explained >10% of measured variation in 13 protein biomarkers, with a single SNP (rs7041; p = 10-392) explaining 71%-75% of the measured variation in vitamin D binding protein (gene = GC). Some of these pQTLs [e.g., pQTLs for VDBP, sRAGE (gene = AGER), surfactant protein D (gene = SFTPD), and TNFRSF10C] have been previously associated with COPD phenotypes. Most pQTLs were local (cis), but distant (trans) pQTL SNPs in the ABO blood group locus were the top pQTL SNPs for five proteins. The inclusion of pQTL SNPs improved the clinical predictive value for the established association of sRAGE and emphysema, and the explanation of variance (R2) for emphysema improved from 0.3 to 0.4 when the pQTL SNP was included in the model along with clinical covariates. Causal modeling provided insight into specific pQTL-disease relationships for airflow obstruction and emphysema. In conclusion, given the frequency of highly significant local pQTLs, the large amount of variance potentially explained by pQTL, and the differences observed between pQTLs and eQTLs SNPs, we recommend that protein biomarker-disease association studies take into account the potential effect of common local SNPs and that pQTLs be integrated along with eQTLs to uncover disease mechanisms. Large-scale blood biomarker studies would also benefit from close attention to the ABO blood group.


Subject(s)
Biomarkers/blood , Blood Proteins/genetics , Emphysema/genetics , Pulmonary Disease, Chronic Obstructive/genetics , ABO Blood-Group System/genetics , Emphysema/blood , Emphysema/pathology , Genetic Predisposition to Disease , Genome-Wide Association Study , Genotype , Humans , Polymorphism, Single Nucleotide , Pulmonary Disease, Chronic Obstructive/blood , Pulmonary Disease, Chronic Obstructive/pathology , Quantitative Trait Loci/genetics
10.
Biostatistics ; 18(1): 15-31, 2017 01.
Article in English | MEDLINE | ID: mdl-27335117

ABSTRACT

In the standard analysis of competing risks data, proportional hazards models are fit to the cause-specific hazard functions for all causes on the same time scale. These regression analyses are the foundation for predictions of cause-specific cumulative incidence functions based on combining the estimated cause-specific hazard functions. However, in predictions arising from disease registries, where only subjects with disease enter the database, disease-related mortality may be more naturally modeled on the time since diagnosis time scale while death from other causes may be more naturally modeled on the age time scale. The single time scale methodology may be biased if an incorrect time scale is employed for one of the causes and an alternative methodology is not available. We propose inferences for the cumulative incidence function in which regression models for the cause-specific hazard functions may be specified on different time scales. Using the disease registry data, the analysis of other cause mortality on the age scale requires left truncating the event time at the age of disease diagnosis, complicating the analysis. In addition, standard Martingale theory is not applicable when combining regression models on different time scales. We establish that the covariate conditional predictions are consistent and asymptotically normal using empirical process techniques and propose consistent variance estimators for constructing confidence intervals. Simulation studies show that the proposed two time scales method performs well, outperforming the single time-scale predictions when the time scale is misspecified. The methods are illustrated with stage III colon cancer data obtained from the Surveillance, Epidemiology, and End Results program of National Cancer Institute.


Subject(s)
Epidemiologic Measurements , Proportional Hazards Models , Registries/statistics & numerical data , Risk Assessment/methods , Humans
11.
Biometrics ; 73(1): 104-113, 2017 03.
Article in English | MEDLINE | ID: mdl-27276276

ABSTRACT

In competing risks setup, the data for each subject consist of the event time, censoring indicator, and event category. However, sometimes the information about the event category can be missing, as, for example, in a case when the date of death is known but the cause of death is not available. In such situations, treating subjects with missing event category as censored leads to the underestimation of the hazard functions. We suggest nonparametric estimators for the cumulative cause-specific hazards and the cumulative incidence functions which use the Nadaraya-Watson estimator to obtain the contribution of an event with missing category to each of the cause-specific hazards. We derive the propertied of the proposed estimators. Optimal bandwidth is determined, which minimizes the mean integrated squared errors of the proposed estimators over time. The methodology is illustrated using data on lung infections in patients from the United States Cystic Fibrosis Foundation Patient Registry.


Subject(s)
Models, Statistical , Risk , Statistics, Nonparametric , Computer Simulation , Data Interpretation, Statistical , Humans , Incidence , Lung Diseases/epidemiology , Registries
12.
Clin Gastroenterol Hepatol ; 13(3): 569-76, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25218670

ABSTRACT

BACKGROUND & AIMS: Nonalcoholic fatty liver disease (NAFLD) was shown to disproportionally affect Hispanic persons. We examined the prevalence of suspected NAFLD in Hispanic/Latino persons with diverse backgrounds. METHODS: We studied the prevalence of suspected NAFLD among 12,133 persons included in the Hispanic Community Health Study/Study of Latinos. We collected data on levels of aminotransferase, metabolic syndrome (defined by National Cholesterol Education Program-Adult Treatment Panel III guidelines), demographics, and health behaviors. Suspected NAFLD was defined on the basis of increased level of aminotransferase in the absence of serologic evidence for common causes of liver disease or excessive alcohol consumption. In multivariate analyses, data were adjusted for metabolic syndrome, age, acculturation, diet, physical activity, sleep, and levels of education and income. RESULTS: In multivariate analysis, compared with persons of Mexican heritage, persons of Cuban (odds ratio [OR], 0.69; 95% confidence interval [CI], 0.57-0.85), Puerto Rican (OR, 0.67; 95% CI, 0.52-0.87), and Dominican backgrounds (OR, 0.71; 95% CI, 0.54-0.93) had lower rates of suspected NAFLD. Persons of Central American and South American heritage had a similar prevalence of suspected NAFLD compared with persons of Mexican heritage. NAFLD was less common in women than in men (OR, 0.49; 95% CI, 0.40-0.60). Suspected NAFLD associated with metabolic syndrome and all 5 of its components. CONCLUSIONS: On the basis of an analysis of a large database of health in Latino populations, we found the prevalence of suspected NAFLD among Hispanic/Latino individuals to vary by region of heritage.


Subject(s)
Hispanic or Latino , Non-alcoholic Fatty Liver Disease/epidemiology , Adolescent , Adult , Aged , Female , Humans , Male , Middle Aged , Prevalence , Risk Factors , Young Adult
13.
J Behav Med ; 38(1): 160-70, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25107504

ABSTRACT

Little research has examined associations of social support with diabetes (or other physical health outcomes) in Hispanics, who are at elevated risk. We examined associations between social support and diabetes prevalence in the Hispanic Community Health Study/Study of Latinos Sociocultural Ancillary Study. Participants were 5,181 adults, 18-74 years old, representing diverse Hispanic backgrounds, who underwent baseline exam with fasting blood draw, oral glucose tolerance test, medication review, sociodemographic assessment, and sociocultural exam with functional and structural social support measures. In adjusted analyses, one standard deviation higher structural and functional social support related to 16 and 15% lower odds, respectively, of having diabetes. Structural and functional support were related to both previously diagnosed diabetes (OR = .84 and .88, respectively) and newly recognized diabetes prevalence (OR = .84 and .83, respectively). Higher functional and structural social support are associated with lower diabetes prevalence in Hispanics/Latinos.


Subject(s)
Diabetes Mellitus/epidemiology , Hispanic or Latino/statistics & numerical data , Social Support , Adolescent , Adult , Aged , Female , Humans , Male , Middle Aged , Prevalence , United States/epidemiology , Young Adult
14.
Biometrics ; 70(2): 441-8, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24446693

ABSTRACT

In HIV-1 clinical trials the interest is often to compare how well treatments suppress the HIV-1 RNA viral load. The current practice in statistical analysis of such trials is to define a single ad hoc composite event which combines information about both the viral load suppression and the subsequent viral rebound, and then analyze the data using standard univariate survival analysis techniques. The main weakness of this approach is that the results of the analysis can be easily influenced by minor details in the definition of the composite event. We propose a straightforward alternative endpoint based on the probability of being suppressed over time, and suggest that treatment differences be summarized using the restricted mean time a patient spends in the state of viral suppression. A nonparametric analysis is based on methods for multiple endpoint studies. We demonstrate the utility of our analytic strategy using a recent therapeutic trial, in which the protocol specified a primary analysis using a composite endpoint approach.


Subject(s)
Biometry/methods , HIV Infections/virology , Models, Statistical , Viral Load , Anti-HIV Agents/therapeutic use , Computer Simulation , Endpoint Determination/statistics & numerical data , HIV Infections/drug therapy , HIV-1/drug effects , Humans , Kaplan-Meier Estimate , RNA, Viral/blood , Randomized Controlled Trials as Topic/statistics & numerical data , Statistics, Nonparametric , Time Factors , Viral Load/drug effects
15.
Stat Med ; 33(2): 181-92, 2014 Jan 30.
Article in English | MEDLINE | ID: mdl-24038032

ABSTRACT

The number needed to treat is a tool often used in clinical settings to illustrate the effect of a treatment. It has been widely adopted in the communication of risks to both clinicians and non-clinicians, such as patients, who are better able to understand this measure than absolute risk or rate reductions. The concept was introduced by Laupacis, Sackett, and Roberts in 1988 for binary data, and extended to time-to-event data by Altman and Andersen in 1999. However, up to the present, there is no definition of the number needed to treat for time-to-event data with competing risks. This paper introduces such a definition using the cumulative incidence function and suggests non-parametric and semi-parametric inferential methods for right-censored time-to-event data in the presence of competing risks. The procedures are illustrated using the data from a breast cancer clinical trial.


Subject(s)
Clinical Trials as Topic/methods , Incidence , Risk , Treatment Outcome , Aged , Aged, 80 and over , Antineoplastic Agents, Hormonal/administration & dosage , Breast Neoplasms/surgery , Female , Humans , Neoplasm Recurrence, Local/prevention & control , Tamoxifen/administration & dosage
16.
PLoS Genet ; 7(6): e1002113, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21738480

ABSTRACT

White blood cell (WBC) count is a common clinical measure from complete blood count assays, and it varies widely among healthy individuals. Total WBC count and its constituent subtypes have been shown to be moderately heritable, with the heritability estimates varying across cell types. We studied 19,509 subjects from seven cohorts in a discovery analysis, and 11,823 subjects from ten cohorts for replication analyses, to determine genetic factors influencing variability within the normal hematological range for total WBC count and five WBC subtype measures. Cohort specific data was supplied by the CHARGE, HeamGen, and INGI consortia, as well as independent collaborative studies. We identified and replicated ten associations with total WBC count and five WBC subtypes at seven different genomic loci (total WBC count-6p21 in the HLA region, 17q21 near ORMDL3, and CSF3; neutrophil count-17q21; basophil count- 3p21 near RPN1 and C3orf27; lymphocyte count-6p21, 19p13 at EPS15L1; monocyte count-2q31 at ITGA4, 3q21, 8q24 an intergenic region, 9q31 near EDG2), including three previously reported associations and seven novel associations. To investigate functional relationships among variants contributing to variability in the six WBC traits, we utilized gene expression- and pathways-based analyses. We implemented gene-clustering algorithms to evaluate functional connectivity among implicated loci and showed functional relationships across cell types. Gene expression data from whole blood was utilized to show that significant biological consequences can be extracted from our genome-wide analyses, with effect estimates for significant loci from the meta-analyses being highly corellated with the proximal gene expression. In addition, collaborative efforts between the groups contributing to this study and related studies conducted by the COGENT and RIKEN groups allowed for the examination of effect homogeneity for genome-wide significant associations across populations of diverse ancestral backgrounds.


Subject(s)
Genetic Loci/genetics , Leukocyte Count , Leukocytes , Phenotype , Genome-Wide Association Study , Humans , Molecular Epidemiology , Multigene Family/genetics , Polymorphism, Single Nucleotide/genetics , Ubiquitin-Protein Ligases/genetics
17.
Am J Med ; 136(12): 1196-1202.e2, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37777143

ABSTRACT

BACKGROUND: Intensive blood pressure lowering prevents major adverse cardiovascular events, but some patients experience serious adverse events. Examining benefit-harm profiles may be more informative than analyzing major adverse cardiovascular events and serious adverse events separately. METHODS: We analyzed data from the Systolic Blood Pressure Intervention Trial (n = 9361), comparing intensive treatment (systolic blood pressure target <120 mm Hg) to standard treatment (<140 mm Hg). A 4-year hierarchical outcome profile was defined for each participant: 1) alive with neither major adverse cardiovascular events nor serious adverse events (most desirable); 2) alive with serious adverse events only; 3) alive with major adverse cardiovascular events only; 4) alive with both events; and 5) deceased (least desirable). We compared 4-year outcome profiles between the treatment groups in the entire population and by frailty subgroups defined using physical frailty phenotype (non-frail, pre-frail, and frail). RESULTS: The proportion who died were lower with intensive treatment than standard treatment (5% vs 6%). A higher proportion of the intensive treatment group was alive with serious adverse events and no major adverse cardiovascular events (36% vs 33%), and a lower proportion were alive with both events (6% vs 5%) than the standard treatment group. The outcome profiles were more favorable among those with physical frailty phenotype who were treated with intensive treatment vs standard treatment, but outcome profiles were similar between the treatment groups among non-frail or pre-frail participants. CONCLUSIONS: This post hoc proof-of-concept analysis demonstrates the utility of the outcome profile analysis that simultaneously examines the benefit and harm of the treatment.


Subject(s)
Frailty , Hypertension , Humans , Blood Pressure , Hypertension/drug therapy , Antihypertensive Agents/adverse effects , Blood Pressure Determination
18.
J Gerontol A Biol Sci Med Sci ; 78(11): 2145-2151, 2023 10 28.
Article in English | MEDLINE | ID: mdl-37428879

ABSTRACT

BACKGROUND: Dementia severity is unavailable in administrative claims data. We examined whether a claims-based frailty index (CFI) can measure dementia severity in Medicare claims. METHODS: This cross-sectional study included the National Health and Aging Trends Study Round 5 participants with possible or probable dementia whose Medicare claims were available. We estimated the Functional Assessment Staging Test (FAST) scale (range: 3 [mild cognitive impairment] to 7 [severe dementia]) using information from the survey. We calculated CFI (range: 0-1, higher scores indicating greater frailty) using Medicare claims 12 months prior to the participants' interview date. We examined C-statistics to evaluate the ability of the CFI in identifying moderate-to-severe dementia (FAST stage 5-7) and determined the optimal CFI cut-point that maximized both sensitivity and specificity. RESULTS: Of the 814 participants with possible or probable dementia and measurable CFI, 686 (72.2%) patients were ≥75 years old, 448 (50.8%) were female, and 244 (25.9%) had FAST stage 5-7. The C-statistic of CFI to identify FAST stage 5-7 was 0.78 (95% confidence interval: 0.72-0.83), with a CFI cut-point of 0.280, achieving the maximum sensitivity of 76.9% and specificity of 62.8%. Participants with CFI ≥0.280 had a higher prevalence of disability (19.4% vs 58.3%) and dementia medication use (6.0% vs 22.8%) and higher risk of mortality (10.7% vs 26.3%) and nursing home admission (4.5% vs 10.6%) over 2 years than those with CFI <0.280. CONCLUSIONS: Our study suggests that CFI can be useful in identifying moderate-to-severe dementia from administrative claims among older adults with dementia.


Subject(s)
Dementia , Frailty , Humans , Female , Aged , United States/epidemiology , Male , Frailty/diagnosis , Frailty/epidemiology , Cross-Sectional Studies , Medicare , Frail Elderly , Dementia/diagnosis , Dementia/epidemiology , Dementia/drug therapy
19.
Health Aff (Millwood) ; 42(2): 217-226, 2023 02.
Article in English | MEDLINE | ID: mdl-36745839

ABSTRACT

COVID-19 vaccination and regular testing of nursing home staff have been critical interventions for mitigating COVID-19 outbreaks in US nursing homes. Although implementation of testing has largely been left to nursing home organizations to coordinate, vaccination occurred through a combination of state, federal, and organization efforts. Little research has focused on structural variation in these processes. We examined whether one structural factor, the primary shift worked by staff, was associated with differences in COVID-19 testing rates and odds of vaccination, using staff-level data from a multistate sample of 294 nursing homes. In facility fixed effects analyses, we found that night-shift staff had the lowest testing rates and lowest odds of vaccination, whereas day-shift staff had the highest testing rates and odds of vaccination. These findings highlight the need to coordinate resources and communication evenly across shifts when implementing large-scale processes in nursing homes and other organizations with shift-based workforces.


Subject(s)
COVID-19 , Humans , COVID-19/prevention & control , COVID-19/epidemiology , COVID-19 Testing , COVID-19 Vaccines , Nursing Homes , Vaccination
20.
J Am Geriatr Soc ; 71(6): 1851-1860, 2023 06.
Article in English | MEDLINE | ID: mdl-36883262

ABSTRACT

BACKGROUND: Existing models to predict fall-related injuries (FRI) in nursing homes (NH) focus on hip fractures, yet hip fractures comprise less than half of all FRIs. We developed and validated a series of models to predict the absolute risk of FRIs in NH residents. METHODS: Retrospective cohort study of long-stay US NH residents (≥100 days in the same facility) between January 1, 2016 and December 31, 2017 (n = 733,427) using Medicare claims and Minimum Data Set v3.0 clinical assessments. Predictors of FRIs were selected through LASSO logistic regression in a 2/3 random derivation sample and tested in a 1/3 validation sample. Sub-distribution hazard ratios (HR) and 95% confidence intervals (95% CI) were estimated for 6-month and 2-year follow-up. Discrimination was evaluated via C-statistic, and calibration compared the predicted rate of FRI to the observed rate. To develop a parsimonious clinical tool, we calculated a score using the five strongest predictors in the Fine-Gray model. Model performance was repeated in the validation sample. RESULTS: Mean (Q1, Q3) age was 85.0 (77.5, 90.6) years and 69.6% were women. Within 2 years of follow-up, 43,976 (6.0%) residents experienced ≥1 FRI. Seventy predictors were included in the model. The discrimination of the 2-year prediction model was good (C-index = 0.70), and the calibration was excellent. Calibration and discrimination of the 6-month model were similar (C-index = 0.71). In the clinical tool to predict 2-year risk, the five characteristics included independence in activities of daily living (ADLs) (HR 2.27; 95% CI 2.14-2.41) and a history of non-hip fracture (HR 2.02; 95% CI 1.94-2.12). Performance results were similar in the validation sample. CONCLUSIONS: We developed and validated a series of risk prediction models that can identify NH residents at greatest risk for FRI. In NH, these models should help target preventive strategies.


Subject(s)
Hip Fractures , Humans , Female , Aged , United States/epidemiology , Male , Retrospective Studies , Accidental Falls , Activities of Daily Living , Medicare , Nursing Homes
SELECTION OF CITATIONS
SEARCH DETAIL