Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.426
Filter
1.
Multivariate Behav Res ; : 1-24, 2024 Jul 04.
Article in English | MEDLINE | ID: mdl-38963381

ABSTRACT

Psychologists leverage longitudinal designs to examine the causal effects of a focal predictor (i.e., treatment or exposure) over time. But causal inference of naturally observed time-varying treatments is complicated by treatment-dependent confounding in which earlier treatments affect confounders of later treatments. In this tutorial article, we introduce psychologists to an established solution to this problem from the causal inference literature: the parametric g-computation formula. We explain why the g-formula is effective at handling treatment-dependent confounding. We demonstrate that the parametric g-formula is conceptually intuitive, easy to implement, and well-suited for psychological research. We first clarify that the parametric g-formula essentially utilizes a series of statistical models to estimate the joint distribution of all post-treatment variables. These statistical models can be readily specified as standard multiple linear regression functions. We leverage this insight to implement the parametric g-formula using lavaan, a widely adopted R package for structural equation modeling. Moreover, we describe how the parametric g-formula may be used to estimate a marginal structural model whose causal parameters parsimoniously encode time-varying treatment effects. We hope this accessible introduction to the parametric g-formula will equip psychologists with an analytic tool to address their causal inquiries using longitudinal data.

2.
Int J Epidemiol ; 53(4)2024 Jun 12.
Article in English | MEDLINE | ID: mdl-38961645

ABSTRACT

BACKGROUND: Perceived discrimination in health care settings can have adverse consequences on mental health in minority groups. However, the association between perceived discrimination and mental health is prone to unmeasured confounding. The study aims to quantitatively evaluate the influence of unmeasured confounding in this association, using g-estimation. METHODS: In a predominantly African American cohort, we applied g-estimation to estimate the association between perceived discrimination and mental health, adjusted and unadjusted for measured confounders. Mental health was measured using clinical diagnoses of anxiety, depression and bipolar disorder. Perceived discrimination was measured as the number of patient-reported discrimination events in health care settings. Measured confounders included demographic, socioeconomic, residential and health characteristics. The influence of confounding was denoted as α1 from g-estimation. We compared α1 for measured and unmeasured confounding. RESULTS: Strong associations between perceived discrimination in health care settings and mental health outcomes were observed. For anxiety, the odds ratio (95% confidence interval) unadjusted and adjusted for measured confounders were 1.30 (1.21, 1.39) and 1.26 (1.17, 1.36), respectively. The α1 for measured confounding was -0.066. Unmeasured confounding with α1=0.200, which was over three times that of measured confounding, corresponds to an odds ratio of 1.12 (1.01, 1.24). Similar results were observed for other mental health outcomes. CONCLUSION: Compared with measured confounding, unmeasured that was three times measured confounding was not enough to explain away the association between perceived discrimination and mental health, suggesting that this association is robust to unmeasured confounding. This study provides a novel framework to quantitatively evaluate unmeasured confounding.


Subject(s)
Black or African American , Confounding Factors, Epidemiologic , Mental Health , Adult , Aged , Female , Humans , Male , Middle Aged , Anxiety/epidemiology , Anxiety/psychology , Bipolar Disorder/psychology , Bipolar Disorder/ethnology , Black or African American/psychology , Black or African American/statistics & numerical data , Cohort Studies , Depression/epidemiology , Depression/psychology , Depression/ethnology , Mental Disorders/epidemiology , Racism/psychology , Racism/statistics & numerical data , Perceived Discrimination
3.
Metabolomics ; 20(4): 71, 2024 Jul 07.
Article in English | MEDLINE | ID: mdl-38972029

ABSTRACT

BACKGROUND AND OBJECTIVE: Blood-based small molecule metabolites offer easy accessibility and hold significant potential for insights into health processes, the impact of lifestyle, and genetic variation on disease, enabling precise risk prevention. In a prospective study with records of heart failure (HF) incidence, we present metabolite profiling data from individuals without HF at baseline. METHODS: We uncovered the interconnectivity of metabolites using data-driven and causal networks augmented with polygenic factors. Exploring the networks, we identified metabolite broadcasters, receivers, mediators, and subnetworks corresponding to functional classes of metabolites, and provided insights into the link between metabolomic architecture and regulation in health. We incorporated the network structure into the identification of metabolites associated with HF to control the effect of confounding metabolites. RESULTS: We identified metabolites associated with higher and lower risk of HF incidence, such as glycine, ureidopropionic and glycocholic acids, and LPC 18:2. These associations were not confounded by the other metabolites due to uncovering the connectivity among metabolites and adjusting each association for the confounding metabolites. Examples of our findings include the direct influence of asparagine on glycine, both of which were inversely associated with HF. These two metabolites were influenced by polygenic factors and only essential amino acids, which are not synthesized in the human body and are obtained directly from the diet. CONCLUSION: Metabolites may play a critical role in linking genetic background and lifestyle factors to HF incidence. Revealing the underlying connectivity of metabolites associated with HF strengthens the findings and facilitates studying complex conditions like HF.


Subject(s)
Heart Failure , Metabolomics , Heart Failure/metabolism , Humans , Metabolomics/methods , Male , Female , Prospective Studies , Middle Aged , Metabolome , Aged , Metabolic Networks and Pathways
4.
J Am Stat Assoc ; 119(546): 1019-1031, 2024.
Article in English | MEDLINE | ID: mdl-38974187

ABSTRACT

We introduce a simple diagnostic test for assessing the overall or partial goodness of fit of a linear causal model with errors being independent of the covariates. In particular, we consider situations where hidden confounding is potentially present. We develop a method and discuss its capability to distinguish between covariates that are confounded with the response by latent variables and those that are not. Thus, we provide a test and methodology for partial goodness of fit. The test is based on comparing a novel higher-order least squares principle with ordinary least squares. In spite of its simplicity, the proposed method is extremely general and is also proven to be valid for high-dimensional settings. Supplementary materials for this article are available online.

5.
Eur J Endocrinol ; 191(1): E1-E4, 2024 Jul 02.
Article in English | MEDLINE | ID: mdl-38872400

ABSTRACT

Propensity score methods are popular to control for confounding in observational biomedical studies of risk factors or medical treatments. This paper focused on aspects of propensity score methods that often remain undiscussed, including unmeasured confounding, missing data, variable selection, statistical efficiency, estimands, the positivity assumption, and predictive performance of the propensity score model.


Subject(s)
Propensity Score , Humans , Observational Studies as Topic/methods , Confounding Factors, Epidemiologic , Data Interpretation, Statistical , Models, Statistical
6.
Obes Res Clin Pract ; 18(3): 189-194, 2024.
Article in English | MEDLINE | ID: mdl-38866643

ABSTRACT

BACKGROUND: The relationship between body mass index (BMI) and outcomes in the acute care setting is controversial, with evidence suggesting that obesity is either protective - which is also called obesity paradox - or associated with worse outcomes. The purpose of this study was to assess whether BMI was related to frailty and biological age, and whether BMI remained predictive of mortality after adjusting for frailty and biological age. SUBJECTS: Of the 2950 patients who had a biological age estimated on admission to the intensive care unit, 877 (30 %) also had BMI and frailty data available for further analysis in this retrospective cohort study. METHODS: Biological age of each patient was estimated using the Levine PhenoAge model based on results of nine blood tests that were reflective of DNA methylation. Biological age in excess of chronological age was then indexed to the local study context by a linear regression to generate the residuals. The associations between BMI, clinical frailty scale, and the residuals were first analyzed using univariable analyses. Their associations with mortality were then assessed by multivariable analysis, including the use of a 3-knot restricted cubic spline function to allow non-linearity. RESULTS: Both frailty (p = 0.003) and the residuals of the biological age (p = 0.001) were related to BMI in a U-shaped fashion. BMI was not related to hospital mortality, but both frailty (p = 0.015) and the residuals of biological age (OR per decade older than chronological age 1.50, 95 % confidence interval [CI] 1.04-2.18; p = 0.031) were predictive of mortality after adjusting for chronological age, diabetes mellitus and severity of acute illness. CONCLUSIONS: BMI was significantly associated with both frailty and biological age in a U-shaped fashion but only the latter two were related to mortality. These results may, in part, explain why obesity paradox could be observed in some studies.


Subject(s)
Body Mass Index , Critical Illness , Frailty , Intensive Care Units , Obesity , Humans , Male , Female , Critical Illness/mortality , Retrospective Studies , Aged , Middle Aged , Intensive Care Units/statistics & numerical data , Obesity/complications , Obesity/mortality , Obesity/physiopathology , Aged, 80 and over , Age Factors , Adult
7.
Front Vet Sci ; 11: 1402981, 2024.
Article in English | MEDLINE | ID: mdl-38835899

ABSTRACT

This study summarizes a presentation at the symposium for the Calvin Schwabe Award for Lifetime Achievement in Veterinary Epidemiology and Preventive Medicine, which was awarded to the first author. As epidemiologists, we are taught that "correlation does not imply causation." While true, identifying causes is a key objective for much of the research that we conduct. There is empirical evidence that veterinary epidemiologists are conducting observational research with the intent to identify causes; many studies include control for confounding variables, and causal language is often used when interpreting study results. Frameworks for studying causes include the articulation of specific hypotheses to be tested, approaches for the selection of variables, methods for statistical estimation of the relationship between the exposure and the outcome, and interpretation of that relationship as causal. When comparing observational studies in veterinary populations to those conducted in human populations, the application of each of these steps differs substantially. The a priori identification of exposure-outcome pairs of interest are less common in observational studies in the veterinary literature compared to the human literature, and prior knowledge is used to select confounding variables in most observational studies in human populations, whereas data-driven approaches are the norm in veterinary populations. The consequences of not having a defined exposure-outcome hypotheses of interest and using data-driven analytical approaches include an increased probability of biased results and poor replicability of results. A discussion by the community of researchers on current approaches to studying causes in observational studies in veterinary populations is warranted.

8.
Cureus ; 16(5): e60193, 2024 May.
Article in English | MEDLINE | ID: mdl-38868240

ABSTRACT

Background Immunosuppressants are administered in various combinations to prevent immune-induced transplant rejection in patients with liver transplant, as each immunosuppressant acts on different cellular sites. However, the use of multiple immunosuppressants also increases the risk for adverse events. Therefore, it is desirable to reduce the types of immunosuppressants administered without increasing the incidence of transplant rejection. The effectiveness of prednisone avoidance has been suggested, although this was not based on statistical significance in many instances. To definitively establish the effectiveness of prednisone avoidance, a statistically significant difference from a prednisone-use group should be demonstrated. Additionally, the effectiveness of prednisone avoidance might vary depending on the combination of other immunosuppressants administered. It has therefore been considered necessary to investigate, for various immunosuppressant combinations, the administration patterns in which prednisone avoidance is effective. Objectives This study aimed to investigate the effectiveness of prednisone avoidance in patients with liver transplant and discuss the results based on statistically significant differences. Methods Data from the U.S. Food and Drug Administration Adverse Event Reporting System (FAERS) were obtained. In studying immunosuppressant combinations, it was essential to control for confounding. Thus, the immunosuppressant combinations, excluding prednisone, were kept the same in the two groups being compared (prednisone-use and prednisone-avoidance groups). The large sample from FAERS allowed for those various immunosuppressant combinations to be compared. Comparisons of transplant rejection in the prednisone-use and prednisone-avoidance groups used the reporting odds ratio (ROR) and the adjusted ROR (aROR), which controlled for differences in patient background. Results With the prednisone-use groups being set as the reference, ROR and aROR were calculated for the prednisone-avoidance groups. Various immunosuppressant combinations were evaluated, and in four patterns - (1) the combination of prednisone and tacrolimus, (2) the combination of prednisone, cyclosporine, and tacrolimus, (3) the combination of prednisone, tacrolimus, and basiliximab, and (4) the combination of prednisone and everolimus) - both the ROR and the aROR for transplant rejection in the prednisone-avoidance group were significantly <1.000. Conclusions This study identified effective immunosuppressant combinations for prednisone avoidance that were not associated with increased transplant rejection. The evidence supporting the effectiveness of prednisone avoidance is strengthened when combined with results from previous studies.

9.
Eur J Epidemiol ; 2024 Jun 16.
Article in English | MEDLINE | ID: mdl-38879863

ABSTRACT

Epidemiological researchers often examine associations between risk factors and health outcomes in non-experimental designs. Observed associations may be causal or confounded by unmeasured factors. Sibling and co-twin control studies account for familial confounding by comparing exposure levels among siblings (or twins). If the exposure-outcome association is causal, the siblings should also differ regarding the outcome. However, such studies may sometimes introduce more bias than they alleviate. Measurement error in the exposure may bias results and lead to erroneous conclusions that truly causal exposure-outcome associations are confounded by familial factors. The current study used Monte Carlo simulations to examine bias due to measurement error in sibling control models when the observed exposure-outcome association is truly causal. The results showed that decreasing exposure reliability and increasing sibling-correlations in the exposure led to deflated exposure-outcome associations and inflated associations between the family mean of the exposure and the outcome. The risk of falsely concluding that causal associations were confounded was high in many situations. For example, when exposure reliability was 0.7 and the observed sibling-correlation was r = 0.4, about 30-90% of the samples (n = 2,000) provided results supporting a false conclusion of confounding, depending on how p-values were interpreted as evidence for a family effect on the outcome. The current results have practical importance for epidemiological researchers conducting or reviewing sibling and co-twin control studies and may improve our understanding of observed associations between risk factors and health outcomes. We have developed an app (SibSim) providing simulations of many situations not presented in this paper.

10.
Front Aging ; 5: 1389789, 2024.
Article in English | MEDLINE | ID: mdl-38873125

ABSTRACT

No clear consensus has emerged from the literature on the gene expression changes that occur in human whole blood with age. In this study we compared whole blood ageing genes from the published literature with data on gene specificity for leukocyte subtypes. Surprisingly we found that highly ranked ageing genes were predominantly expressed by naïve T cells, with limited expression from more common cell types. Highly ranked ageing genes were also more likely to have decreased expression with age. Taken together, it is plausible that much of the observed gene expression changes in whole blood is reflecting the decline in abundance of naïve T cells known to occur with age, rather than changes in transcription rates in common cell types. Correct attribution of the gene expression changes that occur with age is essential for understanding the underlying mechanisms.

11.
Biometrics ; 80(2)2024 Mar 27.
Article in English | MEDLINE | ID: mdl-38884127

ABSTRACT

The marginal structure quantile model (MSQM) provides a unique lens to understand the causal effect of a time-varying treatment on the full distribution of potential outcomes. Under the semiparametric framework, we derive the efficiency influence function for the MSQM, from which a new doubly robust estimator is proposed for point estimation and inference. We show that the doubly robust estimator is consistent if either of the models associated with treatment assignment or the potential outcome distributions is correctly specified, and is semiparametric efficient if both models are correct. To implement the doubly robust MSQM estimator, we propose to solve a smoothed estimating equation to facilitate efficient computation of the point and variance estimates. In addition, we develop a confounding function approach to investigate the sensitivity of several MSQM estimators when the sequential ignorability assumption is violated. Extensive simulations are conducted to examine the finite-sample performance characteristics of the proposed methods. We apply the proposed methods to the Yale New Haven Health System Electronic Health Record data to study the effect of antihypertensive medications to patients with severe hypertension and assess the robustness of the findings to unmeasured baseline and time-varying confounding.


Subject(s)
Computer Simulation , Hypertension , Models, Statistical , Humans , Hypertension/drug therapy , Antihypertensive Agents/therapeutic use , Electronic Health Records/statistics & numerical data , Biometry/methods
12.
Front Pharmacol ; 15: 1357567, 2024.
Article in English | MEDLINE | ID: mdl-38903996

ABSTRACT

Introduction: Antihypertensive drugs are used preventatively to lower the risk of cardiovascular disease events. Comparative effectiveness studies on angiotensin-converting enzyme inhibitors (ACEIs), angiotensin II receptor blockers (ARBs), beta-blockers (BBs), calcium channel blockers (CCBs), and thiazides have yielded inconsistent results and given little consideration to patient adherence. Using a longitudinal cohort and considering time-varying adherence and confounding factors, we aimed to estimate the real-world effectiveness of five major antihypertensive drug monotherapies in the primary prevention of cardiovascular events. Methods: Eligible patients for a retrospective inception cohort study were selected using information obtained from the University of Groningen IADB.nl pharmacy prescription database. Cohort 1 comprised adherent patients with a follow-up time exceeding 1 year, and cohort 2 comprised all patients independent of adherence. The exposures were ACEIs, ARBs, BBs, CCBs, and thiazides. The primary outcome was the time to the first prescription for an acute cardiac drug therapy (CDT) measured using valid drug proxies to identify the first major cardiovascular event. A per-protocol analytical approach was adopted with inverse probability of treatment weighted (IPTW), time-varying Cox regression analysis to obtain the hazard ratios (HRs) and 95% confidence intervals (CIs). Results: In cohort 1 (n = 22,441), 1,294 patients (5.8%) were prescribed an acute CDT with an average follow-up time of 4.2 ± 2.8 years. Following IPTW, the hazard measures of ARBs and thiazides were lower than those of BBs (HRs: 0.79 and 0.80, respectively; 95% CIs: 0.64-0.97 and 0.69-0.94, respectively). Among drug-treated diabetic patients, the hazard measures were even lower, with HR point estimates of 0.43 (CI: 0.19-0.98) for ARBs and 0.32 (CI: 0.13-0.82) for thiazides. In cohort 2 (n = 33,427) and sensitivity analysis, the comparative effectiveness results for thiazides and BBs were similar to those for cohort 1. Conclusion: The findings of this real-world analysis suggest that the incidence of CDT associated with long-term thiazide or ARB monotherapy is lower than the incidence of CDT with BBs, notably among high-risk patients. Incidences of CDT associated with ACEIs and CCBs were comparable relative to those associated with BBs.

13.
BMC Public Health ; 24(1): 1601, 2024 Jun 15.
Article in English | MEDLINE | ID: mdl-38879521

ABSTRACT

BACKGROUND: Cardiovascular disease (CVD) is the leading cause of death worldwide. It has been known for some considerable time that radiation is associated with excess risk of CVD. A recent systematic review of radiation and CVD highlighted substantial inter-study heterogeneity in effect, possibly a result of confounding or modifications of radiation effect by non-radiation factors, in particular by the major lifestyle/environmental/medical risk factors and latent period. METHODS: We assessed effects of confounding by lifestyle/environmental/medical risk factors on radiation-associated CVD and investigated evidence for modifying effects of these variables on CVD radiation dose-response, using data assembled for a recent systematic review. RESULTS: There are 43 epidemiologic studies which are informative on effects of adjustment for confounding or risk modifying factors on radiation-associated CVD. Of these 22 were studies of groups exposed to substantial doses of medical radiation for therapy or diagnosis. The remaining 21 studies were of groups exposed at much lower levels of dose and/or dose rate. Only four studies suggest substantial effects of adjustment for lifestyle/environmental/medical risk factors on radiation risk of CVD; however, there were also substantial uncertainties in the estimates in all of these studies. There are fewer suggestions of effects that modify the radiation dose response; only two studies, both at lower levels of dose, report the most serious level of modifying effect. CONCLUSIONS: There are still large uncertainties about confounding factors or lifestyle/environmental/medical variables that may influence radiation-associated CVD, although indications are that there are not many studies in which there are substantial confounding effects of these risk factors.


Subject(s)
Cardiovascular Diseases , Life Style , Humans , Cardiovascular Diseases/etiology , Cardiovascular Diseases/epidemiology , Confounding Factors, Epidemiologic , Environmental Exposure/adverse effects , Risk Factors
14.
Am J Epidemiol ; 2024 May 31.
Article in English | MEDLINE | ID: mdl-38825336

ABSTRACT

BACKGROUND: Unmeasured confounding is often raised as a source of potential bias during the design of non-randomized studies but quantifying such concerns is challenging. METHODS: We developed a simulation-based approach to assess the potential impact of unmeasured confounding during the study design stage. The approach involved generation of hypothetical individual-level cohorts using realistic parameters including a binary treatment (prevalence 25%), a time-to-event outcome (incidence 5%), 13 measured covariates, a binary unmeasured confounder (u1, 10%), and a binary measured 'proxy' variable (p1) correlated with u1. Strength of unmeasured confounding and correlations between u1 and p1 were varied in simulation scenarios. Treatment effects were estimated with, a) no adjustment, b) adjustment for measured confounders (Level 1), c) adjustment for measured confounders and their proxy (Level 2). We computed absolute standardized mean differences in u1 and p1 and relative bias with each level of adjustment. RESULTS: Across all scenarios, Level 2 adjustment led to improvement in balance of u1, but this improvement was highly dependent on the correlation between u1 and p1. Level 2 adjustments also had lower relative bias than Level 1 adjustments (in strong u1 scenarios: relative bias of 9.2%, 12.2%, 13.5% at correlations 0.7, 0.5, and 0.3, respectively versus 16.4%, 15.8%, 15.0% for Level 1, respectively). CONCLUSION: An approach using simulated individual-level data was useful to explicitly convey the potential for bias due to unmeasured confounding while designing non-randomized studies and can be helpful in informing design choices.

15.
BMC Nurs ; 23(1): 424, 2024 Jun 24.
Article in English | MEDLINE | ID: mdl-38910263

ABSTRACT

BACKGROUND: The ability of a nurse to make effective clinical decisions is the most important factor that can affect the treatment quality. However, several factors can affect the ability of nursing and midwifery students to make effective clinical decisions. OBJECTIVES: This study aims to identify the confounding factors that may affect the clinical decision making of nurses and thus patient outcomes after the COVID-19 pandemic in Jordan. METHODS: A descriptive cross-sectional design was employed in this study. An online self-administered questionnaire was distributed to 269 nursing and midwifery students selected through purposive sampling, 224 of whom completed the questionnaire. The valid and reliable nursing decision-making instrument, which consisted of 24 items, was employed to gather the data, and descriptive statistics and simple linear regression were employed for the data analysis. Data was collected from November to the end of December 2022. RESULTS: Among the respondents, 72.8% were female, and the average age was 20.79 years (SD = 1.44). The vast majority of the respondents (94.6%) was unmarried, and 74.1% were pursuing a nursing degree. The simple linear regression analysis showed that clinical decision making had a negative and significant relationship with social media usage of an average of 6 h a day (ß=-0.085). Moreover, the male nursing students obtained lower clinical decision-making scores (ß= -0.408) compared with the female nursing students. CONCLUSION: Social media usage and gender have a considerable effect on the clinical decision making of the nursing and midwifery students. Therefore, the confounding factors that can affect the clinical decision making of nurses should be discussed further, and strategies to address such factors should be implemented.

16.
Eur J Epidemiol ; 39(4): 343-347, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38733447

ABSTRACT

Trial emulations in observational data analyses can complement findings from randomized clinical trials, inform future trial designs, or generate evidence when randomized studies are not feasible due to resource constraints and ethical or practical limitations. Importantly, trial emulation designs facilitate causal inference in observational data analyses by enhancing counterfactual thinking and comparisons of real-world observations (e.g. Mendelian Randomization) to hypothetical interventions. In order to enhance credibility, trial emulations would benefit from prospective registration, publication of statistical analysis plans, and subsequent prospective benchmarking to randomized clinical trials prior to their publication. Confounding by indication, however, is the key challenge to interpreting observed intended effects of an intervention as causal in observational data analyses. We discuss the target trial emulation of the REDUCE-AMI randomized clinical trial (ClinicalTrials.gov ID NCT03278509; beta-blocker use in patients with preserved left ventricular ejection fraction after myocardial infarction) to illustrate the challenges and uncertainties of studying intended effects of interventions without randomization to account for confounding. We furthermore directly compare the findings, statistical power, and clinical interpretation of the results of the REDUCE-AMI target trial emulation to those from the simultaneously published randomized clinical trial. The complexity and subtlety of confounding by indication when studying intended effects of interventions can generally only be addressed by randomization.


Subject(s)
Myocardial Infarction , Randomized Controlled Trials as Topic , Research Design , Humans , Uncertainty
17.
J Neural Eng ; 21(3)2024 May 17.
Article in English | MEDLINE | ID: mdl-38722304

ABSTRACT

Discrete myoelectric control-based gesture recognition has recently gained interest as a possible input modality for many emerging ubiquitous computing applications. Unlike the continuous control commonly employed in powered prostheses, discrete systems seek to recognize the dynamic sequences associated with gestures to generate event-based inputs. More akin to those used in general-purpose human-computer interaction, these could include, for example, a flick of the wrist to dismiss a phone call or a double tap of the index finger and thumb to silence an alarm. Moelectric control systems have been shown to achieve near-perfect classification accuracy, but in highly constrained offline settings. Real-world, online systems are subject to 'confounding factors' (i.e. factors that hinder the real-world robustness of myoelectric control that are not accounted for during typical offline analyses), which inevitably degrade system performance, limiting their practical use. Although these factors have been widely studied in continuous prosthesis control, there has been little exploration of their impacts on discrete myoelectric control systems for emerging applications and use cases. Correspondingly, this work examines, for the first time, three confounding factors and their effect on the robustness of discrete myoelectric control: (1)limb position variability, (2)cross-day use, and a newly identified confound faced by discrete systems (3)gesture elicitation speed. Results from four different discrete myoelectric control architectures: (1) Majority Vote LDA, (2) Dynamic Time Warping, (3) an LSTM network trained with Cross Entropy, and (4) an LSTM network trained with Contrastive Learning, show that classification accuracy is significantly degraded (p<0.05) as a result of each of these confounds. This work establishes that confounding factors are a critical barrier that must be addressed to enable the real-world adoption of discrete myoelectric control for robust and reliable gesture recognition.


Subject(s)
Electromyography , Gestures , Pattern Recognition, Automated , Humans , Electromyography/methods , Male , Pattern Recognition, Automated/methods , Female , Adult , Young Adult , Artificial Limbs
18.
Am J Epidemiol ; 2024 May 17.
Article in English | MEDLINE | ID: mdl-38754869

ABSTRACT

We spend a great deal of time on confounding in our teaching, in our methods development and in our assessment of study results. This may give the impression that uncontrolled confounding is the biggest problem that observational epidemiology faces, when in fact, other sources of bias such as selection bias, measurement error, missing data, and misalignment of zero time may often (especially if they are all present in a single study) lead to a stronger deviation from the truth. Compared to the amount of time we spend teaching how to address confounding in a data analysis, we spend relatively little time teaching methods for simulating confounding (and other sources of bias) to learn their impact and develop plans to mitigate or quantify the bias. We review a paper by Desai et al that uses simulation methods to quantify the impact of an unmeasured confounder when it is completely missing or when a proxy of the confounder is measured. We use this article to discuss how we can use simulations of sources of bias to ensure we generate better and more valid study estimates, and we discuss the importance of simulating realistic datasets with plausible bias structures to guide data collection. If an advanced life form exists outside of our current universe and they came to earth with the goal of scouring the published epidemiologic literature to understand what the biggest problem epidemiologists have, they would quickly discover that the limitations section of publications would provide them with all the information they needed. And most likely what they would conclude is that the biggest problem that we face is uncontrolled confounding. It seems to be an obsession of ours.

19.
Elife ; 132024 May 16.
Article in English | MEDLINE | ID: mdl-38752987

ABSTRACT

We discuss 12 misperceptions, misstatements, or mistakes concerning the use of covariates in observational or nonrandomized research. Additionally, we offer advice to help investigators, editors, reviewers, and readers make more informed decisions about conducting and interpreting research where the influence of covariates may be at issue. We primarily address misperceptions in the context of statistical management of the covariates through various forms of modeling, although we also emphasize design and model or variable selection. Other approaches to addressing the effects of covariates, including matching, have logical extensions from what we discuss here but are not dwelled upon heavily. The misperceptions, misstatements, or mistakes we discuss include accurate representation of covariates, effects of measurement error, overreliance on covariate categorization, underestimation of power loss when controlling for covariates, misinterpretation of significance in statistical models, and misconceptions about confounding variables, selecting on a collider, and p value interpretations in covariate-inclusive analyses. This condensed overview serves to correct common errors and improve research quality in general and in nutrition research specifically.


Subject(s)
Observational Studies as Topic , Research Design , Humans , Research Design/standards , Models, Statistical , Data Interpretation, Statistical
20.
Struct Equ Modeling ; 31(1): 132-150, 2024.
Article in English | MEDLINE | ID: mdl-38706777

ABSTRACT

Parallel process latent growth curve mediation models (PP-LGCMMs) are frequently used to longitudinally investigate the mediation effects of treatment on the level and change of outcome through the level and change of mediator. An important but often violated assumption in empirical PP-LGCMM analysis is the absence of omitted confounders of the relationships among treatment, mediator, and outcome. In this study, we analytically examined how omitting pretreatment confounders impacts the inference of mediation from the PP-LGCMM. Using the analytical results, we developed three sensitivity analysis approaches for the PP-LGCMM, including the frequentist, Bayesian, and Monte Carlo approaches. The three approaches help investigate different questions regarding the robustness of mediation results from the PP-LGCMM, and handle the uncertainty in the sensitivity parameters differently. Applications of the three sensitivity analyses are illustrated using a real-data example. A user-friendly Shiny web application is developed to conduct the sensitivity analyses.

SELECTION OF CITATIONS
SEARCH DETAIL
...