RESUMO
Parkinson's disease (PD) patients require frequent office visits where they are assessed for health state changes using Unified Parkinson's Disease Rating Scale (UPDRS). Inertial wearable sensor devices present a unique opportunity to supplement these assessments with continuous monitoring. In this work, we analyze kinematic features from sensor devices located on feet, wrists, lumbar and sternum for 35 PD subjects as they performed walk trials in two clinical visits, one for each of their self-reported ON and OFF motor states. Our results show that a few features related to subject's whole-body turns and pronation-supination motor events can accurately infer cardinal features of PD like bradykinesia and posture instability and gait disorder (PIGD). In addition, these features can be measured from only two sensors, one located on the affected wrist and one on the lumbar region, thus potentially reducing patient burden of wearing sensors while supporting continuous monitoring in out of office settings.
Assuntos
Fenômenos Biomecânicos/fisiologia , Hipocinesia/diagnóstico , Hipocinesia/etiologia , Monitorização Fisiológica/instrumentação , Doença de Parkinson/reabilitação , Postura/fisiologia , Dispositivos Eletrônicos Vestíveis , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Masculino , Doença de Parkinson/diagnóstico , Caminhada/fisiologiaRESUMO
A novel writing platform composed of a wearable sensor on the fingernail and classification algorithms is described. Findings from using this platform to translate fingertip writing into shapes, letters, and numbers on a range of surfaces are reported. The new wearable platform leverages an architecture with miniaturized electronic circuitry to precisely measure a set of forces in the longitudinal and transverse directions using multiple strain gauges. We find that the directional pressure patterns are translated from the fingertip to the fingernail. Deformation of fingernails in the longitudinal and transverse directions are detected by the fingernail sensor which sends the data wirelessly to a portable electronic system. Fingernail pressure patterns are categorized through signal processing to recognize a range of shapes, numbers, and letters, enabling fingertip writing recognition. Use of the writing platform following a short training session, shows human fingertip writing on multiple surfaces were automatically transcribed to a computer.
Assuntos
Dedos , Processamento de Sinais Assistido por Computador , Redação , Algoritmos , Humanos , UnhasRESUMO
Patient specific models of ventricular mechanics require the optimization of their many parameters under the uncertainties associated with imaging of cardiac function. We present a strategy to reduce the complexity of parametric searches for 3-D FE models of left ventricular contraction. The study employs automatic image segmentation and analysis of an image database to gain geometric features for several classes of patients. Statistical distributions of geometric parameters are then used to design parametric studies investigating the effects of: (1) passive material properties during ventricular filling, and (2) infarct geometry on ventricular contraction in patients after a heart attack. Gaussian Process regression is used in both cases to build statistical models trained on the results of biophysical FEM simulations. The first statistical model estimates unloaded configurations based on either the intraventricular pressure or the end-diastolic fiber strain. The technique provides an alternative to the standard fixed-point iteration algorithm, which is more computationally expensive when used to unload more than 10 ventricles. The second statistical model captures the effects of varying infarct geometries on cardiac output. For training, we designed high resolution models of non-transmural infarcts including refinements of the border zone around the lesion. This study is a first effort in developing a platform combining HPC models and machine learning to investigate cardiac function in heart failure patients with the goal of assisting clinical diagnostics.
RESUMO
Ectopic heartbeats can trigger reentrant arrhythmias, leading to ventricular fibrillation and sudden cardiac death. Such events have been attributed to perturbed Ca2+ handling in cardiac myocytes leading to spontaneous Ca2+ release and delayed afterdepolarizations (DADs). However, the ways in which perturbation of specific molecular mechanisms alters the probability of ectopic beats is not understood. We present a multiscale model of cardiac tissue incorporating a biophysically detailed three-dimensional model of the ventricular myocyte. This model reproduces realistic Ca2+ waves and DADs driven by stochastic Ca2+ release channel (RyR) gating and is used to study mechanisms of DAD variability. In agreement with previous experimental and modeling studies, key factors influencing the distribution of DAD amplitude and timing include cytosolic and sarcoplasmic reticulum Ca2+ concentrations, inwardly rectifying potassium current (IK1) density, and gap junction conductance. The cardiac tissue model is used to investigate how random RyR gating gives rise to probabilistic triggered activity in a one-dimensional myocyte tissue model. A novel spatial-average filtering method for estimating the probability of extreme (i.e. rare, high-amplitude) stochastic events from a limited set of spontaneous Ca2+ release profiles is presented. These events occur when randomly organized clusters of cells exhibit synchronized, high amplitude Ca2+ release flux. It is shown how reduced IK1 density and gap junction coupling, as observed in heart failure, increase the probability of extreme DADs by multiple orders of magnitude. This method enables prediction of arrhythmia likelihood and its modulation by alterations of other cellular mechanisms.
Assuntos
Arritmias Cardíacas/fisiopatologia , Simulação por Computador , Ventrículos do Coração/fisiopatologia , Modelos Cardiovasculares , Miócitos Cardíacos/patologia , Animais , Arritmias Cardíacas/metabolismo , Cálcio/metabolismo , Sinalização do Cálcio , Células Cultivadas , Cães , Ventrículos do Coração/metabolismo , Potenciais da Membrana , Miócitos Cardíacos/metabolismo , Probabilidade , Canal de Liberação de Cálcio do Receptor de Rianodina/metabolismo , Retículo Sarcoplasmático/metabolismo , Retículo Sarcoplasmático/patologiaRESUMO
While pre-clinical Torsades de Pointes (TdP) risk classifiers had initially been based on drug-induced block of hERG potassium channels, it is now well established that improved risk prediction can be achieved by considering block of non-hERG ion channels. The current multi-channel TdP classifiers can be categorized into two classes. First, the classifiers that take as input the values of drug-induced block of ion channels (direct features). Second, the classifiers that are built on features extracted from output of the drug-induced multi-channel blockage simulations in the in-silico models (derived features). The classifiers built on derived features have thus far not consistently provided increased prediction accuracies, and hence casts doubt on the value of such approaches given the cost of including biophysical detail. Here, we propose a new two-step method for TdP risk classification, referred to as Multi-Channel Blockage at Early After Depolarization (MCB@EAD). In the first step, we classified the compound that produced insufficient hERG block as non-torsadogenic. In the second step, the role of non-hERG channels to modulate TdP risk are considered by constructing classifiers based on direct or derived features at critical hERG block concentrations that generates EADs in the computational cardiac cell models. MCB@EAD provides comparable or superior TdP risk classification of the drugs from the direct features in tests against published methods. TdP risk for the drugs highly correlated to the propensity to generate EADs in the model. However, the derived features of the biophysical models did not improve the predictive capability for TdP risk assessment.
RESUMO
MOTIVATION: Animal models are important tools in drug discovery and for understanding human biology in general. However, many drugs that initially show promising results in rodents fail in later stages of clinical trials. Understanding the commonalities and differences between human and rat cell signaling networks can lead to better experimental designs, improved allocation of resources and ultimately better drugs. RESULTS: The sbv IMPROVER Species-Specific Network Inference challenge was designed to use the power of the crowds to build two species-specific cell signaling networks given phosphoproteomics, transcriptomics and cytokine data generated from NHBE and NRBE cells exposed to various stimuli. A common literature-inspired reference network with 220 nodes and 501 edges was also provided as prior knowledge from which challenge participants could add or remove edges but not nodes. Such a large network inference challenge not based on synthetic simulations but on real data presented unique difficulties in scoring and interpreting the results. Because any prior knowledge about the networks was already provided to the participants for reference, novel ways for scoring and aggregating the results were developed. Two human and rat consensus networks were obtained by combining all the inferred networks. Further analysis showed that major signaling pathways were conserved between the two species with only isolated components diverging, as in the case of ribosomal S6 kinase RPS6KA1. Overall, the consensus between inferred edges was relatively high with the exception of the downstream targets of transcription factors, which seemed more difficult to predict. CONTACT: ebilal@us.ibm.com or gustavo@us.ibm.com. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.
Assuntos
Algoritmos , Crowdsourcing , Citocinas/metabolismo , Perfilação da Expressão Gênica , Redes Reguladoras de Genes , Fosfoproteínas/metabolismo , Software , Biologia de Sistemas/métodos , Animais , Brônquios/citologia , Brônquios/metabolismo , Comunicação Celular , Células Cultivadas , Bases de Dados Factuais , Células Epiteliais/citologia , Células Epiteliais/metabolismo , Regulação da Expressão Gênica , Humanos , Modelos Animais , Análise de Sequência com Séries de Oligonucleotídeos , Fosforilação , Ratos , Transdução de Sinais , Especificidade da EspécieRESUMO
The biological responses to external cues such as drugs, chemicals, viruses and hormones, is an essential question in biomedicine and in the field of toxicology, and cannot be easily studied in humans. Thus, biomedical research has continuously relied on animal models for studying the impact of these compounds and attempted to 'translate' the results to humans. In this context, the SBV IMPROVER (Systems Biology Verification for Industrial Methodology for PROcess VErification in Research) collaborative initiative, which uses crowd-sourcing techniques to address fundamental questions in systems biology, invited scientists to deploy their own computational methodologies to make predictions on species translatability. A multi-layer systems biology dataset was generated that was comprised of phosphoproteomics, transcriptomics and cytokine data derived from normal human (NHBE) and rat (NRBE) bronchial epithelial cells exposed in parallel to more than 50 different stimuli under identical conditions. The present manuscript describes in detail the experimental settings, generation, processing and quality control analysis of the multi-layer omics dataset accessible in public repositories for further intra- and inter-species translation studies.
Assuntos
Brônquios/metabolismo , Citocinas , Células Epiteliais/metabolismo , Proteômica , Transcriptoma , Animais , Brônquios/citologia , Citocinas/metabolismo , Humanos , Modelos Animais , Ratos , Biologia de Sistemas/métodos , Pesquisa Translacional BiomédicaRESUMO
Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.
Assuntos
Metodologias Computacionais , Modelos Cardiovasculares , Software , Simulação por Computador , Feminino , Humanos , Miócitos Cardíacos/citologia , Projetos Ser Humano VisívelRESUMO
We present the orthogonal recursive bisection algorithm that hierarchically segments the anatomical model structure into subvolumes that are distributed to cores. The anatomy is derived from the Visible Human Project, with electrophysiology based on the FitzHugh-Nagumo (FHN) and ten Tusscher (TT04) models with monodomain diffusion. Benchmark simulations with up to 16,384 and 32,768 cores on IBM Blue Gene/P and L supercomputers for both FHN and TT04 results show good load balancing with almost perfect speedup factors that are close to linear with the number of cores. Hence, strong scaling is demonstrated. With 32,768 cores, a 1000 ms simulation of full heart beat requires about 6.5 min of wall clock time for a simulation of the FHN model. For the largest machine partitions, the simulations execute at a rate of 0.548 s (BG/P) and 0.394 s (BG/L) of wall clock time per 1 ms of simulation time. To our knowledge, these simulations show strong scaling to substantially higher numbers of cores than reported previously for organ-level simulation of the heart, thus significantly reducing run times. The ability to reduce runtimes could play a critical role in enabling wider use of cardiac models in research and clinical applications.
Assuntos
Algoritmos , Metodologias Computacionais , Coração/anatomia & histologia , Coração/fisiologia , Modelos Anatômicos , Modelos Cardiovasculares , Animais , Simulação por Computador , HumanosRESUMO
Inherited long QT syndrome (LQTS) is caused by mutations in ion channels that delay cardiac repolarization, increasing the risk of sudden death from ventricular arrhythmias. Currently, the risk of sudden death in individuals with LQTS is estimated from clinical parameters such as age, gender, and the QT interval, measured from the electrocardiogram. Even though a number of different mutations can cause LQTS, mutation-specific information is rarely used clinically. LQTS type 1 (LQT1), one of the most common forms of LQTS, is caused by mutations in the slow potassium current (I(Ks)) channel α subunit KCNQ1. We investigated whether mutation-specific changes in I(Ks) function can predict cardiac risk in LQT1. By correlating the clinical phenotype of 387 LQT1 patients with the cellular electrophysiological characteristics caused by an array of mutations in KCNQ1, we found that channels with a decreased rate of current activation are associated with increased risk of cardiac events (hazard ratio=2.02), independent of the clinical parameters usually used for risk stratification. In patients with moderate QT prolongation (a QT interval less than 500 ms), slower activation was an independent predictor for cardiac events (syncope, aborted cardiac arrest, and sudden death) (hazard ratio = 2.10), whereas the length of the QT interval itself was not. Our results indicate that genotype and biophysical phenotype analysis may be useful for risk stratification of LQT1 patients and suggest that slow channel activation is associated with an increased risk of cardiac events.
Assuntos
Ativação do Canal Iônico/fisiologia , Canal de Potássio KCNQ1/genética , Canal de Potássio KCNQ1/metabolismo , Síndrome do QT Longo/genética , Síndrome do QT Longo/fisiopatologia , Mutação , Adolescente , Adulto , Animais , Criança , Pré-Escolar , Simulação por Computador , Eletrofisiologia , Predisposição Genética para Doença , Genótipo , Humanos , Lactente , Estimativa de Kaplan-Meier , Masculino , Modelos Biológicos , Oócitos/citologia , Oócitos/fisiologia , Fenótipo , Modelos de Riscos Proporcionais , Sistema de Registros , Fatores de Risco , Xenopus laevis , Adulto JovemRESUMO
Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.
Assuntos
Algoritmos , Simulação por Computador , Coração/fisiologia , Modelos Cardiovasculares , Linguagens de Programação , HumanosRESUMO
High performance computing is required to make feasible simulations of whole organ models of the heart with biophysically detailed cellular models in a clinical setting. Increasing model detail by simulating electrophysiology and mechanical models increases computation demands. We present scaling results of an electro - mechanical cardiac model of two ventricles and compare them to our previously published results using an electrophysiological model only. The anatomical data-set was given by both ventricles of the Visible Female data-set in a 0.2 mm resolution. Fiber orientation was included. Data decomposition for the distribution onto the distributed memory system was carried out by orthogonal recursive bisection. Load weight ratios for non-tissue vs. tissue elements used in the data decomposition were 1:1, 1:2, 1:5, 1:10, 1:25, 1:38.85, 1:50 and 1:100. The ten Tusscher et al. (2004) electrophysiological cell model was used and the Rice et al. (1999) model for the computation of the calcium transient dependent force. Scaling results for 512, 1024, 2048, 4096, 8192 and 16,384 processors were obtained for 1 ms simulation time. The simulations were carried out on an IBM Blue Gene/L supercomputer. The results show linear scaling from 512 to 16,384 processors with speedup factors between 1.82 and 2.14 between partitions. The most optimal load ratio was 1:25 for on all partitions. However, a shift towards load ratios with higher weight for the tissue elements can be recognized as can be expected when adding computational complexity to the model while keeping the same communication setup. This work demonstrates that it is potentially possible to run simulations of 0.5 s using the presented electro-mechanical cardiac model within 1.5 hours.
Assuntos
Engenharia Biomédica/métodos , Simulação por Computador , Eletrofisiologia/métodos , Modelos Cardiovasculares , Metodologias Computacionais , Feminino , Coração/fisiologia , Sistema de Condução Cardíaco , Humanos , Modelos Anatômicos , Modelos Neurológicos , Contração Miocárdica , Redes Neurais de Computação , Estresse Mecânico , Estados Unidos , Projetos Ser Humano VisívelRESUMO
Orthogonal recursive bisection (ORB) algorithm can be used as data decomposition strategy to distribute a large data set of a cardiac model to a distributed memory supercomputer. It has been shown previously that good scaling results can be achieved using the ORB algorithm for data decomposition. However, the ORB algorithm depends on the distribution of computational load of each element in the data set. In this work we investigated the dependence of data decomposition and load balancing on different rotations of the anatomical data set to achieve optimization in load balancing. The anatomical data set was given by both ventricles of the Visible Female data set in a 0.2 mm resolution. Fiber orientation was included. The data set was rotated by 90 degrees around x, y and z axis, respectively. By either translating or by simply taking the magnitude of the resulting negative coordinates we were able to create 14 data set of the same anatomy with different orientation and position in the overall volume. Computation load ratios for non - tissue vs. tissue elements used in the data decomposition were 1:1, 1:2, 1:5, 1:10, 1:25, 1:38.85, 1:50 and 1:100 to investigate the effect of different load ratios on the data decomposition. The ten Tusscher et al. (2004) electrophysiological cell model was used in monodomain simulations of 1 ms simulation time to compare performance using the different data sets and orientations. The simulations were carried out for load ratio 1:10, 1:25 and 1:38.85 on a 512 processor partition of the IBM Blue Gene/L supercomputer. Th results show that the data decomposition does depend on the orientation and position of the anatomy in the global volume. The difference in total run time between the data sets is 10 s for a simulation time of 1 ms. This yields a difference of about 28 h for a simulation of 10 s simulation time. However, given larger processor partitions, the difference in run time decreases and becomes less significant. Depending on the processor partition size, future work will have to consider the orientation of the anatomy in the global volume for longer simulation runs.
Assuntos
Metodologias Computacionais , Sistema de Condução Cardíaco/fisiologia , Modelos Cardiovasculares , Algoritmos , Simulação por Computador , Computadores , Eletrofisiologia/métodos , Coração/fisiologia , Humanos , Processamento de Imagem Assistida por Computador , Modelos Anatômicos , Modelos Teóricos , Contração Miocárdica , Software , Estados Unidos , Projetos Ser Humano VisívelRESUMO
Multi-scale, multi-physical heart models have not yet been able to include a high degree of accuracy and resolution with respect to model detail and spatial resolution due to computational limitations of current systems. We propose a framework to compute large scale cardiac models. Decomposition of anatomical data in segments to be distributed on a parallel computer is carried out by optimal recursive bisection (ORB). The algorithm takes into account a computational load parameter which has to be adjusted according to the cell models used. The diffusion term is realized by the monodomain equations. The anatomical data-set was given by both ventricles of the Visible Female data-set in a 0.2 mm resolution. Heterogeneous anisotropy was included in the computation. Model weights as input for the decomposition and load balancing were set to (a) 1 for tissue and 0 for non-tissue elements; (b) 10 for tissue and 1 for non-tissue elements. Scaling results for 512, 1024, 2048, 4096 and 8192 computational nodes were obtained for 10 ms simulation time. The simulations were carried out on an IBM Blue Gene/L parallel computer. A 1 s simulation was then carried out on 2048 nodes for the optimal model load. Load balances did not differ significantly across computational nodes even if the number of data elements distributed to each node differed greatly. Since the ORB algorithm did not take into account computational load due to communication cycles, the speedup is close to optimal for the computation time but not optimal overall due to the communication overhead. However, the simulation times were reduced form 87 minutes on 512 to 11 minutes on 8192 nodes. This work demonstrates that it is possible to run simulations of the presented detailed cardiac model within hours for the simulation of a heart beat.