Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 39
Filter
Add more filters

Complementary Medicines
Publication year range
1.
JCO Clin Cancer Inform ; 4: 299-309, 2020 03.
Article in English | MEDLINE | ID: mdl-32216636

ABSTRACT

PURPOSE: We present SlicerDMRI, an open-source software suite that enables research using diffusion magnetic resonance imaging (dMRI), the only modality that can map the white matter connections of the living human brain. SlicerDMRI enables analysis and visualization of dMRI data and is aimed at the needs of clinical research users. SlicerDMRI is built upon and deeply integrated with 3D Slicer, a National Institutes of Health-supported open-source platform for medical image informatics, image processing, and three-dimensional visualization. Integration with 3D Slicer provides many features of interest to cancer researchers, such as real-time integration with neuronavigation equipment, intraoperative imaging modalities, and multimodal data fusion. One key application of SlicerDMRI is in neurosurgery research, where brain mapping using dMRI can provide patient-specific maps of critical brain connections as well as insight into the tissue microstructure that surrounds brain tumors. PATIENTS AND METHODS: In this article, we focus on a demonstration of SlicerDMRI as an informatics tool to enable end-to-end dMRI analyses in two retrospective imaging data sets from patients with high-grade glioma. Analyses demonstrated here include conventional diffusion tensor analysis, advanced multifiber tractography, automated identification of critical fiber tracts, and integration of multimodal imagery with dMRI. RESULTS: We illustrate the ability of SlicerDMRI to perform both conventional and advanced dMRI analyses as well as to enable multimodal image analysis and visualization. We provide an overview of the clinical rationale for each analysis along with pointers to the SlicerDMRI tools used in each. CONCLUSION: SlicerDMRI provides open-source and clinician-accessible research software tools for dMRI analysis. SlicerDMRI is available for easy automated installation through the 3D Slicer Extension Manager.


Subject(s)
Brain Neoplasms/pathology , Brain Neoplasms/surgery , Diffusion Magnetic Resonance Imaging/methods , Image Interpretation, Computer-Assisted/methods , Image Processing, Computer-Assisted/methods , Software/standards , Aged , Algorithms , Brain Neoplasms/diagnostic imaging , Humans , Imaging, Three-Dimensional/methods , Male , Middle Aged , Retrospective Studies
2.
Nature ; 580(7805): 663-668, 2020 04.
Article in English | MEDLINE | ID: mdl-32152607

ABSTRACT

On average, an approved drug currently costs US$2-3 billion and takes more than 10 years to develop1. In part, this is due to expensive and time-consuming wet-laboratory experiments, poor initial hit compounds and the high attrition rates in the (pre-)clinical phases. Structure-based virtual screening has the potential to mitigate these problems. With structure-based virtual screening, the quality of the hits improves with the number of compounds screened2. However, despite the fact that large databases of compounds exist, the ability to carry out large-scale structure-based virtual screening on computer clusters in an accessible, efficient and flexible manner has remained difficult. Here we describe VirtualFlow, a highly automated and versatile open-source platform with perfect scaling behaviour that is able to prepare and efficiently screen ultra-large libraries of compounds. VirtualFlow is able to use a variety of the most powerful docking programs. Using VirtualFlow, we prepared one of the largest and freely available ready-to-dock ligand libraries, with more than 1.4 billion commercially available molecules. To demonstrate the power of VirtualFlow, we screened more than 1 billion compounds and identified a set of structurally diverse molecules that bind to KEAP1 with submicromolar affinity. One of the lead inhibitors (iKeap1) engages KEAP1 with nanomolar affinity (dissociation constant (Kd) = 114 nM) and disrupts the interaction between KEAP1 and the transcription factor NRF2. This illustrates the potential of VirtualFlow to access vast regions of the chemical space and identify molecules that bind with high affinity to target proteins.


Subject(s)
Drug Discovery/methods , Drug Evaluation, Preclinical/methods , Molecular Docking Simulation/methods , Software , User-Computer Interface , Access to Information , Automation/methods , Automation/standards , Cloud Computing , Computer Simulation , Databases, Chemical , Drug Discovery/standards , Drug Evaluation, Preclinical/standards , Kelch-Like ECH-Associated Protein 1/antagonists & inhibitors , Kelch-Like ECH-Associated Protein 1/chemistry , Kelch-Like ECH-Associated Protein 1/metabolism , Ligands , Molecular Docking Simulation/standards , Molecular Targeted Therapy , NF-E2-Related Factor 2/metabolism , Reproducibility of Results , Software/standards , Thermodynamics
3.
J Med Internet Res ; 22(4): e16533, 2020 04 17.
Article in English | MEDLINE | ID: mdl-32077858

ABSTRACT

BACKGROUND: Many comprehensive cancer centers incorporate tumor documentation software supplying structured information from the associated centers' oncology patients for internal and external audit purposes. However, much of the documentation data included in these systems often remain unused and unknown by most of the clinicians at the sites. OBJECTIVE: To improve access to such data for analytical purposes, a prerollout of an analysis layer based on the business intelligence software QlikView was implemented. This software allows for the real-time analysis and inspection of oncology-related data. The system is meant to increase access to the data while simultaneously providing tools for user-friendly real-time analytics. METHODS: The system combines in-memory capabilities (based on QlikView software) with innovative techniques that compress the complexity of the data, consequently improving its readability as well as its accessibility for designated end users. Aside from the technical and conceptual components, the software's implementation necessitated a complex system of permission and governance. RESULTS: A continuously running system including daily updates with a user-friendly Web interface and real-time usage was established. This paper introduces its main components and major design ideas. A commented video summarizing and presenting the work can be found within the Multimedia Appendix. CONCLUSIONS: The system has been well-received by a focus group of physicians within an initial prerollout. Aside from improving data transparency, the system's main benefits are its quality and process control capabilities, knowledge discovery, and hypothesis generation. Limitations such as run time, governance, or misinterpretation of data are considered.


Subject(s)
Medical Oncology/methods , Humans , Internet , Software/standards
4.
Nucleic Acids Res ; 47(D1): D596-D600, 2019 01 08.
Article in English | MEDLINE | ID: mdl-30272209

ABSTRACT

Rhea (http://www.rhea-db.org) is a comprehensive and non-redundant resource of over 11 000 expert-curated biochemical reactions that uses chemical entities from the ChEBI ontology to represent reaction participants. Originally designed as an annotation vocabulary for the UniProt Knowledgebase (UniProtKB), Rhea also provides reaction data for a range of other core knowledgebases and data repositories including ChEBI and MetaboLights. Here we describe recent developments in Rhea, focusing on a new resource description framework representation of Rhea reaction data and an SPARQL endpoint (https://sparql.rhea-db.org/sparql) that provides access to it. We demonstrate how federated queries that combine the Rhea SPARQL endpoint and other SPARQL endpoints such as that of UniProt can provide improved metabolite annotation and support integrative analyses that link the metabolome through the proteome to the transcriptome and genome. These developments will significantly boost the utility of Rhea as a means to link chemistry and biology for a more holistic understanding of biological systems and their function in health and disease.


Subject(s)
Databases, Chemical , Databases, Protein , Metabolomics/methods , Software/standards , Humans , Knowledge Bases , Systems Biology/methods
5.
Annu Int Conf IEEE Eng Med Biol Soc ; 2018: 2768-2771, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30440975

ABSTRACT

Previous research has supported the use of virtual reality (VR) to decrease stress, anxiety, perceptions of pain, and increase positive affect. However, the effect of VR on blood pressure (BP) and autonomic function in healthy populations have not been explored. This study quantifies the effect of instructed meditation augmented by a virtual environment (VE) on BP and heart rate variability (HRV) during rest and following physical (isometric handgrip) or mental (serial sevens subtraction) stress. Sixteen healthy participants underwent all conditions, and those that responded to the stress tests were included in the analysis of stress recovery. Results showed that under resting conditions, VE had no significant effect on BP or HRV when compared to seated rest and the VE video on a 2D screen. Following serial sevens, VE maintained the increased low frequency (LF) power of HRV $( 66 \pm 4$ normalized units (n.u.)) compared to seated rest $( 55 \pm 5\mathrm {n}$.u., $\mathrm {p}=0.0060)$; VE maintained the decreased high frequency (HF) power of HRV $( 34 \pm 4\mathrm {n}$.u.) compared to seated rest $( 44 \pm 5\mathrm {n}$.u., $\mathrm {p}=0.014)$; and VE maintained the increased LF/HF ratio $( 2.4 \pm 0.5)$ compared to seated rest $( 1.6 \pm 0.3$, $\mathrm {p}=0.012)$. Hence, after mental stress, VE sustains the increased sympathetic drive and reduced parasympathetic drive. VE may act as a stimulatory driver for autonomic activity and BP. Further studies are required to investigate the effect of different types of VE on BP and autonomic function.


Subject(s)
Heart Rate , Meditation , Software , Virtual Reality , Adult , Aged , Autonomic Nervous System/physiology , Blood Pressure , Female , Hand Strength , Humans , Male , Meditation/methods , Middle Aged , Software/standards
6.
Article in German | MEDLINE | ID: mdl-29349524

ABSTRACT

Smartphones and tablets with their nearly unlimited number of different applications have become an integral part of everyday life. Thus, mobile devices and applications have also found their way into the healthcare sector.For developers, manufacturers, or users as well, it is often difficult to decide whether a mobile health application is a medical device.In this context, it is extremely important for manufacturers to decide at an early stage of the development whether the product is to be introduced into the market as a medical device and is therefore subject to the legislation on medical devices.This article first presents the regulatory framework and subsequently introduces the reader to the Federal Institute for Drugs and Medical Devices' (BfArM) view of the criteria for differentiating between apps as non-medical products and apps as medical apps as well as the classification thereof. Various examples are presented to demonstrate how these criteria are applied practically and options that support developers and manufacturers in their decision making are shown. The article concludes with a reference to current developments and offers a perspective on the new European medical device regulations MDR/IVDR (Medical Device Regulation/In-Vitro Diagnostic Regulation) as well as on future challenges regarding medical apps.


Subject(s)
Device Approval/legislation & jurisprudence , Medical Device Legislation , Mobile Applications/legislation & jurisprudence , Software/legislation & jurisprudence , Device Approval/standards , Germany , Humans , Mobile Applications/standards , National Health Programs/legislation & jurisprudence , Software/classification , Software/standards , Software Design
7.
J Pharm Biomed Anal ; 131: 40-47, 2016 Nov 30.
Article in English | MEDLINE | ID: mdl-27521988

ABSTRACT

The characterization of herbal prescriptions serves as a foundation for quality control and regulation of herbal medicines. Previously, the characterization of herbal chemicals from natural medicines often relied on the analysis of signature fragment ions from the acquired tandem mass spectrometry (MS/MS) spectra with prior knowledge of the herbal species present in the herbal prescriptions of interest. Nevertheless, such an approach is often limited to target components, and it risks missing the critical components that we have no prior knowledge of. We previously reported a "diagnostic ion-guided network bridging" strategy. It is a generally applicable and robust approach to analyze unknown substances from complex mixtures in an untargeted manner. In this study, we have developed a standalone software named "Nontargeted Diagnostic Ion Network Analysis (NINA)" with a graphical user interface based on a strategy for post-acquisition data analysis. NINA allows one to rapidly determine the nontargeted diagnostic ions (NIs) by summarizing all of the fragment ions shared by the precursors from the acquired MS/MS spectra. A NI-guided network using bridging components that possess two or more NIs can then be established via NINA. With such a network, we could sequentially identify the structures of all the NIs once a single compound has been identified de novo. The structures of NIs can then be used as "priori" knowledge to narrow the candidates containing the sub-structure of the corresponding NI from the database hits. Subsequently, we applied the NINA software to the characterization of a model herbal prescription, Re-Du-Ning injection, and rapidly identified 56 herbal chemicals from the prescription using an ultra-performance liquid chromatography quadrupole time-of-flight system in the negative mode with no knowledge of the herbal species or herbal chemicals in the mixture. Therefore, we believe the applications of NINA will greatly facilitate the characterization of complex mixtures, such as natural medicines, especially when no advance information is available. In addition to herbal medicines, the NINA-based workflow will also benefit many other fields, such as environmental analysis, nutritional science, and forensic analysis.


Subject(s)
Biological Products/analysis , Drugs, Chinese Herbal/analysis , Software , Tandem Mass Spectrometry/methods , Software/standards , Tandem Mass Spectrometry/standards
8.
Article in German | MEDLINE | ID: mdl-26346898

ABSTRACT

BACKGROUND: Telemedicine systems are today already used in a variety of areas to improve patient care. The lack of standardization in those solutions creates a lack of interoperability of the systems. Internationally accepted standards can help to solve the lack of system interoperability. With Integrating the Healthcare Enterprise (IHE), a worldwide initiative of users and vendors is working on the use of defined standards for specific use cases by describing those use cases in so called IHE Profiles. OBJECTIVES: The aim of this work is to determine how telemedicine applications can be implemented using IHE profiles. METHODS: Based on a literature review, exemplary telemedicine applications are described and technical abilities of IHE Profiles are evaluated. These IHE Profiles are examined for their usability and are then evaluated in exemplary telemedicine application architectures. RESULTS: There are IHE Profiles which can be identified as being useful for intersectoral patient records (e.g. PEHR at Heidelberg), as well as for point to point communication where no patient record is involved. In the area of patient records, the IHE Profile "Cross-Enterprise Document Sharing (XDS)" is often used. The point to point communication can be supported using the IHE "Cross-Enterprise Document Media Interchange (XDM)". IHE-based telemedicine applications offer caregivers the possibility to be informed about their patients using data from intersectoral patient records, but also there are possible savings by reusing the standardized interfaces in other scenarios.


Subject(s)
Delivery of Health Care, Integrated/standards , Electronic Health Records/standards , Hospital Information Systems/standards , Medical Record Linkage/standards , Models, Organizational , Telemedicine/standards , Germany , Meaningful Use/standards , Practice Guidelines as Topic , Software/standards , Systems Integration
9.
Stud Health Technol Inform ; 216: 45-9, 2015.
Article in English | MEDLINE | ID: mdl-26262007

ABSTRACT

Healthcare Information Systems are a big business. Currently there is an explosion of EHR/EMR products available on the market, and the best tools are really expensive. Many developing countries and healthcare providers cannot access such tools, and for those who can, there is not a clear strategy for the evolution, scaling, and cost of these electronic health products. The lack of standard-based implementations conduct to the creation of isolated information silos that cannot be exploited (i.e. shared between providers to promote a holistic view of each patient's medical history). This paper exposes the main elements behind a Standard-based Open Source EHR Platform that is future-proof and allows to evolve and scale with minimal cost. The proposed EHR Architecture is based on openEHR specifications, adding elements emerged from research and development experiences, leading to a design that can be implemented in any modern technology. Different implementations will be interoperable by design. This Platform will leverage contexts of scarce resources, reusing clinical knowledge, a common set of software components and services.


Subject(s)
Confidentiality/standards , Electronic Health Records/organization & administration , Guidelines as Topic , Information Storage and Retrieval/standards , Medical Record Linkage/standards , Internationality , Organizational Objectives , Software/standards
10.
Homeopathy ; 104(3): 190-6, 2015 Jul.
Article in English | MEDLINE | ID: mdl-26143452

ABSTRACT

UNLABELLED: Collection of data concerning case histories is not yet common in homeopathy despite its great importance for this method. Computer program development progresses slowly and discussion about requirements is scarce. Two Dutch projects assessed Materia Medica of some homeopathic medicines and six homeopathic symptoms. Especially the second project relied heavily on data collection. In both projects much effort was spent on consensus between participating doctors. There was much variance between doctors despite our consensus efforts. Assessing causality seems the most important source of bias, there is also much variance in assessing symptoms. CONCLUSION: Data collection software should be developed step-by-step, guided by close monitoring and feedback of participating practitioners.


Subject(s)
Data Collection/methods , Decision Making, Computer-Assisted , Homeopathy/methods , Materia Medica/standards , Software/supply & distribution , Consensus , Data Collection/standards , Homeopathy/standards , Humans , Materia Medica/therapeutic use , Practice Patterns, Physicians' , Software/standards
11.
Eur J Cancer ; 51(9): 997-1017, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25956208

ABSTRACT

UNLABELLED: Population-based cancer registries (CRs) in Europe have played a supportive, sometimes guiding, role in describing geographic variation of cancer epidemics and comparisons of oncological practice and preventive interventions since the 1950s for all types of cancer, separate and simultaneously. This paper deals with historical and longitudinal developments of the roughly 160 CRs and their programme owners (POs) that emerged since 1927 and accelerating since the late 70s especially in southern and continental Europe. About 40 million newly diagnosed patients were recorded since the 1950s out of a total of 100 million of whom almost 20 million are still alive and about 10% annually dying from cancer. The perception of unity in diversity and suboptimal comparability in performance and governance of CRs was confirmed in the EUROCOURSE (EUROpe against cancer: Optimisation of the Use of Registries for Scientific Excellence in research) European Research Area (ERA)-net coordination FP7 project of the European Commission (EU) which explored best practices, bottlenecks and future challenges of CRs. Regional oncologic and public health changes but also academic embedding of CRs varied considerably, although Anno 2012 optimal cancer surveillance indeed demanded intensive collaboration with professional and institutional stakeholders in two major areas (public health and clinical research) and five minor overlapping cancer research domains: aetiologic research, mass screening evaluation, quality of care, translational prognostics and survivorship. Each of these domains address specific study questions, mixes of disciplines, methodologies, additional data-sources and funding mechanisms. POs tended to become more and more public health institutes, Health ministries, but also comprehensive cancer centres and cancer societies through more and more funding at project or programme basis. POs were not easy to pin down because of their multiple, sometimes competitive (funding) obligations and increasing complexity of cancer surveillance. But they also rather seemed to need guiding principles for Governance of 'their' CR(s) as well as to appreciate value of collaborative research in Europe and shield CRs against unreasonable data protection in case of linkages. Despite access to specialised care related shortcomings, especially of survival cohort studies, European databases for studies of incidence and survival (such as ACCIS and EUREG on the one hand and EUROCARE and RARECARE on the other hand) have proved to be powerful means for comparative national or regional cancer surveillance. Pooling of comparable data will exhibit much instructive variation in time and place. If POs of CRs would consider multinational European studies of risk and prognosis of cancer more to serve their own regional or national interest, then progress in this field will accelerate and lead to more consistent funding from the EU. The current 20 million cancer survivors and their care providers are likely to appreciate more feedback. CONCLUSION: Most CRs remain uniquely able to report on progress against cancer by studies of variation in incidence (in time and place), detection and survival, referral and treatment patterns and their (side) effects in unselected patients, the latter especially in the (very) elderly. Programming and profiling its multiple and diverse clinical and prevention research is likely to promote involvement of public health and clinical stakeholders with a population-based research interest, increasingly patient groups and licensed 'buyers' of oncologic services.


Subject(s)
Clinical Protocols , Health Information Management , Neoplasms , Public Health , Registries , Software , Clinical Protocols/standards , Health Information Management/education , Health Information Management/organization & administration , Health Information Management/standards , Health Services Research/history , Health Services Research/methods , Health Services Research/standards , History, 20th Century , History, 21st Century , Humans , Learning , Neoplasms/epidemiology , Neoplasms/therapy , Ownership , Population Surveillance/methods , Public Health/education , Public Health/history , Public Health/methods , Registries/standards , Software/legislation & jurisprudence , Software/standards
12.
J Am Coll Radiol ; 12(1): 38-42, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25455196

ABSTRACT

The ACR recognizes that low-dose CT for lung cancer screening has the potential to significantly reduce mortality from lung cancer in the appropriate high-risk population. The ACR supports the recommendations of the US Preventive Services Task Force and the National Comprehensive Cancer Network for screening patients. To be effective, lung cancer screening should be performed at sites providing high-quality low-dose CT examinations overseen and interpreted by qualified physicians using a structured reporting and management system. The ACR has developed a set of tools necessary for radiologists to take the lead on the front lines of lung cancer screening. The ACR Lung Cancer Screening Center designation is built upon the ACR CT accreditation program and requires use of Lung-RADS or a similar structured reporting and management system. This designation provides patients and referring providers with the assurance that they will receive high-quality screening with appropriate follow-up care.


Subject(s)
Accreditation/standards , Early Detection of Cancer/standards , Lung Neoplasms/diagnostic imaging , Radiology Information Systems/standards , Software/standards , Tomography, X-Ray Computed/standards , Humans , Lung Neoplasms/prevention & control , United States
13.
J Comput Chem ; 34(25): 2212-21, 2013 Sep 30.
Article in English | MEDLINE | ID: mdl-23813626

ABSTRACT

The program VinaMPI has been developed to enable massively large virtual drug screens on leadership-class computing resources, using a large number of cores to decrease the time-to-completion of the screen. VinaMPI is a massively parallel Message Passing Interface (MPI) program based on the multithreaded virtual docking program AutodockVina, and is used to distribute tasks while multithreading is used to speed-up individual docking tasks. VinaMPI uses a distribution scheme in which tasks are evenly distributed to the workers based on the complexity of each task, as defined by the number of rotatable bonds in each chemical compound investigated. VinaMPI efficiently handles multiple proteins in a ligand screen, allowing for high-throughput inverse docking that presents new opportunities for improving the efficiency of the drug discovery pipeline. VinaMPI successfully ran on 84,672 cores with a continual decrease in job completion time with increasing core count. The ratio of the number of tasks in a screening to the number of workers should be at least around 100 in order to have a good load balance and an optimal job completion time. The code is freely available and downloadable. Instructions for downloading and using the code are provided in the Supporting Information.


Subject(s)
Computing Methodologies , Drug Evaluation, Preclinical , Estrogen Receptor alpha/agonists , Humans , Ligands , Small Molecule Libraries/chemistry , Software/standards
14.
Nutr Clin Pract ; 28(4): 515-21, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23753649

ABSTRACT

BACKGROUND: Computerized software programs reduce errors and increase consistency when ordering parenteral nutrition (PN). The purpose of this study was to evaluate the effectiveness of our computerized neonatal PN calculator ordering program in reducing errors and optimizing nutrient intake. MATERIALS AND METHODS: This was a retrospective study of infants requiring PN during the first 2-3 weeks of life. Caloric, protein, calcium, and phosphorus intakes; days above and below amino acid (AA) goals; and PN ordering errors were recorded. Infants were divided into 3 groups by birth weight for analysis: ≤1000 g, 1001-1500 g, and >1500 g. Intakes and outcomes of infants before (2007) vs after (2009) implementation of the calculator for each group were compared. RESULTS: There were no differences in caloric, protein, or phosphorus intakes in 2007 vs 2009 in any group. Mean protein intakes were 97%-99% of goal for ≤1000-g and 1001- to 1500-g infants in 2009 vs 87% of goal for each group in 2007. In 2007, 7.6 per 100 orders were above and 11.5 per 100 were below recommended AA intakes. Calcium intakes were higher in 2009 vs 2007 in ≤1000-g (46.6 ± 6.1 vs 39.5 ± 8.0 mg/kg/d, P < .001) and >1500-g infants (50.6 ± 7.4 vs 39.9 ± 8.3 mg/kg/d, P < .001). Ordering errors were reduced from 4.6 per 100 in 2007 to 0.1 per 100 in 2009. CONCLUSION: Our study reaffirms that computerized ordering systems can increase the quality and safety of neonatal PN orders. Calcium and AA intakes were optimized and ordering errors were minimized using the computer-based ordering program.


Subject(s)
Computers , Energy Intake , Nutrition Assessment , Parenteral Nutrition, Total/standards , Prescriptions/standards , Software/standards , Calcium/administration & dosage , Calcium, Dietary/administration & dosage , Dietary Proteins/administration & dosage , Humans , Infant, Newborn , Parenteral Nutrition Solutions/chemistry , Parenteral Nutrition, Total/adverse effects , Parenteral Nutrition, Total/methods , Phosphorus/administration & dosage , Phosphorus, Dietary/administration & dosage , Retrospective Studies
15.
BMC Neurosci ; 12: 100, 2011 Oct 11.
Article in English | MEDLINE | ID: mdl-21989414

ABSTRACT

BACKGROUND: To date, some of the most useful and physiologically relevant neuronal cell culture systems, such as high density co-cultures of astrocytes and primary hippocampal neurons, or differentiated stem cell-derived cultures, are characterized by high cell density and partially overlapping cellular structures. Efficient analytical strategies are required to enable rapid, reliable, quantitative analysis of neuronal morphology in these valuable model systems. RESULTS: Here we present the development and validation of a novel bioinformatics pipeline called NeuriteQuant. This tool enables fully automated morphological analysis of large-scale image data from neuronal cultures or brain sections that display a high degree of complexity and overlap of neuronal outgrowths. It also provides an efficient web-based tool to review and evaluate the analysis process. In addition to its built-in functionality, NeuriteQuant can be readily extended based on the rich toolset offered by ImageJ and its associated community of developers. As proof of concept we performed automated screens for modulators of neuronal development in cultures of primary neurons and neuronally differentiated P19 stem cells, which demonstrated specific dose-dependent effects on neuronal morphology. CONCLUSIONS: NeuriteQuant is a freely available open-source tool for the automated analysis and effective review of large-scale high-content screens. It is especially well suited to quantify the effect of experimental manipulations on physiologically relevant neuronal cultures or brain sections that display a high degree of complexity and overlap among neurites or other cellular structures.


Subject(s)
Image Cytometry/methods , Neurites/ultrastructure , Neurogenesis/physiology , Software Validation , Software/standards , Algorithms , Animals , Cell Culture Techniques/methods , Cell Line , Computational Biology/methods , Drug Evaluation, Preclinical/methods , Information Dissemination/methods , Mice , Nerve Growth Factors/physiology , Neurites/physiology
16.
Biomed Eng Online ; 9: 45, 2010 Sep 06.
Article in English | MEDLINE | ID: mdl-20819204

ABSTRACT

BACKGROUND: Interpreting and controlling bioelectromagnetic phenomena require realistic physiological models and accurate numerical solvers. A semi-realistic model often used in practise is the piecewise constant conductivity model, for which only the interfaces have to be meshed. This simplified model makes it possible to use Boundary Element Methods. Unfortunately, most Boundary Element solutions are confronted with accuracy issues when the conductivity ratio between neighboring tissues is high, as for instance the scalp/skull conductivity ratio in electro-encephalography. To overcome this difficulty, we proposed a new method called the symmetric BEM, which is implemented in the OpenMEEG software. The aim of this paper is to present OpenMEEG, both from the theoretical and the practical point of view, and to compare its performances with other competing software packages. METHODS: We have run a benchmark study in the field of electro- and magneto-encephalography, in order to compare the accuracy of OpenMEEG with other freely distributed forward solvers. We considered spherical models, for which analytical solutions exist, and we designed randomized meshes to assess the variability of the accuracy. Two measures were used to characterize the accuracy. the Relative Difference Measure and the Magnitude ratio. The comparisons were run, either with a constant number of mesh nodes, or a constant number of unknowns across methods. Computing times were also compared. RESULTS: We observed more pronounced differences in accuracy in electroencephalography than in magnetoencephalography. The methods could be classified in three categories: the linear collocation methods, that run very fast but with low accuracy, the linear collocation methods with isolated skull approach for which the accuracy is improved, and OpenMEEG that clearly outperforms the others. As far as speed is concerned, OpenMEEG is on par with the other methods for a constant number of unknowns, and is hence faster for a prescribed accuracy level. CONCLUSIONS: This study clearly shows that OpenMEEG represents the state of the art for forward computations. Moreover, our software development strategies have made it handy to use and to integrate with other packages. The bioelectromagnetic research community should therefore be able to benefit from OpenMEEG with a limited development effort.


Subject(s)
Electromagnetic Phenomena , Software , Benchmarking , Computers , Electric Impedance , Electricity , Electroencephalography , Licensure , Magnetics , Magnetoencephalography , Models, Theoretical , Quality Control , Software/legislation & jurisprudence , Software/standards , Time Factors , Tomography
17.
Pain Physician ; 13(4): 321-35, 2010.
Article in English | MEDLINE | ID: mdl-20648201

ABSTRACT

BACKGROUND: With advances in spinal cord stimulation (SCS) technology, particularly rechargeable implantable, patients are now being offered a wider range of parameters to treat their pain. In particular, pulse width (PW) programming ranges of rechargeable implantable pulse generators now match that of radiofrequency systems (with programmability up to 1000 microseconds. The intent of the present study was to investigate the effects of varying PW in SCS. OBJECTIVE: To understand the effects of PW programming in spinal cord stimulation (SCS). DESIGN: Single-center, prospective, randomized, single-blind evaluation of the technical and clinical outcomes of PW programming. SETTING: Acute, outpatient follow-up. METHODS: Subjects using fully-implanted SCS for > 3 months to treat chronic intractable low back and/or leg pain. Programming of a wide range (50-1000 microseconds) of programmed PW settings using each patient's otherwise unchanged 'walk-in' program. OUTCOME MEASURES: Paresthesia thresholds (perception, maximum comfortable, discomfort), paresthesia coverage and patient choice of tested programs. RESULTS: We found strength-duration parameters of chronaxie and rheobase to be 295 (242 - 326) microseconds and 2.5 (1.3 - 3.3) mA, respectively. The median PW of all patients' 'walk-out' programs was 400 microseconds, approximately 48% higher than median chronaxie (p = 0.01), suggesting that chronaxie may not relate to patient-preferred stimulation settings. We found that 7/19 patients selected new PW programs, which significantly increased their paresthesia-pain overlap by 56% on average (p = 0.047). We estimated that 10/19 patients appeared to have greater paresthesia coverage, and 8/19 patients appeared to display a 'caudal shift' of paresthesia coverage with increased PW. LIMITATIONS: Small number of patients. CONCLUSIONS: Variable PW programming in SCS appears to have clinical value, demonstrated by some patients improving their paresthesia-pain overlap, as well as the ability to increase and even 'steer' paresthesia coverage.


Subject(s)
Electric Stimulation Therapy/methods , Low Back Pain/therapy , Prosthesis Implantation/methods , Sciatica/therapy , Software , Spinal Cord/surgery , Electric Stimulation Therapy/adverse effects , Electrodes, Implanted/standards , Female , Humans , Male , Middle Aged , Outcome Assessment, Health Care/methods , Pain Measurement/methods , Paresthesia/etiology , Paresthesia/prevention & control , Prospective Studies , Prosthesis Implantation/instrumentation , Single-Blind Method , Software/standards , Treatment Outcome
18.
J Neurosci Methods ; 191(1): 110-8, 2010 Aug 15.
Article in English | MEDLINE | ID: mdl-20595034

ABSTRACT

Prior studies of multichannel ECoG from animals showed that beta and gamma oscillations carried perceptual information in both local and global spatial patterns of amplitude modulation, when the subjects were trained to discriminate conditioned stimuli (CS). Here the hypothesis was tested that similar patterns could be found in the scalp EEG human subjects trained to discriminate simultaneous visual-auditory CS. Signals were continuously recorded from 64 equispaced scalp electrodes and band-pass filtered. The Hilbert transform gave the analytic phase, which segmented the EEG into temporal frames, and the analytic amplitude, which expressed the pattern in each frame as a feature vector. Methods applied to the ECoG were adapted to the EEG for systematic search of the beta-gamma spectrum, the time period after CS onset, and the scalp surface to locate patterns that could be classified with respect to type of CS. Spatial patterns of EEG amplitude modulation were found from all subjects that could be classified with respect to stimulus combination type significantly above chance levels. The patterns were found in the beta range (15-22 Hz) but not in the gamma range. They occurred in three short bursts following CS onset. They were non-local, occupying the entire array. Our results suggest that the scalp EEG can yield information about the timing of episodically synchronized brain activity in higher cognitive function, so that future studies in brain-computer interfacing can be better focused. Our methods may be most valuable for analyzing data from dense arrays with very high spatial and temporal sampling rates.


Subject(s)
Brain Mapping/methods , Cerebral Cortex/physiology , Electroencephalography/classification , Electroencephalography/methods , Perception/physiology , Sensation/physiology , Signal Processing, Computer-Assisted , Acoustic Stimulation/classification , Acoustic Stimulation/methods , Adult , Biological Clocks/physiology , Brain Mapping/classification , Cognition/classification , Cognition/physiology , Cortical Synchronization , Discrimination Learning/classification , Discrimination Learning/physiology , Evoked Potentials/physiology , Humans , Male , Pattern Recognition, Automated , Photic Stimulation/methods , Software/classification , Software/standards , Young Adult
19.
J Comput Chem ; 31(11): 2109-25, 2010 Aug.
Article in English | MEDLINE | ID: mdl-20127741

ABSTRACT

Many molecular docking programs are available nowadays, and thus it is of great practical value to evaluate and compare their performance. We have conducted an extensive evaluation of four popular commercial molecular docking programs, including Glide, GOLD, LigandFit, and Surflex. Our test set consists of 195 protein-ligand complexes with high-resolution crystal structures (resolution

Subject(s)
Computational Biology/methods , Ligands , Proteins/chemistry , Proteins/metabolism , Software , Computer Simulation , Drug Evaluation, Preclinical , Enzyme Inhibitors/chemistry , Enzyme Inhibitors/pharmacology , Protein Binding , Software/standards , Solvents/chemistry , Structure-Activity Relationship
20.
J Neurosci Methods ; 169(1): 239-48, 2008 Mar 30.
Article in English | MEDLINE | ID: mdl-18215424

ABSTRACT

The possibilities of currently commercially available auditory steady-state response (ASSR) devices are mostly limited to avoid unintentional misuse and to guarantuee patient safety as such. Some setups, e.g. do not allow the application of high intensities or the use of own stimuli. Moreover, most devices generally only allow data collection using maximal two EEG channels. The freedom to modify and extend the accompagnying software and hardware is very restricted or inexistent. As a result, these devices are not suited for research and several clinically diagnostic purposes. In this paper, a research platform for multi-channel ASSR measurements is presented, referred to as SOMA (setup ORL for multi-channel ASSR). The setup allows multi-channel measurements and the use of own stimuli. It can be easily extended to facilitate new measurement protocols and real-time signal processing. The mobile setup is based on an inexpensive multi-channel RME soundcard and software is written in C++. Both hardware and software of the setup are described. An evaluation study with nine normal-hearing subjects shows no significant performance differences between a reference and the proposed platform. SOMA presents a flexible and modularly extensible mobile high-end multi-channel ASSR test platform.


Subject(s)
Audiometry, Evoked Response/methods , Auditory Perception/physiology , Electroencephalography/methods , Electronics/methods , Evoked Potentials, Auditory/physiology , Software/standards , Acoustic Stimulation , Adult , Audiometry, Evoked Response/instrumentation , Auditory Cortex/physiology , Auditory Threshold/physiology , Electroencephalography/instrumentation , Electronics/instrumentation , Functional Laterality/physiology , Humans , Reference Values , Signal Processing, Computer-Assisted/instrumentation , Software/trends
SELECTION OF CITATIONS
SEARCH DETAIL