Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 17 de 17
Filtrar
1.
Adv Exp Med Biol ; 1361: 23-35, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35230681

RESUMO

Precision oncology mainly relies on genetic and molecular patient profiling from high-throughput sequencing data. The necessity to process and analyze large volumes of data has led to the development of robust computational tools and methods. The most challenging aspect in the implementation of a precision oncology workflow involves proper handling of large volume of data, while ensuring the results are reproducible and replicable. In this chapter, we provide a detailed description of the various tools available for the design and implementation of a precision oncology pipeline along with the technical considerations to make to utilize these tools effectively. We then provide a guide to the development of a precision oncology pipeline, with a specific emphasis on the software workflows and infrastructure needed.


Assuntos
Neoplasias , Biologia Computacional/métodos , Genômica/métodos , Humanos , Neoplasias/genética , Neoplasias/terapia , Medicina de Precisão , Software , Fluxo de Trabalho
2.
BMC Bioinformatics ; 22(1): 60, 2021 Feb 09.
Artigo em Inglês | MEDLINE | ID: mdl-33563206

RESUMO

BACKGROUND: Current high-throughput technologies-i.e. whole genome sequencing, RNA-Seq, ChIP-Seq, etc.-generate huge amounts of data and their usage gets more widespread with each passing year. Complex analysis pipelines involving several computationally-intensive steps have to be applied on an increasing number of samples. Workflow management systems allow parallelization and a more efficient usage of computational power. Nevertheless, this mostly happens by assigning the available cores to a single or few samples' pipeline at a time. We refer to this approach as naive parallel strategy (NPS). Here, we discuss an alternative approach, which we refer to as concurrent execution strategy (CES), which equally distributes the available processors across every sample's pipeline. RESULTS: Theoretically, we show that the CES results, under loose conditions, in a substantial speedup, with an ideal gain range spanning from 1 to the number of samples. Also, we observe that the CES yields even faster executions since parallelly computable tasks scale sub-linearly. Practically, we tested both strategies on a whole exome sequencing pipeline applied to three publicly available matched tumour-normal sample pairs of gastrointestinal stromal tumour. The CES achieved speedups in latency up to 2-2.4 compared to the NPS. CONCLUSIONS: Our results hint that if resources distribution is further tailored to fit specific situations, an even greater gain in performance of multiple samples pipelines execution could be achieved. For this to be feasible, a benchmarking of the tools included in the pipeline would be necessary. It is our opinion these benchmarks should be consistently performed by the tools' developers. Finally, these results suggest that concurrent strategies might also lead to energy and cost savings by making feasible the usage of low power machine clusters.


Assuntos
Biologia Computacional , Sequenciamento do Exoma , Sequenciamento de Nucleotídeos em Larga Escala , Software , Sequenciamento de Cromatina por Imunoprecipitação , Biologia Computacional/métodos , Sequenciamento do Exoma/normas , Fluxo de Trabalho
3.
BMC Bioinformatics ; 19(1): 97, 2018 03 13.
Artigo em Inglês | MEDLINE | ID: mdl-29534677

RESUMO

BACKGROUND: The development of high-throughput experimental technologies, such as next-generation sequencing, have led to new challenges for handling, analyzing and integrating the resulting large and diverse datasets. Bioinformatical analysis of these data commonly requires a number of mutually dependent steps applied to numerous samples for multiple conditions and replicates. To support these analyses, a number of workflow management systems (WMSs) have been developed to allow automated execution of corresponding analysis workflows. Major advantages of WMSs are the easy reproducibility of results as well as the reusability of workflows or their components. RESULTS: In this article, we present Watchdog, a WMS for the automated analysis of large-scale experimental data. Main features include straightforward processing of replicate data, support for distributed computer systems, customizable error detection and manual intervention into workflow execution. Watchdog is implemented in Java and thus platform-independent and allows easy sharing of workflows and corresponding program modules. It provides a graphical user interface (GUI) for workflow construction using pre-defined modules as well as a helper script for creating new module definitions. Execution of workflows is possible using either the GUI or a command-line interface and a web-interface is provided for monitoring the execution status and intervening in case of errors. To illustrate its potentials on a real-life example, a comprehensive workflow and modules for the analysis of RNA-seq experiments were implemented and are provided with the software in addition to simple test examples. CONCLUSIONS: Watchdog is a powerful and flexible WMS for the analysis of large-scale high-throughput experiments. We believe it will greatly benefit both users with and without programming skills who want to develop and apply bioinformatical workflows with reasonable overhead. The software, example workflows and a comprehensive documentation are freely available at www.bio.ifi.lmu.de/watchdog.


Assuntos
Biologia Computacional/métodos , Sequenciamento de Nucleotídeos em Larga Escala/métodos , RNA/análise , Software , Replicação Viral , Fluxo de Trabalho , Herpes Simples/genética , Herpes Simples/virologia , Herpesvirus Humano 1/genética , Humanos , RNA/genética , Interface Usuário-Computador
4.
Entropy (Basel) ; 20(3)2018 Mar 14.
Artigo em Inglês | MEDLINE | ID: mdl-33265286

RESUMO

As the virtual mirror of complex real-time business processes of organisations' underlying information systems, the workflow management system (WfMS) has emerged in recent decades as a new self-autonomous paradigm in the open, dynamic, distributed computing environment. In order to construct a trustworthy workflow management system (TWfMS), the design of a software behaviour trustworthiness measurement algorithm is an urgent task for researchers. Accompanying the trustworthiness mechanism, the measurement algorithm, with uncertain software behaviour trustworthiness information of the WfMS, should be resolved as an infrastructure. Based on the framework presented in our research prior to this paper, we firstly introduce a formal model for the WfMS trustworthiness measurement, with the main property reasoning based on calculus operators. Secondly, this paper proposes a novel measurement algorithm from the software behaviour entropy of calculus operators through the principle of maximum entropy (POME) and the data mining method. Thirdly, the trustworthiness measurement algorithm for incomplete software behaviour tests and runtime information is discussed and compared by means of a detailed explanation. Finally, we provide conclusions and discuss certain future research areas of the TWfMS.

5.
Entropy (Basel) ; 20(10)2018 Sep 24.
Artigo em Inglês | MEDLINE | ID: mdl-33265821

RESUMO

Under the infrastructure of three gradually deepening layers consisting of System, Service and Software, the information entropy of the Trustworthy Workflow Management System (TWfMS) will evolve from being more precise to more undetermined, due to a series of exception event X occurring on certain components (ExCs), along with the life cycle of TWfMS, experienced in its phased original, as-is, to-be, and agile-consistent stages, and recover, more precisely again, by turning back to the original state from the agile-consistent stage, due to its self-autonomous improvement. With a special emphasis on the system layer, to assure the trustworthiness of WfMS, this paper firstly introduces the preliminary knowledge of the hierarchical information entropy model with correlation theories. After illustrating the fundamental principle, the transformation rule is deduced, step by step, followed by a case study, which is conducive to generating discussions and conclusions in the different research areas of TWfMS. Overall, in this paper, we argue that the trustworthiness maintenance of WfMS could be analyzed and computational, through the viewpoint that all the various states of TWfMS can be considered as the transformation between WfMS and its trustworthiness compensate components, whose information entropy fluctuate repeatedly and comply with the law of the dissipative structure system.

6.
Hosp Pharm ; 52(1): 54-59, 2017 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-28179741

RESUMO

Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors. The error rate after the implementation of the workflow management system averaged 8.4%, which was a 1.6% reduction. After the implementation of the workflow management system, the average number of reported oral liquid medication and injectable medication errors decreased to 0.4 and 0.2 times per week, respectively. Conclusion: The organization was able to achieve its purpose and goal of improving the provision of quality pharmacy care through optimal medication use and safety by reducing medication preparation errors. Error rates decreased and the workflow processes were streamlined, which has led to seamless operations within the pharmacy department. There has been significant cost avoidance and waste reduction and enhanced interdepartmental satisfaction due to the reduction of reported medication errors.

7.
Am J Health Syst Pharm ; 81(4): 129-136, 2024 Feb 08.
Artigo em Inglês | MEDLINE | ID: mdl-37879887

RESUMO

PURPOSE: This study is an evaluation of technology-assisted technician verification (TATV) of the compounded sterile product (CSP) preparation process as an alternative to final verification by a pharmacist. METHODS: A 2-phase, single-center noninferiority study was conducted to assess the accuracy and CSP processing time with TATV versus pharmacist verification. Phase I of the study was a validation of the internal pharmacist accuracy rate in which 2 pharmacists checked each CSP. In phase II, prepared CSPs were first checked by a technician and then checked by a pharmacist. Technicians were required to complete baseline credentialing and training requirements to participate in the study. The primary outcome was the error rate for the pharmacist check in phase I and the error rate of the technician check in phase II. Secondary outcomes included total verification time and total dose processing time in each phase. The Farrington-Manning test was used for noninferiority assessment of accuracy, and the Wilcoxon rank sum test was used to detect a difference between the processing times. RESULTS: A total of 4,000 doses were checked in each phase. Pharmacist accuracy was 99.600% in phase I, compared to TATV accuracy of 99.575% in phase II. TATV of CSPs was noninferior to pharmacist verification (absolute difference in accuracy, 0.025%; 95% CI, -0.26% to 0.31%; P = 0.0016). Total verification time and total dose processing times were significantly lower in Phase II. CONCLUSION: This study showed that TATV of CSPs is noninferior to pharmacist final verification and does not negatively impact the time to check CSPs or total CSP processing time.


Assuntos
Farmacêuticos , Tecnologia , Humanos , Técnicos em Farmácia
8.
Obes Surg ; 33(12): 3860-3870, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37867185

RESUMO

PURPOSE: The introduction of innovative digital solutions in healthcare lags compared to other industries but promises high potential to create value in efficiency and quality. Increasing economic pressure forces hospitals to optimize operating room (OR) processes, in which such solutions might provide additional support. MATERIALS AND METHODS: This retrospective case-control and monocentric study investigated if digitalized and standardized intraoperative surgical workflows of laparoscopic Roux-en-Y gastric bypass (LRYGB) have a significant impact on efficiency, quality, and economics. Logistic and linear regression models were used to apply propensity score matching (PSM) for efficiency and odds ratio for the quality analysis. RESULTS: The study included 49 patients per group. The results demonstrate a significant increase in efficiency and cost-effectiveness in the treatment group. Length of stay (LoS) was 1.2 days less than in the control group (5.6 vs. 4.4). The mean of total OR and skin-to-skin time increased by 3.7% (142.00 vs. 136.80) and 8.5%, respectively (93.88 vs. 85.94). The standard deviation (SD) of total OR and skin-to-skin time decreased by 7.36 min (26.86 vs. 34.22) and 8.98 min (23.20 vs. 32.18) in the treatment group. The results of the odds ratio did not provide any conclusions on quality. Overall, costs were reduced by 318 € per patient and total revenue improved by 10,073 €. CONCLUSION: The implementation of digital workflow management systems in obesity surgery improves economic efficiency. Hospital management and payors should evaluate further support in research of the digitization of the OR, followed by reimbursement to increase and facilitate the accessibility to digital support systems.


Assuntos
Cirurgia Bariátrica , Derivação Gástrica , Laparoscopia , Obesidade Mórbida , Humanos , Obesidade Mórbida/cirurgia , Estudos Retrospectivos , Resultado do Tratamento , Derivação Gástrica/métodos , Laparoscopia/métodos , Complicações Pós-Operatórias/cirurgia
9.
Expert Opin Drug Discov ; 18(6): 579-590, 2023 06.
Artigo em Inglês | MEDLINE | ID: mdl-37089036

RESUMO

INTRODUCTION: Drug discovery in academia and industry poses contrasting challenges. While academia focuses on producing new knowledge, industry is keen on product development and success in clinical trials. Galaxy is a web-based open-source computational workbench which is used to analyze large datasets and is customized to integrate analysis and visualization tools in a single framework. Depending on the methodology, one can generate customized and suitable workflows in the Galaxy platform. AREAS COVERED: Herein, the authors appraise the suitability of the Galaxy platform for developing a disease specific web portal called the Molecular Property Diagnostic Suite (MPDS). The authors include their future perspectives in the expert opinion section. EXPERT OPINION: Galaxy is ideally suited for community-based software development as the scripts, tools, and codes developed in the different programming languages can be integrated in an extremely efficient fashion. MPDS puts forth a new approach known as a disease-specific web portal which aims to implement a range of computational methods and algorithms that can be developed and shared freely across the community of computer aided drug design (CADD) scientists.


Assuntos
Biologia Computacional , Software , Humanos , Biologia Computacional/métodos , Algoritmos , Descoberta de Drogas , Fluxo de Trabalho
10.
Methods Mol Biol ; 2443: 197-209, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35037207

RESUMO

SciApps is an open-source, web-based platform for processing, storing, visualizing, and distributing genomic data and analysis results. Built upon the Tapis (formerly Agave) platform, SciApps brings users TB-scale of data storage via CyVerse Data Store and over one million CPUs via the Extreme Science and Engineering Discovery Environment (XSEDE) resources at Texas Advanced Computing Center (TACC). SciApps provides users ways to chain individual jobs into automated and reproducible workflows in a distributed cloud and provides a management system for data, associated metadata, individual analysis jobs, and multi-step workflows. This chapter provides examples of how to (1) submitting, managing, constructing workflows, (2) using public workflows for Bulked Segregant Analysis (BSA), (3) constructing a Data Analysis Center (DAC), and Data Coordination Center (DCC) for the plant ENCODE project.


Assuntos
Genômica , Software , Biologia Computacional , Genoma de Planta , Genômica/métodos , Armazenamento e Recuperação da Informação , Fluxo de Trabalho
11.
PeerJ ; 9: e11376, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34055480

RESUMO

Whole Genome Sequence (WGS) data from bacterial species is used for a variety of applications ranging from basic microbiological research, diagnostics, and epidemiological surveillance. The availability of WGS data from hundreds of thousands of individual isolates of individual microbial species poses a tremendous opportunity for discovery and hypothesis-generating research into ecology and evolution of these microorganisms. Flexibility, scalability, and user-friendliness of existing pipelines for population-scale inquiry, however, limit applications of systematic, population-scale approaches. Here, we present ProkEvo, an automated, scalable, reproducible, and open-source framework for bacterial population genomics analyses using WGS data. ProkEvo was specifically developed to achieve the following goals: (1) Automation and scaling of complex combinations of computational analyses for many thousands of bacterial genomes from inputs of raw Illumina paired-end sequence reads; (2) Use of workflow management systems (WMS) such as Pegasus WMS to ensure reproducibility, scalability, modularity, fault-tolerance, and robust file management throughout the process; (3) Use of high-performance and high-throughput computational platforms; (4) Generation of hierarchical-based population structure analysis based on combinations of multi-locus and Bayesian statistical approaches for classification for ecological and epidemiological inquiries; (5) Association of antimicrobial resistance (AMR) genes, putative virulence factors, and plasmids from curated databases with the hierarchically-related genotypic classifications; and (6) Production of pan-genome annotations and data compilation that can be utilized for downstream analysis such as identification of population-specific genomic signatures. The scalability of ProkEvo was measured with two datasets comprising significantly different numbers of input genomes (one with ~2,400 genomes, and the second with ~23,000 genomes). Depending on the dataset and the computational platform used, the running time of ProkEvo varied from ~3-26 days. ProkEvo can be used with virtually any bacterial species, and the Pegasus WMS uniquely facilitates addition or removal of programs from the workflow or modification of options within them. To demonstrate versatility of the ProkEvo platform, we performed a hierarchical-based population structure analyses from available genomes of three distinct pathogenic bacterial species as individual case studies. The specific case studies illustrate how hierarchical analyses of population structures, genotype frequencies, and distribution of specific gene functions can be integrated into an analysis. Collectively, our study shows that ProkEvo presents a practical viable option for scalable, automated analyses of bacterial populations with direct applications for basic microbiology research, clinical microbiological diagnostics, and epidemiological surveillance.

12.
Med Dosim ; 45(4): 393-399, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32807611

RESUMO

The purpose of this study was to develop and implement a custom-designed electronic workflow management tool created by Medlever, Inc, in order to improve efficiency, leverage interoperability and maximize overall labor resources. Administrators and clinicians from five Banner MD Anderson Cancer Center, Department of Radiation Oncology clinics utilized Medlever, Inc. to track and analyze clinical workflow. Real-time data were collected for the duration of 3 months. Time and process data were compared month-to-month from each of the five Banner MD Anderson facilities. The data were quantified based on efficiency scores, where efficiency score was defined by measured timelines for work completion, which was defined by average measured times to complete clinical process steps. The overall average efficiency score for the clinical process steps were as follows: simulation - 66%, define target volume - 69%, creating a treatment plan - 71%, plan review - 76%, finalizing plan - 81%, physics review - 73%, IMRT QA - 72%, approving treatment plan - 69%, and therapy chart check - 66%. The combined average efficiency scores for facility A through E were approximately 72%, 77%, 82%, 66%, and 60%, respectively. Overall, the average sum of all clinical efficiency scores for the radiation oncology service line for all five facilities was approximately 73%. The results set the base line for efficiency and can be evaluated in future studies. In conclusion, a workflow management tool is an effective system to provide results for real-time data tracking, opportunities of improved efficiency, and evidence-based approaches to workflow decision making.


Assuntos
Radioterapia (Especialidade) , Humanos , Planejamento da Radioterapia Assistida por Computador , Fluxo de Trabalho
13.
Health Informatics J ; 26(3): 1995-2010, 2020 09.
Artigo em Inglês | MEDLINE | ID: mdl-31912756

RESUMO

A failure modes, effects and criticality analysis was supported by an observational medication error rate study to analyze the impact of Phocus Rx®, a new image-based workflow software system, on chemotherapy compounding error rates. Residual risks that should be a target for additional action were identified and prioritized and pharmacy staff satisfaction with the new system was evaluated. In total, 16 potential failure modes were recognized in the pre-implementation phase and 21 after Phocus Rx® implementation. The total reduction of the criticality index was 67 percent, with a reduction of 46 percent in material preparation, 76 percent in drug production and 48 percent in quality control subprocesses. The relative risk reduction of compounding error rate was 63 percent after the implementation of Phocus Rx®, from 0.045 to 0.017 percent. The high-priority recommendations defined were identification of the product with batch and expiration date from scanned bidimensional barcodes on drug vials and process improvements in image-based quality control. Overall satisfaction index was 8.30 (SD 1.06) for technicians and 8.56 (SD 1.42) for pharmacists (p = 0.655). The introduction of a new workflow management software system was an effective approach to increasing safety in the compounding procedures in the pharmacy department, according to the failure modes, effects and criticality analysis method.


Assuntos
Neoplasias , Serviço de Farmácia Hospitalar , Composição de Medicamentos , Humanos , Erros de Medicação/prevenção & controle , Fluxo de Trabalho
14.
Gigascience ; 9(6)2020 06 01.
Artigo em Inglês | MEDLINE | ID: mdl-32556167

RESUMO

BACKGROUND: Advances in high-throughput methods have brought new challenges for biological data analysis, often requiring many interdependent steps applied to a large number of samples. To address this challenge, workflow management systems, such as Watchdog, have been developed to support scientists in the (semi-)automated execution of large analysis workflows. IMPLEMENTATION: Here, we present Watchdog 2.0, which implements new developments for module creation, reusability, and documentation and for reproducibility of analyses and workflow execution. Developments include a graphical user interface for semi-automatic module creation from software help pages, sharing repositories for modules and workflows, and a standardized module documentation format. The latter allows generation of a customized reference book of public and user-specific modules. Furthermore, extensive logging of workflow execution, module and software versions, and explicit support for package managers and container virtualization now ensures reproducibility of results. A step-by-step analysis protocol generated from the log file may, e.g., serve as a draft of a manuscript methods section. Finally, 2 new execution modes were implemented. One allows resuming workflow execution after interruption or modification without rerunning successfully executed tasks not affected by changes. The second one allows detaching and reattaching to workflow execution on a local computer while tasks continue running on computer clusters. CONCLUSIONS: Watchdog 2.0 provides several new developments that we believe to be of benefit for large-scale bioinformatics analysis and that are not completely covered by other competing workflow management systems. The software itself, module and workflow repositories, and comprehensive documentation are freely available at https://www.bio.ifi.lmu.de/watchdog.


Assuntos
Biologia Computacional/métodos , Software , Algoritmos , Biologia Computacional/normas , Sequenciamento de Nucleotídeos em Larga Escala , Reprodutibilidade dos Testes , Interface Usuário-Computador , Fluxo de Trabalho
15.
Brachytherapy ; 16(1): 236-244, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-27618420

RESUMO

PURPOSE: To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. METHODS AND MATERIALS: A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. RESULTS: After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. CONCLUSIONS: The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management.


Assuntos
Carcinoma Hepatocelular/terapia , Embolização Terapêutica/métodos , Neoplasias Hepáticas/terapia , Microesferas , Equipe de Assistência ao Paciente/organização & administração , Avaliação de Processos em Cuidados de Saúde , Melhoria de Qualidade , Fluxo de Trabalho , Radioisótopos de Ítrio/uso terapêutico , Humanos , Software
16.
Front Genet ; 7: 75, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-27200084

RESUMO

Next-generation sequencing (NGS) technologies have deeply changed our understanding of cellular processes by delivering an astonishing amount of data at affordable prices; nowadays, many biology laboratories have already accumulated a large number of sequenced samples. However, managing and analyzing these data poses new challenges, which may easily be underestimated by research groups devoid of IT and quantitative skills. In this perspective, we identify five issues that should be carefully addressed by research groups approaching NGS technologies. In particular, the five key issues to be considered concern: (1) adopting a laboratory management system (LIMS) and safeguard the resulting raw data structure in downstream analyses; (2) monitoring the flow of the data and standardizing input and output directories and file names, even when multiple analysis protocols are used on the same data; (3) ensuring complete traceability of the analysis performed; (4) enabling non-experienced users to run analyses through a graphical user interface (GUI) acting as a front-end for the pipelines; (5) relying on standard metadata to annotate the datasets, and when possible using controlled vocabularies, ideally derived from biomedical ontologies. Finally, we discuss the currently available tools in the light of these issues, and we introduce HTS-flow, a new workflow management system conceived to address the concerns we raised. HTS-flow is able to retrieve information from a LIMS database, manages data analyses through a simple GUI, outputs data in standard locations and allows the complete traceability of datasets, accompanying metadata and analysis scripts.

17.
Per Med ; 11(5): 523-544, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-26000024

RESUMO

Moving from a traditional medical model of treating pathologies to an individualized predictive and preventive model of personalized medicine promises to reduce the healthcare cost on an overburdened and overwhelmed system. Next-generation sequencing (NGS) has the potential to accelerate the early detection of disorders and the identification of pharmacogenetics markers to customize treatments. This review explains the historical facts that led to the development of NGS along with the strengths and weakness of NGS, with a special emphasis on the analytical aspects used to process NGS data. There are solutions to all the steps necessary for performing NGS in the clinical context where the majority of them are very efficient, but there are some crucial steps in the process that need immediate attention.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA