Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
1.
Neuroimage ; 244: 118543, 2021 12 01.
Artículo en Inglés | MEDLINE | ID: mdl-34508893

RESUMEN

The Human Connectome Project (HCP) was launched in 2010 as an ambitious effort to accelerate advances in human neuroimaging, particularly for measures of brain connectivity; apply these advances to study a large number of healthy young adults; and freely share the data and tools with the scientific community. NIH awarded grants to two consortia; this retrospective focuses on the "WU-Minn-Ox" HCP consortium centered at Washington University, the University of Minnesota, and University of Oxford. In just over 6 years, the WU-Minn-Ox consortium succeeded in its core objectives by: 1) improving MR scanner hardware, pulse sequence design, and image reconstruction methods, 2) acquiring and analyzing multimodal MRI and MEG data of unprecedented quality together with behavioral measures from more than 1100 HCP participants, and 3) freely sharing the data (via the ConnectomeDB database) and associated analysis and visualization tools. To date, more than 27 Petabytes of data have been shared, and 1538 papers acknowledging HCP data use have been published. The "HCP-style" neuroimaging paradigm has emerged as a set of best-practice strategies for optimizing data acquisition and analysis. This article reviews the history of the HCP, including comments on key events and decisions associated with major project components. We discuss several scientific advances using HCP data, including improved cortical parcellations, analyses of connectivity based on functional and diffusion MRI, and analyses of brain-behavior relationships. We also touch upon our efforts to develop and share a variety of associated data processing and analysis tools along with detailed documentation, tutorials, and an educational course to train the next generation of neuroimagers. We conclude with a look forward at opportunities and challenges facing the human neuroimaging field from the perspective of the HCP consortium.


Asunto(s)
Conectoma/historia , Encéfalo/diagnóstico por imagen , Bases de Datos Factuales , Imagen de Difusión por Resonancia Magnética , Femenino , Historia del Siglo XXI , Humanos , Procesamiento de Imagen Asistido por Computador , Masculino , Neuroimagen , Estudios Retrospectivos
2.
Neuroimage ; 124(Pt B): 1102-1107, 2016 Jan 01.
Artículo en Inglés | MEDLINE | ID: mdl-25934470

RESUMEN

ConnectomeDB is a database for housing and disseminating data about human brain structure, function, and connectivity, along with associated behavioral and demographic data. It is the main archive and dissemination platform for data collected under the WU-Minn consortium Human Connectome Project. Additional connectome-style study data is and will be made available in the database under current and future projects, including the Connectome Coordination Facility. The database currently includes multiple modalities of magnetic resonance imaging (MRI) and magnetoencephalograpy (MEG) data along with associated behavioral data. MRI modalities include structural, task, resting state and diffusion. MEG modalities include resting state and task. Imaging data includes unprocessed, minimally preprocessed and analysis data. Imaging data and much of the behavioral data are publicly available, subject to acceptance of data use terms, while access to some sensitive behavioral data is restricted to qualified investigators under a more stringent set of terms. ConnectomeDB is the public side of the WU-Minn HCP database platform. As such, it is geared towards public distribution, with a web-based user interface designed to guide users to the optimal set of data for their needs and a robust backend mechanism based on the commercial Aspera fasp service to enable high speed downloads. HCP data is also available via direct shipment of hard drives and Amazon S3.


Asunto(s)
Encéfalo/anatomía & histología , Encéfalo/fisiología , Conectoma , Bases de Datos Factuales , Difusión de la Información/métodos , Acceso a la Información , Conducta , Mapeo Encefálico , Humanos , Internet , Imagen por Resonancia Magnética , Magnetoencefalografía , Neuroimagen , Control de Calidad
3.
Crit Pathw Cardiol ; 2(3): 197-206, 2003 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-18340122

RESUMEN

OBJECTIVES: To determine whether hospitals are capable of delivering myocardial reperfusion therapy in a manner consistent with the American College of Cardiology/American Heart Association guidelines. DATA SOURCE AND STUDY SETTING: Data from the National Registry of Myocardial Infarction (NRMI)-2 and NRMI-3 were used. NRMI is an observational study, sponsored by Genentech, conducted from June 1994 through June 2000 and involving 1876 hospitals and 1,310,030 patients across the United States. The protocol calls for collecting data on all patients with a diagnosis of acute myocardial infarction. The setting was community and tertiary hospitals in the United States. STUDY DESIGN: This observational study used process capability analysis. PRINCIPAL FINDINGS: Overall, no hospital was deemed capable of delivering myocardial reperfusion therapy consistent with the American College of Cardiology/American Heart Association guidelines. The highest thrombolytic and angioplasty CPUs were 0.44 and 0.52, respectively-well below the traditional value of 1.0 signifying minimum capability. In addition, among the hospitals examined, there remained a wide degree of variability in process capability, ranging from -0.69 to 0.52. CONCLUSIONS: Myocardial reperfusion therapy performance measurement systems relying solely on mean time-to-reperfusion conceal true process performance, thereby obscuring quality improvement opportunities and strategies for improvement. Health care providers, purchasers, regulators, and other organizations interested in measuring and improving health care quality are encouraged to incorporate process capability analysis into their myocardial reperfusion therapy performance measurement and quality management systems.

4.
AMIA Annu Symp Proc ; : 205-9, 2003.
Artículo en Inglés | MEDLINE | ID: mdl-14728163

RESUMEN

Automated expert systems provide a reliable and effective way to improve patient safety in a hospital environment. Their ability to analyze large amounts of data without fatigue is a decided advantage over clinicians who perform the same tasks. As dependence on expert systems increase and the systems become more complex, it is important to closely monitor their performance. Failure to generate alerts can jeopardize the health and safety of patients, while generating excessive false positive alerts can lead to valid alerts being dismissed as noise. In this study, statistical process control charts were used to monitor an expert system, and the strengths and weaknesses of this technology are presented.


Asunto(s)
Sistemas de Información en Farmacia Clínica/normas , Quimioterapia Asistida por Computador/normas , Sistemas Especialistas , Control de Calidad , Procesamiento Automatizado de Datos , Falla de Equipo , Humanos , Sistemas de Registros Médicos Computarizados , Errores de Medicación/prevención & control , Estadística como Asunto
SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda