Your browser doesn't support javascript.
loading
BigNeuron: a resource to benchmark and predict performance of algorithms for automated tracing of neurons in light microscopy datasets.
Manubens-Gil, Linus; Zhou, Zhi; Chen, Hanbo; Ramanathan, Arvind; Liu, Xiaoxiao; Liu, Yufeng; Bria, Alessandro; Gillette, Todd; Ruan, Zongcai; Yang, Jian; Radojevic, Miroslav; Zhao, Ting; Cheng, Li; Qu, Lei; Liu, Siqi; Bouchard, Kristofer E; Gu, Lin; Cai, Weidong; Ji, Shuiwang; Roysam, Badrinath; Wang, Ching-Wei; Yu, Hongchuan; Sironi, Amos; Iascone, Daniel Maxim; Zhou, Jie; Bas, Erhan; Conde-Sousa, Eduardo; Aguiar, Paulo; Li, Xiang; Li, Yujie; Nanda, Sumit; Wang, Yuan; Muresan, Leila; Fua, Pascal; Ye, Bing; He, Hai-Yan; Staiger, Jochen F; Peter, Manuel; Cox, Daniel N; Simonneau, Michel; Oberlaender, Marcel; Jefferis, Gregory; Ito, Kei; Gonzalez-Bellido, Paloma; Kim, Jinhyun; Rubel, Edwin; Cline, Hollis T; Zeng, Hongkui; Nern, Aljoscha; Chiang, Ann-Shyn.
Afiliação
  • Manubens-Gil L; Institute for Brain and Intelligence, Southeast University, Nanjing, China.
  • Zhou Z; Microsoft Corporation, Redmond, WA, USA.
  • Chen H; Tencent AI Lab, Bellevue, WA, USA.
  • Ramanathan A; Computing, Environment and Life Sciences Directorate, Argonne National Laboratory, Lemont, IL, USA.
  • Liu X; Kaya Medical, Seattle, WA, USA.
  • Liu Y; Institute for Brain and Intelligence, Southeast University, Nanjing, China.
  • Bria A; University of Cassino and Southern Lazio, Cassino, Italy.
  • Gillette T; Center for Neural Informatics, Structures and Plasticity, Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, USA.
  • Ruan Z; Institute for Brain and Intelligence, Southeast University, Nanjing, China.
  • Yang J; Faculty of Information Technology, Beijing University of Technology, Beijing, China.
  • Radojevic M; Beijing International Collaboration Base on Brain Informatics and Wisdom Services, Beijing, China.
  • Zhao T; Nuctech Netherlands, Rotterdam, the Netherlands.
  • Cheng L; Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA.
  • Qu L; Department of Electrical and Computer Engineering, University of Alberta, Edmonton, Alberta, Canada.
  • Liu S; Institute for Brain and Intelligence, Southeast University, Nanjing, China.
  • Bouchard KE; Ministry of Education Key Laboratory of Intelligent Computation and Signal Processing, Anhui University, Hefei, China.
  • Gu L; Paige AI, New York, NY, USA.
  • Cai W; Scientific Data Division and Biological Systems and Engineering Division, Lawrence Berkeley National Lab, Berkeley, CA, USA.
  • Ji S; Helen Wills Neuroscience Institute and Redwood Center for Theoretical Neuroscience, UC Berkeley, Berkeley, CA, USA.
  • Roysam B; RIKEN AIP, Tokyo, Japan.
  • Wang CW; Research Center for Advanced Science and Technology (RCAST), The University of Tokyo, Tokyo, Japan.
  • Yu H; School of Computer Science, University of Sydney, Sydney, New South Wales, Australia.
  • Sironi A; Texas A&M University, College Station, TX, USA.
  • Iascone DM; Cullen College of Engineering, University of Houston, Houston, TX, USA.
  • Zhou J; Graduate Institute of Biomedical Engineering, National Taiwan University of Science and Technology, Taipei, Taiwan.
  • Bas E; National Centre for Computer Animation, Bournemouth University, Poole, UK.
  • Conde-Sousa E; PROPHESEE, Paris, France.
  • Aguiar P; Department of Neuroscience, Columbia University, New York, NY, USA.
  • Li X; Mortimer B. Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY, USA.
  • Li Y; Department of Computer Science, Northern Illinois University, DeKalb, IL, USA.
  • Nanda S; AWS AI, Seattle, WA, USA.
  • Wang Y; i3S, Instituto de Investigação E Inovação Em Saúde, Universidade Do Porto, Porto, Portugal.
  • Muresan L; INEB, Instituto de Engenharia Biomédica, Universidade Do Porto, Porto, Portugal.
  • Fua P; i3S, Instituto de Investigação E Inovação Em Saúde, Universidade Do Porto, Porto, Portugal.
  • Ye B; Massachusetts General Hospital and Harvard Medical School, Boston, MA, USA.
  • He HY; Allen Institute for Brain Science, Seattle, WA, USA.
  • Staiger JF; Cortical Architecture Imaging and Discovery Lab, Department of Computer Science and Bioimaging Research Center, The University of Georgia, Athens, GA, USA.
  • Peter M; Center for Neural Informatics, Structures and Plasticity, Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, USA.
  • Cox DN; Program in Neuroscience, Department of Biomedical Sciences, Florida State University College of Medicine, Tallahassee, FL, USA.
  • Simonneau M; Cambridge Advanced Imaging Centre, University of Cambridge, Cambridge, UK.
  • Oberlaender M; Computer Vision Laboratory, EPFL, Lausanne, Switzerland.
  • Jefferis G; Life Sciences Institute and Department of Cell and Developmental Biology, University of Michigan, Ann Arbor, MI, USA.
  • Ito K; Department of Biology, Georgetown University, Washington, DC, USA.
  • Gonzalez-Bellido P; Institute for Neuroanatomy, University Medical Center Göttingen, Georg-August- University Göttingen, Goettingen, Germany.
  • Kim J; Department of Stem Cell and Regenerative Biology and Center for Brain Science, Harvard University, Cambridge, MA, USA.
  • Rubel E; Neuroscience Institute, Georgia State University, Atlanta, GA, USA.
  • Cline HT; 42 ENS Paris-Saclay, CNRS, CentraleSupélec, LuMIn, Université Paris-Saclay, Gif-sur-Yvette, France.
  • Zeng H; Max Planck Group: In Silico Brain Sciences, Max Planck Institute for Neurobiology of Behavior - caesar, Bonn, Germany.
  • Nern A; Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA.
  • Chiang AS; Division of Neurobiology, MRC Laboratory of Molecular Biology, Cambridge, UK.
Nat Methods ; 20(6): 824-835, 2023 Jun.
Article em En | MEDLINE | ID: mdl-37069271
ABSTRACT
BigNeuron is an open community bench-testing platform with the goal of setting open standards for accurate and fast automatic neuron tracing. We gathered a diverse set of image volumes across several species that is representative of the data obtained in many neuroscience laboratories interested in neuron tracing. Here, we report generated gold standard manual annotations for a subset of the available imaging datasets and quantified tracing quality for 35 automatic tracing algorithms. The goal of generating such a hand-curated diverse dataset is to advance the development of tracing algorithms and enable generalizable benchmarking. Together with image quality features, we pooled the data in an interactive web application that enables users and developers to perform principal component analysis, t-distributed stochastic neighbor embedding, correlation and clustering, visualization of imaging and tracing data, and benchmarking of automatic tracing algorithms in user-defined data subsets. The image quality metrics explain most of the variance in the data, followed by neuromorphological features related to neuron size. We observed that diverse algorithms can provide complementary information to obtain accurate results and developed a method to iteratively combine methods and generate consensus reconstructions. The consensus trees obtained provide estimates of the neuron structure ground truth that typically outperform single algorithms in noisy datasets. However, specific algorithms may outperform the consensus tree strategy in specific imaging conditions. Finally, to aid users in predicting the most accurate automatic tracing results without manual annotations for comparison, we used support vector machine regression to predict reconstruction quality given an image volume and a set of automatic tracings.
Assuntos

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Benchmarking / Microscopia Tipo de estudo: Prognostic_studies / Risk_factors_studies Idioma: En Revista: Nat Methods Assunto da revista: TECNICAS E PROCEDIMENTOS DE LABORATORIO Ano de publicação: 2023 Tipo de documento: Article País de afiliação: China

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Benchmarking / Microscopia Tipo de estudo: Prognostic_studies / Risk_factors_studies Idioma: En Revista: Nat Methods Assunto da revista: TECNICAS E PROCEDIMENTOS DE LABORATORIO Ano de publicação: 2023 Tipo de documento: Article País de afiliação: China