Your browser doesn't support javascript.
loading
Bayesian tensor network structure search and its application to tensor completion.
Zeng, Junhua; Zhou, Guoxu; Qiu, Yuning; Li, Chao; Zhao, Qibin.
Affiliation
  • Zeng J; School of Automation, Guangdong University of Technology, Guangzhou, 510006, China; Center for Advanced Intelligence Project (AIP), RIKEN, Tokyo, 103-0027, Japan; Key Laboratory of Intelligent Information Processing and System Integration of IoT, Ministry of Education, Guangzhou, 510006, China. Elec
  • Zhou G; School of Automation, Guangdong University of Technology, Guangzhou, 510006, China; Key Laboratory of Intelligent Detection and the Internet of Things in Manufacturing, Ministry of Education, Guangzhou, 510006, China. Electronic address: gx.zhou@gdut.edu.cn.
  • Qiu Y; School of Automation, Guangdong University of Technology, Guangzhou, 510006, China; Center for Advanced Intelligence Project (AIP), RIKEN, Tokyo, 103-0027, Japan. Electronic address: yuning.qiu.gd@gmail.com.
  • Li C; Center for Advanced Intelligence Project (AIP), RIKEN, Tokyo, 103-0027, Japan. Electronic address: chao.li@riken.jp.
  • Zhao Q; School of Automation, Guangdong University of Technology, Guangzhou, 510006, China; Center for Advanced Intelligence Project (AIP), RIKEN, Tokyo, 103-0027, Japan. Electronic address: qibin.zhao@riken.jp.
Neural Netw ; 175: 106290, 2024 Jul.
Article in En | MEDLINE | ID: mdl-38626616
ABSTRACT
Tensor network (TN) has demonstrated remarkable efficacy in the compact representation of high-order data. In contrast to the TN methods with pre-determined structures, the recently introduced tensor network structure search (TNSS) methods automatically learn a compact TN structure from the data, gaining increasing attention. Nonetheless, TNSS requires time-consuming manual adjustments of the penalty parameters that control the model complexity to achieve better performance, especially in the presence of missing or noisy data. To provide an effective solution to this problem, in this paper, we propose a parameters tuning-free TNSS algorithm based on Bayesian modeling, aiming at conducting TNSS in a fully data-driven manner. Specifically, the uncertainty in the data corruption is well-incorporated in the prior setting of the probabilistic model. For TN structure determination, we reframe it as a rank learning problem of the fully-connected tensor network (FCTN), integrating the generalized inverse Gaussian (GIG) distribution for low-rank promotion. To eliminate the need for hyperparameter tuning, we adopt a fully Bayesian approach and propose an efficient Markov chain Monte Carlo (MCMC) algorithm for posterior distribution sampling. Compared with the previous TNSS method, experiment results demonstrate the proposed algorithm can effectively and efficiently find the latent TN structures of the data under various missing and noise conditions and achieves the best recovery results. Furthermore, our method exhibits superior performance in tensor completion with real-world data compared to other state-of-the-art tensor-decomposition-based completion methods.
Subject(s)
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Algorithms / Bayes Theorem Limits: Humans Language: En Journal: Neural Netw Journal subject: NEUROLOGIA Year: 2024 Document type: Article

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Algorithms / Bayes Theorem Limits: Humans Language: En Journal: Neural Netw Journal subject: NEUROLOGIA Year: 2024 Document type: Article