ABSTRACT
Mirror descent is an elegant optimization technique that leverages a dual space of parametric models to perform gradient descent. While originally developed for convex optimization, it has increasingly been applied in the field of machine learning. In this study, we propose a novel approach for using mirror descent to initialize the parameters of neural networks. Specifically, we demonstrate that by using the Hopfield model as a prototype for neural networks, mirror descent can effectively train the model with significantly improved performance compared to traditional gradient descent methods that rely on random parameter initialization. Our findings highlight the potential of mirror descent as a promising initialization technique for enhancing the optimization of machine learning models.
ABSTRACT
We revisit the slow-bond (SB) problem of the one-dimensional (1D) totally asymmetric simple exclusion process (TASEP) with modified hopping rates. In the original SB problem, it turns out that a local defect is always relevant to the system as jamming, so that phase separation occurs in the 1D TASEP. However, crossover scaling behaviors are also observed as finite-size effects. In order to check if the SB can be irrelevant to the system with particle interaction, we employ the condensation concept in the zero-range process. The hopping rate in the modified TASEP depends on the interaction parameter and the distance up to the nearest particle in the moving direction, besides the SB factor. In particular, we focus on the interplay of jamming and condensation in the current-density relation of 1D driven flow. Based on mean-field calculations, we present the fundamental diagram and the phase diagram of the modified SB problem, which are numerically checked. Finally, we discuss how the condensation of holes suppresses the jamming of particles and vice versa, where the partially condensed phase is the most interesting, compared to that in the original SB problem.
ABSTRACT
The slow-bond problem is a long-standing question about the minimal strength ε_{c} of a local defect with global effects on the Kardar-Parisi-Zhang (KPZ) universality class. A consensus on the issue has been delayed due to the discrepancy between various analytical predictions claiming ε_{c}=0 and numerical observations claiming ε_{c}>0. We revisit the problem via finite-size scaling analyses of the slow-bond effects, which are tested for different boundary conditions through extensive Monte Carlo simulations. Our results provide evidence that the previously reported nonzero ε_{c} is an artifact of a crossover phenomenon which logarithmically converges to zero as the system size goes to infinity.