Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Artículo en Inglés | MEDLINE | ID: mdl-37067969

RESUMEN

Artificial neural networks (ANNs) are inspired by human learning. However, unlike human education, classical ANN does not use a curriculum. Curriculum learning (CL) refers to the process of ANN training in which samples are used in a meaningful order. When using CL, training begins with a subset of the dataset and new samples are added throughout the training, or training begins with the entire dataset and the number of samples used is reduced. With these changes in training dataset size, better results can be obtained with curriculum, anti-curriculum, or random-curriculum methods than the vanilla method. However, a generally efficient CL method for various architectures and datasets is not found. In this article, we propose cyclical CL (CCL), in which the data size used during training changes cyclically rather than simply increasing or decreasing. Instead of using only the vanilla method or only the curriculum method, using both methods cyclically like in CCL provides more successful results. We tested the method on 18 different datasets and 15 architectures in image and text classification tasks and obtained more successful results than no-CL and existing CL methods. We also have shown theoretically that it is less erroneous to apply CL and vanilla cyclically instead of using only CL or only the vanilla method. The code of the cyclical curriculum is available at https://github.com/CyclicalCurriculum/Cyclical-Curriculum.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...