publications
publications by categories in reversed chronological order. generated by jekyll-scholar.
2025
- Efficient Single-Step Framework for Incremental Class Learning in Neural Networks2025
Incremental learning continues to present a significant challenge in deep learning, particularly in environments where resources are limited. While existing methods have been shown to achieve high levels of accuracy, they often require substantial computational resources and storage capacity. This work proposes CIFNet, an efficient approach to class incremental learning that matches comparable accuracy to state-of-the-art methods while significantly reducing training time and energy consumption through a single-step optimisation process. CIFNet incorporates a novel compressed buffer mechanism that stores condensed representations of previous data samples instead of full raw data, substantially reducing memory requirements. In contrast to conventional approaches that necessitate multiple iterations of weight optimisation, our method achieves optimal performance in a single training step, with no need for iterations, leading to a significant decrease in computational overhead. Experimental results on standard benchmark datasets have shown that our approach inherently mitigates catastrophic forgetting without the need for complex regularization schemes. CIFNet achieves accuracy comparable to current state-of-the-art approaches while significantly reducing training time and energy consumption. This work represents a step forward in making class incremental learning more accessible for resource-constrained environments while maintaining robust performance.
@mastersthesis{dopico2025efficient, title = {Efficient Single-Step Framework for Incremental Class Learning in Neural Networks}, author = {Dopico-Castro, A. and Fontenla-Romero, Ó. and Guijarro-Berdiñas, B. and Alonso-Betanzos, A.}, year = {2025}, school = {Universidade da Coruña}, type = {Master's Thesis}, }