Publication
ISCAS 2022
Conference paper

Hyper-parameter Tuning for Progressive Learning and its Application to Network Cyber Security

Abstract

The long-term deployment of data-driven AI technology using artificial neural networks (ANNs) should be scalable and maintainable when new data becomes available. To insure smooth adaptation, the learning must be cumulative so that the network consumes new data without compromising its inference performance based on past data. Such incremental accumulation of learning experience is known as progressive learning. In this paper, we address the open problem of tuning the hyperparameters of neural networks during progressive learning. A hyper-parameter optimization framework is proposed that selects the best hyper-parameter values on a task-by-task basis. The neural network model adapts to each progressive learning task by adjusting the hyper-parameters under which the neural architecture is incrementally grown. Several hyper-parameter search strategies are explored and compared in support of progressive learning. In contrast to the predominant practice of using imaging datasets in machine learning, we have used cybersecurity datasets to illustrate the advantages of the proposed hyper-parameter tuning algorithms.

Date

Publication

ISCAS 2022

Authors

Share