VOCAL 2024
Abstract Submission

104. New interval-based training technique to parameter robustness

Invited abstract in session FB-4: Methods of optimization, stream contributed papers.

Friday, 10:15 - 11:45
Room: C105

Authors (first author is the speaker)

1. Attila Szász
Department of Computational Optimization, University of Szeged
2. Balázs Bánhelyi

Abstract

Today's artificial neural networks appear in many scientific fields and have a wide range of applications, for example, they are widely used for image and speech recognition. Over the years, the accuracy of the networks has continuously improved, but many studies have shown that these networks are also not error-free. Many technologies have been developed to reduce the probability of error, but most of them look for examples of adversities in the input space and try to make neural networks more robust by using them. There is a much less technique to search for aversality in the smaller distances of the weight matrices of the neural network. However, this type of adversity is magnified in that area, where taught neural networks are evaluated with much lower accuracy. In these quantized neural networks, the results differ greatly from the expected result.

In our presentation, we present this problem and compare the robustness in the input space and the parameter space of the network. The effectiveness of methods based on floating-point and interval evaluations was shown. A new training method was implemented to parameter robustness and the results show that it is more effective than previous techniques.

Keywords

Status: accepted


Back to the list of papers