95. Metaheuristics for the automated design and configuration of Deep Neural Networks
Contributed abstract in session SB-1: Plenary El-Ghazali Talbi, stream Plenary.
Saturday, 12:00 - 13:00Room: L226
Authors (first author is the speaker)
| 1. | El-Ghazali Talbi
|
| Laboratoire d'Informatique Fondamentale de Lille |
Abstract
In recent years, research in metaheuristic optimization approaches in the automatic design and configuration of deep neural networks has become increasingly popular. Although various approaches have been proposed, there is a lack of a comprehensive survey and taxonomy on this hot research topic. In this talk, we propose a unified way to describe the various metaheuristics that focus on common and important search components of optimization algorithms: representation, objective function, constraints, initial solution(s), and variation operators. In addition to large-scale search space, the problem is characterized by its variable mixed design space, it is very expensive, and it has multiple blackbox objective functions. Hence, this unified methodology has been extended to advanced optimization approaches, such as surrogate-based, multi-objective, and parallel optimization.
Keywords
- Machine learning
Status: accepted
Back to the list of papers