EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
2455. Energy Time Series Forecasting with Neural Architecture Search
Invited abstract in session WD-31: AI for Energy Finance, stream Analytics.
Wednesday, 14:30-16:00Room: 046 (building: 208)
Authors (first author is the speaker)
1. | Georg Velev
|
School of Business and Economics, Humboldt-Universität zu Berlin | |
2. | Stefan Lessmann
|
School of Business and Economics, Humboldt-University of Berlin |
Abstract
The importance of accurate planning and operation of energy ressources in today’s power system has made energy forecasting a popular topic of research. Transformer-based networks have shown promising results on various tasks including energy time series prediction. In this regard, neural architecture search (NAS), which facilitates the automated design of complex neural-based architectures tailored to a specific task, has been reported in the literature to outperform hand-crafted architectures. Therefore, in this research we apply NAS using Reinforcement Learning for the prediction of energy time series. We focus on the search for novel hybrid self-attention modules, which incorporate different functionalities presented in the encoder of Transformer-based frameworks for time series forecasting. Furthermore, we explore the automated design of self-attention components with memory states, in order to examine if the recurrence on a sequence-level could improve the expressive power of Transformer-based models. We report the results obtained from NAS on both real-world datasets and synthetic time series. We benchmark the performance of NAS-related models with Long Short-Term Memory neural cells, and with the Transformer-based frameworks of Informer, Autoformer and Pyraformer.
Keywords
- Artificial Intelligence
- Forecasting
- Energy Policy and Planning
Status: accepted
Back to the list of papers