• Medientyp: E-Artikel
  • Titel: Estimating time-delayed variables using transformer-based soft sensors
  • Beteiligte: Wibbeke, Jelke; Alves, Darian; Rohjans, Sebastian
  • Erschienen: Springer Science and Business Media LLC, 2023
  • Erschienen in: Energy Informatics
  • Sprache: Englisch
  • DOI: 10.1186/s42162-023-00274-3
  • ISSN: 2520-8942
  • Schlagwörter: Computer Networks and Communications ; Energy Engineering and Power Technology ; Information Systems
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: <jats:title>Abstract</jats:title><jats:p>In the course of digitization, there is an increased interest in sensor data, including data from old systems with a service life of several decades. Since the installation of sensor technology can be quite expensive, soft sensors are often used to enhance the monitoring capabilities. Soft sensors use easy-to-measure variables to predict hard-to-measure variables, employing arbitrary models. This is particularly challenging if the observed system is complex and exhibits dynamic behavior, e.g., transient responses after changes in the system. Data-driven models are, therefore, often used. As recent studies suggest using Transformer-based models for regression tasks, this paper investigates the use of Transformer-based soft sensors for modelling the dynamic behavior of systems. To this extent, the performance of Multilayer Perceptron (MLP) and Long Short-term Memory (LSTM) models are compared to Transformers, based on two data sets featuring dynamic behavior in terms of time-delayed variables. The outcomes of this paper demonstrate that while the Transformer can map time delays, it is outperformed by MLP and LSTM. This deviation from previous Transformer evaluations is noteworthy as it may be influenced by the dynamic characteristics of the input data set, and its attention-based mechanism may not be optimized for sequential data. It is important to mention that the previous studies in this area did not focus on time-delayed dynamic variables.</jats:p>
  • Zugangsstatus: Freier Zugang