• Medientyp: E-Artikel
  • Titel: Physically enhanced training for modeling rate-independent plasticity with feedforward neural networks
  • Beteiligte: Weber, Patrick; Wagner, Werner; Freitag, Steffen
  • Erschienen: Springer Science and Business Media LLC, 2023
  • Erschienen in: Computational Mechanics
  • Sprache: Englisch
  • DOI: 10.1007/s00466-023-02316-9
  • ISSN: 0178-7675; 1432-0924
  • Schlagwörter: Applied Mathematics ; Computational Mathematics ; Computational Theory and Mathematics ; Mechanical Engineering ; Ocean Engineering ; Computational Mechanics
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: <jats:title>Abstract</jats:title><jats:p>In recent years, a lot of progress has been made in the field of material modeling with artificial neural networks (ANNs). However, the following drawbacks persist to this day: ANNs need a large amount of data for the training process. This is not realistic, if real world experiments are intended to be used as data basis. Additionally, the application of ANN material models in finite element (FE) calculations is challenging because local material instabilities can lead to divergence within the solution algorithm. In this paper, we extend the approach of constrained neural network training from [28] to elasto-plastic material behavior, modeled by an incrementally defined feedforward neural network. Purely stress and strain dependent equality and inequality constraints are introduced, including material stability, stationarity, normalization, symmetry and the prevention of energy production. In the Appendices, we provide a comprehensive framework on how to implement these constraints in a gradient based optimization algorithm. We show, that ANN material models with training enhanced by physical constraints leads to a broader capture of the material behavior that underlies the given training data. This is especially the case, if a limited amount of data is available, which is important for a practical application. Furthermore, we show that these ANN models are superior to classically trained ANNs in FE computations when it comes to convergence behavior, stability, and physical interpretation of the results.</jats:p>