Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks
Permanent link
https://hdl.handle.net/10037/24272Date
2021-06-05Type
Journal articleTidsskriftartikkel
Peer reviewed
Author
Geist, Moritz; Petersen, Philipp; Raslan, Mones; Schneider, Reinhold; Kutyniok, Gitta Astrid HildegardAbstract
We perform a comprehensive numerical study of the effect of approximation-theoretical
results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric
partial differential equations. Here, approximation theory for fully-connected neural networks
predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution
manifold of the parametric partial differential equation. We use various methods to establish
comparability between test-cases by minimizing the effect of the choice of test-cases on the
optimization and sampling aspects of the learning problem. We find strong support for the
hypothesis that approximation-theoretical effects heavily influence the practical behavior of
learning problems in numerical analysis. Turning to practically more successful and modern architectures, at the end of this study we derive improved error bounds by focusing on
convolutional neural networks.
Publisher
SpringerCitation
Geist, Petersen, Raslan, Schneider, Kutyniok. Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks. Journal of Scientific Computing. 2021;88(1)Metadata
Show full item recordCollections
Copyright 2021 The Author(s)