Abstract
Various deep learning architectures are appearing in the field of machine learning with the goal of being able to handle various types of data, or solving inherent problems within the networks. In this thesis, we propose the idea of creating architectures based on physics partial differential equations (PDEs), where we transfer the known properties of PDEs as a method of introducing inductive bias to the model architectures. We test this idea by comparing the oversmoothing process in graph neural networks to heat diffusion, and constructing new architectures based on the wave equation to reduce the effects of oversmoothing. The experiments suggests that the proposed architectures posses similar properties to wave propagation, implying that the idea of inheriting properties from physics PDEs is a viable method.