Type
Master's thesis / supervised research
Prerequisites
- Basic knowledge of deep learning
- Either (1) background in functional analysis (for more theoretical projects) and/or (2) knowledge of PyTorch or Tensorflow (for more applied projects)
- Willingness to learn more about functional analysis and PyTorch/Tensorflow
Description
In many applications in data science, such as social networks, chemistry, recommendation systems, knowledge graphs, traffic networks, and functional brain networks, the data is represented by graphs. Graph neural networks (GNNs) extend classical deep learning methods to graph-structured data and have achieved resounding success in the past few years. By now, GNNs are ubiquitous both in industry and in the applied sciences. Since graphs are irregular objects, graph neural networks present challenging problems, such as how to define convolution on graphs, how to train a network on certain graphs and apply it to other graphs, how to define a convolutional network that is stable and robust to domain perturbations, and how to determine the expressive capacity of graph neural networks. Contemporary research focuses on such questions, which span the spectrum between theoretical analysis and application
References
- General Overview of the Theory of GNNs:
Theory of Graph Neural Networks: Representation and Learning (https://arxiv.org/abs/2204.07697)
Future Directions in the Theory of Graph Machine Learning (https://arxiv.org/abs/2402.02287) - Expressivity of GNNs
How Powerful are Graph Neural Networks? (https://arxiv.org/abs/1810.00826)
Beyond Weisfeiler-Leham: A Quantitative Framework for GNN Expressiveness (https://arxiv.org/abs/2401.08514)
Sign and basis invariant networks for spectral graph representational learning (https://arxiv.org/abs/2202.13013)
Equivariant Subgraph Attention Networks (https://arxiv.org/abs/2110.02910)
Homomorphism Counts for Graph Neural Networks: All About That Basis (https://arxiv.org/abs/2402.08595) - Oversmoothing of GNNs
A Survey On Oversmoothing in Graph Neural Networks (https://arxiv.org/abs/2303.10993)
Understanding convolution on graphs via energies (https://arxiv.org/abs/2206.10991)
Graph Neural Networks Exponentially Lose Expressive Power for Node Classification (https://arxiv.org/abs/1905.10947) - Generalization in GNNs
Generalization and Representational Limits of Graph Neural Networks (https://arxiv.org/abs/2002.06157) - Applications of GNNs
FAENet: Frame Averaging Equivariant GNN for Material Modeling (https://arxiv.org/abs/2305.05577)
Message Passing Neural PDE Solvers (https://arxiv.org/abs/2202.03376)
Geometric and Physical Quantities Improve E(3) Equivariant Message Passing (https://arxiv.org/abs/2110.02905)