Speaker: Emily Jin

Abstract

One of the key challenges in graph machine learning is how to effectively encode the topology of a graph into the model at hand. Standard message-passing GNNs are known to struggle with counting certain patterns (e.g., cycles), which limit their applicability to real-world tasks. Furthermore, Graph Transformers heavily rely on the quality of their positional or structural encodings to incorporate graph structure. In this talk, we will discuss how using homomorphism counts can help address both of these shortcomings by providing a principled way of incorporating structural information to increase model expressivity.

Speaker Bio

Emily is a PhD student in Computer Science at the University of Oxford. She is supervised by Prof. Michael Bronstein and Dr. Ismail Ceylan, and also part of the Quantitative Biology group at AstraZeneca. Her research includes applying geometric deep learning methods to challenges in drug discovery, such as molecular property prediction, target identification, and organic crystal structure generation.