Many complex networks evolve over time including transaction networks, traffic networks, social networks and more. Temporal Graph Learning (TGL) is a fast growing field which aims to learn, predict and understand evolving networks. See our previous blog post for an introduction to temporal graph learning and a summary of advancements last year.
In 2023, we saw significantly increased interest from both academia and the industry in the development of TGL. Compared to last year, the number of submissions at the temporal graph learning workshop @ NeurIPS 2023 tripled, resulting in 35 accepted papers. In addition, the temporal graph learning reading group, started in February 2023, has now hosted 28 research talks (find the recordings on YouTube). With nearly 200 researchers signed up for the reading group, we are glad to witness interest in the topic and an extremely active community.
This post was co-authored with Emanuele Rossi, Michael Galkin ,Andrea Cini and Ingo Scholtes.
This blog post covers a selection of exciting developments in TGL while pointing out research directions for 2024. We also ask leading researchers for their take on what the future holds for TGL. This blog also aims to provide references and act as starting points for those who want to learn more about temporal graph learning. Please share with us in the comment section any other advances you are excited about. For advancements on graph learning, checkout Michael Galkin’s excellent blog post.
Table of Contents:
One of the driving forces of the rapid development of machine learning on graphs is the availability of standardized and diverse benchmarks such as the Open Graph Benchmark (OGB), the Long Range Graph Benchmark and GraphWorld. However, these benchmarks are designed for static graphs and lack the fine-grained timestamp information required for temporal graph learning. Therefore, progress in temporal graph learning has been held back by the lack of large high-quality datasets, as well as the lack of proper evaluation resulting in over-optimistic performances.
To address this gap, the Temporal Graph Benchmark (TGB) was presented recently, including a collection of challenging and diverse benchmark datasets for realistic, reproducible, and robust evaluation for machine learning on temporal graphs. TGB provides a pypi package to automatically download and process nine datasets from five distinct domains with up to 72 million edges and 30 million timestamps. TGB also provides standardized evaluation motivated by real applications.