Learning on Graphs Conference, 2024
Organizers: Qiang Zhang, Jiaoyan Chen, and Zaiqiao Meng
Date: November 26, 2024
US West Coast | US East Coast | London (UTC+1) | Asia (Beijing) |
---|---|---|---|
6:30 | 9:30 | 14:30 | 22:30 |
Length: 1.5 hours
Abstract: In recent years, Knowledge Graphs (KG) and Large Language Models (LLMs) have emerged as powerful tools for scientific research and knowledge discovery. This tutorial aims to provide attendees with an understanding of how these technologies can be integrated and applied to advance research in life sciences and other scientific domains. Over three hours, participants will explore the foundational concepts of KGs and LLMs, their applications in life sciences, and practical examples demonstrating their integration for BioNLP tasks. The tutorial will include materials demonstrating KG construc- tion and LLM development for scientific data interpretation, and emphasizing practical techniques for incorporating domain-specific knowledge into AI models. By the end of this tutorial, participants will have a comprehensive understanding of the synergy between KGs and LLMs and how to leverage these tools for innovative scientific solutions.
Website: TBD
Setup requirements: None
Organizers: Sitao Luan, Chenqing Hua, Qincheng Lu
Date: November 26, 2024
US West Coast | US East Coast | London (UTC+1) | Asia (Beijing) |
---|---|---|---|
11:00 | 14:00 | 19:00 | 03:00 (Nov 27) |
Length: 1.5 hours
Abstract: The objective of this tutorial is to: (1) let people be aware of heterophily problem and understand that not all graph structures are useful or beneficial for learning; (2) help our audiences to review the latest heterophily-specific graph models quickly; (3) summarize real-world applications and assist our audiences on their future research about related topics.
Website: TBD
Setup requirements: None
Organizers: Avishek Joey Bose, Alexander Tong, Heli Ben-Hamu
Date: November 27, 2024
US West Coast | US East Coast | London (UTC+1) | Asia (Beijing) |
---|---|---|---|
06:00 | 09:00 | 14:00 | 22:00 |
Length: 3 hours
Abstract: Recent years have seen a surge in research at the intersection of differential geometry and generative modeling on Riemannian manifolds. Indeed the growth of applications of many highly structured data domains (e.g. graphs representing molecular data and geospatial data as points on a sphere) demand modern generative models to treat this rich anatomy of data as more than just optional inductive biases but rather as first-class citizens that drive key modeling decisions. This raises the natural question of constructing a principled playbook to imbue geometry within generative models. Geometry-aware generative models are key drivers in this space and have already started to make a substantial impact in important application areas such as diffusion and flow matching models, protein generative models (e.g. FrameDiff, FoldFlow), robotics (EDGI), and modeling symmetries in dynamical systems. At present, the intersection of generative modelers and geometric deep learning communities shares a relatively small overlap in comparison to their parent communities. In this tutorial, we seek to bridge this gap by providing a bottom-up view of building modern generative models like diffusion and flow matching attuned to downstream applications that benefit from geometric inductive biases. Our tutorial is a first-of-its-kind and aims to provide a geometric blueprint for audiences that are both newcomers to generative models as well as seasoned geometric ML experts which we hope bolster both the size of the community and the potential for future advances in new theories and applications.
Website: TBD
Setup requirements: None
Organizers: Andrea Cini, Ivan Marisca, Daniele Zambon
Date: November 28, 2024
US West Coast | US East Coast | London (UTC+1) | Asia (Beijing) |
---|---|---|---|
09:00 | 12:00 | 17:00 | 01:00 (Nov 29) |
Length: 3 hours
Abstract: Successful applications of deep learning in time series processing often involve training a single neural network on a collection of (related) time series. Pairwise relationships among time series can be modeled by considering a (possibly dynamic) graph spanning the collec- tion. In this context, graph-based methods take the standard deep learning approach to time series processing a step forward. Indeed, graph representations enable the condition- ing of the predictions w.r.t. subsets of time series (i.e., neighboring nodes) while learning a single model. The recent theoretical and practical developments in graph machine learning make adopting such an approach particularly appealing and timely. Furthermore, looking at dynamic relational data from the time series processing perspective allows for devising new modeling tools for temporal graphs. We believe that research on the topic is mature enough to warrant exposition to a wide audience. The twofold objective of this tutorial is to (1) offer a comprehensive overview of the field, emphasizing the potential of graph-based processing in time series forecasting applications, and (2) provide the necessary tools and guidelines to design and evaluate graph-based models for time series. Open challenges and directions are discussed to foster future developments.
Website: TBD
Setup requirements: Hands-on session will be an overview of open-source Pytorch
libraries for graph-based time series processing. Demo on building and training custom spatiotemporal predictors with PyTorch Geometric
and Torch Spatiotemporal
.
Organizers: Petar Velickovi, Olga Kozlova, Federico Barbero, Larisa Markeeva, Alex Vitvitskyi, Wilfried Bounsi
Date: November 29, 2024
US West Coast | US East Coast | London (UTC+1) | Asia (Beijing) |
---|---|---|---|
08:00 | 11:00 | 16:00 | 00:00 (Nov 29) |
Length: 3 hours
Abstract: We want to build machine learning systems that are thoroughly able to understand, align to, and leverage algorithmic computation. Given the diametrically opposite pros and cons of neural networks and classical algorithms, we believe that bringing them closer together could yield immeasurably drastic gains towards generally in- telligent systems. Bridging this gap is the essence of neural algorithmic reasoning. A lot has happened—both to the field itself, and to the broader landscape of artificial intelligence—since the original NAR tutorial at LoG’22. Reasoning is now undoubtedly at the forefront of the desiderata of capabilities for next-generation intelligent systems, and it is one of the most permeating topic of scientific discourse in AI. We believe it is high time for a new tutorial! In this tutorial, we aim to provide foundational material needed to understand the interplay between neural nets and classical computation, contribute to NAR research and engineering, and ground this with hands-on coding segments directly implementing some of the core ideas, and interfacing with relevant benchmarks. Keeping in line with modern trends, there will be an especial focus towards deploy- ment of NAR techniques in contemporary language models. Much like the original NAR tutorial, our tutorial will be presented from the ground up, in a way that is accessible to anyone with a basic computer science background, though insight from the original tutorial will be favourable for this one.
Website: TBD
Setup requirements: TBD