Skip to the content.

Explainable Graph Machine Learning : Challenges and Solutions

Speaker: Megha Khosla

Abstract: Graph-based machine learning (GraphML) has led to state of the art improvements in various scientific tasks, yet its opaque nature hinders its full potential in sensitive domains. In this talk, I will commence by providing a brief overview of prevalent notions of post-hoc explanations for GraphML techniques, shedding light on the intricacies of finding and evaluating these explanations. I will present our proposed effective solutions and conclude by exploring intriguing problems that continue to stimulate research in this domain.

[ Slides ] [ Recording ]

Multi-label Node Classification On Graph-Structured Data

Speaker: Tianqi Zhao

Abstract: Graph Neural Networks (GNNs) have exhibited remarkable progress in node classification on graphs, particularly in a multi-class setting. However, their applicability to the more realistic multi-label classification scenario, where nodes can have multiple labels, has been largely overlooked. This talk will unveil the limitations of current GNNs in handling multi-label classification tasks. I will also highlight how existing GNNs, designed for either homophilic or heterophilic characteristics of node labels, fall short in capturing the nuanced complexities of multi-label datasets that don’t conform to such clear distinctions.

[ Slides ] [ Recording ]

Self-Attention Message Passing for Contrastive Few-Shot Image Classification

Speaker: Ojas Shirekar

Abstract: Humans have a unique ability to learn new representations from just a handful of examples with little to no supervision. Deep learning models, however, require an abundance of data and supervision to perform at a satisfactory level. Unsupervised few-shot learning (U-FSL) is the pursuit of bridging this gap between machines and humans. Inspired by the capacity of graph neural networks (GNNs) in discovering complex inter-sample relationships, we propose a novel self-attention based message passing contrastive learning approach (coined as SAMP-CLR) for U-FSL pre-training. This work also proposes an optimal transport (OT) based fine-tuning strategy (called OpT-Tune) to efficiently induce task awareness into our novel end-to-end unsupervised few-shot classification framework (SAMPTransfer).

[ Slides ] [ Recording ]

back