Li Gu

I am a research engineer at Noah’s Ark Lab, Huawei Canada in Toronto, supervised by Prof. Wang Yang. I obtained M. Eng in Computer Engineering from University of Toronto and B. Eng in Electrical & Computer Engineering from Shanghai Jiao Tong University.

My research interests are at the intersection of computer vision and machine learning. Recently, I've been focusing on how to enable machine learning models to adapt to novel environments rapidly and efficiently, including areas in meta-learning, few-shot learning, continual learning, and prompt learning.

Email  /  Google Scholar  /  Github  /  Linkedin

profile photo
Publication
meta_dmoe Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts
Tao Zhong*, Zhixiang Chi*, Li Gu*, Yang Wang, Yuanhao Yu, Jin Tang
NeurIPS, 2022.
Paper / Slides / Code

Proposed a new framework for unsupervised test-time adaption toward domain shift; Formulated the adaptation process as knowledge distillation and meta-learned the scheme of knowledge aggregation from multiple source domains.

orbit_2022 Improving ProtoNet for Few-Shot Video Object Recognition: Winner of ORBIT Challenge 2022
Li Gu, Zhixiang Chi*, Huan Liu*, Yuanhao Yu, Yang Wang
CVPR 2022 VisWiz workshop.
Paper / Slides / Code

Extended ProtoNet-based few-shot image classification approaches into the video domain.

The winning team at the ORBIT few-shot object recognition challenge; Awarded cash prizes of 2,500 USD.

meta_fscil MetaFSCIL: A Meta-Learning Approach for Few-Shot Class Incremental Learning
Zhixiang Chi, Li Gu, Huan Liu, Yang Wang, Yuanhao Yu, Jin Tang
CVPR, 2022.
Paper / Talk

Proposed a bi-level optimization-based meta-learning approach to learning how to learn incrementally in the Few-Shot Class Incremental Learning(FSCIL) setting.

free-replay Few-Shot Class-Incremental Learning via Entropy-Regularized Data-Free Replay
Huan Liu, Li Gu, Zhixiang Chi, Yang Wang, Yuanhao Yu, Jun Chen, Jin Tang
ECCV, 2022.
Paper /

Proposed a replay-based Few-Shot Class Incremental Learning(FSCIL) framework that can synthesize data by a generator without storing real data during each continual learning session. Imposed entropy regularization in the generator training to encourage more uncertain examples to be utilized in knowledge distillation.

dmm-net DMM-Net: Differentiable Mask-Matching Network for Video Object Segmentation
Xiaohui Zeng*, Renjie Liao*, Li Gu, Yuwen Xiong, Sanja Fidler, Raquel Urtasun
ICCV, 2019.
Paper / Code

Proposed the differentiable mask-matching network (DMM-Net) for solving the video object segmentation problem where the initial object masks are provided.

APD-BNN Adversarial Distillation of Bayesian Neural Network Posteriors
Xiaohui Zeng*, Renjie Liao*, Li Gu, Yuwen Xiong, Sanja Fidler, Raquel Urtasun
ICML, 2018.
Paper / Code / Slides / Poster

Proposed an efficient framework for using a small Generative Adversarial Network (GAN) to store MCMC samples of the posterior from a large Bayesian Neural Network.