Welcome to Wei Huang’s Homepage
I am a Research Scientist in the Deep Learning Theory Team at RIKEN Center for Advanced Intelligence Project (AIP), working with Prof. Taiji Suzuki. Prior to this role, I was a Postdoctoral Researcher in the same team. Before, I was working as a research associate advised by Dr. Xin Cao at UNSW and working with A/Prof. Jie Yin on graph neural network at USYD.
Dr. Huang obtained his Ph.D. degree at Faculty of Engineering and Information Technology, University of Technology Sydney working with Prof. Richard Xu. I obtained my Master degree at the department of modern Physics, University of Science and Technology of China, supervised by Prof. Youjin Deng.
I maintain a comprehensive and popular blog on the latest Deep Learning Theory works on Zhihu and a feature learning theory reading list on Github.
Please feel free to contact me if you want to cooperate or discuss with me (weihuang[dot]uts[at]gmail[dot]com) on deep learning theory and its application.
My office is located at the University of Tokyo, Hongo Campus. If you would like to reach out, please don’t hesitate to do so.
Research Interest
Theoretically understanding deep learning and large foundation modal from expressivity, trainability, and generalization.
Feature Learning; Implicit Regularization/Bias; Neural Tangent Kernel;
Applications powered by deep learning theory:
Large Foundation Model, Graph Neural Networks; Computer Vision
News
09/2024 Seven papers are accepted by NeurIPS 2024, On the Comparison between Multi-modal and Single-modal Contrastive Learning; Provable and Efficient Dataset Distillation for Kernel Ridge Regression; Federated Learning from Vision-Language Foundation Models: Theoretical Analysis and Method; Unveil Benign Overfitting for Transformer in Vision: Training Dynamics, Convergence, and Generalization; Provably Transformers Harness Multi-Concept Word Semantics for Efficient In-Context Learning; On Mesa-Optimization in Autoregressively Trained Transformers: Emergence and Capability; SLTrain: a sparse plus low rank approach for parameter and memory efficient pretraining
05/2024 One paper is accepted by KDD 2024 Research Track, The Heterophily Snowflake Hypothesis: Training and Empowering GNN for Heterophilic Graphs.
05/2024 I will be attending ICLR 2024 from 06/05/2024 to 12/05/2024 in Vienna. Feel free have a chat if you’re there.
05/2024 Two papers are accepted by ICML 2024. Diffusion Models Demand Contrastive Guidance for Adversarial Purification to Advance; Provably Neural Active Learning Succeeds via Prioritizing Perplexing Samples
02/2024 One paper is accepted by CVPR 2024, Global and Local Prompts Cooperation via Optimal Transport for Federated Learning
01/2024 Two papers are accepted by ICLR 2024, Understanding Convergence and Generalization in Federated Learning through Feature Learning Theory; Graph Lottery Ticket Automated
12/2023 I will be attending NeurIPS 2023 from 10/12/2023 to 17/12/2023 at the New Orleans Ernest N. Morial Convention Center. Feel free have a chat if you’re there.
12/2023 One paper is accepted by IEEE Transactions on Image Processing, DMMG: Dual Min-Max Games for Self-Supervised Skeleton-Based Action Recognition
10/2023 I will be attending IBIS2023 from 29/10/2023 to 01/11/2023 in Kitakyushu. Feel free have a chat if you’re there.
09/2023 Three papers are accepted by NeurIPS 2023, Understanding and Improving Feature Learning for Out-of-Distribution Generalization; Analyzing Generalization of Neural Networks through Loss Path Kernels; Fed-CO_2: Cooperation of Online and Offline Models for Severe Data Heterogeneity in Federated Learning;
08/2023 One paper is accepted by Transactions on Machine Learning Research, Single-Pass Contrastive Learning Can Work for Both Homophilic and Heterophilic Graph
08/2023 I will be attending ICIAM 2023 at Waseda University, Tokyo. Feel free to have a chat if you’re there.
07/2023 I had the privilege of giving a contributed talk at the ICML 2023 workshop on High-Dimensional Learning Dynamics in Honolulu, Hawaii. It was a great experience sharing my work with researchers. Please find my slides here.
06/2023 One paper is accepted by High-dimensional Learning Dynamics Workshop (ICML) 2023, Graph Neural Networks Provably Benefit from Structural Information: A Feature Learning Perspective
05/2023 One paper is accepted by Transactions on Machine Learning Research, Analyzing Deep PAC-Bayesian Learning with Neural Tangent Kernel: Convergence, Analytic Generalization Bound, and Efficient Hyperparameter Selection
05/2023 One paper is accepted by AutoML 2023. No Free Lunch in Neural Architectures? A Joint Analysis of Expressivity, Convergence, and Generalization
03/2023 One paper is accepted by ICLR 2023 workshop, Towards Understanding Feature Learning in Out-of-Distribution Generalization
11/2022 I arrived in Tokyo, Japan!
09/2022 Four papers are accepted by NeurIPS 2022, Deep Architecture Connectivity Matters for Its Convergence: A Fine-Grained Analysis; Deep Active Learning by Leveraging Training Dynamics; Interpreting Operation Selection in Differentiable Architecture Search: A Perspective from Influence-Directed Explanations; Weighted Mutual Learning with Diversity-Driven Model Compression
08/2022 One paper Pruning graph neural networks by evaluating edge properties is accepted by Knowledge-Based Systems journal
05/2022 I am invited to serve as a reviewer for ICLR-2023
03/2022 I am invited to serve as a reviewer for NeurIPS-2022
01/2022 Two papers are accepted by ICLR 2022, Towards Deepening Graph Neural Networks: A GNTK-based Optimization Perspective; Auto-scaling Vision Transformers without Training
12/2021 I am invited to serve as a reviewer for ICML-2022
11/2021 I am invited to serve as a reviewer for CVPR-2022
09/2021 One paper is accepted by NeurIPS 2021, On the Equivalence between Neural Network and Support Vector Machine
07/2021 I am invited to serve as a Senior Program Committee member for AAAI-2022
06/2021 I am invited to serve as a reviewer for ICLR-2022
04/2021 One paper is accepted by IJCAI 2021, On the Neural Tangent Kernel of Deep Networks with Orthogonal Initialization
03/2021 I am invited to serve as a reviewer for NeurIPS-2021
12/2020 I am invited to serve as a reviewer for ICML-2021