Ji Lin

         


Contact:

jilin.eecs AT gmail

I am a research scientist at OpenAI.

Previously, I completed my PhD at MIT EECS advised by Prof. Song Han. Before that, I received my B.Eng. in Electronic Engineering from Tsinghua University, and M.Sc. in EECS from MIT. I've interned/worked at Adobe Research, OmniML, and NVIDIA Research.

Publications [Full List]

* indicates equal contribution


VILA: On Pre-training for Visual Language Models
Ji Lin*, Hongxu Yin*, Wei Ping, Yao Lu, Pavlo Molchanov, Andrew Tao, Huizi Mao, Jan Kautz, Mohammad Shoeybi, Song Han,
AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
Ji Lin*, Jiaming Tang*, Haotian Tang, Shang Yang, Xingyu Dang, Song Han,
Best Paper Award
SmoothQuant: Accurate and Efficient Post-Training Quantization for Large Language Models
Guangxuan Xiao*, Ji Lin*, Mickael Seznec, Julien Demouth, Song Han,
PockEngine: Sparse and Efficient Fine-tuning in a Pocket
Ligeng Zhu, Lanxiang Hu, Ji Lin, Wei-Ming Chen, Wei-Chen Wang, Chuang Gan, Song Han,
Efficient Spatially Sparse Inference for Conditional GANs and Diffusion Models
Muyang Li, Ji Lin, Chenlin Meng, Stefano Ermon, Song Han, Jun-Yan Zhu
On-Device Training Under 256KB Memory
Ji Lin*, Ligeng Zhu*, Wei-Ming Chen, Wei-Chen Wang, Chuang Gan, Song Han
Network Augmentation for Tiny Deep Learning
Han Cai, Chuang Gan, Ji Lin, Song Han
MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning
Ji Lin, Wei-Ming Chen, Han Cai, Chuang Gan, Song Han
Anycost GANs for Interactive Image Synthesis and Editing
Ji Lin, Richard Zhang, Frieder Ganz, Song Han, Jun-Yan Zhu
MCUNet: Tiny Deep Learning on IoT Devices
Ji Lin, Wei-Ming Chen, Yujun Lin, John Cohn, Chuang Gan, Song Han
NeurIPS 2020 / arXiv / Project Page / Code / Demo Video
Differentiable Augmentation for Data-Efficient GAN Training
Shengyu Zhao, Zhijian Liu, Ji Lin, Jun-Yan Zhu, Song Han
NeurIPS 2020 / arXiv / Project Page / Code / Slides / Colab Tutorial
Press: VentureBeat
GAN Compression: Efficient Architectures for Interactive Conditional GANs
Muyang Li, Ji Lin, Yaoyao Ding, Zhijian Liu, Jun-Yan Zhu, Song Han
APQ: Joint Search for Network Architecture, Pruning and Quantization Policy
Tianzhe Wang, Kuan Wang, Han Cai, Ji Lin, Zhijian Liu, Hanrui Wang, Yujun Lin, Song Han
AutoML for Architecting Efficient and Specialized Neural Networks
Han Cai*, Ji Lin*, Yujun Lin*, Zhijian Liu*, Kuan Wang*, Tianzhe Wang*, Ligeng Zhu*, Song Han
TSM: Temporal Shift Module for Efficient Video Understanding
Ji Lin, Chuang Gan, Song Han
ICCV 2019 / arXiv
Training Kinetics in 15 Minutes: Large-scale Distributed Training on Videos
Ji Lin, Chuang Gan, Song Han
HAQ: Hardware-Aware Automated Quantization
Kuan Wang*, Zhijian Liu*, Yujun Lin*, Ji Lin, Song Han
Hardware-Centric AutoML for Mixed-Precision Quantization
Kuan Wang*, Zhijian Liu*, Yujun Lin*, Ji Lin, Song Han
AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Runtime Neural Pruning
Runtime Network Routing for Efficient Image Classification

Academic Service

  • Conference reviewer: ICLR, ICML, NeurIPS, CVPR, ICCV, ECCV, SIGGRAPH, IJCAI, AAAI, ACMMM, etc.
  • Journel reviewer: T-PAMI, JMLR, T-MM, etc.