게시사진

ICCL’s paper accepted to CVPR 2021

Activities June 9, 2021

UNIST EE Student, Sangyun Oh and Prof. Jongeun Lee’s paper has been accepted to CVPR 2021.

Authors: Sangyun Oh, Hyeonuk Sim, Sugil Lee, Jongeun Lee (corresponding author)
Title: Automated Log-Scale Quantization for Low-Cost Deep Neural Networks
Conference: 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)

Abstract: Quantization plays an important role in deep neural network (DNN) hardware. In particular, logarithmic quantization has multiple advantages for DNN hardware implementations, and its weakness in terms of lower performance at high precision compared with linear quantization has been recently remedied by what we call selective two-word logarithmic quantization (STLQ). However, there is a lack of training methods designed for STLQ or even logarithmic quantization in general. In this paper we propose a novel STLQ-aware training method, which significantly outperforms the previous state-of-the-art training method for STLQ. Moreover, our training results demonstrate that with our new training method, STLQ applied to weight parameters of ResNet-18 can achieve the same level of performance as state-of-the-art quantization method, APoT, at 3-bit precision. We also apply our method to various DNNs in image enhancement and semantic segmentation, showing competitive results.

CVPR is a premiere conference in computer vision and deep learning algorithms and optimizations and also ranked 1st among engineering & computer science conferences according to Google Scholar.