Skip to main navigation Skip to search Skip to main content

CSQ: Growing Mixed-Precision Quantization Scheme with Bi-level Continuous Sparsification

  • Lirui Xiao
  • , Huanrui Yang
  • , Zhen Dong
  • , Kurt Keutzer
  • , Li Du
  • , Shanghang Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Mixed-precision quantization has been applied on deep neural networks (DNNs) as it leads to significantly better efficiency-accuracy tradeoffs compared to uniform quantization. Meanwhile, determining the exact precision of each layer remains challenging. Previous attempts on bit-level regularization and pruning-based dynamic precision adjustment during training suffer from noisy gradients and unstable convergence. In this work, we propose Continuous Sparsification Quantization (CSQ), a bit-level training method to search for mixed-precision quantization schemes with improved stability. CSQ stabilizes the bit-level mixed-precision training process with a bi-level gradual continuous sparsification on both the bit values of the quantized weights and the bit selection in determining the quantization precision of each layer. The continuous sparsification scheme enables fully-differentiable training without gradient approximation while achieving an exact quantized model in the end. A budget-aware regularization of total model size enables the dynamic growth and pruning of each layer's precision towards a mixed-precision quantization scheme of the desired size. Extensive experiments show CSQ achieves better efficiency-accuracy tradeoff than previous methods on multiple models and datasets.

Original languageEnglish (US)
Title of host publication2023 60th ACM/IEEE Design Automation Conference, DAC 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350323481
DOIs
StatePublished - 2023
Externally publishedYes
Event60th ACM/IEEE Design Automation Conference, DAC 2023 - San Francisco, United States
Duration: Jul 9 2023Jul 13 2023

Publication series

NameProceedings - Design Automation Conference
Volume2023-July
ISSN (Print)0738-100X

Conference

Conference60th ACM/IEEE Design Automation Conference, DAC 2023
Country/TerritoryUnited States
CitySan Francisco
Period7/9/237/13/23

Keywords

  • continuous sparsification
  • efficient neural network
  • Quantization

ASJC Scopus subject areas

  • Computer Science Applications
  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Modeling and Simulation

Fingerprint

Dive into the research topics of 'CSQ: Growing Mixed-Precision Quantization Scheme with Bi-level Continuous Sparsification'. Together they form a unique fingerprint.

Cite this