2021 IEEE International Conference on Acoustics, Speech and Signal Processing

6-11 June 2021 • Toronto, Ontario, Canada

Extracting Knowledge from Information

2021 IEEE International Conference on Acoustics, Speech and Signal Processing

6-11 June 2021 • Toronto, Ontario, Canada

Extracting Knowledge from Information

Technical Program

Paper Detail

Paper IDMLSP-18.4
Paper Title OPTIMAL SELECTION OF MATRIX SHAPE AND DECOMPOSITION SCHEME FOR NEURAL NETWORK COMPRESSION
Authors Yerlan Idelbayev, Miguel Á. Carreira-Perpiñán, University of California, Merced, United States
SessionMLSP-18: Matrix Factorization and Applications
LocationGather.Town
Session Time:Wednesday, 09 June, 14:00 - 14:45
Presentation Time:Wednesday, 09 June, 14:00 - 14:45
Presentation Poster
Topic Machine Learning for Signal Processing: [MLR-MFC] Matrix factorizations/completion
IEEE Xplore Open Preview  Click here to view in IEEE Xplore
Virtual Presentation  Click here to watch in the Virtual Conference
Abstract When applying the low-rank decomposition to neural networks, tensor-shaped weights need to be reshaped into a matrix first. While many matrix reshapes are possible, some of them induce a low-rank decomposition scheme that can be more efficiently implemented as a sequence of layers. This poses the following problem: how should one select both the matrix reshape and associated low-rank decomposition scheme in order to compress a neural network so that its implementation is as efficient as possible? We formulate this problem as a mixed-integer optimization over the weights, ranks, and decompositions schemes; and we provide an efficient alternating optimization algorithm involving two simple steps: a step over the weights of the neural network (solved by SGD), and a step over the ranks and decomposition schemes (solved by an SVD). Our algorithm automatically selects the most suitable ranks and decomposition schemes to efficiently reduce compression costs (e.g., FLOPs) of various networks.