ExpandNet: Training Compact Networks by Linear Expansion
Shuxuan Guo,J. Álvarez,M. Salzmann
2018
2 Citations
TLDR
This paper proposes to expand each linear layer of the compact network into multiple linear layers, without adding any nonlinearity, so that the resulting expanded network can be compressed back to the compact one algebraically, but consistently outperforms it.
Cited Papers
Citing Papers
