Content Tags

There are no tags.

Building Efficient ConvNets using Redundant Feature Pruning.

RSS Source
Authors
Babajide O. Ayinde, Jacek M. Zurada

This paper presents an efficient technique to prune deep and/or wideconvolutional neural network models by eliminating redundant features (orfilters). Previous studies have shown that over-sized deep neural networkmodels tend to produce a lot of redundant features that are either shiftedversion of one another or are very similar and show little or no variations;thus resulting in filtering redundancy. We propose to prune these redundantfeatures along with their connecting feature maps according to theirdifferentiation and based on their relative cosine distances in the featurespace, thus yielding smaller network size with reduced inference costs andcompetitive performance. We empirically show on select models and CIFAR-10dataset that inference costs can be reduced by 40% for VGG-16, 27% forResNet-56, and 39% for ResNet-110.

Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.