Content Tags

There are no tags.

Optimizing Recurrent Neural Networks Architectures under Time Constraints.

RSS Source
Authors
Junqi Jin, Ziang Yan, Kun Fu, Nan Jiang, Changshui Zhang

Recurrent neural network (RNN)'s architecture is a key factor influencing itsperformance. We propose algorithms to optimize hidden sizes under running timeconstraint. We convert the discrete optimization into a subset selectionproblem. By novel transformations, the objective function becomes submodularand constraint becomes supermodular. A greedy algorithm with bounds issuggested to solve the transformed problem. And we show how transformationsinfluence the bounds. To speed up optimization, surrogate functions areproposed which balance exploration and exploitation. Experiments show that ouralgorithms can find more accurate models or faster models than manually tunedstate-of-the-art and random search. We also compare popular RNN architecturesusing our algorithms.

Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.