Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks | IEEE Conference Publication | IEEE Xplore