Comparison of ReLU and linear saturated activation functions in neural network for universal approximation | IEEE Conference Publication | IEEE Xplore