tensorflowneural-networkactivation-function

What activation function is used in the nce_loss?


I am stuck with the nce_loss activation function in the word2vec model. I want to figure out what activation function it uses among all these listed here:

These include smooth nonlinearities (sigmoid, tanh, elu, softplus, and softsign), continuous but not everywhere differentiable functions (relu, relu6, crelu and relu_x), and random regularization (dropout).

I have searched for it in this function and somewhere else but failed to get any ideas. I suppose it is the relu* series. What can I try next?


Solution

  • None of those. It uses CrossEntropy.