pythondeep-learning

Avoid Division by Zero in Sigmoid function


With : https://nbviewer.org/github/ImadDabbura/blog-posts/blob/master/notebooks/Coding-Neural-Network-Forwad-Back-Propagation.ipynb as reference I am trying to build a Vanila Deep Learning Code. I wonder how is division by zero avoided in the sigmoid funtion.

def sigmoid(Z):
"""
Computes the sigmoid of Z element-wise.

Arguments
---------
Z : array
    output of affine transformation.

Returns
-------
A : array
    post activation output.
Z : array
    output of affine transformation.
"""
A = 1 / (1 + np.exp(-Z))

return A, Z

Solution

  • I wonder how is division by zero avoided in the sigmoid function.

    For 1 + exp(-Z) to be zero, exp(-Z) would need to be -1 which won't happen because exp is always positive for all real numbers. Z is a real number in the context of neural networks.