neural-networkfpgahyperbolic-function

which is the fastest tanh approximation to be implemented in FPGA


I want to implement a neural net on FPGA, and I need a fast function to calculate (approxmite) tanh.

Accuracy of 2 digit after point will be enough.


Solution

  • https://www.planetmath.org/taylorseriesofhyperbolicfunctions This looks fast enough, depending on your needs. You can add more, or discard the higher terms, and keep the first few for speed, e.g.: (python syntax)

    x-1/3*x**3+2/15*x**5
    

    If you also need the derivative, it's easy to calculate for polinomials. (AFAIR you need that for back propagation)