haskellautomatic-differentiationhmatrix

How to get more performance out of automatic differentiation?


I am having a hard time optimizing a program that is relying on ads conjugateGradientDescent function for most of it's work.

Basically my code is a translation of an old papers code that is written in Matlab and C. I have not measured it, but that code is running at several iterations per second. Mine is in the order of minutes per iteration ...

The code is available in this repositories:

The code in question can be run by following these commands:

$ cd aer-utils
$ cabal sandbox init
$ cabal sandbox add-source ../aer
$ cabal run learngabors

Using GHCs profiling facilities I have confirmed that the descent is in fact the part that is taking most of the time:

Flamegraph of one iteration

(interactive version here: https://dl.dropboxusercontent.com/u/2359191/learngabors.svg)

-s is telling me that productivity is quite low:

Productivity  33.6% of total user, 33.6% of total elapsed

From what I have gathered there are two things that might lead to higher performance:

This is probably a question that is too wide to be answered on SO, so if you are willing to help me out here, feel free to contact me on Github.


Solution

  • You are running into pretty much the worst-case scenario for the current ad library here.

    FWIW- You won't be able to use the existing ad classes/types with "matrix/vector ad". It'd be a fairly large engineering effort, see https://github.com/ekmett/ad/issues/2

    As for why you can't unbox: conjugateGradient requires the ability to use Kahn mode or two levels of forward mode on your functions. The former precludes it from working with unboxed vectors, as the data types carry syntax trees, and can't be unboxed. For various technical reasons I haven't figured out how to make it work with a fixed sized 'tape' like the standard Reverse mode.

    I think the "right" answer here is for us to sit down and figure out how to get matrix/vector AD right and integrated into the package, but I confess I'm timesliced a bit too thinly right now to give it the attention it deserves.

    If you get a chance to swing by #haskell-lens on irc.freenode.net I'd happy to talk about designs in this space and offer advice. Alex Lang has also been working on ad a lot and is often present there and may have ideas.