I'm studying multilayer perceptrons and write simple net for classification points in 2D-space. Net trains by back-propagation algorithm with momentum.
Unfortunately, even while local error is going to zero, the global error is still very high and I cannot understand why. The output of global error in console ranges in [100, 150]. So, my main question: how can I reduce this error?
I, obviously, provide link to archive with my project. A few words about that: almost all parameters of net is in file libraries.h (input, hidden and output layer dimensions, learning rate, momentum rate, sigma and sigma derivative definitions) so if you want to play with that - here you go. Structure of net is in file perceptron.cpp, graphics library is in plot.cpp. To test project you should run it, click mouse left-button at the points on appeared window where you want to be centers of classes. Right-click on the window will generate random points in the circle of radius 5 around those centers and will train net with this points.
If somebody can provide some theoretical solution or even take a fresh look at my code, I will be very gratefull for that.
I successfully resolved the problem.
First of all, I had incorrect centers of groups of points, so this points became completely inseparable in 2D space.
Secondary, I had to rewrite process of training as picking random points from set.
And third one, I found that casting double to int isn't the best idea ever (very high lost of data).
Link to the final version of code: CLICK