neural-networkdeep-learningperceptronmulti-layer

How to train a multiplier with MLP?


I'm new to neural networks. Im trying to understand what kind of solutions a multilayer perceptron can learn to achieve.

Is it possible to train an MLP to do multiplication by just giving discrete amount of examples?

I could teach it how to do multiplication on certain numbers (those from training dataset of course) but it can not estimate other multiplications correctly.

I used 1 hidden layer (TanH, 10 units) and 1 output layer (Identity), both hidden layer and output layer were biased and were trained using Momentum optimizer.

dataset

0, 5 = 0
1, 1 = 1
2, 3 = 6
3, 7 = 21
4, 3 = 12
5, 9 = 45 
7,7 = 49
13,13 = 169

It gives correct results for this dataset but for example calculating 5 * 5 gives wrong number like 32.

Am I expecting too much from MLP? what dataset (or layer setup) I should give to the network to be able to multiply any given number?


Solution

  • Yes, you are expecting too much. An MLP is not "smart" enough to abstract methods from a handful of specific examples. It's a linear combination of weights based on the inputs; extrapolating a quadratic relationship from these examples is a deeper concept than one can express in MLP terms.

    In general, if your research hasn't already turned up a standard solution class for a given problem, you're stuck with a wide range of experimentation. My first thought is to try doing this with a RNN, hoping to catch the multiplication abstraction as a side-effect of the feedback loop.