Wednesday, July 20, 2011

SCHEMENAUER AND THE XOR GATE

SCHEMENAUER AND THE XOR GATE

IMPLEMENTING ANN IN PYTHON

I was searching for Artificial Neural Networks (ANN) implementation in Python. I came across the following;
  1. FANN - C library with python bindings
  2. PyBrain
  3. NeuroLab
  4. PyNN 
  5. BPNN - Not a library, solitary script by Neil Schemenauer
THE XOR PROBLEM

The XOR problem has some history in the evolution of ANN methods. The XOR function is not linearly separable and cannot be realised using only one layer of ANN.

TINKERING WITH SCHEMENAUER'S CODE

Schemenauer's code has default training values for a 2 input XOR gate.


Schemenauer recommends using of a (2,2,1) network (viz. a network with two input, two hidden, and one output nodes) and the output is very much as desired, in the limits of errors of the ANN.  

XOR Output for a (2,2,1) Back Propogation Neural Network;
([0, 0], '==', [0.025608579041218795])
([0, 1], '==', [0.98184578447794768])
([1, 0], '==', [0.98170742564066216])
([1, 1], '==', [-0.021030064439813451])
However, playing around with the number of hidden layers has interesting results,

XOR Output for a (2,1,1) Back Propagation Neural Network; 
([0, 0], '==', [0.0020536886211772179])
([0, 1], '==', [0.68437587415369783])
([1, 0], '==', [0.68413753288547252])
([1, 1], '==', [0.6856616998850974])
The output of (2,1,1) clearly confirms the XOR problem !

Increasing the number of hidden layers indiscriminately, leads to anomalous output.

As an example, XOR Output for a (2,25,1) Back Propagation Neural Network;
([0, 0], '==', [0.99999643777993841])
([0, 1], '==', [0.99999911082329096])
([1, 0], '==', [0.99999280130316026])
([1, 1], '==', [0.99999824824488848])

Anomalous behaviour comes into play from about 12 hidden nodes.


REFERENCES
(1) An introduction to neural networks

2 comments:

Luke Dunn said...

the bigger your brain the more likely you are to get in a muddle !

Arkapravo said...

lets say ! a small brain is bad (2,1,1) .... a medium sized brain is good (2,2,1) ..... and large ones (2,25,1) are crappy