Home

Don't use XOR's for Neural Networks

A little while back I had the idea to build a neural network that operated using only binary nodes, using XORs as the node's transfer function, and changing the neural layout rather than changing the weightings. Well, I tried it and it was completely and utterly useless. I threw a genetic algorithm at it, and recievent no improvements other than by chance, even with low mutation values. It did not achieve even a trivial incremental improvement.

While walking home from work today, I think I figured out the reason. Rewiring a neural network completely changes it's response. In the binary network, a single change would likely result in the complete inversion of the output of a branch of the network structure. As a result, there is no way to "incrementally" improve the topology of the network in the fashion that a genetic algorithm tries to do.

So unless I can come up with some better way for a genetic algorithm to mutate the topology of a neural network, I think I'll abandon binary neural networks.