XOR Neural Network Problem

Discussion in 'Embedded Systems and Microcontrollers' started by grantm, Jan 7, 2013.

  1. grantm

    Thread Starter New Member

    Jan 7, 2013
    1
    0
    I have written an Artificial Neural Network code for a PIC microprocessor. The idea is to expand the code later into a forecast model for time series prediction and later implement the same model on a PLD platform. My model is a feedforward network that use back propagation as a learning method and was written for the Pic24F series microcontroller using the latest MPLABX development environment and associated libraries (16bit language tools ): 2 input nodes, 2 hidden nodes and 1 output node. Bias nodes were inserted. The idea is to first test the learning algorithm using the classical XOR problem. A data set is loaded:

    float DataSet[12] = {0,0,0,0,1,1,1,0,1,1,1,0};

    for which the first two bits are inputs and the last bit the expected output. The input is propagated through the network with the random initialise weights..

    float w[10] = {0.0,0.2,0.5,0.1,0.6,0.3,0.4,0.7,0.2,0.8};

    The output is then determined using:

    float CalculateOut(void)

    Whilst the error is greater than 10%, a back propagation learning algorithm seeks to correct: void Learn (void)

    The delta output is calculated for the output node and two hidden nodes. The output is simulated by means of the main loop. I used a variable watch to observe all the outputs (IDE environment). I am aware that the XOR is not linearly separable. The problem is that during the verification only two conditions are correct. The error is really high for conditions {0,0} and {1;1}. I have tried to get the code working for a while now but proved to be unsuccessful. I would be grateful if you can offer some guidance.

    Regards

     
Loading...