Neural Network new input values destroy old output behavior of old input values

Thread Starter

Nils_1902

Joined Mar 30, 2022
1
Hello,

in my neural network my backpropagation outputs are destroyed by new inputs. My neural network structure: 4 inputs, 2 hidden layer (8 neurons) and 2 outputs.

For example :
First input values [0.1, 0.2, 0.9, 0.4] --> Predicted [0.48, 0.52], Expected [1, 0].
After Backpropagation --> Predicted [0.9, 0.1]

Second Input Values are paste in [0.2, 0.5, 0.1, 0.7] --> Predicted [0.9, 0.1], Expected [0, 1]
After Backpropagation --> Predicted [0.1, 0.9]

Now when I insert my first inputs into the neural network ([0.1, 0.2, 0.9, 0.4]) it no longer comes out [0.9, 0.1] but [0.1, 0.9] is the output.

Somehow my new input values destroy the output behavior of my old input values after a backpropagation was performed.

I would be happy if someone could explain to me why this is happening and give me tips on how to solve this problem.

Thanks alot
 
Top