Relative Content

Tag Archive for javadeep-learningneural-network

Neural net flips behaviour with layer count

I wrote a simple neural net and i am too confused with its behaviour. I try a simple training set with Relu-Activation which has to predict 0.5 for input 1 and 0 for input 0. It works with layerDimension = {1,10,10,1} but not with {1,10,1} or {1,10,10,10,1}.