Regression in Custom Neural Network giving same output for all inputs
I have recently started studying Neural Network and thought about writing my own Neural Network before using libraries like tensorflow or pytorch so that I understand deeply what happens inside the network. I developed the code based on math of Neural Networks. Now there is some strange behaviour about the network. First of all, The network sometimes performs really well, Like with iris data set of scikit learn and one small configuration, I was able to take the network to accuracy of 99.3%. But sometimes it performs too poorly. Also, In one problem I noticed that using ReLU activation function for first layer and then all rest sigmoid gave almost 63%accuracy but when I made the first layer sigmoid and second layer ReLU and rest all sigmoid then accuracy went to 93%. Now the main problem is regression which doesn’t see to work at all. The model seem to output same thing for all inputs.
PINN is not learning [closed]
Closed 2 days ago.
Neural Network cost not changing [closed]
Closed 14 hours ago.