How to improve SGDRegressor model performance
I’m working on a personal project to compare model performances between OLS model and SGDRessor model. The OLS models are not perfect, but work fine. The SGDR model predictions are way off. I checked the cost reduction over iterations. The results suggest that the learning rate is too high. However, lowering learning rate seems to worsen the cost reduction.
Is the given code for gradient descent updating the paraments sequentially or simultaneously?
I’m new to machine learning and I have been learning gradient descent algorithm. I believe this code uses simultaneous update, even though it looks like sequential update. Since the values of partial derivative are calculated before updating either w or b, that is, from the original w and b, the algorithm thus applied on individual w,b is being applied from original values. Am I wrong?