SciPy Minimization Convergence Problems for Objective Function with Small Values and Numerical Derivatives
I’m having issues with minimization with SciPy for an objective function that returns a small value, and for a problem where I would like to use numerical derivatives in a gradient-based algorithm. It for a problem related to optimal sampling sizes from Cochran’s 1977 “Sampling Techniques” textbook. My minimum working example is as follows:
Finding unused variables after minimizing
After minimization (Python/scipy), I would like to know how to find unused variables in the result. Here is a simple example where the third variable is left untouched. Apart comparing initial value vs result, is there a better way to identify such variable?