Is there a way to use SciPy’s differential evolution, or another library, to minimize one of a multi-output regressor’s outs and bound the others?

  Kiến thức lập trình

I am using a simple RandomForestRegressor from sklearn to predict 3 outputs, and it works well. The problem is that I now need to optimize one of the outputs to be as low as possible, while ensuring the other two predictions don’t escape a couple of bounds in the process. I know I could pass the prediction function to something like differential_evolution to minimize the output I need, but my understanding is that you can only enforce bounds on the inputs and not other outputs.

Is there a way to do this that you know of? Am I perhaps approaching this problem incorrectly?

Minimizing the output I wanted results in the other outputs escaping the boundaries I need to set. I’m hoping there is a more intelligent way to find the global minimum other than testing a huge range of inputs, and then finding the minimum among the “allowed” predictions where the other outputs are within the bounds.

3

Use something like the following to supply to differential_evolution. Turn the polishing option off.

def my_objective(x):
    y = RandomForestRegressorFun(x)
    if np.logical_or(y[1:] > 2.5, y[1:] < 0).any():
        return np.inf

    return y[0]

Theme wordpress giá rẻ Theme wordpress giá rẻ Thiết kế website Kho Theme wordpress Kho Theme WP Theme WP

LEAVE A COMMENT