qNParEGO Bayesian Optimiser - Can generations be increased?

Hi Cadet team,

I have been recently employing a qNParEGO to run a parameter estimation. It is working really well for me and I’m getting some nice fits for my models in a fraction of the time of the U_NSGA3.

I did have a quick question with regards to the generations. Currently the optimisation seems to be fixed to 100 generations and there’s no setting to increase this? Is there a way to increase the number of generations above 100 so it carries on looking for a better solution or does it not really work like this. I’m just looking for the best way to increase the accuracy of the estimation using this optimiser as it is running so quickly I don’t mind playing with some of the variables so that it takes longer, provided the estimation ends up being better.

Thank you !

I just tested on my machine and the current dev branch the n_max_evals and n_max_iter settings control the number of “generations” for qNParEGO. It runs until the lower of the two limits is reached, so

optimizer.n_max_iter = 160
optimizer.n_max_evals = 140

will run 140 generations.

This means, that if you only set one of them, the other will stay at its default limit of 100, so

optimizer = qNParEGO()
optimizer.n_max_iter = 160
optimization_results = optimizer.optimize(
    optimization_problem,
)

will only run for 100 generations.

1 Like

I have been trying to start playing with some of the different optimizers in cadet-process recently but have been having some trouble setting it up. I was wondering if I could get some help.

I installed cadet-process with the optional dependencies as described in the cadet-process docs with

pip install "cadet-process[ax]"

but when I try to import one of the Ax optimizers cadet-process gives me the error

---> 25 from CADETProcess.optimization import qNParEGO

File ~/miniconda3/envs/ax-env/lib/python3.12/site-packages/CADETProcess/optimization/__init__.py:122, in __getattr__(name)
    120         return getattr(module, name)
    121     else:
--> 122         raise ImportError(
    123             "The AxInterface class could not be imported. "
    124             "This may be because the 'ax' package, which is an optional dependency, is not installed. "
    125             "To install it, run 'pip install CADET-Process[ax]'"
    126         )
    127 raise AttributeError(f"module {__name__} has no attribute {name}")

ImportError: The AxInterface class could not be imported. This may be because the 'ax' package, which is an optional dependency, is not installed. To install it, run 'pip install CADET-Process[ax]'

That’s not true, so if I don’t let cadet-process catch the error I get

File ~/miniconda3/envs/ax-env/lib/python3.12/site-packages/CADETProcess/optimization/__init__.py:106
--> 106 from .axAdapater import BotorchModular, GPEI, NEHVI, qNParEGO

File ~/miniconda3/envs/ax-env/lib/python3.12/site-packages/CADETProcess/optimization/axAdapater.py:23
---> 23 from botorch.models.gp_regression import FixedNoiseGP

ImportError: cannot import name 'FixedNoiseGP' from 'botorch.models.gp_regression' (/Users/angelamoser/miniconda3/envs/ax-env/lib/python3.12/site-packages/botorch/models/gp_regression.py)

Since it seems like it is working for others, I am wondering what I am missing.

After a bit of digging I found out that Botorch deprecated a model that we use

with pip install botorch==0.11.3 ax-platform==0.4.1 you can roll back to a Botorch version that works until we can get a fix. But that won’t happen until after the workshop next week :smiley:

1 Like