The code from the paper is open-source and GPL v3 at GitHub - ronald-jaepel/ChromBayesOpt but written by a biologist (me), so probably not a joy to read.

I thought this was a interesting and useful paper! Surrogate optimization has been instrumental in parameter fitting in my research projects and makes sense to use in chrom parameter fitting. One challenge I’ve had with it which hasn’t been present as much when using a genetic algorithm is when the objective is nonsmooth. One example is if input parameter constraints are violated (e.g. if characteristic charge becomes negative in a pH-dependent model) and the objective must be discarded at such a point in the search space. I found that this definitely causes more issues for the surrogate models compared to genetic algorithm which brute forces the problem in a less elegant manner. Since the assumption of objective function smoothness is mentioned in the paper, I think this is relevant “objective” (haha) to consider.

In any case, nice job! I hope to read more on this topic in the future.

Non-smooth objectives are a real problem for any kind of gaussian system because a sharp transition anywhere lowers the confidence everywhere which really hurts the surrogate.

This is one of the reasons I have been working on neural networks for surrogates because they don’t suffer from this problem. The downside is you do have to build and train a neural network.

A genetic algorithm with tens of thousands population size running on a neural network is truly amazing to see.

That’s exactly why I tried to mention the required smoothness so many times in the paper, because we also noticed the problems with non-smooth objectives. The algorithm is highly optimized for the problem at hand (single component parameter estimation) and suffers in performance if the parameter influence is non-smooth e.g. in a multi-component breakthrough experiment. For complex problems CADETMatch’s GA will remain the better and more robust choice.