Tutorial: Hyper-parameter Optimisation for Machine Learning Models using MiP-EGO
When you start with using machine learning, the choices that you have to make can be overwhelming. There are many machine learning algorithms / models that you can use that are already available in packages like scikit-learn . Which of these algorithms should you use? Which will perform best for your specific data-set or the problem you want to solve? And once you have chosen one of these models, each algorithm has usually many hyper-parameters. Hyper-parameters are options of the algorithm that are most of the time essential for it's performance. For example, if we want to perform a classification task, we might want to use Support Vector Machines due to its outstanding performance. However, SVC has a staggering 15 hyper-parameters , not all of these are essential to tune, but the C, kernel, degree and gamma parameters are crucial for its performance. How to best tune these hyper-parameters? There are a few strategies that we can use when choosing hyper-parameters. In some...