Basic Example¶
Let’s say we want to create an experiment called “myExp”. The first thing to do is to create the folder exp/myExp
under the limbo root. Then add two files:
- the
main.cpp
file - a pyhton file called
wscript
, which will be used bywaf
to register the executable for building
The file structure should look like this:
limbo
|-- exp
|-- myExp
+-- wscript
+-- main.cpp
|-- src
...
Next, copy the following content to the wscript
file:
def options(opt):
pass
def build(bld):
bld(features='cxx cxxprogram',
source='main.cpp',
includes='. ../../src',
target='myExp',
uselib='BOOST EIGEN TBB LIBCMAES NLOPT',
use='limbo')
For this example, we will optimize a simple function: \(-{(5 * x - 2.5)}^2 + 5\), using all default values and settings. If you did not compile with libcmaes and/or nlopt, remove LIBCMAES and/or NLOPT from ‘uselib’.
To begin, the main
file has to include the necessary files, and declare the Parameter struct
:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 | struct Params {
// no noise
struct bayes_opt_boptimizer : public defaults::bayes_opt_boptimizer {
BO_PARAM(double, noise, 1e-10);
};
// depending on which internal optimizer we use, we need to import different parameters
#ifdef USE_NLOPT
struct opt_nloptnograd : public defaults::opt_nloptnograd {
};
#elif defined(USE_LIBCMAES)
struct opt_cmaes : public defaults::opt_cmaes {
};
#else
struct opt_gridsearch : public defaults::opt_gridsearch {
};
#endif
// enable / disable the writing of the result files
struct bayes_opt_bobase : public defaults::bayes_opt_bobase {
BO_PARAM(int, stats_enabled, true);
};
struct kernel_exp : public defaults::kernel_exp {
};
// we use 10 random samples to initialize the algorithm
struct init_randomsampling {
BO_PARAM(int, samples, 10);
};
// we stop after 40 iterations
struct stop_maxiterations {
BO_PARAM(int, iterations, 40);
};
// we use the default parameters for acqui_ucb
struct acqui_ucb : public defaults::acqui_ucb {
};
};
|
Here we are stating that the samples are observed without noise (which makes sense, because we are going to evaluate the function), that we want to output the stats (by setting stats_enabled to true), that the model has to be initialized with 10 samples (that will be selected randomly), and that the optimizer should run for 40 iterations. The rest of the values are taken from the defaults. By default limbo optimizes in \([0,1]\), but you can optimize without bounds by setting BO_PARAM(bool, bounded, false)
in bayes_opt_bobase
parameters.
Then, we have to define the evaluation function for the optimizer to call:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 | struct Eval {
// number of input dimension (x.size())
static constexpr size_t dim_in = 1;
// number of dimenions of the result (res.size())
static constexpr size_t dim_out = 1;
// the function to be optimized
Eigen::VectorXd operator()(const Eigen::VectorXd& x) const
{
double y = -((5 * x(0) - 2.5) * (5 * x(0) - 2.5)) + 5;
// we return a 1-dimensional vector
return tools::make_vector(y);
}
};
|
It is required that the evaluation struct has the static members dim_in
and dim_out
, specifying the input and output dimensions.
Also, it should have the operator()
expecting a const Eigen::VectorXd&
of size dim_in
, and return another one, of size dim_out
.
With this, we can declare the main function:
1 2 3 4 5 6 7 8 9 10 | int main()
{
// we use the default acquisition function / model / stat / etc.
bayes_opt::BOptimizer<Params> boptimizer;
// run the evaluation
boptimizer.optimize(Eval());
// the best sample found
std::cout << "Best sample: " << boptimizer.best_sample()(0) << " - Best observation: " << boptimizer.best_observation()(0) << std::endl;
return 0;
}
|
Finally, from the root of limbo, run a build command, with the additional switch --exp myExp
:
./waf build --exp myExp
Then, an executable named myExp
should be produced under the folder build/exp/myExp
.
Full main.cpp
:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 | #include <iostream>
// you can also include <limbo/limbo.hpp> but it will slow down the compilation
#include <limbo/bayes_opt/boptimizer.hpp>
using namespace limbo;
struct Params {
// no noise
struct bayes_opt_boptimizer : public defaults::bayes_opt_boptimizer {
BO_PARAM(double, noise, 1e-10);
};
// depending on which internal optimizer we use, we need to import different parameters
#ifdef USE_NLOPT
struct opt_nloptnograd : public defaults::opt_nloptnograd {
};
#elif defined(USE_LIBCMAES)
struct opt_cmaes : public defaults::opt_cmaes {
};
#else
struct opt_gridsearch : public defaults::opt_gridsearch {
};
#endif
// enable / disable the writing of the result files
struct bayes_opt_bobase : public defaults::bayes_opt_bobase {
BO_PARAM(int, stats_enabled, true);
};
struct kernel_exp : public defaults::kernel_exp {
};
// we use 10 random samples to initialize the algorithm
struct init_randomsampling {
BO_PARAM(int, samples, 10);
};
// we stop after 40 iterations
struct stop_maxiterations {
BO_PARAM(int, iterations, 40);
};
// we use the default parameters for acqui_ucb
struct acqui_ucb : public defaults::acqui_ucb {
};
};
struct Eval {
// number of input dimension (x.size())
static constexpr size_t dim_in = 1;
// number of dimenions of the result (res.size())
static constexpr size_t dim_out = 1;
// the function to be optimized
Eigen::VectorXd operator()(const Eigen::VectorXd& x) const
{
double y = -((5 * x(0) - 2.5) * (5 * x(0) - 2.5)) + 5;
// we return a 1-dimensional vector
return tools::make_vector(y);
}
};
int main()
{
// we use the default acquisition function / model / stat / etc.
bayes_opt::BOptimizer<Params> boptimizer;
// run the evaluation
boptimizer.optimize(Eval());
// the best sample found
std::cout << "Best sample: " << boptimizer.best_sample()(0) << " - Best observation: " << boptimizer.best_observation()(0) << std::endl;
return 0;
}
|