Skip to content

Commit a48ac03

Browse files
committed
Pushing the docs for revision for branch: master, commit 39d7053223b583c97df53ab6881d6f4508922435
1 parent c117116 commit a48ac03

File tree

361 files changed

+2043
-32730
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

361 files changed

+2043
-32730
lines changed
-3 Bytes
Binary file not shown.
Binary file not shown.
Binary file not shown.
134 Bytes
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

beta/doctrees/development.doctree

7.76 KB
Binary file not shown.

beta/doctrees/environment.pickle

1.02 MB
Binary file not shown.

beta/doctrees/getting_started.doctree

-79 Bytes
Binary file not shown.

beta/doctrees/index.doctree

-7.89 KB
Binary file not shown.

beta/doctrees/install.doctree

1.36 KB
Binary file not shown.
-41.3 KB
Binary file not shown.
-129 KB
Binary file not shown.
3.09 KB
Binary file not shown.
-23.6 KB
Binary file not shown.

beta/doctrees/modules/classes.doctree

-1.22 KB
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

beta/doctrees/modules/index.doctree

-2.73 KB
Binary file not shown.
3.11 KB
Binary file not shown.
-195 KB
Binary file not shown.

beta/doctrees/modules/plots.doctree

-61.1 KB
Binary file not shown.

beta/doctrees/modules/space.doctree

-140 KB
Binary file not shown.

beta/doctrees/modules/utils.doctree

-58.8 KB
Binary file not shown.

beta/doctrees/whats_new.doctree

3.04 KB
Binary file not shown.

beta/doctrees/whats_new/v0.8.doctree

2.85 KB
Binary file not shown.

beta/html/_downloads/006e45de69c37f7b58e1da3a6fd18e4e/parallel-optimization.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Parallel optimization\n\n\nIaroslav Shcherbatyi, May 2017.\n\nReviewed by Manoj Kumar and Tim Head.\n\nReformatted by Holger Nahrstaedt 2020\n\n.. currentmodule:: skopt\n\nIntroduction\n============\n\nFor many practical black box optimization problems expensive objective can be\nevaluated in parallel at multiple points. This allows to get more objective\nevaluations per unit of time, which reduces the time necessary to reach good\nobjective values when appropriate optimization algorithms are used, see for\nexample results in [1] and the references therein.\n\n\nOne such example task is a selection of number and activation function of a\nneural network which results in highest accuracy for some machine learning\nproblem. For such task, multiple neural networks with different combinations\nof number of neurons and activation function type can be evaluated at the same\ntime in parallel on different cpu cores / computational nodes.\n\nThe \u201cask and tell\u201d API of scikit-optimize exposes functionality that allows to\nobtain multiple points for evaluation in parallel. Intended usage of this\ninterface is as follows:\n\n1. Initialize instance of the `Optimizer` class from skopt\n2. Obtain n points for evaluation in parallel by calling the `ask` method of\n an optimizer instance with the `n_points` argument set to n > 0\n3. Evaluate points\n4. Provide points and corresponding objectives using the `tell` method of\n an optimizer instance\n5. Continue from step 2 until eg maximum number of evaluations reached\n"
18+
"\n# Parallel optimization\n\n\nIaroslav Shcherbatyi, May 2017.\nReviewed by Manoj Kumar and Tim Head.\nReformatted by Holger Nahrstaedt 2020\n\n.. currentmodule:: skopt\n\nIntroduction\n============\n\nFor many practical black box optimization problems expensive objective can be\nevaluated in parallel at multiple points. This allows to get more objective\nevaluations per unit of time, which reduces the time necessary to reach good\nobjective values when appropriate optimization algorithms are used, see for\nexample results in [1]_ and the references therein.\n\n\nOne such example task is a selection of number and activation function of a\nneural network which results in highest accuracy for some machine learning\nproblem. For such task, multiple neural networks with different combinations\nof number of neurons and activation function type can be evaluated at the same\ntime in parallel on different cpu cores / computational nodes.\n\nThe \u201cask and tell\u201d API of scikit-optimize exposes functionality that allows to\nobtain multiple points for evaluation in parallel. Intended usage of this\ninterface is as follows:\n\n1. Initialize instance of the `Optimizer` class from skopt\n2. Obtain n points for evaluation in parallel by calling the `ask` method of an optimizer instance with the `n_points` argument set to n > 0\n3. Evaluate points\n4. Provide points and corresponding objectives using the `tell` method of an optimizer instance\n5. Continue from step 2 until eg maximum number of evaluations reached\n"
1919
]
2020
},
2121
{
@@ -51,7 +51,7 @@
5151
"cell_type": "markdown",
5252
"metadata": {},
5353
"source": [
54-
"Note that if `n_points` is set to some integer > 0 for the `ask` method, the\nresult will be a list of points, even for `n_points`=1. If the argument is\nset to `None` (default value) then a single point (but not a list of points)\nwill be returned.\n\nThe default \"minimum constant liar\" [1] parallelization strategy is used in\nthe example, which allows to obtain multiple points for evaluation with a\nsingle call to the `ask` method with any surrogate or acquisition function.\nParalellization strategy can be set using the \"strategy\" argument of `ask`.\nFor supported parallelization strategies see the documentation of\nscikit-optimize.\n\n[1] [https://hal.archives-ouvertes.fr/hal-00732512/document](https://hal.archives-ouvertes.fr/hal-00732512/document) .\n"
54+
"Note that if `n_points` is set to some integer > 0 for the `ask` method, the\nresult will be a list of points, even for `n_points` = 1. If the argument is\nset to `None` (default value) then a single point (but not a list of points)\nwill be returned.\n\nThe default \"minimum constant liar\" [1]_ parallelization strategy is used in\nthe example, which allows to obtain multiple points for evaluation with a\nsingle call to the `ask` method with any surrogate or acquisition function.\nParallelization strategy can be set using the \"strategy\" argument of `ask`.\nFor supported parallelization strategies see the documentation of\nscikit-optimize.\n\n.. [1] `<https://hal.archives-ouvertes.fr/hal-00732512/document>`_\n"
5555
]
5656
}
5757
],

beta/html/_downloads/011af548156c5f3d07b382300fea643d/strategy-comparison.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@
3030
# Toy model
3131
# =========
3232
#
33-
# We will use the `branin` function as toy model for the expensive function.
33+
# We will use the :class:`benchmarks.branin` function as toy model for the expensive function.
3434
# In a real world application this function would be unknown and expensive
3535
# to evaluate.
3636

@@ -84,7 +84,7 @@ def plot_branin():
8484
#
8585
# The objective of this example is to find one of these minima in as
8686
# few iterations as possible. One iteration is defined as one call
87-
# to the `branin` function.
87+
# to the :class:`benchmarks.branin` function.
8888
#
8989
# We will evaluate each model several times using a different seed for the
9090
# random number generator. Then compare the average performance of these
@@ -133,12 +133,12 @@ def run(minimizer, n_iter=5):
133133
#############################################################################
134134
# This plot shows the value of the minimum found (y axis) as a function
135135
# of the number of iterations performed so far (x axis). The dashed red line
136-
# indicates the true value of the minimum of the branin function.
136+
# indicates the true value of the minimum of the :class:`benchmarks.branin` function.
137137
#
138138
# For the first ten iterations all methods perform equally well as they all
139139
# start by creating ten random samples before fitting their respective model
140140
# for the first time. After iteration ten the next point at which
141-
# to evaluate `branin` is guided by the model, which is where differences
141+
# to evaluate :class:`benchmarks.branin` is guided by the model, which is where differences
142142
# start to appear.
143143
#
144144
# Each minimizer only has access to noisy observations of the objective

beta/html/_downloads/16e24a5afc2d92e961c3c58e7d73c6e6/ask-and-tell.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -170,7 +170,7 @@
170170
},
171171
"outputs": [],
172172
"source": [
173-
"for i in range(50):\n next_x = opt.ask()\n f_val = objective(next_x)\n opt.tell(next_x, f_val)\n\nplot_optimizer(opt, x, fx)"
173+
"for i in range(10):\n next_x = opt.ask()\n f_val = objective(next_x)\n opt.tell(next_x, f_val)\n\nplot_optimizer(opt, x, fx)"
174174
]
175175
},
176176
{

beta/html/_downloads/1c9bc01d15cf0a1e95b499b64cae5679/bayesian-optimization.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -141,7 +141,7 @@
141141
"cell_type": "markdown",
142142
"metadata": {},
143143
"source": [
144-
"Let us now visually examine\n\n1. The approximation of the fit gp model to the original function.\n2. The acquistion values that determine the next point to be queried.\n\n"
144+
"Let us now visually examine\n\n1. The approximation of the fit gp model to the original function.\n2. The acquisition values that determine the next point to be queried.\n\n"
145145
]
146146
},
147147
{

beta/html/_downloads/22360dec306b5faa13a47fefbbde8c89/visualizing-results.py

Lines changed: 19 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -19,16 +19,16 @@
1919
will be spread over the whole parameter space and later samples should
2020
cluster around the minimum.
2121
22-
The `plot_evaluations()` function helps with visualizing the location and
22+
The :class:`plots.plot_evaluations` function helps with visualizing the location and
2323
order in which samples are evaluated for objectives with an arbitrary
2424
number of dimensions.
2525
26-
The `plot_objective()` function plots the partial dependence of the objective,
26+
The :class:`plots.plot_objective` function plots the partial dependence of the objective,
2727
as represented by the surrogate model, for each dimension and as pairs of the
2828
input dimensions.
2929
3030
All of the minimizers implemented in `skopt` return an [`OptimizeResult`]()
31-
instance that can be inspected. Both `plot_evaluations` and `plot_objective`
31+
instance that can be inspected. Both :class:`plots.plot_evaluations` and :class:`plots.plot_objective`
3232
are helpers that do just that
3333
"""
3434

@@ -42,15 +42,15 @@
4242
# Toy models
4343
# ==========
4444
#
45-
# We will use two different toy models to demonstrate how `plot_evaluations()`
45+
# We will use two different toy models to demonstrate how :class:`plots.plot_evaluations`
4646
# works.
4747
#
48-
# The first model is the `branin` function which has two dimensions and three
48+
# The first model is the :class:`benchmarks.branin` function which has two dimensions and three
4949
# minima.
5050
#
5151
# The second model is the `hart6` function which has six dimension which makes
5252
# it hard to visualize. This will show off the utility of
53-
# `plot_evaluations()`.
53+
# :class:`plots.plot_evaluations`.
5454

5555
from skopt.benchmarks import branin as branin
5656
from skopt.benchmarks import hart6 as hart6_
@@ -64,7 +64,7 @@ def hart6(x):
6464
# Starting with `branin`
6565
# ======================
6666
#
67-
# To start let's take advantage of the fact that `branin` is a simple
67+
# To start let's take advantage of the fact that :class:`benchmarks.branin` is a simple
6868
# function which can be visualised in two dimensions.
6969

7070
from matplotlib.colors import LogNorm
@@ -105,8 +105,8 @@ def plot_branin():
105105
# =================================
106106
#
107107
# Next we use an extra trees based minimizer to find one of the minima of the
108-
# `branin` function. Then we visualize at which points the objective is being
109-
# evaluated using `plot_evaluations()`.
108+
# :class:`benchmarks.branin` function. Then we visualize at which points the objective is being
109+
# evaluated using :class:`plots.plot_evaluations`.
110110

111111
from functools import partial
112112
from skopt.plots import plot_evaluations
@@ -122,7 +122,7 @@ def plot_branin():
122122
_ = plot_evaluations(forest_res, bins=10)
123123

124124
#############################################################################
125-
# `plot_evaluations()` creates a grid of size `n_dims` by `n_dims`.
125+
# :class:`plots.plot_evaluations` creates a grid of size `n_dims` by `n_dims`.
126126
# The diagonal shows histograms for each of the dimensions. In the lower
127127
# triangle (just one plot in this case) a two dimensional scatter plot of all
128128
# points is shown. The order in which points were evaluated is encoded in the
@@ -134,7 +134,7 @@ def plot_branin():
134134
# of the true miminum. The histograms show that the objective is evaluated
135135
# more often at locations near to one of the three minima.
136136
#
137-
# Using `plot_objective()` we can visualise the one dimensional partial
137+
# Using :class:`plots.plot_objective` we can visualise the one dimensional partial
138138
# dependence of the surrogate model for each dimension. The contour plot in
139139
# the bottom left corner shows the two dimensional partial dependence. In this
140140
# case this is the same as simply plotting the objective as it only has two
@@ -144,8 +144,7 @@ def plot_branin():
144144
# ------------------------
145145
#
146146
# Partial dependence plots were proposed by
147-
# [Friedman (2001)](http://projecteuclid.org/euclid.aos/1013203451)
148-
# (doi:10.1214/aos/1013203451 section 8.2)
147+
# [Friedman (2001)]_
149148
# as a method for interpreting the importance of input features used in
150149
# gradient boosting machines. Given a function of :math:`k`: variables
151150
# :math:`y=f\left(x_1, x_2, ..., x_k\right)`: the
@@ -154,7 +153,7 @@ def plot_branin():
154153
# with the sum running over a set of $N$ points drawn at random from the
155154
# search space.
156155
#
157-
# The idea is to visulize how the value of :math:`x_j`: influences the function
156+
# The idea is to visualize how the value of :math:`x_j`: influences the function
158157
# :math:`f`: after averaging out the influence of all other variables.
159158

160159
from skopt.plots import plot_objective
@@ -171,7 +170,7 @@ def plot_branin():
171170
# Random sampling
172171
# ===============
173172
#
174-
# Compare this to a mimizer which picks points at random. There is no
173+
# Compare this to a minimizer which picks points at random. There is no
175174
# structure visible in the order in which it evaluates the objective. Because
176175
# there is no model involved in the process of picking sample points at
177176
# random, we can not plot the partial dependence of the model.
@@ -185,12 +184,12 @@ def plot_branin():
185184
# =========================
186185
#
187186
# Visualising what happens in two dimensions is easy, where
188-
# `plot_evaluations()` and `plot_objective()` start to be useful is when the
187+
# :class:`plots.plot_evaluations` and :class:`plots.plot_objective` start to be useful is when the
189188
# number of dimensions grows. They take care of many of the more mundane
190189
# things needed to make good plots of all combinations of the dimensions.
191190
#
192-
# The next example uses `hart6` which has six dimensions and shows both
193-
# `plot_evaluations()` and `plot_objective()`.
191+
# The next example uses class:`benchmarks.hart6` which has six dimensions and shows both
192+
# :class:`plots.plot_evaluations` and :class:`plots.plot_objective`.
194193

195194
bounds = [(0., 1.),] * 6
196195

@@ -207,7 +206,7 @@ def plot_branin():
207206
# ==============================
208207
#
209208
# To make things more interesting let's add two dimension to the problem.
210-
# As `hart6` only depends on six dimensions we know that for this problem
209+
# As :class:`benchmarks.hart6` only depends on six dimensions we know that for this problem
211210
# the new dimensions will be "flat" or uninformative. This is clearly visible
212211
# in both the placement of samples and the partial dependence plots.
213212

@@ -221,5 +220,4 @@ def plot_branin():
221220
_ = plot_evaluations(forest_res)
222221
_ = plot_objective(forest_res)
223222

224-
225-
223+
# .. [Friedman (2001)] `doi:10.1214/aos/1013203451 section 8.2 <http://projecteuclid.org/euclid.aos/1013203451>`

beta/html/_downloads/283a4aef788dadc4cf49a7a2d11804bd/interruptible-optimization.py

Lines changed: 14 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -14,18 +14,15 @@
1414
If for some reason the process has to be interrupted results are irreversibly
1515
lost, and the routine has to start over from the beginning.
1616
17-
With the help of the `CheckpointSaver` callback the optimizer's current state
17+
With the help of the :class:`callbacks.CheckpointSaver` callback the optimizer's current state
1818
can be saved after each iteration, allowing to restart from that point at any
1919
time.
2020
2121
This is useful, for example,
2222
23-
* if you don't know how long the process will take and cannot hog
24-
computational resources forever
25-
* if there might be system failures due to shaky infrastructure
26-
(or colleagues...)
27-
* if you want to adjust some parameters and continue with the already obtained
28-
results
23+
* if you don't know how long the process will take and cannot hog computational resources forever
24+
* if there might be system failures due to shaky infrastructure (or colleagues...)
25+
* if you want to adjust some parameters and continue with the already obtained results
2926
3027
"""
3128
print(__doc__)
@@ -45,7 +42,7 @@
4542
#
4643
# We will use pretty much the same optimization problem as in the
4744
# :ref:`sphx_glr_auto_examples_bayesian-optimization.py`
48-
# notebook. Additionaly we will instantiate the `CheckpointSaver`
45+
# notebook. Additionally we will instantiate the :class:`callbacks.CheckpointSaver`
4946
# and pass it to the minimizer:
5047

5148
from skopt import gp_minimize
@@ -82,14 +79,14 @@ def obj_fun(x, noise_level=noise_level):
8279
# tells you that he had had an update scheduled for Sunday noon – who
8380
# doesn't like updates?
8481
#
85-
# `gp_minimize` did not finish, and there is no `res` variable with the
82+
# :class:`gp_minimize` did not finish, and there is no `res` variable with the
8683
# actual results!
8784
#
8885
# Restoring the last checkpoint
8986
# =============================
9087
#
91-
# Luckily we employed the `CheckpointSaver` and can now restore the latest
92-
# result with `skopt.load`
88+
# Luckily we employed the :class:`callbacks.CheckpointSaver` and can now restore the latest
89+
# result with :class:`skopt.load`
9390
# (see :ref:`sphx_glr_auto_examples_store-and-load-results.py` for more
9491
# information on that)
9592

@@ -123,10 +120,11 @@ def obj_fun(x, noise_level=noise_level):
123120
# =================
124121
#
125122
# * **changes in search space:** You can use this technique to interrupt
126-
# the search, tune the search space and continue the optimization. Note
127-
# that the optimizers will complain if `x0` contains parameter values not
128-
# covered by the dimension definitions, so in many cases shrinking the
129-
# search space will not work without deleting the offending runs from
130-
# `x0` and `y0`.
123+
# the search, tune the search space and continue the optimization. Note
124+
# that the optimizers will complain if `x0` contains parameter values not
125+
# covered by the dimension definitions, so in many cases shrinking the
126+
# search space will not work without deleting the offending runs from
127+
# `x0` and `y0`.
131128
# * see :ref:`sphx_glr_auto_examples_store-and-load-results.py`
129+
#
132130
# for more information on how the results get saved and possible caveats

beta/html/_downloads/4764fcd3bd386b4f7eff366aafcaaa8a/sklearn-gridsearchcv-replacement.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -11,19 +11,19 @@
1111
============
1212
1313
This example assumes basic familiarity with
14-
`scikit-learn <http://scikit-learn.org/stable/index.html>`.
14+
`scikit-learn <http://scikit-learn.org/stable/index.html>`_.
1515
1616
Search for parameters of machine learning models that result in best
1717
cross-validation performance is necessary in almost all practical
1818
cases to get a model with best generalization estimate. A standard
19-
approach in scikit-learn is using `GridSearchCV` class, which takes
19+
approach in scikit-learn is using :obj:`sklearn.model_selection.GridSearchCV` class, which takes
2020
a set of values for every parameter to try, and simply enumerates all
2121
combinations of parameter values. The complexity of such search grows
2222
exponentially with the addition of new parameters. A more scalable
23-
approach is using `RandomizedSearchCV`, which however does not take
23+
approach is using :obj:`sklearn.model_selection.RandomizedSearchCV`, which however does not take
2424
advantage of the structure of a search space.
2525
26-
Scikit-optimize provides a drop-in replacement for `GridSearchCV`,
26+
Scikit-optimize provides a drop-in replacement for :obj:`sklearn.model_selection.GridSearchCV`,
2727
which utilizes Bayesian Optimization where a predictive model referred
2828
to as "surrogate" is used to model the search space and utilized to
2929
arrive at good parameter values combination as soon as possible.
@@ -126,7 +126,7 @@
126126
# Progress monitoring and control using `callback` argument of `fit` method
127127
# =========================================================================
128128
#
129-
# It is possible to monitor the progress of BayesSearchCV with an event
129+
# It is possible to monitor the progress of :class:`BayesSearchCV` with an event
130130
# handler that is called on every step of subspace exploration. For single job
131131
# mode, this is called on every evaluation of model configuration, and for
132132
# parallel mode, this is called when n_jobs model configurations are evaluated

beta/html/_downloads/55bac0b8fb5ddffe9b26fe03c5087713/ask-and-tell.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -146,7 +146,7 @@ def plot_optimizer(opt, x, fx):
146146
# Let us sample a few more points and plot the optimizer again:
147147

148148

149-
for i in range(50):
149+
for i in range(10):
150150
next_x = opt.ask()
151151
f_val = objective(next_x)
152152
opt.tell(next_x, f_val)
Binary file not shown.

0 commit comments

Comments
 (0)