Skip to content

Commit f1f0233

Browse files
committed
doc: Better introduction to tutorials
Write some better introduction for the tutorials, so users can better understand what the expected outcome of the tutorial is.
1 parent da2b582 commit f1f0233

File tree

3 files changed

+65
-6
lines changed

3 files changed

+65
-6
lines changed

docs/tutorials/grid_search.rst

+41-3
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,50 @@
22
Tutorial: Basic Grid Search
33
***************************
44

5-
In this tutorial, we learn how to set up cluster_utils to run a basic grid search on an
6-
arbitrary optimization function. It does not cover all available options but instead
7-
shows the minimal steps needed to get started.
5+
In this tutorial you will learn
6+
7+
- how to write a simple script that can be executed by cluster-utils, and
8+
- how to configure cluster-utils to run a grid search over a few parameters on your
9+
script.
10+
11+
It does not cover all available options but instead shows the minimal steps needed to
12+
get started.
813

914
--------
1015

16+
17+
What is grid search?
18+
====================
19+
20+
For grid search, you specify a list of parameters and, for each of them, a list of
21+
values to check. cluster-utils will then execute your script with all possible
22+
combinations of parameter values and collect the resulting metrics (e.g. the reward
23+
achieved by a policy trained with the given parameters).
24+
In the end, you will get an overview of the results and a list of parameter values that
25+
performed best with respect to your metric.
26+
27+
In the example below, we use the Rosenbrock function::
28+
29+
f(x,y) = (1 - x)² + 100 · (y - x²)²
30+
31+
For each of the two parameters ``x`` and ``y``, we will check the values ``[0.0, 0.5,
32+
1.0, 1.5, 2.0]``. That is, a total of 25 jobs will be run with the following parameter
33+
values:
34+
35+
.. csv-table::
36+
:header-rows: 1
37+
38+
x,y
39+
0.0,0.0
40+
0.0,0.5
41+
0.0,1.0
42+
0.0,1.5
43+
0.0,2.0
44+
0.5,0.0
45+
0.5,0.5
46+
...,...
47+
48+
1149
Prepare your code
1250
=================
1351

docs/tutorials/hp_optimization.rst

+19-3
Original file line numberDiff line numberDiff line change
@@ -2,14 +2,30 @@
22
Tutorial: Basic Hyperparameter Optimization
33
*******************************************
44

5-
In this tutorial, we learn how to set up cluster_utils to run a basic hyperparameter
6-
optimization on an arbitrary optimization function. It does not cover all available
7-
options but instead shows the minimal steps needed to get started.
5+
In this tutorial, we reuse the script with the Rosenbrock function from
6+
:doc:`grid_search`, but instead of a simple grid search, we run a more sophisticated
7+
hyperparameter optimization on it.
8+
9+
Again, this tutorial does not cover all available options but instead shows the minimal
10+
steps needed to get started.
811

912
.. note:: If you haven't done so, please read :doc:`grid_search` first.
1013

1114
--------
1215

16+
17+
What is hyperparameter optimization
18+
===================================
19+
20+
In the previous tutorial, we used ``grid_search`` to do an exhaustive search over a
21+
discrete set of possible parameter values. In this tutorial, we will use
22+
``hp_optimization``, which instead samples parameter values them from continuous
23+
distributions, based on results of previous jobs.
24+
25+
When given enough iterations, this ideally converges towards good values for the
26+
hyperparameters (w.r.t. to the metric you specified).
27+
28+
1329
Prepare your code
1430
=================
1531

docs/usage.rst

+5
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,11 @@ Usage
55
Run Batch of Jobs
66
=================
77

8+
.. note::
9+
10+
If you are new to cluster_utils, we recommend that you start with
11+
:doc:`tutorials/grid_search`.
12+
813
cluster_utils provides two main commands to run batches of jobs on the cluster:
914

1015
- ``grid_search``: Simple grid search over specified parameter ranges.

0 commit comments

Comments
 (0)