Optimization Run name:
Enable GPU?
Number of workers:
Mem. (MB)/Worker:
Objective program: ($x_0) for zero'th command line parameter, ($x_1) for first, ...
  • Make sure your program-path has execution rights or run it with the command it's supposed to run as (bash, python, ...)
  • The program path can be anywhere, also in different directories from OmniOpt's folder
  • It is recommended to use a, check this tutorial for creating a for your script
  • Example:
    bash /scratch/ --layers=($x_0) --neurons=($x_1) ($x_2)
Search type:
Warning: Setting the seed does not work on ML.
Max. number set evaluations:
Number of hyperparameters:

Run OmniOpt

Execute this command on your Taurus-Shell to run this optimization:

Autohide config and sbatch when it is not needed?

Config file

Sbatch command



What your program needs to be like:

  1. It needs to be able to run on Taurus on your account
  2. It needs to accept it's hyperparameters by command line parameters (for example via the argparse-python-module)
  3. A lower RESULT must mean that it's somehow better, i.e. the area where lower results are are researched more
  4. The result needs to be printed in Stdout in a single line like this:
    RESULT: 0.123456
  5. If you want to maximize a value, just prepend - to the result string to »negate« it and turn a maximization-problem to a minimization-problem, like this:
    RESULT: -0.123456
  6. Only the last RESULT-line counts, all others will be disregarded!
  7. Make sure your programs can be run from any Working Directory, as it's problable that the CWD will not be the same as the directory your program runs in
  8. Make sure your program runs on the architecture of the partition you chose

Additional information:

  1. Once the job ran, go to the omniopt-folder it ran in and run
    to gain easy access to the results
  2. Anything in the STRING: FLOAT-Format will be saved in the DB and can be output to a CSV file via bash
  3. If your program does not seem to run properly, do
    , go to your project and run Check this project for errors. It will check for the most common errors in your project.

Multi-parameter optimization:

  1. It is possible to do multi-parameter optimization in a limited, but very easy way. Instead of writing RESULT: 0.5, you only need to leave out RESULT and write RESULT1: ... and RESULT2: ... and so on.
  2. When no RESULT is found in the stdout, the following equation will be run:

    and the resulting RESULT will be used.
  3. All outputted parameters will still be saved.
Based on HyperOptBergstra, J., Yamins, D., Cox, D. D. (2013) Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. TProc. of the 30th International Conference on Machine Learning (ICML 2013), June 2013, pp. I-115 to I-23.