DRAGON#

DRAGON, for DiRected Acyclic Graphs OptimizatioN, is an open source Python package for the optimization of Deep Neural Networks Hyperparameters and Architecture. It implements the algorithmic framework proposed in [1]. In this framework, Deep Neural Networks are encoded as Directed Acyclic Graphs (DAGs), where the nodes can be any Pytorch` operations parameterized by some optimizable hyperparameters and the edges are the connections between them. Unlike most AutoDL or AutoML packages, DRAGON is not a no-code package. It is not enough to write .fit and then .predict` to get results with DRAGON. To use the package, the user must define a suitable search space, a meta-architecture, and procedures for training and validation. Although this implementation requires more initial effort, it allows for a wide range of tools tailored to different problems. The search space consists of objects that fall into three main categories: search space variables, search operators, and search algorithms.

You can get familiar with it quickly thanks to the Quickstart tutorial.

Structure#

  • Search Space.
    • DRAGON provides tools to create custom search spaces. Various Python objects called Variables are available to encode different elements such as integers or arrays. These elements come from the zellij package developed for hyperparameter optimisation.

    • Based on these elements, the search space based Directed Acyclic Graphs (DAGs) are proposed to encode the deep neural networks. The nodes can be any PyTorch layer (custom or not) and the edges are the connections between them.

  • Search Operators.
    • Each variable can be given a neighbor attribute which is used to modify the current object. This function can be seen as a neighbourhood or mutation operator. The DRAGON package provides default mutations for each Variable, but the user is free to implement his own.

    • A crossover operator is also implemented, allowing both arrays and graph-like variables to be mixed.

  • Search Algorithms.
    • DRAGON provides the implementation of several search algorithms: the Random Search, the Evolutionary Algorithm [1], Mutant UCB [3] and Hyperband [4].

    • Mutant-UCB and the Evolutionary Algorithm use the neighbor attributes to modify the configurations. Other search algorithms such as local search or simulated annealing could be implemented in a similar way.

    • Each search algorithm comes with a storage system to keep RAM memory small and an optional distributed version on multiple processors. The distributed version requires an MPI library such as MPICH or Open MPI and is based on the mpi4py package.

  • Performance evaluation.
    • Evaluating a configuration from a search space built with DRAGON is done by building the model (i.e. the neural networks) with the configuration elements. Then the model should be trained, evaluated and return a loss (the search algorithms minimise losses).

    • The process of building - training - evaluating a model based on a configuration depends on the applications and has to be implemented by the user.

    • Examples are given for image classification with the package skorch and load forecasting [2].

Installation#

Basic version#

After cloning the git repository, install DRAGON, using:

pip install dragon-autodl==1.1

Distributed version#

If you plan on using the distributed version, you have to install the mpi4py package:

pip install mpi4py

Dependencies#

Contributors#

References#