All Products
Search
Document Center

Platform For AI:Limits and usage notes of AutoML

Last Updated:Nov 20, 2024

This topic describes the limits and usage notes of AutoML, including the supported regions, supported search algorithms, and scenarios for which the algorithms are suitable. The algorithms include Tree-structured Parzen Estimator (TPE), GridSearch, Random, Evolution, Gaussian Process (GP) for Bayesian Optimization (BO), and Population Based Training (PBT).

Supported regions

You can use AutoML in the following regions:

China (Hangzhou), China (Shanghai), China (Beijing), China (Shenzhen), and China (Hong Kong).

Supported search algorithms

The following section describes the supported search algorithms of AutoML and the scenarios for which the algorithms are suitable.

  • TPE: an algorithm that does not require additional dependencies. TPE supports all types of search space types and is used as the default algorithm in HPO. TPE can be used to resolve complex, nonlinear, high-dimensional issues that require a large amount of calculation. However, TPE cannot find connections between different parameters. For more information, see Algorithms for Hyper-Parameter Optimization.

  • GridSearch: a search algorithm that evenly divides the search space into a grid and traverses all possible combinations to determine the optimal combination. This algorithm works good for a small number of possible combinations.

  • Random: a search algorithm that randomly generates hyperparameter combinations. Similar to grid search, the algorithm divides the search space into a grid, and randomly selects a hyperparameter combination in each trial. This algorithm works good for nonlinear and high-dimensional issues.

  • Evolution: an algorithm that initializes the search space, selects better-performing hyperparameters for each generation to generate offspring combinations. Evolution requires multiple trials, but the algorithm logic is simple and easy to apply to new features.

    This algorithm is developed based on Large-Scale Evolution of Image Classifiers.

  • GP: a BO method that uses the Gaussian Process to calculate loss. As the amount of obtained data increases, the posterior distribution becomes increasingly accurate and the optimization performance becomes better.

  • PBT: an asynchronous optimization algorithm for fixed computing resources and improves model performance by optimizing a fixed number of models and hyperparameters. The PBT algorithm constantly iterates a single set of hyperparameters to obtain the optimal combination.

    This algorithm is developed based on Population Based Training of Neural Networks.

References