ﻻ يوجد ملخص باللغة العربية
This paper presents a novel and lightweight hyperparameter optimization (HPO) method, MOdular FActorial Design (MOFA). MOFA pursues several rounds of HPO, where each round alternates between exploration of hyperparameter space by factorial design and exploitation of evaluation results by factorial analysis. Each round first explores the configuration space by constructing a low-discrepancy set of hyperparameters that cover this space well while de-correlating hyperparameters, and then exploits evaluation results through factorial analysis that determines which hyperparameters should be further explored and which should become fixed in the next round. We prove that the inference of MOFA achieves higher confidence than other sampling schemes. Each individual round is highly parallelizable and hence offers major improvements of efficiency compared to model-based methods. Empirical results show that MOFA achieves better effectiveness and efficiency compared with state-of-the-art methods.
When developing and analyzing new hyperparameter optimization (HPO) methods, it is vital to empirically evaluate and compare them on well-curated benchmark suites. In this work, we list desirable properties and requirements for such benchmarks and pr
Can we reduce the search cost of Neural Architecture Search (NAS) from days down to only few hours? NAS methods automate the design of Convolutional Networks (ConvNets) under hardware constraints and they have emerged as key components of AutoML fram
Metafeatures, or dataset characteristics, have been shown to improve the performance of hyperparameter optimization (HPO). Conventionally, metafeatures are precomputed and used to measure the similarity between datasets, leading to a better initializ
Recent work on hyperparameters optimization (HPO) has shown the possibility of training certain hyperparameters together with regular parameters. However, these online HPO algorithms still require running evaluation on a set of validation examples at
Modern machine learning algorithms crucially rely on several design decisions to achieve strong performance, making the problem of Hyperparameter Optimization (HPO) more important than ever. Here, we combine the advantages of the popular bandit-based