ترغب بنشر مسار تعليمي؟ اضغط هنا

Across neighbourhood search for numerical optimization

62   0   0.0 ( 0 )
 نشر من قبل Guohua Wu
 تاريخ النشر 2014
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Guohua Wu




اسأل ChatGPT حول البحث

Population-based search algorithms (PBSAs), including swarm intelligence algorithms (SIAs) and evolutionary algorithms (EAs), are competitive alternatives for solving complex optimization problems and they have been widely applied to real-world optimization problems in different fields. In this study, a novel population-based across neighbourhood search (ANS) is proposed for numerical optimization. ANS is motivated by two straightforward assumptions and three important issues raised in improving and designing efficient PBSAs. In ANS, a group of individuals collaboratively search the solution space for an optimal solution of the optimization problem considered. A collection of superior solutions found by individuals so far is maintained and updated dynamically. At each generation, an individual directly searches across the neighbourhoods of multiple superior solutions with the guidance of a Gaussian distribution. This search manner is referred to as across neighbourhood search. The characteristics of ANS are discussed and the concept comparisons with other PBSAs are given. The principle behind ANS is simple. Moreover, ANS is easy for implementation and application with three parameters being required to tune. Extensive experiments on 18 benchmark optimization functions of different types show that ANS has well balanced exploration and exploitation capabilities and performs competitively compared with many efficient PBSAs (Related Matlab codes used in the experiments are available from http://guohuawunudt.gotoip2.com/publications.html).

قيم البحث

اقرأ أيضاً

The development, assessment, and comparison of randomized search algorithms heavily rely on benchmarking. Regarding the domain of constrained optimization, the number of currently available benchmark environments bears no relation to the number of di stinct problem features. The present paper advances a proposal of a scalable linear constrained optimization problem that is suitable for benchmarking Evolutionary Algorithms. By comparing two recent EA variants, the linear benchmarking environment is demonstrated.
70 - Jian Yang , Yuhui Shi 2021
Population-based methods are often used to solve multimodal optimization problems. By combining niching or clustering strategy, the state-of-the-art approaches generally divide the population into several subpopulations to find multiple solutions for a problem at hand. However, these methods only guided by the fitness value during iterations, which are suffering from determining the number of subpopulations, i.e., the number of niche areas or clusters. To compensate for this drawback, this paper presents an Attention-oriented Brain Storm Optimization (ABSO) method that introduces the attention mechanism into a relatively new swarm intelligence algorithm, i.e., Brain Storm Optimization (BSO). By converting the objective space from the fitness space into attention space, the individuals are clustered and updated iteratively according to their salient values. Rather than converge to a single global optimum, the proposed method can guide the search procedure to converge to multiple salient solutions. The preliminary results show that the proposed method can locate multiple global and local optimal solutions of several multimodal benchmark functions. The proposed method needs less prior knowledge of the problem and can automatically converge to multiple optimums guided by the attention mechanism, which has excellent potential for further development.
We present a quantum synthesis algorithm designed to produce short circuits and to scale well in practice. The main contribution is a novel representation of circuits able to encode placement and topology using generic gates, which allows the QFAST a lgorithm to replace expensive searches over circuit structures with few steps of numerical optimization. When compared against optimal depth, search based state-of-the-art techniques, QFAST produces comparable results: 1.19x longer circuits up to four qubits, with an increase in compilation speed of 3.6x. In addition, QFAST scales up to seven qubits. When compared with the state-of-the-art rule based decomposition techniques in Qiskit, QFAST produces circuits shorter by up to two orders of magnitude (331x), albeit 5.6x slower. We also demonstrate the composability with other techniques and the tunability of our formulation in terms of circuit depth and running time.
Brain storm optimization (BSO) is a newly proposed population-based optimization algorithm, which uses a logarithmic sigmoid transfer function to adjust its search range during the convergent process. However, this adjustment only varies with the cur rent iteration number and lacks of flexibility and variety which makes a poor search effciency and robustness of BSO. To alleviate this problem, an adaptive step length structure together with a success memory selection strategy is proposed to be incorporated into BSO. This proposed method, adaptive step length based on memory selection BSO, namely ASBSO, applies multiple step lengths to modify the generation process of new solutions, thus supplying a flexible search according to corresponding problems and convergent periods. The novel memory mechanism, which is capable of evaluating and storing the degree of improvements of solutions, is used to determine the selection possibility of step lengths. A set of 57 benchmark functions are used to test ASBSOs search ability, and four real-world problems are adopted to show its application value. All these test results indicate the remarkable improvement in solution quality, scalability, and robustness of ASBSO.
In this work, we present a simple and general search space shrinking method, called Angle-Based search space Shrinking (ABS), for Neural Architecture Search (NAS). Our approach progressively simplifies the original search space by dropping unpromisin g candidates, thus can reduce difficulties for existing NAS methods to find superior architectures. In particular, we propose an angle-based metric to guide the shrinking process. We provide comprehensive evidences showing that, in weight-sharing supernet, the proposed metric is more stable and accurate than accuracy-based and magnitude-based metrics to predict the capability of child models. We also show that the angle-based metric can converge fast while training supernet, enabling us to get promising shrunk search spaces efficiently. ABS can easily apply to most of NAS approaches (e.g. SPOS, FairNAS, ProxylessNAS, DARTS and PDARTS). Comprehensive experiments show that ABS can dramatically enhance existing NAS approaches by providing a promising shrunk search space.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا