Do you want to publish a course? Click here

Logic Guided Genetic Algorithms

221   0   0.0 ( 0 )
 Added by Dhananjay Ashok
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

We present a novel Auxiliary Truth enhanced Genetic Algorithm (GA) that uses logical or mathematical constraints as a means of data augmentation as well as to compute loss (in conjunction with the traditional MSE), with the aim of increasing both data efficiency and accuracy of symbolic regression (SR) algorithms. Our method, logic-guided genetic algorithm (LGGA), takes as input a set of labelled data points and auxiliary truths (ATs) (mathematical facts known a priori about the unknown function the regressor aims to learn) and outputs a specially generated and curated dataset that can be used with any SR method. Three key insights underpin our method: first, SR users often know simple ATs about the function they are trying to learn. Second, whenever an SR system produces a candidate equation inconsistent with these ATs, we can compute a counterexample to prove the inconsistency, and further, this counterexample may be used to augment the dataset and fed back to the SR system in a corrective feedback loop. Third, the value addition of these ATs is that their use in both the loss function and the data augmentation process leads to better rates of convergence, accuracy, and data efficiency. We evaluate LGGA against state-of-the-art SR tools, namely, Eureqa and TuringBot on 16 physics equations from The Feynman Lectures on Physics book. We find that using these SR tools in conjunction with LGGA results in them solving up to 30.0% more equations, needing only a fraction of the amount of data compared to the same tool without LGGA, i.e., resulting in up to a 61.9% improvement in data efficiency.



rate research

Read More

264 - Martin Pelikan 2008
This study analyzes performance of several genetic and evolutionary algorithms on randomly generated NK fitness landscapes with various values of n and k. A large number of NK problem instances are first generated for each n and k, and the global optimum of each instance is obtained using the branch-and-bound algorithm. Next, the hierarchical Bayesian optimization algorithm (hBOA), the univariate marginal distribution algorithm (UMDA), and the simple genetic algorithm (GA) with uniform and two-point crossover operators are applied to all generated instances. Performance of all algorithms is then analyzed and compared, and the results are discussed.
This work aims at optimizing injection networks, which consist in adding a set of long-range links (called bypass links) in mobile multi-hop ad hoc networks so as to improve connectivity and overcome network partitioning. To this end, we rely on small-world network properties, that comprise a high clustering coefficient and a low characteristic path length. We investigate the use of two genetic algorithms (generational and steady-state) to optimize three instances of this topology control problem and present results that show initial evidence of their capacity to solve it.
115 - Nuno Alves 2010
Since their conception in 1975, Genetic Algorithms have been an extremely popular approach to find exact or approximate solutions to optimization and search problems. Over the last years there has been an enhanced interest in the field with related techniques, such as grammatical evolution, being developed. Unfortunately, work on developing genetic optimizations for low-end embedded architectures hasnt embraced the same enthusiasm. This short paper tackles that situation by demonstrating how genetic algorithms can be implemented in Arduino Duemilanove, a 16 MHz open-source micro-controller, with limited computation power and storage resources. As part of this short paper, the libraries used in this implementation are released into the public domain under a GPL license.
78 - Noe Casas 2015
In this article we provide a comprehensive review of the different evolutionary algorithm techniques used to address multimodal optimization problems, classifying them according to the nature of their approach. On the one hand there are algorithms that address the issue of the early convergence to a local optimum by differentiating the individuals of the population into groups and limiting their interaction, hence having each group evolve with a high degree of independence. On the other hand other approaches are based on directly addressing the lack of genetic diversity of the population by introducing elements into the evolutionary dynamics that promote new niches of the genotypical space to be explored. Finally, we study multi-objective optimization genetic algorithms, that handle the situations where multiple criteria have to be satisfied with no penalty for any of them. Very rich literature has arised over the years on these topics, and we aim at offering an overview of the most important techniques of each branch of the field.
We introduce the method of using an annealing genetic algorithm to the numerically complex problem of looking for quantum logic gates which simultaneously have highest fidelity and highest success probability. We first use the linear optical quantum nonlinear sign (NS) gate as an example to illustrate the efficiency of this method. We show that by appropriately choosing the annealing parameters, we can reach the theoretical maximum success probability (1/4 for NS) for each attempt. We then examine the controlled-z (CZ) gate as the first new problem to be solved. We show results that agree with the highest known maximum success probability for a CZ gate (2/27) while maintaining a fidelity of 0.9997. Since the purpose of our algorithm is to optimize a unitary matrix for quantum transformations, it could easily be applied to other areas of interest such as quantum optics and quantum sensors.

suggested questions

comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا