Do you want to publish a course? Click here

Computing Abstract Distances in Logic Programs

254   0   0.0 ( 0 )
 Added by Pedro Lopez-Garcia
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

Abstract interpretation is a well-established technique for performing static analyses of logic programs. However, choosing the abstract domain, widening, fixpoint, etc. that provides the best precision-cost trade-off remains an open problem. This is in a good part because of the challenges involved in measuring and comparing the precision of different analyses. We propose a new approach for measuring such precision, based on defining distances in abstract domains and extending them to distances between whole analyses of a given program, thus allowing comparing precision across different analyses. We survey and extend existing proposals for distances and metrics in lattices or abstract domains, and we propose metrics for some common domains used in logic program analysis, as well as extensions of those metrics to the space of whole program analysis. We implement those metrics within the CiaoPP framework and apply them to measure the precision of different analyses over both benchmarks and a realistic program.



rate research

Read More

112 - Vincent Nys 2017
We extend a technique called Compiling Control. The technique transforms coroutining logic programs into logic programs that, when executed under the standard left-to-right selection rule (and not using any delay features) have the same computational behavior as the coroutining program. In recent work, we revised Compiling Control and reformulated it as an instance of Abstract Conjunctive Partial Deduction. This work was mostly focused on the program analysis performed in Compiling Control. In the current paper, we focus on the synthesis of the transformed program. Instead of synthesizing a new logic program, we synthesize a CHR(Prolog) program which mimics the coroutining program. The synthesis to CHR yields programs containing only simplification rules, which are particularly amenable to certain static analysis techniques. The programs are also more concise and readable and can be ported to CHR implementations embedded in other languages than Prolog.
232 - Yuting Wang 2017
We argue that the implementation and verification of compilers for functional programming languages are greatly simplified by employing a higher-order representation of syntax known as Higher-Order Abstract Syntax or HOAS. The underlying idea of HOAS is to use a meta-language that provides a built-in and logical treatment of binding related notions. By embedding the meta-language within a larger programming or reasoning framework, it is possible to absorb the treatment of binding structure in the object language into the meta-theory of the system, thereby greatly simplifying the overall implementation and reasoning processes. We develop the above argument in this thesis by presenting and demonstrating the effectiveness of an approach to the verified implementation of compiler transformations for functional programs that exploits HOAS. In this approach, transformations on functional programs are first articulated in the form of rule-based relational specifications. These specifications are rendered into programs in the language lambda Prolog. On the one hand, these programs serve directly as implementations. On the other hand, they can be used as input to the Abella system which allows us to prove properties about them and thereby about the implementations. Both lambda Prolog and Abella support the use of the HOAS approach. Thus, they constitute a framework that can be used to test out the benefits of the HOAS approach in verified compilation. We use them to implement and verify a compiler for a representative functional programming language that embodies the transformations that form the core of many compilers for such languages. In both the programming and the reasoning phases, we show how the use of the HOAS approach significantly simplifies the representation, manipulation, analysis and reasoning of binding structure.
Approximations during program analysis are a necessary evil, as they ensure essential properties, such as soundness and termination of the analysis, but they also imply not always producing useful results. Automatic techniques have been studied to prevent precision loss, typically at the expense of larger resource consumption. In both cases (i.e., when analysis produces inaccurate results and when resource consumption is too high), it is necessary to have some means for users to provide information to guide analysis and thus improve precision and/or performance. We present techniques for supporting within an abstract interpretation framework a rich set of assertions that can deal with multivariance/context-sensitivity, and can handle different run-time semantics for those assertions that cannot be discharged at compile time. We show how the proposed approach can be applied to both improving precision and accelerating analysis. We also provide some formal results on the effects of such assertions on the analysis results.
In this note we consider the problem of introducing variables in temporal logic programs under the formalism of Temporal Equilibrium Logic (TEL), an extension of Answer Set Programming (ASP) for dealing with linear-time modal operators. To this aim, we provide a definition of a first-order version of TEL that shares the syntax of first-order Linear-time Temporal Logic (LTL) but has a different semantics, selecting some LTL models we call temporal stable models. Then, we consider a subclass of theories (called splittable temporal logic programs) that are close to usual logic programs but allowing a restricted use of temporal operators. In this setting, we provide a syntactic definition of safe variables that suffices to show the property of domain independence -- that is, addition of arbitrary elements in the universe does not vary the set of temporal stable models. Finally, we present a method for computing the derivable facts by constructing a non-temporal logic program with variables that is fed to a standard ASP grounder. The information provided by the grounder is then used to generate a subset of ground temporal rules which is equivalent to (and generally smaller than) the full program instantiation.
We present a logical calculus for reasoning about information flow in quantum programs. In particular we introduce a dynamic logic that is capable of dealing with quantum measurements, unitary evolutions and entanglements in compound quantum systems. We give a syntax and a relational semantics in which we abstract away from phases and probabilities. We present a sound proof system for this logic, and we show how to characterize by logical means various forms of entanglement (e.g. the Bell states) and various linear operators. As an example we sketch an analysis of the teleportation protocol.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا