No Arabic abstract
Graded modalities have been proposed in recent work on programming languages as a general framework for refining type systems with intensional properties. In particular, continuous endomaps of the discrete time scale, or time warps, can be used to quantify the growth of information in the course of program execution. Time warps form a complete residuated lattice, with the residuals playing an important role in potential programming applications. In this paper, we study the algebraic structure of time warps, and prove that their equational theory is decidable, a necessary condition for their use in real-world compilers. We also describe how our universal-algebraic proof technique lends itself to a constraint-based implementation, establishing a new link between universal algebra and verification technology.
When a computational task tolerates a relaxation of its specification or when an algorithm tolerates the effects of noise in its execution, hardware, programming languages, and system software can trade deviations from correct behavior for lower resource usage. We present, for the first time, a synthesis of research results on computing systems that only make as many errors as their users can tolerate, from across the disciplines of computer aided design of circuits, digital system design, computer architecture, programming languages, operating systems, and information theory. Rather than over-provisioning resources at each layer to avoid errors, it can be more efficient to exploit the masking of errors occurring at one layer which can prevent them from propagating to a higher layer. We survey tradeoffs for individual layers of computing systems from the circuit level to the operating system level and illustrate the potential benefits of end-to-end approaches using two illustrative examples. To tie together the survey, we present a consistent formalization of terminology, across the layers, which does not significantly deviate from the terminology traditionally used by research communities in their layer of focus.
A new criterion of comprehension is defined, initially termed by myself as connected and finally as Acyclic by Mr. Randall Holmes. Acyclic comprehension simply asserts that for any acyclic formula phi, the set {x:phi} exists. I first presented this criterion semi-formally to Mr. Randall Holmes, who further made the first rigorous definition of it, a definition that I finally simplified to the one presented here. Later Mr. Holmes made another presentation of the definition which is also mentioned here. He pointed to me that acyclic comprehension is implied by stratification, and posed the question as to whether it is equivalent to full stratification or strictly weaker. He and initially I myself thought that it was strictly weaker; Mr. Randall Holmes actually conjectured that it is very weak. Surprisingly it turned to be equivalent to full stratification as I proved here
Applying program analyses to Software Product Lines (SPLs) has been a fundamental research problem at the intersection of Product Line Engineering and software analysis. Different attempts have been made to lift particular product-level analyses to run on the entire product line. In this paper, we tackle the class of Datalog-based analyses (e.g., pointer and taint analyses), study the theoretical aspects of lifting Datalog inference, and implement a lifted inference algorithm inside the Souffle Datalog engine. We evaluate our implementation on a set of benchmark product lines. We show significant savings in processing time and fact database size (billions of times faster on one of the benchmarks) compared to brute-force analysis of each product individually.
Higher-order constrained Horn clauses (HoCHC) are a semantically-invariant system of higher-order logic modulo theories. With semi-decidable unsolvability over a semi-decidable background theory, HoCHC is suitable for safety verification. Less is known about its relation to larger classes of higher-order verification problems. Motivated by program equivalence, we introduce a coinductive version of HoCHC that enjoys a greatest model property. We define an encoding of higher-order recursion schemes (HoRS) into HoCHC logic programs. Correctness of this encoding reduces decidability of the open HoRS equivalence problem -- and, thus, the LambdaY-calculus Bohm tree equivalence problem -- to semi-decidability of coinductive HoCHC over a complete and decidable theory of trees.
In this paper, first-order logic is interpreted in the framework of universal algebra, using the clone theory developed in three previous papers. We first define the free clone T(L, C) of terms of a first order language L over a set C of parameters in a standard way. The free right algebra F(L, C) of formulas over T(L, C) is then generated by atomic formulas. Structures for L over C are represented as perfect valuations of F(L, C), and theories of L are represented as filters of F(L). Finally Godels completeness theorem and first incompleteness theorem are stated as expected.