No Arabic abstract
This paper studies useful sharing, which is a sophisticated optimization for lambda-calculi, in the context of call-by-need evaluation in presence of open terms. Useful sharing turns out to be harder to manipulate in call-by-need than in call-by-name or call-by-value, because call-by-need evaluates inside environments, making it harder to specify when a substitution step is useful. We isolate the key involved concepts and prove the correctness of useful sharing in this setting.
This paper explores two topics at once: the use of denotational semantics to bound the evaluation length of functional programs, and the semantics of strong (that is, possibly under abstractions) call-by-value evaluation. About the first, we analyze de Carvalhos seminal use of relational semantics for bounding the evaluation length of lambda-terms, starting from the presentation of the semantics as an intersection types system. We focus on the part of his work which is usually neglected in its many recent adaptations, despite being probably the conceptually deeper one: how to transfer the bounding power from the type system to the relational semantics itself. We dissect this result and re-understand it via the isolation of a simpler size representation property. About the second, we use relational semantics to develop a semantical study of strong call-by-value evaluation, which is both a delicate and neglected topic. We give a semantic characterization of terms normalizable with respect to strong evaluation, providing in particular the first result of adequacy with respect to strong call-by-value. Moreover, we extract bounds about strong evaluation from both the type systems and the relational semantics. Essentially, we use strong call-by-value to revisit de Carvalhos semantic bounds, and de Carvalhos technique to provide semantical foundations for strong call-by-value.
We provide a sound and relatively complete Hoare-like proof system for reasoning about partial correctness of recursive procedures in presence of local variables and the call-by-value parameter mechanism, and in which the correctness proofs are linear in the length of the program. We argue that in spite of the fact that Hoare-like proof systems for recursive procedures were intensively studied, no such proof system has been proposed in the literature.
This paper shows equivalence of sever
We examine the relationship between the algebraic lambda-calculus, a fragment of the differential lambda-calculus and the linear-algebraic lambda-calculus, a candidate lambda-calculus for quantum computation. Both calculi are algebraic: each one is equipped with an additive and a scalar-multiplicative structure, and their set of terms is closed under linear combinations. However, the two languages were built using different approaches: the former is a call-by-name language whereas the latter is call-by-value; the former considers algebraic equalities whereas the latter approaches them through rewrite rules. In this paper, we analyse how these different approaches relate to one another. To this end, we propose four canonical languages based on each of the possible choices: call-by-name versus call-by-value, algebraic equality versus algebraic rewriting. We show that the various languages simulate one another. Due to subtle interaction between beta-reduction and algebraic rewriting, to make the languages consistent some additional hypotheses such as confluence or normalisation might be required. We carefully devise the required properties for each proof, making them general enough to be valid for any sub-language satisfying the corresponding properties.
In each variant of the lambda-calculus, factorization and normalization are two key-properties that show how results are computed. Instead of proving factorization/normalization for the call-by-name (CbN) and call-by-value (CbV) variants separately, we prove them only once, for the bang calculus (an extension of the lambda-calculus inspired by linear logic and subsuming CbN and CbV), and then we transfer the result via translations, obtaining factorization/normalization for CbN and CbV. The approach is robust: it still holds when extending the calculi with operators and extra rules to model some additional computational features.