ترغب بنشر مسار تعليمي؟ اضغط هنا

Differentials and distances in probabilistic coherence spaces

84   0   0.0 ( 0 )
 نشر من قبل Thomas Ehrhard
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Thomas Ehrhard




اسأل ChatGPT حول البحث

In probabilistic coherence spaces, a denotational model of probabilistic functional languages, mor-phisms are analytic and therefore smooth. We explore two related applications of the corresponding derivatives. First we show how derivatives allow to compute the expectation of execution time in the weak head reduction of probabilistic PCF (pPCF). Next we apply a general notion of local differential of morphisms to the proof of a Lipschitz property of these morphisms allowing in turn to relate the observational distance on pPCF terms to a distance the model is naturally equipped with. This suggests that extending probabilistic programming languages with derivatives, in the spirit of the differential lambda-calculus, could be quite meaningful.

قيم البحث

اقرأ أيضاً

85 - Thomas Ehrhard 2020
In probabilistic coherence spaces, a denotational model of probabilistic functional languages, morphisms are analytic and therefore smooth. We explore two related applications of the corresponding derivatives. First we show how derivatives allow to c ompute the expectation of execution time in the weak head reduction of probabilistic PCF (pPCF). Next we apply a general notion of local differential of morphisms to the proof of a Lipschitz property of these morphisms allowing in turn to relate the observational distance on pPCF terms to a distance the model is naturally equipped with. This suggests that extending probabilistic programming languages with derivatives, in the spirit of the differential lambda-calculus, could be quite meaningful.
61 - Thomas Ehrhard 2020
We develop a theory of probabilistic coherence spaces equipped with an additional extensional structure and apply it to approximating probability of convergence of ground type programs of probabilistic PCF whose free variables are of ground types. To this end we define an adapted version of Krivine Machine which computes polynomial approximations of the semantics of these programs in the model. These polynomials provide approximations from below and from above of probabilities of convergence; this is made possible by extending the language with an error symbol which is extensionally maximal in the model.
112 - C. A. Middelburg 2019
We first present a probabilistic version of ACP that rests on the principle that probabilistic choices are always resolved before choices involved in alternative composition and parallel composition are resolved and then extend this probabilistic ver sion of ACP with a form of interleaving in which parallel processes are interleaved according to what is known as a process-scheduling policy in the field of operating systems. We use the term strategic interleaving for this more constrained form of interleaving. The extension covers probabilistic process-scheduling policies.
We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the b ehaviours exhibited by the components of execution environments of instruction sequences. In a paper concerned with probabilistic instruction sequences, we proposed several kinds of probabilistic instructions and gave an informal explanation for each of them. The probabilistic features added to the extension of basic thread algebra with thread-service interaction make it possible to give a formal explanation in terms of non-probabilistic instructions and probabilistic services. The probabilistic features added to the extensions of basic thread algebra with strategic interleaving make it possible to cover strategies corresponding to probabilistic scheduling algorithms.
A notion of probabilistic lambda-calculus usually comes with a prescribed reduction strategy, typically call-by-name or call-by-value, as the calculus is non-confluent and these strategies yield different results. This is a break with one of the main advantages of lambda-calculus: confluence, which means results are independent from the choice of strategy. We present a probabilistic lambda-calculus where the probabilistic operator is decomposed into two syntactic constructs: a generator, which represents a probabilistic event; and a consumer, which acts on the term depending on a given event. The resulting calculus, the Probabilistic Event Lambda-Calculus, is confluent, and interprets the call-by-name and call-by-value strategies through different interpretations of the probabilistic operator into our generator and consumer constructs. We present two notions of reduction, one via fine-grained local rewrite steps, and one by generation and consumption of probabilistic events. Simple types for the calculus are essentially standard, and they convey strong normalization. We demonstrate how we can encode call-by-name and call-by-value probabilistic evaluation.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا