Do you want to publish a course? Click here

A Few Considerations on Structural and Logical Composition in Specification Theories

111   0   0.0 ( 0 )
 Added by EPTCS
 Publication date 2011
and research's language is English




Ask ChatGPT about the research

Over the last 20 years a large number of automata-based specification theories have been proposed for modeling of discrete,real-time and probabilistic systems. We have observed a lot of shared algebraic structure between these formalisms. In this short abstract, we collect results of our work in progress on describing and systematizing the algebraic assumptions in specification theories.



rate research

Read More

70 - Yusuke Kawamoto 2019
We introduce a logical approach to formalizing statistical properties of machine learning. Specifically, we propose a formal model for statistical classification based on a Kripke model, and formalize various notions of classification performance, robustness, and fairness of classifiers by using epistemic logic. Then we show some relationships among properties of classifiers and those between classification performance and robustness, which suggests robustness-related properties that have not been formalized in the literature as far as we know. To formalize fairness properties, we define a notion of counterfactual knowledge and show techniques to formalize conditional indistinguishability by using counterfactual epistemic operators. As far as we know, this is the first work that uses logical formulas to express statistical properties of machine learning, and that provides epistemic (resp. counterfactually epistemic) views on robustness (resp. fairness) of classifiers.
The idea of the Semantic Web is to annotate Web content and services with computer interpretable descriptions with the aim to automatize many tasks currently performed by human users. In the context of Web services, one of the most interesting tasks is their composition. In this paper we formalize this problem in the framework of a constructive description logic. In particular we propose a declarative service specification language and a calculus for service composition. We show by means of an example how this calculus can be used to define composed Web services and we discuss the problem of automatic service synthesis.
RISC-V is a relatively new, open instruction set architecture with a mature ecosystem and an official formal machine-readable specification. It is therefore a promising playground for formal-methods research. However, we observe that different formal-methods research projects are interested in different aspects of RISC-V and want to simplify, abstract, approximate, or ignore the other aspects. Often, they also require different encoding styles, resulting in each project starting a new formalization from-scratch. We set out to identify the commonalities between projects and to represent the RISC-V specification as a program with holes that can be instantiated differently by different projects. Our formalization of the RISC-V specification is written in Haskell and leverages existing tools rather than requiring new domain-specific tools, contrary to other approaches. To our knowledge, it is the first RISC-V specification able to serve as the interface between a processor-correctness proof and a compiler-correctness proof, while supporting several other projects with diverging requirements as well.
The scattering predictions of a web of theories including Yang-Mills (YM), gravity, bi-adjoint scalar, the non-linear sigma model (NLSM), Dirac-Born-Infeld-Volkov-Akulov (DBI-VA) and the special Galileon (sGal) form a class of special objects with two fascinating properties: they are related by the double-copy procedure, and they can be defined purely by on-shell constraints. We expand on both of these properties. First we show that NLSM tree-level amplitudes are fully determined by imposing color-dual structure together with cyclic invariance and locality. We then consider how hard-scaling can be used to constrain the predictions of these theories, as opposed to the usual soft-scaling. We probe the UV by generalizing the familiar BCFW shift off-shell to a novel single hard limit. We show that UV scalings are sufficient to fully constrain: 1. Bi-adjoint doubly-ordered amplitudes, assuming locality; 2. NLSM and BI, assuming locality and unitarity; 3. Special Galileon, assuming locality, unitarity, and a UV bound for the general Galileon vertex. We see how potentially distinct aspects of this UV behavior can be understood and unified via double-copy relations. Surprisingly, we find evidence that assuming unitarity for these theories may not be necessary, and can emerge via UV considerations and locality alone. These results complete the observations that, like IR considerations, UV scaling is sufficient to fully constrain a wide range of tree-level amplitudes, for both gauge, gravity, and effective field theories.
We develop an approach to choice principles and their contrapositive bar-induction principles as extensionality schemes connecting an intensional or effective view of respectively ill-and well-foundedness properties to an extensional or ideal view of these properties. After classifying and analysing the relations between different intensional definitions of ill-foundedness and well-foundedness, we introduce, for a domain $A$, a codomain $B$ and a filter $T$ on finite approximations of functions from $A$ to $B$, a generalised form GDC$_{A,B,T}$ of the axiom of dependent choice and dually a generalised bar induction principle GBI$_{A,B,T}$ such that: GDC$_{A,B,T}$ intuitionistically captures the strength of $bullet$ the general axiom of choice expressed as $forall aexists b R(a, b) Rightarrowexistsalphaforall alpha R(alpha,alpha(a))$ when $T$ is a filter that derives point-wise from a relation $R$ on $A times B$ without introducing further constraints, $bullet$ the Boolean Prime Filter Theorem / Ultrafilter Theorem if $B$ is the two-element set $mathbb{B}$ (for a constructive definition of prime filter), $bullet$ the axiom of dependent choice if $A = mathbb{N}$, $bullet$ Weak K{o}nigs Lemma if $A = mathbb{N}$ and $B = mathbb{B}$ (up to weak classical reasoning) GBI$_{A,B,T}$ intuitionistically captures the strength of $bullet$ G{o}dels completeness theorem in the form validity implies provability for entailment relations if $B = mathbb{B}$, $bullet$ bar induction when $A = mathbb{N}$, $bullet$ the Weak Fan Theorem when $A = mathbb{N}$ and $B = mathbb{B}$. Contrastingly, even though GDC$_{A,B,T}$ and GBI$_{A,B,T}$ smoothly capture several variants of choice and bar induction, some instances are inconsistent, e.g. when $A$ is $mathbb{B}^mathbb{N}$ and $B$ is $mathbb{N}$.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا