ترغب بنشر مسار تعليمي؟ اضغط هنا

A New Approach to Abstract Machines - Introduction to the Theory of Configuration Machines

131   0   0.0 ( 0 )
 نشر من قبل Zhaohua Luo
 تاريخ النشر 2010
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Zhaohua Luo




اسأل ChatGPT حول البحث

An abstract machine is a theoretical model designed to perform a rigorous study of computation. Such a model usually consists of configurations, instructions, programs, inputs and outputs for the machine. In this paper we formalize these notions as a very simple algebraic system, called a configuration machine. If an abstract machine is defined as a configuration machine consisting of primitive recursive functions then the functions computed by the machine are always recursive. The theory of configuration machines provides a useful tool to study universal machines.



قيم البحث

اقرأ أيضاً

116 - Masahiro Hamano 2012
RNA interference (RNAi) is a mechanism whereby small RNAs (siRNAs) directly control gene expression without assistance from proteins. This mechanism consists of interactions between RNAs and small RNAs both of which may be single or double stranded. The target of the mechanism is mRNA to be degraded or aberrated, while the initiator is double stranded RNA (dsRNA) to be cleaved into siRNAs. Observing the digital nature of RNAi, we represent RNAi as a Minsky register machine such that (i) The two registers hold single and double stranded RNAs respectively, and (ii) Machines instructions are interpreted by interactions of enzyme (Dicer), siRNA (with RISC com- plex) and polymerization (RdRp) to the appropriate registers. Interpreting RNAi as a computational structure, we can investigate the computational meaning of RNAi, especially its complexity. Initially, the machine is configured as a Chemical Ground Form (CGF), which generates incorrect jumps. To remedy this problem, the system is remodeled as recursive RNAi, in which siRNA targets not only mRNA but also the machine instructional analogues of Dicer and RISC. Finally, probabilistic termination is investigated in the recursive RNAi system.
156 - Olivier Finkel 2012
An {omega}-language is a set of infinite words over a finite alphabet X. We consider the class of recursive {omega}-languages, i.e. the class of {omega}-languages accepted by Turing machines with a Buchi acceptance condition, which is also the class {Sigma}11 of (effective) analytic subsets of X{omega} for some finite alphabet X. We investigate here the notion of ambiguity for recursive {omega}-languages with regard to acceptance by Buchi Turing machines. We first present in detail essentials on the literature on {omega}-languages accepted by Turing Machines. Then we give a complete and broad view on the notion of ambiguity and unambiguity of Buchi Turing machines and of the {omega}-languages they accept. To obtain our new results, we make use of results and methods of effective descriptive set theory.
112 - Guido Montufar 2018
The restricted Boltzmann machine is a network of stochastic units with undirected interactions between pairs of visible and hidden units. This model was popularized as a building block of deep learning architectures and has continued to play an impor tant role in applied and theoretical machine learning. Restricted Boltzmann machines carry a rich structure, with connections to geometry, applied algebra, probability, statistics, machine learning, and other areas. The analysis of these models is attractive in its own right and also as a platform to combine and generalize mathematical tools for graphical models with hidden variables. This article gives an introduction to the mathematical analysis of restricted Boltzmann machines, reviews recent results on the geometry of the sets of probability distributions representable by these models, and suggests a few directions for further investigation.
68 - Ignacio Vissani 2016
Distributed software is becoming more and more dynamic to support applications able to respond and adapt to the changes of their execution environment. For instance, service-oriented computing (SOC) envisages applications as services running over glo bally available computational resources where discovery and binding between them is transparently performed by a middleware. Asynchronous Relational Networks (ARNs) is a well-known formal orchestration model, based on hypergraphs, for the description of service-oriented software artefacts. Choreography and orchestration are the two main design principles for the development of distributed software. In this work, we propose Communicating Relational Networks (CRNs), which is a variant of ARNs, but relies on choreographies for the characterisation of the communicational aspects of a software artefact, and for making their automated analysis more efficient.
Over recent years, deep learning-based computer vision systems have been applied to images at an ever-increasing pace, oftentimes representing the only type of consumption for those images. Given the dramatic explosion in the number of images generat ed per day, a question arises: how much better would an image codec targeting machine-consumption perform against state-of-the-art codecs targeting human-consumption? In this paper, we propose an image codec for machines which is neural network (NN) based and end-to-end learned. In particular, we propose a set of training strategies that address the delicate problem of balancing competing loss functions, such as computer vision task losses, image distortion losses, and rate loss. Our experimental results show that our NN-based codec outperforms the state-of-the-art Versa-tile Video Coding (VVC) standard on the object detection and instance segmentation tasks, achieving -37.87% and -32.90% of BD-rate gain, respectively, while being fast thanks to its compact size. To the best of our knowledge, this is the first end-to-end learned machine-targeted image codec.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا