Do you want to publish a course? Click here

Stream Computing

107   0   0.0 ( 0 )
 Added by Subhash Kak
 Publication date 2008
and research's language is English
 Authors Subhash Kak




Ask ChatGPT about the research

Stream computing is the use of multiple autonomic and parallel modules together with integrative processors at a higher level of abstraction to embody intelligent processing. The biological basis of this computing is sketched and the matter of learning is examined.



rate research

Read More

Stream constraint programming is a recent addition to the family of constraint programming frameworks, where variable domains are sets of infinite streams over finite alphabets. Previous works showed promising results for its applicability to real-world planning and control problems. In this paper, motivated by the modelling of planning applications, we improve the expressiveness of the framework by introducing 1) the until constraint, a new construct that is adapted from Linear Temporal Logic and 2) the @ operator on streams, a syntactic sugar for which we provide a more efficient solving algorithm over simple desugaring. For both constructs, we propose corresponding novel solving algorithms and prove their correctness. We present competitive experimental results on the Missionaries and Cannibals logic puzzle and a standard path planning application on the grid, by comparing with Apt and Brands method for verifying eventuality conditions using a CP approach.
The World Wide Web continues to evolve and serve as the infrastructure for carrying massive amounts of multimodal and multisensory observations. These observations capture various situations pertinent to peoples needs and interests along with all their idiosyncrasies. To support human-centered computing that empower people in making better and timely decisions, we look towards computation that is inspired by human perception and cognition. Toward this goal, we discuss computing paradigms of semantic computing, cognitive computing, and an emerging aspect of computing, which we call perceptual computing. In our view, these offer a continuum to make the most out of vast, growing, and diverse data pertinent to human needs and interests. We propose details of perceptual computing characterized by interpretation and exploration operations comparable to the interleaving of bottom and top brain processing. This article consists of two parts. First we describe semantic computing, cognitive computing, and perceptual computing to lay out distinctions while acknowledging their complementary capabilities. We then provide a conceptual overview of the newest of these three paradigms--perceptual computing. For further insights, we focus on an application scenario of asthma management converting massive, heterogeneous and multimodal (big) data into actionable information or smart data.
53 - Luis A. Pineda 2019
The Turing Machine is the paradigmatic case of computing machines, but there are others such as Artificial Neural Networks, quantum computing, holography, and diverse forms of analogical computing, each based on a particular intuition of the phenomenon of computing. This variety can be captured in terms of system levels, re-interpreting and generalizing Newells hierarchy, which includes the knowledge level at the top and the symbol level immediately below it. In this re-interpretation the knowledge level consists of human knowledge and the symbol level is generalized into a new level that here is called The Mode of Computing. Natural computing performed by brains of humans and non-human animals with a developed enough neural system should be understood in terms of a hierarchy of system levels too. By analogy from standard computing machinery there must be a system level above the neural circuitry and directly below the knowledge level that is named here The mode of Natural Computing. A central question for Cognition is the characterization of this mode. The Mode of Computing provides a novel perspective on the phenomena of computing, interpreting, the representational and non-representational views of cognition, and consciousness.
Efficient decision-making over continuously changing data is essential for many application domains such as cyber-physical systems, industry digitalization, etc. Modern stream reasoning frameworks allow one to model and solve various real-world problems using incremental and continuous evaluation of programs as new data arrives in the stream. Applied techniques use, e.g., Datalog-like materialization or truth maintenance algorithms to avoid costly re-computations, thus ensuring low latency and high throughput of a stream reasoner. However, the expressiveness of existing approaches is quite limited and, e.g., they cannot be used to encode problems with constraints, which often appear in practice. In this paper, we suggest a novel approach that uses the Conflict-Driven Constraint Learning (CDCL) to efficiently update legacy solutions by using intelligent management of learned constraints. In particular, we study the applicability of reinforcement learning to continuously assess the utility of learned constraints computed in previous invocations of the solving algorithm for the current one. Evaluations conducted on real-world reconfiguration problems show that providing a CDCL algorithm with relevant learned constraints from previous iterations results in significant performance improvements of the algorithm in stream reasoning scenarios. Under consideration for acceptance in TPLP.
Content-Centric Networking (CCN) research addresses the mismatch between the modern usage of the Internet and its outdated architecture. Importantly, CCN routers may locally cache frequently requested content in order to speed up delivery to end users. Thus, the issue of caching strategies arises, i.e., which content shall be stored and when it should be replaced. In this work, we employ novel techniques towards intelligent administration of CCN routers that autonomously switch between existing strategies in response to changing content request patterns. In particular, we present a router architecture for CCN networks that is controlled by rule-based stream reasoning, following the recent formal framework LARS which extends Answer Set Programming for streams. The obtained possibility for flexible router configuration at runtime allows for faster experimentation and may thus help to advance the further development of CCN. Moreover, the empirical evaluation of our feasibility study shows that the resulting caching agent may give significant performance gains.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا