No Arabic abstract
Motivated by the application problem of sensor fusion the author introduced the concept of graded set. It is reasoned that in classification problem arising in an information system (represented by information table), a novel set called Granular set naturally arises. It is realized that in any hierarchical classification problem, Granular set naturally arises. Also when the target set of objects forms a graded set the lower and upper approximations of target sets form a graded set. This generalizes the concept of rough set. It is hoped that a detailed theory of granular/ graded sets finds several applications.
We lay the groundwork for a formal framework that studies scientific theories and can serve as a unified foundation for the different theories within physics. We define a scientific theory as a set of verifiable statements, assertions that can be shown to be true with an experimental test in finite time. By studying the algebra of such objects, we show that verifiability already provides severe constraints. In particular, it requires that a set of physically distinguishable cases is naturally equipped with the mathematical structures (i.e. second-countable Kolmogorov topologies and $sigma$-algebras) that form the foundation of manifold theory, differential geometry, measure theory, probability theory and all the major branches of mathematics currently used in physics. This gives a clear physical meaning to those mathematical structures and provides a strong justification for their use in science. Most importantly it provides a formal framework to incorporate additional assumptions and constrain the search space for new physical theories.
Conformance checking techniques are widely adopted to pinpoint possible discrepancies between process models and the execution of the process in reality. However, state of the art approaches adopt a crisp evaluation of deviations, with the result that small violations are considered at the same level of significant ones. This affects the quality of the provided diagnostics, especially when there exists some tolerance with respect to reasonably small violations, and hampers the flexibility of the process. In this work, we propose a novel approach which allows to represent actors tolerance with respect to violations and to account for severity of deviations when assessing executions compliance. We argue that besides improving the quality of the provided diagnostics, allowing some tolerance in deviations assessment also enhances the flexibility of conformance checking techniques and, indirectly, paves the way for improving the resilience of the overall process management system.
In this paper we show how, under certain restrictions, the hydrodynamic equations for the freely evolving granular fluid fit within the framework of the time dependent Landau-Ginzburg (LG) models for critical and unstable fluids (e.g. spinodal decomposition). The granular fluid, which is usually modeled as a fluid of inelastic hard spheres (IHS), exhibits two instabilities: the spontaneous formation of vortices and of high density clusters. We suppress the clustering instability by imposing constraints on the system sizes, in order to illustrate how LG-equations can be derived for the order parameter, being the rate of deformation or shear rate tensor, which controls the formation of vortex patterns. From the shape of the energy functional we obtain the stationary patterns in the flow field. Quantitative predictions of this theory for the stationary states agree well with molecular dynamics simulations of a fluid of inelastic hard disks.
Theory of Mind is commonly defined as the ability to attribute mental states (e.g., beliefs, goals) to oneself, and to others. A large body of previous work - from the social sciences to artificial intelligence - has observed that Theory of Mind capabilities are central to providing an explanation to another agent or when explaining that agents behaviour. In this paper, we build and expand upon previous work by providing an account of explanation in terms of the beliefs of agents and the mechanism by which agents revise their beliefs given possible explanations. We further identify a set of desiderata for explanations that utilize Theory of Mind. These desiderata inform our belief-based account of explanation.
We consider how an agent should update her uncertainty when it is represented by a set P of probability distributions and the agent observes that a random variable X takes on value x, given that the agent makes decisions using the minimax criterion, perhaps the best-studied and most commonly-used criterion in the literature. We adopt a game-theoretic framework, where the agent plays against a bookie, who chooses some distribution from P. We consider two reasonable games that differ in what the bookie knows when he makes his choice. Anomalies that have been observed before, like time inconsistency, can be understood as arising because different games are being played, against bookies with different information. We characterize the important special cases in which the optimal decision rules according to the minimax criterion amount to either conditioning or simply ignoring the information. Finally, we consider the relationship between conditioning and calibration when uncertainty is described by sets of probabilities.