ترغب بنشر مسار تعليمي؟ اضغط هنا

A two-step fusion process for multi-criteria decision applied to natural hazards in mountains

138   0   0.0 ( 0 )
 نشر من قبل Jean Dezert
 تاريخ النشر 2010
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English
 تأليف Jean-Marc Tacnet




اسأل ChatGPT حول البحث

Mountain river torrents and snow avalanches generate human and material damages with dramatic consequences. Knowledge about natural phenomenona is often lacking and expertise is required for decision and risk management purposes using multi-disciplinary quantitative or qualitative approaches. Expertise is considered as a decision process based on imperfect information coming from more or less reliable and conflicting sources. A methodology mixing the Analytic Hierarchy Process (AHP), a multi-criteria aid-decision method, and information fusion using Belief Function Theory is described. Fuzzy Sets and Possibilities theories allow to transform quantitative and qualitative criteria into a common frame of discernment for decision in Dempster-Shafer Theory (DST ) and Dezert-Smarandache Theory (DSmT) contexts. Main issues consist in basic belief assignments elicitation, conflict identification and management, fusion rule choices, results validation but also in specific needs to make a difference between importance and reliability and uncertainty in the fusion process.



قيم البحث

اقرأ أيضاً

In this paper, we present a case study demonstrating how dynamic and uncertain criteria can be incorporated into a multicriteria analysis with the help of discrete event simulation. The simulation guided multicriteria analysis can include both moneta ry and non-monetary criteria that are static or dynamic, whereas standard multi criteria analysis only deals with static criteria and cost benefit analysis only deals with static monetary criteria. The dynamic and uncertain criteria are incorporated by using simulation to explore how the decision options perform. The results of the simulation are then fed into the multicriteria analysis. By enabling the incorporation of dynamic and uncertain criteria, the dynamic multiple criteria analysis was able to take a unique perspective of the problem. The highest ranked option returned by the dynamic multicriteria analysis differed from the other decision aid techniques.
Due to the fact that basic uncertain information provides a simple form for decision information with certainty degree, it has been developed to reflect the quality of observed or subjective assessments. In order to study the algebra structure and pr eference relation of basic uncertain information, we develop some algebra operations for basic uncertain information. The order relation of such type of information has also been considered. Finally, to apply the developed algebra operations and order relations, a generalized TODIM method for multi-attribute decision making with basic uncertain information is given. The numerical example shows that the developed decision procedure is valid.
A video-grounded dialogue system is required to understand both dialogue, which contains semantic dependencies from turn to turn, and video, which contains visual cues of spatial and temporal scene variations. Building such dialogue systems is a chal lenging problem, involving various reasoning types on both visual and language inputs. Existing benchmarks do not have enough annotations to thoroughly analyze dialogue systems and understand their capabilities and limitations in isolation. These benchmarks are also not explicitly designed to minimise biases that models can exploit without actual reasoning. To address these limitations, in this paper, we present DVD, a Diagnostic Dataset for Video-grounded Dialogues. The dataset is designed to contain minimal biases and has detailed annotations for the different types of reasoning over the spatio-temporal space of video. Dialogues are synthesized over multiple question turns, each of which is injected with a set of cross-turn semantic relationships. We use DVD to analyze existing approaches, providing interesting insights into their abilities and limitations. In total, DVD is built from $11k$ CATER synthetic videos and contains $10$ instances of $10$-round dialogues for each video, resulting in more than $100k$ dialogues and $1M$ question-answer pairs. Our code and dataset are publicly available at https://github.com/facebookresearch/DVDialogues.
Each individual handles many tasks of finding the most profitable option from a set of options that stochastically provide rewards. Our society comprises a collection of such individuals, and the society is expected to maximise the total rewards, whi le the individuals compete for common rewards. Such collective decision making is formulated as the `competitive multi-armed bandit problem (CBP), requiring a huge computational cost. Herein, we demonstrate a prototype of an analog computer that efficiently solves CBPs by exploiting the physical dynamics of numerous fluids in coupled cylinders. This device enables the maximisation of the total rewards for the society without paying the conventionally required computational cost; this is because the fluids estimate the reward probabilities of the options for the exploitation of past knowledge and generate random fluctuations for the exploration of new knowledge. Our results suggest that to optimise the social rewards, the utilisation of fluid-derived natural fluctuations is more advantageous than applying artificial external fluctuations. Our analog computing scheme is expected to trigger further studies for harnessing the huge computational power of natural phenomena for resolving a wide variety of complex problems in modern information society.
With the discovery of a particle that seems rather consistent with the minimal Standard Model Higgs boson, attention turns to questions of naturalness, fine-tuning, and what they imply for physics beyond the Standard Model and its discovery prospects at run II of the LHC. In this article we revisit the issue of naturalness, discussing some implicit assumptions that underly some of the most common statements, which tend to assign physical significance to certain regularization procedures. Vague arguments concerning fine-tuning can lead to conclusions that are too strong and perhaps not as generic as one would hope. Instead, we explore a more pragmatic definition of the hierarchy problem that does not rely on peeking beyond the murky boundaries of quantum field theory: we investigate the fine-tuning of the electroweak scale associated with thresholds from heavy particles, which is both calculable and dependent on the nature of the would-be ultraviolet completion of the Standard Model. We discuss different manifestations of new high-energy scales that are favored by experimental hints for new physics with an eye toward making use of fine-tuning in order to determine natural regions of the new physics parameter spaces.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا