ترغب بنشر مسار تعليمي؟ اضغط هنا

The worldwide spread of COVID-19 has prompted extensive online discussions, creating an `infodemic on social media platforms such as WhatsApp and Twitter. However, the information shared on these platforms is prone to be unreliable and/or misleading. In this paper, we present the first analysis of COVID-19 discourse on public WhatsApp groups from Pakistan. Building on a large scale annotation of thousands of messages containing text and images, we identify the main categories of discussion. We focus on COVID-19 messages and understand the different types of images/text messages being propagated. By exploring user behavior related to COVID messages, we inspect how misinformation is spread. Finally, by quantifying the flow of information across WhatsApp and Twitter, we show how information spreads across platforms and how WhatsApp acts as a source for much of the information shared on Twitter.
The networking field has recently started to incorporate artificial intelligence (AI), machine learning (ML), big data analytics combined with advances in networking (such as software-defined networks, network functions virtualization, and programmab le data planes) in a bid to construct highly optimized self-driving and self-organizing networks. It is worth remembering that the modern Internet that interconnects millions of networks is a `complex adaptive social system, in which interventions not only cause effects but the effects have further knock-on effects (not all of which are desirable or anticipated). We believe that self-driving networks will likely raise new unanticipated challenges (particularly in the human-facing domains of ethics, privacy, and security). In this paper, we propose the use of insights and tools from the field of systems thinking---a rich discipline developing for more than half a century, which encompasses qualitative and quantitative nonlinear models of complex social systems---and highlight their relevance for studying the long-term effects of network architectural interventions, particularly for self-driving networks. We show that these tools complement existing simulation and modeling tools and provide new insights and capabilities. To the best of our knowledge, this is the first study that has considered the relevance of formal systems thinking tools for the analysis of self-driving networks.
While machine learning and artificial intelligence have long been applied in networking research, the bulk of such works has focused on supervised learning. Recently there has been a rising trend of employing unsupervised machine learning using unstr uctured raw network data to improve network performance and provide services such as traffic engineering, anomaly detection, Internet traffic classification, and quality of service optimization. The interest in applying unsupervised learning techniques in networking emerges from their great success in other fields such as computer vision, natural language processing, speech recognition, and optimal control (e.g., for developing autonomous self-driving cars). Unsupervised learning is interesting since it can unconstrain us from the need of labeled data and manual handcrafted feature engineering thereby facilitating flexible, general, and automated methods of machine learning. The focus of this survey paper is to provide an overview of the applications of unsupervised learning in the domain of networking. We provide a comprehensive survey highlighting the recent advancements in unsupervised learning techniques and describe their applications for various learning tasks in the context of networking. We also provide a discussion on future directions and open research issues, while also identifying potential pitfalls. While a few survey papers focusing on the applications of machine learning in networking have previously been published, a survey of similar scope and breadth is missing in literature. Through this paper, we advance the state of knowledge by carefully synthesizing the insights from these survey papers while also providing contemporary coverage of recent advances.
Cell injection is a technique in the domain of biological cell micro-manipulation for the delivery of small volumes of samples into the suspended or adherent cells. It has been widely applied in various areas, such as gene injection, in-vitro fertili zation (IVF), intracytoplasmic sperm injection (ISCI) and drug development. However, the existing manual and semi-automated cell injection systems require lengthy training and suffer from high probability of contamination and low success rate. In the recently introduced fully automated cell injection systems, the injection force plays a vital role in the success of the process since even a tiny excessive force can destroy the membrane or tissue of the biological cell. Traditionally, the force control algorithms are analyzed using simulation, which is inherently non-exhaustive and incomplete in terms of detecting system failures. Moreover, the uncertainties in the system are generally ignored in the analysis. To overcome these limitations, we present a formal analysis methodology based on probabilistic model checking to analyze a robotic cell injection system utilizing the impedance force control algorithm. The proposed methodology, developed using the PRISM model checker, allowed to find a discrepancy in the algorithm, which was not found by any of the previous analysis using the traditional methods.
The explosive increase in number of smart devices hosting sophisticated applications is rapidly affecting the landscape of information communication technology industry. Mobile subscriptions, expected to reach 8.9 billion by 2022, would drastically i ncrease the demand of extra capacity with aggregate throughput anticipated to be enhanced by a factor of 1000. In an already crowded radio spectrum, it becomes increasingly difficult to meet ever growing application demands of wireless bandwidth. It has been shown that the allocated spectrum is seldom utilized by the primary users and hence contains spectrum holes that may be exploited by the unlicensed users for their communication. As we enter the Internet Of Things (IoT) era in which appliances of common use will become smart digital devices with rigid performance requirements (such as low latency, energy efficiency, etc.), current networks face the vexing problem of how to create sufficient capacity for such applications. The fifth generation of cellular networks (5G) envisioned to address these challenges are thus required to incorporate cognition and intelligence to resolve the aforementioned issues.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا