No Arabic abstract
For IoT to reach its full potential, the sharing and reuse of information in different applications and across verticals is of paramount importance. However, there are a plethora of IoT platforms using different representations, protocols and interaction patterns. To address this issue, the Fed4IoT project has developed an IoT virtualization platform that, on the one hand, integrates information from many different source platforms and, on the other hand, makes the information required by the respective users available in the target platform of choice. To enable this, information is translated into a common, neutral exchange format. The format of choice is NGSI-LD, which is being standardized by the ETSI Industry Specification Group on Context Information Management (ETSI ISG CIM). Thing Visors are the components that translate the source information to NGSI-LD, which is then delivered to the target platform and translated into the target format. ThingVisors can be implemented by hand, but this requires significant human effort, especially considering the heterogeneity of low level information produced by a multitude of sensors. Thus, supporting the human developer and, ideally, fully automating the process of extracting and enriching data and translating it to NGSI-LD is a crucial step. Machine learning is a promising approach for this, but it typically requires large amounts of hand-labelled data for training, an effort that makes it unrealistic in many IoT scenarios. A programmatic labelling approach called knowledge infusion that encodes expert knowledge is used for matching a schema or ontology extracted from the data with a target schema or ontology, providing the basis for annotating the data and facilitating the translation to NGSI-LD.
FPGAs are increasingly common in modern applications, and cloud providers now support on-demand FPGA acceleration in data centers. Applications in data centers run on virtual infrastructure, where consolidation, multi-tenancy, and workload migration enable economies of scale that are fundamental to the providers business. However, a general strategy for virtualizing FPGAs has yet to emerge. While manufacturers struggle with hardware-based approaches, we propose a compiler/runtime-based solution called Synergy. We show a compiler transformation for Verilog programs that produces code able to yield control to software at sub-clock-tick granularity according to the semantics of the original program. Synergy uses this property to efficiently support core virtualization primitives: suspend and resume, program migration, and spatial/temporal multiplexing, on hardware which is available today. We use Synergy to virtualize FPGA workloads across a cluster of Altera SoCs and Xilinx FPGAs on Amazon F1. The workloads require no modification, run within 3-4x of unvirtualized performance, and incur a modest increase in FPGA fabric utilization.
High Energy Physics (HEP) distributed computing infrastructures require automatic tools to monitor, analyze and react to potential security incidents. These tools should collect and inspect data such as resource consumption, logs and sequence of system calls for detecting anomalies that indicate the presence of a malicious agent. They should also be able to perform automated reactions to attacks without administrator intervention. We describe a novel framework that accomplishes these requirements, with a proof of concept implementation for the ALICE experiment at CERN. We show how we achieve a fully virtualized environment that improves the security by isolating services and Jobs without a significant performance impact. We also describe a collected dataset for Machine Learning based Intrusion Prevention and Detection Systems on Grid computing. This dataset is composed of resource consumption measurements (such as CPU, RAM and network traffic), logfiles from operating system services, and system call data collected from production Jobs running in an ALICE Grid test site and a big set of malware. This malware was collected from security research sites. Based on this dataset, we will proceed to develop Machine Learning algorithms able to detect malicious Jobs.
Recently, coordinated attack campaigns started to become more widespread on the Internet. In May 2017, WannaCry infected more than 300,000 machines in 150 countries in a few days and had a large impact on critical infrastructure. Existing threat sharing platforms cannot easily adapt to emerging attack patterns. At the same time, enterprises started to adopt machine learning-based threat detection tools in their local networks. In this paper, we pose the question: emph{What information can defenders share across multiple networks to help machine learning-based threat detection adapt to new coordinated attacks?} We propose three information sharing methods across two networks, and show how the shared information can be used in a machine-learning network-traffic model to significantly improve its ability of detecting evasive self-propagating malware.
In the last decade, a large number of Knowledge Graph (KG) information extraction approaches were proposed. Albeit effective, these efforts are disjoint, and their collective strengths and weaknesses in effective KG information extraction (IE) have not been studied in the literature. We propose Plumber, the first framework that brings together the research communitys disjoint IE efforts. The Plumber architecture comprises 33 reusable components for various KG information extraction subtasks, such as coreference resolution, entity linking, and relation extraction. Using these components,Plumber dynamically generates suitable information extraction pipelines and offers overall 264 distinct pipelines.We study the optimization problem of choosing suitable pipelines based on input sentences. To do so, we train a transformer-based classification model that extracts contextual embeddings from the input and finds an appropriate pipeline. We study the efficacy of Plumber for extracting the KG triples using standard datasets over two KGs: DBpedia, and Open Research Knowledge Graph (ORKG). Our results demonstrate the effectiveness of Plumber in dynamically generating KG information extraction pipelines,outperforming all baselines agnostics of the underlying KG. Furthermore,we provide an analysis of collective failure cases, study the similarities and synergies among integrated components, and discuss their limitations.
Internet of Things (IoT) and Network Softwarization are fast becoming core technologies of information systems and network management for next generation Internet. The deployment and applications of IoT ranges from smart cities to urban computing, and from ubiquitous healthcare to tactile Internet. For this reason the physical infrastructure of heterogeneous network systems has become more complicated, and thus requires efficient and dynamic solutions for management, configuration, and flow scheduling. Network softwarization in the form of Software Defined Networks (SDN) and Network Function Virtualization (NFV) has been extensively researched for IoT in recent past. In this article we present a systematic and comprehensive review of virtualization techniques explicitly designed for IoT networks. We have classified the literature into software defined networks designed for IoT, function virtualization for IoT networks, and software defined IoT networks. These categories are further divided into works which present architectural, security, and management solutions. In addition, the paper highlights a number of short term and long term research challenges and open issues related to adoption of software defined Internet of things.