No Arabic abstract
In a resource-constrained Wireless Sensor Networks (WSNs), the optimization of the sampling and the transmission rates of each individual node is a crucial issue. A high volume of redundant data transmitted through the network will result in collisions, data loss, and energy dissipation. This paper proposes a novel data reduction scheme, that exploits the spatial-temporal correlation among sensor data in order to determine the optimal sampling strategy for the deployed sensor nodes. This strategy reduces the overall sampling/transmission rates while preserving the quality of the data. Moreover, a back-end reconstruction algorithm is deployed on the workstation (Sink). This algorithm can reproduce the data that have not been sampled by finding the spatial and temporal correlation among the reported data set, and filling the non-sampled parts with predictions. We have used real sensor data of a network that was deployed at the Grand-St-Bernard pass located between Switzerland and Italy. We tested our approach using the previously mentioned data-set and compared it to a recent adaptive sampling based data reduction approach. The obtained results show that our proposed method consumes up to 60% less energy and can handle non-stationary data more effectively.
Decades of continuous scaling has reduced the energy of unit computing to virtually zero, while energy-efficient communication has remained the primary bottleneck in achieving fully energy-autonomous IoT nodes. This paper presents and analyzes the trade-offs between the energies required for communication and computation in a wireless sensor network, deployed in a mesh architecture over a 2400-acre university campus, and is targeted towards multi-sensor measurement of temperature, humidity and water nitrate concentration for smart agriculture. Several scenarios involving In-Sensor-Analytics (ISA), Collaborative Intelligence (CI) and Context-Aware-Switching (CAS) of the cluster-head during CI has been considered. A real-time co-optimization algorithm has been developed for minimizing the energy consumption in the network, hence maximizing the overall battery lifetime of individual nodes. Measurement results show that the proposed ISA consumes ~467X lower energy as compared to traditional Bluetooth Low Energy (BLE) communication, and ~69,500X lower energy as compared with Long Range (LoRa) communication. When the ISA is implemented in conjunction with LoRa, the lifetime of the node increases from a mere 4.3 hours to 66.6 days with a 230 mAh coin cell battery, while preserving more than 98% of the total information. The CI and CAS algorithms help in extending the worst-case node lifetime by an additional 50%, thereby exhibiting an overall network lifetime of ~104 days, which is >90% of the theoretical limits as posed by the leakage currents present in the system, while effectively transferring information sampled every second. A web-based monitoring system was developed to archive the measured data in a continuous manner, and to report anomalies in the measured data.
In wireless sensor networks (WSNs), main task of each sensor node is to sense the physical activity (i.e., targets or disaster conditions) and then to report it to the control center for further process. For this, sensor nodes are attached with many sensors having ability to measure the environmental information. Spatial correlation between nodes exists in such wireless sensor network based on common sensory coverage and then the redundant data communication is observed. To study virus spreading dynamics in such scenario, a modified SI epidemic model is derived mathematically by incorporating WSN parameters such as spatial correlation, node density, sensing range, transmission range, total sensor nodes etc. The solution for proposed SI model is also determined to study the dynamics with time. Initially, a small number of nodes are attacked by viruses and then virus infection propagates through its neighboring nodes over normal data communication. Since redundant nodes exists in correlated sensor field, virus spread process could be different with different sensory coverage. The proposed SI model captures spatial and temporal dynamics than existing ones which are global. The infection process leads to network failure. By exploiting spatial correlation between nodes, spread control scheme is developed to limit the further infection in the network. Numerical result analysis is provided with comparison for validation.
We study the problem of tracking an object moving through a network of wireless sensors. In order to conserve energy, the sensors may be put into a sleep mode with a timer that determines their sleep duration. It is assumed that an asleep sensor cannot be communicated with or woken up, and hence the sleep duration needs to be determined at the time the sensor goes to sleep based on all the information available to the sensor. Having sleeping sensors in the network could result in degraded tracking performance, therefore, there is a tradeoff between energy usage and tracking performance. We design sleeping policies that attempt to optimize this tradeoff and characterize their performance. As an extension to our previous work in this area [1], we consider generalized models for object movement, object sensing, and tracking cost. For discrete state spaces and continuous Gaussian observations, we derive a lower bound on the optimal energy-tracking tradeoff. It is shown that in the low tracking error regime, the generated policies approach the derived lower bound.
Recently, many researchers have studied efficiently gathering data in wireless sensor networks to minimize the total energy consumption when a fixed number of data are allowed to be aggregated into one packet. However, minimizing the total energy consumption does not imply the network lifetime is maximized. In this paper, we study the problem of scheduling data aggregation trees working for different time periods to maximize the network lifetime when a fixed number of data are allowed to be aggregated into one packet. In addition, we propose a heuristic to balance the lifetime of nodes in data aggregation trees such that the network lifetime is maximized. Simulation results show that the proposed heuristic provides a good performance.
Recent years have seen rapid deployment of mobile computing and Internet of Things (IoT) networks, which can be mostly attributed to the increasing communication and sensing capabilities of wireless systems. Big data analysis, pervasive computing, and eventually artificial intelligence (AI) are envisaged to be deployed on top of the IoT and create a new world featured by data-driven AI. In this context, a novel paradigm of merging AI and wireless communications, called Wireless AI that pushes AI frontiers to the network edge, is widely regarded as a key enabler for future intelligent network evolution. To this end, we present a comprehensive survey of the latest studies in wireless AI from the data-driven perspective. Specifically, we first propose a novel Wireless AI architecture that covers five key data-driven AI themes in wireless networks, including Sensing AI, Network Device AI, Access AI, User Device AI and Data-provenance AI. Then, for each data-driven AI theme, we present an overview on the use of AI approaches to solve the emerging data-related problems and show how AI can empower wireless network functionalities. Particularly, compared to the other related survey papers, we provide an in-depth discussion on the Wireless AI applications in various data-driven domains wherein AI proves extremely useful for wireless network design and optimization. Finally, research challenges and future visions are also discussed to spur further research in this promising area.