No Arabic abstract
We present an integrated approach to analyse the multi-lead ECG data using the frame work of multiplex recurrence networks (MRNs). We explore how their intralayer and interlayer topological features can capture the subtle variations in the recurrence patterns of the underlying spatio-temporal dynamics. We find MRNs from ECG data of healthy cases are significantly more coherent with high mutual information and less divergence between respective degree distributions. In cases of diseases, significant differences in specific measures of similarity between layers are seen. The coherence is affected most in the cases of diseases associated with localized abnormality such as bundle branch block. We note that it is important to do a comprehensive analysis using all the measures to arrive at disease-specific patterns. Our approach is very general and as such can be applied in any other domain where multivariate or multi-channel data are available from highly complex systems.
In the early stage of epidemics, individuals determination on adopting protective measures, which can reduce their risk of infection and suppress disease spreading, is likely to depend on multiple information sources and their mutual confirmation due to inadequate exact information. Here we introduce the inter-layer mutual confirmation mechanism into the information-disease interacting dynamics on multiplex networks. In our model, an individual increases the information transmission rate and willingness to adopt protective measures once he confirms the authenticity of news and severity of disease from neighbors status in multiple layers. By using the microscopic Markov chain approach, we analytically calculate the epidemic threshold and the awareness and infected density in the stationary state, which agree well with simulation results. We find that the increment of epidemic threshold when confirming the aware neighbors on communication layer is larger than that of the contact layer. On the contrary, the confirmation of neighbors awareness and infection from the contact layer leads to a lower final infection density and a higher awareness density than that of the communication layer. The results imply that individuals explicit exposure of their infection and awareness status to neighbors, especially those with real contacts, is helpful in suppressing epidemic spreading.
In many data sets, crucial elements co-exist with non-essential ones and noise. For data represented as networks in particular, several methods have been proposed to extract a network backbone, i.e., the set of most important links. However, the question of how the resulting compressed views of the data can effectively be used has not been tackled. Here we address this issue by putting forward and exploring several systematic procedures to build surrogate data from various kinds of temporal network backbones. In particular, we explore how much information about the original data need to be retained alongside the backbone so that the surrogate data can be used in data-driven numerical simulations of spreading processes. We illustrate our results using empirical temporal networks with a broad variety of structures and properties.
We introduce the sandpile model on multiplex networks with more than one type of edge and investigate its scaling and dynamical behaviors. We find that the introduction of multiplexity does not alter the scaling behavior of avalanche dynamics; the system is critical with an asymptotic power-law avalanche size distribution with an exponent $tau = 3/2$ on duplex random networks. The detailed cascade dynamics, however, is affected by the multiplex coupling. For example, higher-degree nodes such as hubs in scale-free networks fail more often in the multiplex dynamics than in the simplex network counterpart in which different types of edges are simply aggregated. Our results suggest that multiplex modeling would be necessary in order to gain a better understanding of cascading failure phenomena of real-world multiplex complex systems, such as the global economic crisis.
Large-scale research endeavors can be hindered by logistical constraints limiting the amount of available data. For example, global ecological questions require a global dataset, and traditional sampling protocols are often too inefficient for a small research team to collect an adequate amount of data. Citizen science offers an alternative by crowdsourcing data collection. Despite growing popularity, the community has been slow to embrace it largely due to concerns about quality of data collected by citizen scientists. Using the citizen science project Floating Forests (http://floatingforests.org), we show that consensus classifications made by citizen scientists produce data that is of comparable quality to expert generated classifications. Floating Forests is a web-based project in which citizen scientists view satellite photographs of coastlines and trace the borders of kelp patches. Since launch in 2014, over 7,000 citizen scientists have classified over 750,000 images of kelp forests largely in California and Tasmania. Images are classified by 15 users. We generated consensus classifications by overlaying all citizen classifications and assessed accuracy by comparing to expert classifications. Matthews correlation coefficient (MCC) was calculated for each threshold (1-15), and the threshold with the highest MCC was considered optimal. We showed that optimal user threshold was 4.2 with an MCC of 0.400 (0.023 SE) for Landsats 5 and 7, and a MCC of 0.639 (0.246 SE) for Landsat 8. These results suggest that citizen science data derived from consensus classifications are of comparable accuracy to expert classifications. Citizen science projects should implement methods such as consensus classification in conjunction with a quantitative comparison to expert generated classifications to avoid concerns about data quality.
In this paper, we study information transport in multiplex networks comprised of two coupled subnetworks. The upper subnetwork, called the logical layer, employs the shortest paths protocol to determine the logical paths for packets transmission, while the lower subnetwork acts as the physical layer, in which packets are delivered by the biased random walk mechanism characterized with a parameter $alpha$. Through simulation, we obtain the optimal $alpha$ corresponding to the maximum network lifetime and the maximum number of the arrival packets. Assortative coupling is better than the random coupling and the disassortative coupling, since it achieves much better transmission performances. Generally, the more homogeneous the lower subnetwork, the better the transmission performances are, which is opposite for the upper subnetwork. Finally, we propose an attack centrality for nodes based on the topological information of both subnetworks, and further investigate the transmission performances under targeted attacks. Our work helps to understand the spreading and robustness issues of multiplex networks and provides some clues about the designing of more efficient and robust routing architectures in communication systems.