Do you want to publish a course? Click here

Deleting edges to restrict the size of an epidemic in temporal networks

60   0   0.0 ( 0 )
 Added by Viktor Zamaraev
 Publication date 2018
and research's language is English




Ask ChatGPT about the research

Spreading processes on graphs are a natural model for a wide variety of real-world phenomena, including information spread over social networks and biological diseases spreading over contact networks. Often, the networks over which these processes spread are dynamic in nature, and can be modeled with temporal graphs. Here, we study the problem of deleting edges from a given temporal graph in order to reduce the number of vertices (temporally) reachable from a given starting point. This could be used to control the spread of a disease, rumour, etc. in a temporal graph. In particular, our aim is to find a temporal subgraph in which a process starting at any single vertex can be transferred to only a limited number of other vertices using a temporally-feasible path. We introduce a natural edge-deletion problem for temporal graphs and provide positive and negative results on its computational complexity and approximability.



rate research

Read More

Motivated by applications in network epidemiology, we consider the problem of determining whether it is possible to delete at most $k$ edges from a given input graph (of small treewidth) so that the resulting graph avoids a set $mathcal{F}$ of forbidden subgraphs; of particular interest is the problem of determining whether it is possible to delete at most $k$ edges so that the resulting graph has no connected component of more than $h$ vertices, as this bounds the worst-case size of an epidemic. While even this special case of the problem is NP-complete in general (even when $h=3$), we provide evidence that many of the real-world networks of interest are likely to have small treewidth, and we describe an algorithm which solves the general problem in time genruntime ~on an input graph having $n$ vertices and whose treewidth is bounded by a fixed constant $w$, if each of the subgraphs we wish to avoid has at most $r$ vertices. For the special case in which we wish only to ensure that no component has more than $h$ vertices, we improve on this to give an algorithm running in time $O((wh)^{2w}n)$, which we have implemented and tested on real datasets based on cattle movements.
We consider a natural variant of the well-known Feedback Vertex Set problem, namely the problem of deleting a small subset of vertices or edges to a full binary tree. This version of the problem is motivated by real-world scenarios that are best modeled by full binary trees. We establish that bo
Temporal networks are widely used to represent a vast diversity of systems, including in particular social interactions, and the spreading processes unfolding on top of them. The identification of structures playing important roles in such processes remains largely an open question, despite recent progresses in the case of static networks. Here, we consider as candidate structures the recently introduced concept of span-cores: the span-cores decompose a temporal network into subgraphs of controlled duration and increasing connectivity, generalizing the core-decomposition of static graphs. To assess the relevance of such structures, we explore the effectiveness of strategies aimed either at containing or maximizing the impact of a spread, based respectively on removing span-cores of high cohesiveness or duration to decrease the epidemic risk, or on seeding the process from such structures. The effectiveness of such strategies is assessed in a variety of empirical data sets and compared to baselines that use only static information on the centrality of nodes and static concepts of coreness, as well as to a baseline based on a temporal centrality measure. Our results show that the most stable and cohesive temporal cores play indeed an important role in epidemic processes on temporal networks, and that their nodes are likely to represent influential spreaders.
Most previous studies of epidemic dynamics on complex networks suppose that the disease will eventually stabilize at either a disease-free state or an endemic one. In reality, however, some epidemics always exhibit sporadic and recurrent behaviour in one region because of the invasion from an endemic population elsewhere. In this paper we address this issue and study a susceptible-infected-susceptible epidemiological model on a network consisting of two communities, where the disease is endemic in one community but alternates between outbreaks and extinctions in the other. We provide a detailed characterization of the temporal dynamics of epidemic patterns in the latter community. In particular, we investigate the time duration of both outbreak and extinction, and the time interval between two consecutive inter-community infections, as well as their frequency distributions. Based on the mean-field theory, we theoretically analyze these three timescales and their dependence on the average node degree of each community, the transmission parameters, and the number of intercommunity links, which are in good agreement with simulations, except when the probability of overlaps between successive outbreaks is too large. These findings aid us in better understanding the bursty nature of disease spreading in a local community, and thereby suggesting effective time-dependent control strategies.
189 - Ewan Davies , Will Perkins 2021
We determine the computational complexity of approximately counting and sampling independent sets of a given size in bounded-degree graphs. That is, we identify a critical density $alpha_c(Delta)$ and provide (i) for $alpha < alpha_c(Delta)$ randomized polynomial-time algorithms for approximately sampling and counting independent sets of given size at most $alpha n$ in $n$-vertex graphs of maximum degree $Delta$; and (ii) a proof that unless NP=RP, no such algorithms exist for $alpha>alpha_c(Delta)$. The critical density is the occupancy fraction of hard core model on the clique $K_{Delta+1}$ at the uniqueness threshold on the infinite $Delta$-regular tree, giving $alpha_c(Delta)simfrac{e}{1+e}frac{1}{Delta}$ as $Deltatoinfty$.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا