ترغب بنشر مسار تعليمي؟ اضغط هنا

Could social media data aid in disaster response and damage assessment? Countries face both an increasing frequency and intensity of natural disasters due to climate change. And during such events, citizens are turning to social media platforms for d isaster-related communication and information. Social media improves situational awareness, facilitates dissemination of emergency information, enables early warning systems, and helps coordinate relief efforts. Additionally, spatiotemporal distribution of disaster-related messages helps with real-time monitoring and assessment of the disaster itself. Here we present a multiscale analysis of Twitter activity before, during, and after Hurricane Sandy. We examine the online response of 50 metropolitan areas of the United States and find a strong relationship between proximity to Sandys path and hurricane-related social media activity. We show that real and perceived threats -- together with the physical disaster effects -- are directly observable through the intensity and composition of Twitters message stream. We demonstrate that per-capita Twitter activity strongly correlates with the per-capita economic damage inflicted by the hurricane. Our findings suggest that massive online social networks can be used for rapid assessment (nowcasting) of damage caused by a large-scale disaster.
Information flow during catastrophic events is a critical aspect of disaster management. Modern communication platforms, in particular online social networks, provide an opportunity to study such flow, and a mean to derive early-warning sensors, impr oving emergency preparedness and response. Performance of the social networks sensor method, based on topological and behavioural properties derived from the friendship paradox, is studied here for over 50 million Twitter messages posted before, during, and after Hurricane Sandy. We find that differences in users network centrality effectively translate into moderate awareness advantage (up to 26 hours); and that geo-location of users within or outside of the hurricane-affected area plays significant role in determining the scale of such advantage. Emotional response appears to be universal regardless of the position in the network topology, and displays characteristic, easily detectable patterns, opening a possibility of implementing a simple sentiment sensing technique to detect and locate disasters.
Recent research has focused on the monitoring of global-scale online data for improved detection of epidemics, mood patterns, movements in the stock market, political revolutions, box-office revenues, consumer behaviour and many other important pheno mena. However, privacy considerations and the sheer scale of data available online are quickly making global monitoring infeasible, and existing methods do not take full advantage of local network structure to identify key nodes for monitoring. Here, we develop a model of the contagious spread of information in a global-scale, publicly-articulated social network and show that a simple method can yield not just early detection, but advance warning of contagious outbreaks. In this method, we randomly choose a small fraction of nodes in the network and then we randomly choose a friend of each node to include in a group for local monitoring. Using six months of data from most of the full Twittersphere, we show that this friend group is more central in the network and it helps us to detect viral outbreaks of the use of novel hashtags about 7 days earlier than we could with an equal-sized randomly chosen group. Moreover, the method actually works better than expected due to network structure alone because highly central actors are both more active and exhibit increased diversity in the information they transmit to others. These results suggest that local monitoring is not just more efficient, it is more effective, and it is possible that other contagious processes in global-scale networks may be similarly monitored.
We perform laboratory experiments to elucidate the role of historical information in games involving human coordination. Our approach follows prior work studying human network coordination using the task of graph coloring. We first motivate this rese arch by showing empirical evidence that the resolution of coloring conflicts is dependent upon the recent local history of that conflict. We also conduct two tailored experiments to manipulate the game history that can be used by humans in order to determine (i) whether humans use historical information, and (ii) whether they use it effectively. In the first variant, during the course of each coloring task, the network positions of the subjects were periodically swapped while maintaining the global coloring state of the network. In the second variant, participants completed a series of 2-coloring tasks, some of which were restarts from checkpoints of previous tasks. Thus, the participants restarted the coloring task from a point in the middle of a previous task without knowledge of the history that led to that point. We report on the game dynamics and average completion times for the diverse graph topologies used in the swap and restart experiments.
Granovetters strength of weak ties hypothesizes that isolated social ties offer limited access to external prospects, while heterogeneous social ties diversify ones opportunities. We analyze the most complete record of college student interactions to date (approximately 80,000 interactions by 290 students -- 16 times more interactions with almost 3 times more students than previous studies on educational networks) and compare the social interaction data with the academic scores of the students. Our first finding is that social diversity is negatively correlated with performance. This is explained by our second finding: highly performing students interact in groups of similarly performing peers. This effect is stronger the higher the student performance is. Indeed, low performance students tend to initiate many transient interactions independently of the performance of their target. In other words, low performing students act disassortatively with respect to their social network, whereas high scoring students act assortatively. Our data also reveals that highly performing students establish persistent interactions before mid and low performing ones and that they use more structured and longer cascades of information from which low performing students are excluded.
In a genetic algorithm, fluctuations of the entropy of a genome over time are interpreted as fluctuations of the information that the genomes organism is storing about its environment, being this reflected in more complex organisms. The computation o f this entropy presents technical problems due to the small population sizes used in practice. In this work we propose and test an alternative way of measuring the entropy variation in a population by means of algorithmic information theory, where the entropy variation between two generational steps is the Kolmogorov complexity of the first step conditioned to the second one. As an example application of this technique, we report experimental differences in entropy evolution between systems in which sexual reproduction is present or absent.
In a previous work, the authors proposed a Grammatical Evolution algorithm to automatically generate Lindenmayer Systems which represent fractal curves with a pre-determined fractal dimension. This paper gives strong statistical evidence that the pro bability distributions of the execution time of that algorithm exhibits a heavy tail with an hyperbolic probability decay for long executions, which explains the erratic performance of different executions of the algorithm. Three different restart strategies have been incorporated in the algorithm to mitigate the problems associated to heavy tail distributions: the first assumes full knowledge of the execution time probability distribution, the second and third assume no knowledge. These strategies exploit the fact that the probability of finding a solution in short executions is non-negligible and yield a severe reduction, both in the expected execution time (up to one order of magnitude) and in its variance, which is reduced from an infinite to a finite value.
In this paper we discuss the threat of malware targeted at extracting information about the relationships in a real-world social network as well as characteristic information about the individuals in the network, which we dub Stealing Reality. We pre sent Stealing Reality, explain why it differs from traditional types of network attacks, and discuss why its impact is significantly more dangerous than that of other attacks. We also present our initial analysis and results regarding the form that an SR attack might take, with the goal of promoting the discussion of defending against such an attack, or even just detecting the fact that one has already occurred.
Corporate responses to illness is currently an ad-hoc, subjective process that has little basis in data on how disease actually spreads at the workplace. Additionally, many studies have shown that productivity is not an individual factor but a social one: in any study on epidemic responses this social factor has to be taken into account. The barrier to addressing this problem has been the lack of data on the interaction and mobility patterns of people in the workplace. We have created a wearable Sociometric Badge that senses interactions between individuals using an infra-red (IR) transceiver and proximity using a radio transmitter. Using the data from the Sociometric Badges, we are able to simulate diseases spreading through face-to-face interactions with realistic epidemiological parameters. In this paper we construct a curve trading off productivity with epidemic potential. We are able to take into account impacts on productivity that arise from social factors, such as interaction diversity and density, which studies that take an individual approach ignore. We also propose new organizational responses to diseases that take into account behavioral patterns that are associated with a more virulent disease spread. This is advantageous because it will allow companies to decide appropriate responses based on the organizational context of a disease outbreak.
It is now commonplace to see the Web as a platform that can harness the collective abilities of large numbers of people to accomplish tasks with unprecedented speed, accuracy and scale. To push this idea to its limit, DARPA launched its Network Chall enge, which aimed to explore the roles the Internet and social networking play in the timely communication, wide-area team-building, and urgent mobilization required to solve broad-scope, time-critical problems. The challenge required teams to provide coordinates of ten red weather balloons placed at different locations in the continental United States. This large-scale mobilization required the ability to spread information about the tasks widely and quickly, and to incentivize individuals to act. We report on the winning teams strategy, which utilized a novel recursive incentive mechanism to find all balloons in under nine hours. We analyze the theoretical properties of the mechanism, and present data about its performance in the challenge.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا