No Arabic abstract
In the past few decades, constitution-making processes have shifted from closed elite writing to incorporating democratic mechanisms. Yet, little is known about democratic participation in deliberative constitution-making processes. Here, we study a deliberative constituent process held by the Chilean government between 2015 and 2016. The Chilean process had the highest level of citizen participation in the world ($204,402$ people, i.e., $1.3%$ of the population) for such a process and covered $98%$ of the national territory. In its participatory phase, people gathered in self-convoked groups of 10 to 30 members, and they collectively selected, deliberated, and wrote down an argument on why the new constitution should include those social rights. To understand the citizen participation drivers in this volunteer process, we first identify the determinants at the municipality level. We find the educational level, engagement in politics, support for the (left-wing) government, and Internet access increased participation. In contrast, population density and the share of evangelical Christians decreased participation. Moreover, we do not find evidence of political manipulation on citizen participation. In light of those determinants, we analyze the collective selection of social rights, and the content produced during the deliberative phase. The findings suggest that the knowledge embedded in cities, proxied using education levels and main economic activity, facilitates deliberation about themes, concepts, and ideas. These results can inform the organization of new deliberative processes that involve voluntary citizen participation, from citizen consultations to constitution-making processes.
In this work, we reveal the structure of global news coverage of disasters and its determinants by using a large-scale news coverage dataset collected by the GDELT (Global Data on Events, Location, and Tone) project that monitors news media in over 100 languages from the whole world. Significant variables in our hierarchical (mixed-effect) regression model, such as the number of population, the political stability, the damage, and more, are well aligned with a series of previous research. Yet, strong regionalism we found in news geography highlights the necessity of the comprehensive dataset for the study of global news coverage.
This paper was presented as the 8th annual Transactions in GIS plenary address at the American Association of Geographers annual meeting in Washington, DC. The spatial sciences have recently seen growing calls for more accessible software and tools that better embody geographic science and theory. Urban spatial network science offers one clear opportunity: from multiple perspectives, tools to model and analyze nonplanar urban spatial networks have traditionally been inaccessible, atheoretical, or otherwise limiting. This paper reflects on this state of the field. Then it discusses the motivation, experience, and outcomes of developing OSMnx, a tool intended to help address this. Next it reviews this tools use in the recent multidisciplinary spatial network science literature to highlight upstream and downstream benefits of open-source software development. Tool-building is an essential but poorly incentivized component of academic geography and social science more broadly. To conduct better science, we need to build better tools. The paper concludes with paths forward, emphasizing open-source software and reusable computational data science beyond mere reproducibility and replicability.
In this paper, we apply techniques of ensemble analysis to understand the political baseline for Congressional representation in Colorado. We generate a large random sample of reasonable redistricting plans and determine the partisan balance of each district using returns from state-wide elections in 2018, and analyze the 2011/2012 enacted districts in this context. Colorado recently adopted a new framework for redistricting, creating an independent commission to draw district boundaries, prohibiting partisan bias and incumbency considerations, requiring that political boundaries (such as counties) be preserved as much as possible, and also requiring that mapmakers maximize the number of competitive districts. We investigate the relationships between partisan outcomes, number of counties which are split, and number of competitive districts in a plan. This paper also features two novel improvements in methodology--a more rigorous statistical framework for understanding necessary sample size, and a weighted-graph method for generating random plans which split approximately as few counties as acceptable human-drawn maps.
With a majority of Yes votes in the Constitutional Referendum of 2017, Turkey continues its transition from democracy to autocracy. By the will of the Turkish people, this referendum transferred practically all executive power to president Erdogan. However, the referendum was confronted with a substantial number of allegations of electoral misconducts and irregularities, ranging from state coercion of No supporters to the controversial validity of unstamped ballots. In this note we report the results of an election forensic analysis of the 2017 referendum to clarify to what extent these voting irregularities were present and if they were able to influence the outcome of the referendum. We specifically apply novel statistical forensics tests to further identify the specific nature of electoral malpractices. In particular, we test whether the data contains fingerprints for ballot-stuffing (submission of multiple ballots per person during the vote) and voter rigging (coercion and intimidation of voters). Additionally, we perform tests to identify numerical anomalies in the election results. We find systematic and highly significant support for the presence of both, ballot-stuffing and voter rigging. In 6% of stations we find signs for ballot-stuffing with an error (probability of ballot-stuffing not happening) of 0.15% (3 sigma event). The influence of these vote distortions were large enough to tip the overall balance from No to a majority of Yes votes.
Social media radically changed how information is consumed and reported. Moreover, social networks elicited a disintermediated access to an unprecedented amount of content. The world health organization (WHO) coined the term infodemics to identify the information overabundance during an epidemic. Indeed, the spread of inaccurate and misleading information may alter behaviors and complicate crisis management and health responses. This paper addresses information diffusion during the COVID-19 pandemic period with a massive data analysis on YouTube. First, we analyze more than 2M users engagement in 13000 videos released by 68 different YouTube channels, with different political bias and fact-checking indexes. We then investigate the relationship between each users political preference and her/his consumption of questionable/reliable information. Our results, quantified using information theory measures, provide evidence for the existence of echo chambers across two dimensions represented by the political bias and by the trustworthiness of information channels. Finally, we observe that the echo chamber structure cannot be reproduced after properly randomizing the users interaction patterns.