Do you want to publish a course? Click here

A Generalized Framework for Measuring Pedestrian Accessibility around the World Using Open Data

105   0   0.0 ( 0 )
 Added by Geoff Boeing
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

Pedestrian accessibility is an important factor in urban transport and land use policy and critical for creating healthy, sustainable cities. Developing and evaluating indicators measuring inequalities in pedestrian accessibility can help planners and policymakers benchmark and monitor the progress of city planning interventions. However, measuring and assessing indicators of urban design and transport features at high resolution worldwide to enable city comparisons is challenging due to limited availability of official, high quality, and comparable spatial data, as well as spatial analysis tools offering customizable frameworks for indicator construction and analysis. To address these challenges, this study develops an open source software framework to construct pedestrian accessibility indicators for cities using open and consistent data. It presents a generalized method to consistently measure pedestrian accessibility at high resolution and spatially aggregated scale, to allow for both within- and between-city analyses. The open source and open data methods developed in this study can be extended to other cities worldwide to support local planning and policymaking. The software is made publicly available for reuse in an open repository.



rate research

Read More

The objective of this study is to examine spatial patterns of impacts and recovery of communities based on variances in credit card transactions. Such variances could capture the collective effects of household impacts, disrupted accesses, and business closures, and thus provide an integrative measure for examining disaster impacts and community recovery in disasters. Existing studies depend mainly on survey and sociodemographic data for disaster impacts and recovery effort evaluations, although such data has limitations, including large data collection efforts and delayed timeliness results. In addition, there are very few studies have concentrated on spatial patterns and disparities of disaster impacts and short-term recovery of communities, although such investigation can enhance situational awareness during disasters and support the identification of disparate spatial patterns of disaster impacts and recovery in the impacted regions. This study examines credit card transaction data Harris County (Texas, USA) during Hurricane Harvey in 2017 to explore spatial patterns of disaster impacts and recovery during from the perspective of community residents and businesses at ZIP code and county scales, respectively, and to further investigate their spatial disparities across ZIP codes. The results indicate that individuals in ZIP codes with populations of higher income experienced more severe disaster impact and recovered more quickly than those located in lower-income ZIP codes for most business sectors. Our findings not only enhance the understanding of spatial patterns and disparities in disaster impacts and recovery for better community resilience assessment, but also could benefit emergency managers, city planners, and public officials in harnessing population activity data, using credit card transactions as a proxy for activity, to improve situational awareness and resource allocation.
The existence of noisy data is prevalent in both the training and testing phases of machine learning systems, which inevitably leads to the degradation of model performance. There have been plenty of works concentrated on learning with in-distribution (IND) noisy labels in the last decade, i.e., some training samples are assigned incorrect labels that do not correspond to their true classes. Nonetheless, in real application scenarios, it is necessary to consider the influence of out-of-distribution (OOD) samples, i.e., samples that do not belong to any known classes, which has not been sufficiently explored yet. To remedy this, we study a new problem setup, namely Learning with Open-world Noisy Data (LOND). The goal of LOND is to simultaneously learn a classifier and an OOD detector from datasets with mixed IND and OOD noise. In this paper, we propose a new graph-based framework, namely Noisy Graph Cleaning (NGC), which collects clean samples by leveraging geometric structure of data and model predictive confidence. Without any additional training effort, NGC can detect and reject the OOD samples based on the learned class prototypes directly in testing phase. We conduct experiments on multiple benchmarks with different types of noise and the results demonstrate the superior performance of our method against state of the arts.
139 - Anissa Tanweer 2017
Ethics in the emerging world of data science are often discussed through cautionary tales about the dire consequences of missteps taken by high profile companies or organizations. We take a different approach by foregrounding the ways that ethics are implicated in the day-to-day work of data science, focusing on instances in which data scientists recognize, grapple with, and conscientiously respond to ethical challenges. This paper presents a case study of ethical dilemmas that arose in a data science for social good (DSSG) project focused on improving navigation for people with limited mobility. We describe how this particular DSSG team responded to those dilemmas, and how those responses gave rise to still more dilemmas. While the details of the case discussed here are unique, the ethical dilemmas they illuminate can commonly be found across many DSSG projects. These include: the risk of exacerbating disparities; the thorniness of algorithmic accountability; the evolving opportunities for mischief presented by new technologies; the subjective and value- laden interpretations at the heart of any data-intensive project; the potential for data to amplify or mute particular voices; the possibility of privacy violations; and the folly of technological solutionism. Based on our tracing of the teams responses to these dilemmas, we distill lessons for an ethical data science practice that can be more generally applied across DSSG projects. Specifically, this case experience highlights the importance of: 1) Setting the scene early on for ethical thinking 2) Recognizing ethical decision-making as an emergent phenomenon intertwined with the quotidian work of data science for social good 3) Approaching ethical thinking as a thoughtful and intentional balancing of priorities rather than a binary differentiation between right and wrong.
Increasingly available high-frequency location datasets derived from smartphones provide unprecedented insight into trajectories of human mobility. These datasets can play a significant and growing role in informing preparedness and response to natural disasters. However, limited tools exist to enable rapid analytics using mobility data, and tend not to be tailored specifically for disaster risk management. We present an open-source, Python-based toolkit designed to conduct replicable and scalable post-disaster analytics using GPS location data. Privacy, system capabilities, and potential expansions of textit{Mobilkit} are discussed.
Web Accessibility for disabled people has posed a challenge to the civilized societies that claim to uphold the principles of equal opportunity and nondiscrimination. Certain concrete measures have been taken to narrow down the digital divide between normal and disabled users of Internet technology. The efforts have resulted in enactment of legislations and laws and mass awareness about the discriminatory nature of the accessibility issue, besides the efforts have resulted in the development of commensurate technological tools to develop and test the Web accessibility. World Wide Web consortiums (W3C) Web Accessibility Initiative (WAI) has framed a comprehensive document comprising of set of guidelines to make the Web sites accessible to the users with disabilities. This paper is about the issues and aspects surrounding Web Accessibility. The details and scope are kept limited to comply with the aim of the paper which is to create awareness and to provide basis for in-depth investigation.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا