Do you want to publish a course? Click here

Data Leverage: A Framework for Empowering the Public in its Relationship with Technology Companies

137   0   0.0 ( 0 )
 Added by Nicholas Vincent
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Many powerful computing technologies rely on implicit and explicit data contributions from the public. This dependency suggests a potential source of leverage for the public in its relationship with technology companies: by reducing, stopping, redirecting, or otherwise manipulating data contributions, the public can reduce the effectiveness of many lucrative technologies. In this paper, we synthesize emerging research that seeks to better understand and help people action this textit{data leverage}. Drawing on prior work in areas including machine learning, human-computer interaction, and fairness and accountability in computing, we present a framework for understanding data leverage that highlights new opportunities to change technology company behavior related to privacy, economic inequality, content moderation and other areas of societal concern. Our framework also points towards ways that policymakers can bolster data leverage as a means of changing the balance of power between the public and tech companies.



rate research

Read More

Public feeding programs continue to be a major source of nutrition to a large part of the population across the world. Any disruption to these activities, like the one during the Covid-19 pandemic, can lead to adverse health outcomes, especially among children. Policymakers and other stakeholders must balance the need for continuing the feeding programs while ensuring the health and safety of workers engaged in the operations. This has led to several innovations that leverage advanced technologies like AI and IOT to monitor the health and safety of workers and ensure hygienic operations. However, there are practical challenges in its implementation on a large scale. This paper presents an implementation framework to build resilient public feeding programs using a combination of intelligent technologies. The framework is a result of piloting the technology solution at a facility run as part of a large mid-day meal feeding program in India. Using existing resources like CCTV cameras and new technologies like AI and IOT, hygiene and safety compliance anomalies can be detected and reported in a resource-efficient manner. It will guide stakeholders running public feeding programs as they seek to restart suspended operations and build systems that better adapt to future crises.
270 - Katie OConnell 2016
Individual neighborhoods within large cities can benefit from independent analysis of public data in the context of ongoing efforts to improve the community. Yet existing tools for public data analysis and visualization are often mismatched to community needs, for reasons including geographic granularity that does not correspond to community boundaries, siloed data sets, inaccurate assumptions about data literacy, and limited user input in design and implementation phases. In Atlanta this need is being addressed through a Data Dashboard developed under the auspices of the Westside Communities Alliance (WCA), a partnership between Georgia Tech and community stakeholders. In this paper we present an interactive analytic and visualization tool for public safety data within the WCA Data Dashboard. We describe a human-centered approach to understand the needs of users and to build accessible mapping tools for visualization and analysis. The tools include a variety of overlays that allow users to spatially correlate features of the built environment, such as vacant properties with criminal activity as well as crime prevention efforts. We are in the final stages of developing the first version of the tool, with plans for a public release in fall of 2016.
This paper introduces the concept of Open Source Intelligence (OSINT) as an important application in intelligent profiling of individuals. With a variety of tools available, significant data shall be obtained on an individual as a consequence of analyzing his/her internet presence but all of this comes at the cost of low relevance. To increase the relevance score in profiling, PeopleXploit is being introduced. PeopleXploit is a hybrid tool which helps in collecting the publicly available information that is reliable and relevant to the given input. This tool is used to track and trace the given target with their digital footprints like Name, Email, Phone Number, User IDs etc. and the tool will scan & search other associated data from public available records from the internet and create a summary report against the target. PeopleXploit profiles a person using authorship analysis and finds the best matching guess. Also, the type of analysis performed (professional/matrimonial/criminal entity) varies with the requirement of the user.
In the last decades, data have become a cornerstone component in many business decisions, and copious resources are being poured into production and acquisition of the high-quality data. This emerging market possesses unique features, and thus came under the spotlight for the stakeholders and researchers alike. In this work, we aspire to provide the community with a set of tools for making business decisions, as well as analysis of markets behaving according to certain rules. We supply, to the best of our knowledge, the first open source simulation platform, termed Open SOUrce Market Simulator (OSOUM) to analyze trading markets and specifically data markets. We also describe and implement a specific data market model, consisting of two types of agents: sellers who own various datasets available for acquisition, and buyers searching for relevant and beneficial datasets for purchase. The current simulation treats data as an infinite supply product. Yet, other market settings may be easily implemented using OSOUM. Although commercial frameworks, intended for handling data markets, already exist, we provide a free and extensive end-to-end research tool for simulating possible behavior for both buyers and sellers participating in (data) markets.
Pedestrian accessibility is an important factor in urban transport and land use policy and critical for creating healthy, sustainable cities. Developing and evaluating indicators measuring inequalities in pedestrian accessibility can help planners and policymakers benchmark and monitor the progress of city planning interventions. However, measuring and assessing indicators of urban design and transport features at high resolution worldwide to enable city comparisons is challenging due to limited availability of official, high quality, and comparable spatial data, as well as spatial analysis tools offering customizable frameworks for indicator construction and analysis. To address these challenges, this study develops an open source software framework to construct pedestrian accessibility indicators for cities using open and consistent data. It presents a generalized method to consistently measure pedestrian accessibility at high resolution and spatially aggregated scale, to allow for both within- and between-city analyses. The open source and open data methods developed in this study can be extended to other cities worldwide to support local planning and policymaking. The software is made publicly available for reuse in an open repository.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا