Do you want to publish a course? Click here

Towards Privacy-assured and Lightweight On-chain Auditing of Decentralized Storage

72   0   0.0 ( 0 )
 Added by Yuefeng Du
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

How to audit outsourced data in centralized storage like cloud is well-studied, but it is largely under-explored for the rising decentralized storage network (DSN) that bodes well for a billion-dollar market. To realize DSN as a usable service in a truly decentralized manner, the blockchain comes in handy -- to record and verify audit trails in forms of proof of storage, and based on that, to handle fair payments with necessary dispute resolution. Leaving the audit trails on the blockchain offers transparency and fairness, yet it 1) sacrifices privacy, as they may leak information about the data under audit, and 2) overwhelms on-chain resources, as they may be practically large in size and expensive to verify. Prior auditing designs in centralized settings are not directly applicable here. A handful of proposals targeting DSN cannot satisfactorily address these issues either. We present an auditing solution that addresses on-chain privacy and efficiency, from a synergy of homomorphic linear authenticators with polynomial commitments for succinct proofs, and the sigma protocol for provable privacy. The solution results in, per audit, 288-byte proof written to the blockchain, and constant verification cost. It can sustain long-term operation and easily scale to thousands of users on Ethereum.



rate research

Read More

Clients of permissionless blockchain systems, like Bitcoin, rely on an underlying peer-to-peer network to send and receive transactions. It is critical that a client is connected to at least one honest peer, as otherwise the client can be convinced to accept a maliciously forked view of the blockchain. In such an eclipse attack, the client is unable to reliably distinguish the canonical view of the blockchain from the view provided by the attacker. The consequences of this can be catastrophic if the client makes business decisions based on a distorted view of the blockchain transactions. In this paper, we investigate the design space and propose two approaches for Bitcoin clients to detect whether an eclipse attack against them is ongoing. Each approach chooses a different trade-off between average attack detection time and network load. The first scheme is based on the detection of suspicious block timestamps. The second scheme allows blockchain clients to utilize their natural connections to the Internet (i.e., standard web activity) to gossip about their blockchain views with contacted servers and their other clients. Our proposals improve upon previously proposed eclipse attack countermeasures without introducing any dedicated infrastructure or changes to the Bitcoin protocol and network, and we discuss an implementation. We demonstrate the effectiveness of the gossip-based schemes through rigorous analysis using original Internet traffic traces and real-world deployment. The results indicate that our protocol incurs a negligible overhead and detects eclipse attacks rapidly with high probability, and is well-suited for practical deployment.
Virtual reality (VR) is an emerging technology that enables new applications but also introduces privacy risks. In this paper, we focus on Oculus VR (OVR), the leading platform in the VR space, and we provide the first comprehensive analysis of personal data exposed by OVR apps and the platform itself, from a combined networking and privacy policy perspective. We experimented with the Quest 2 headset, and we tested the most popular VR apps available on the official Oculus and the SideQuest app stores. We developed OVRseen, a methodology and system for collecting, analyzing, and comparing network traffic and privacy policies on OVR. On the networking side, we captured and decrypted network traffic of VR apps, which was previously not possible on OVR, and we extracted data flows (defined as <app, data type, destination>). We found that the OVR ecosystem (compared to the mobile and other app ecosystems) is more centralized, and driven by tracking and analytics, rather than by third-party advertising. We show that the data types exposed by VR apps include personally identifiable information (PII), device information that can be used for fingerprinting, and VR-specific data types. By comparing the data flows found in the network traffic with statements made in the apps privacy policies, we discovered that approximately 70% of OVR data flows were not properly disclosed. Furthermore, we provided additional context for these data flows, including the purpose, which we extracted from the privacy policies, and observed that 69% were sent for purposes unrelated to the core functionality of apps.
Decentralized exchanges (DEXs) allow parties to participate in financial markets while retaining full custody of their funds. However, the transparency of blockchain-based DEX in combination with the latency for transactions to be processed, makes market-manipulation feasible. For instance, adversaries could perform front-running -- the practice of exploiting (typically non-public) information that may change the price of an asset for financial gain. In this work we formalize, analytically exposit and empirically evaluate an augmented variant of front-running: sandwich attacks, which involve front- and back-running victim transactions on a blockchain-based DEX. We quantify the probability of an adversarial trader being able to undertake the attack, based on the relative positioning of a transaction within a blockchain block. We find that a single adversarial trader can earn a daily revenue of over several thousand USD when performing sandwich attacks on one particular DEX -- Uniswap, an exchange with over 5M USD daily trading volume by June 2020. In addition to a single-adversary game, we simulate the outcome of sandwich attacks under multiple competing adversaries, to account for the real-world trading environment.
This document describes and analyzes a system for secure and privacy-preserving proximity tracing at large scale. This system, referred to as DP3T, provides a technological foundation to help slow the spread of SARS-CoV-2 by simplifying and accelerating the process of notifying people who might have been exposed to the virus so that they can take appropriate measures to break its transmission chain. The system aims to minimise privacy and security risks for individuals and communities and guarantee the highest level of data protection. The goal of our proximity tracing system is to determine who has been in close physical proximity to a COVID-19 positive person and thus exposed to the virus, without revealing the contacts identity or where the contact occurred. To achieve this goal, users run a smartphone app that continually broadcasts an ephemeral, pseudo-random ID representing the users phone and also records the pseudo-random IDs observed from smartphones in close proximity. When a patient is diagnosed with COVID-19, she can upload pseudo-random IDs previously broadcast from her phone to a central server. Prior to the upload, all data remains exclusively on the users phone. Other users apps can use data from the server to locally estimate whether the devices owner was exposed to the virus through close-range physical proximity to a COVID-19 positive person who has uploaded their data. In case the app detects a high risk, it will inform the user.
Contact tracing is an essential tool in containing infectious diseases such as COVID-19. Many countries and research groups have launched or announced mobile apps to facilitate contact tracing by recording contacts between users with some privacy considerations. Most of the focus has been on using random tokens, which are exchanged during encounters and stored locally on users phones. Prior systems allow users to search over released tokens in order to learn if they have recently been in the proximity of a user that has since been diagnosed with the disease. However, prior approaches do not provide end-to-end privacy in the collection and querying of tokens. In particular, these approaches are vulnerable to either linkage attacks by users using token metadata, linkage attacks by the server, or false reporting by users. In this work, we introduce Epione, a lightweight system for contact tracing with strong privacy protections. Epione alerts users directly if any of their contacts have been diagnosed with the disease, while protecting the privacy of users contacts from both central services and other users, and provides protection against false reporting. As a key building block, we present a new cryptographic tool for secure two-party private set intersection cardinality (PSI-CA), which allows two parties, each holding a set of items, to learn the intersection size of two private sets without revealing intersection items. We specifically tailor it to the case of large-scale contact tracing where clients have small input sets and the servers database of tokens is much larger.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا