ترغب بنشر مسار تعليمي؟ اضغط هنا

CIPM: Common Identification Process Model for Database Forensics Field

337   0   0.0 ( 0 )
 نشر من قبل Arafat Aldhaqm Dr
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Database Forensics (DBF) domain is a branch of digital forensics, concerned with the identification, collection, reconstruction, analysis, and documentation of database crimes. Different researchers have introduced several identification models to handle database crimes. Majority of proposed models are not specific and are redundant, which makes these models a problem because of the multidimensional nature and high diversity of database systems. Accordingly, using the metamodeling approach, the current study is aimed at proposing a unified identification model applicable to the database forensic field. The model integrates and harmonizes all exiting identification processes into a single abstract model, called Common Identification Process Model (CIPM). The model comprises six phases: 1) notifying an incident, 2) responding to the incident, 3) identification of the incident source, 4) verification of the incident, 5) isolation of the database server and 6) provision of an investigation environment. CIMP was found capable of helping the practitioners and newcomers to the forensics domain to control database crimes.



قيم البحث

اقرأ أيضاً

Byzantine fault-tolerant (BFT) protocols allow a group of replicas to come to a consensus even when some of the replicas are Byzantine faulty. There exist multiple BFT protocols to securely tolerate an optimal number of faults $t$ under different net work settings. However, if the number of faults $f$ exceeds $t$ then security could be violated. In this paper we mathematically formalize the study of forensic support of BFT protocols: we aim to identify (with cryptographic integrity) as many of the malicious replicas as possible and in as a distributed manner as possible. Our main result is that forensic support of BFT protocols depends heavily on minor implementation details that do not affect the protocols security or complexity. Focusing on popular BFT protocols (PBFT, HotStuff, Algorand) we exactly characterize their forensic support, showing that there exist minor variants of each protocol for which the forensic supports vary widely. We show strong forensic support capability of LibraBFT, the consensus protocol of Diem cryptocurrency; our lightweight forensic module implemented on a Diem client is open-sourced and is under active consideration for deployment in Diem. Finally, we show that all secure BFT protocols designed for $2t+1$ replicas communicating over a synchronous network forensic support are inherently nonexistent; this impossibility result holds for all BFT protocols and even if one has access to the states of all replicas (including Byzantine ones).
We present two subtle charge transport problems revealed by the statistics of flat fields. Mark Downing has presented photon transfer curves showing variance dips of order 25% at signal levels around 80% of blooming. These dips appear when substrate voltage is raised above zero, for - 0V to 8V parallel clock swing. We present a modified parallel transfer sequence that eliminates the dip, based on the hypothesis that it is caused by charge spillage from last line to the 2nd last line. We discuss an experiment to test whether the electrode map is incorrectly reported in the data sheet. A more subtle dip in the variance occurs at signals around 6000 e-. This is eliminated by increasing serial clock high by a few volts, suggesting the existence of a small structural trap at the parallel-serial interface. Tails above blooming stars are suppressed using an inverted clocking during readout and a positive clocking during exposure to maintain sharpness of the PTC. We show that integrating under three parallel phases, instead of the two recommended, reduces pixel area variations from 0.39% to 0.28%, while also eliminating striations observed along central columns in pixel area maps. We show that systematic line and column width errors at stitching boundaries (~15 nm) are now an order of magnitude less than the random pixel area variations.
74 - Yuezun Li , Xin Yang , Pu Sun 2019
AI-synthesized face-swapping videos, commonly known as DeepFakes, is an emerging problem threatening the trustworthiness of online information. The need to develop and evaluate DeepFake detection algorithms calls for large-scale datasets. However, cu rrent DeepFake datasets suffer from low visual quality and do not resemble DeepFake videos circulated on the Internet. We present a new large-scale challenging DeepFake video dataset, Celeb-DF, which contains 5,639 high-quality DeepFake videos of celebrities generated using improved synthesis process. We conduct a comprehensive evaluation of DeepFake detection methods and datasets to demonstrate the escalated level of challenges posed by Celeb-DF.
Metamodeling is used as a general technique for integrating and defining models from different domains. This technique can be used in diverse application domains, especially for purposes of standardization. Also, this process mainly has a focus on th e identification of general concepts that exist in various problem domain and their relations and to solve complexity, interoperability, and heterogeneity aspects of different domains. Several diverse metamodeling development approaches have been proposed in the literature to develop metamodels. Each metamodeling development process has some advantages and disadvantages too. Therefore, the objective of this paper is to provide a comprehensive review of existing metamodeling development approaches and conduct a comparative study among them-eventually selecting the best approach for metamodel development in the perspective of digital forensics.
IoT devices have been adopted widely in the last decade which enabled collection of various data from different environments. The collected data is crucial in certain applications where IoT devices generate data for critical infrastructure or systems whose failure may result in catastrophic results. Specifically, for such critical applications, data storage poses challenges since the data may be compromised during the storage and the integrity might be violated without being noticed. In such cases, integrity and data provenance are required in order to be able to detect the source of any incident and prove it in legal cases if there is a dispute with the involved parties. To address these issues, blockchain provides excellent opportunities since it can protect the integrity of the data thanks to its distributed structure. However, it comes with certain costs as storing huge amount of data in a public blockchain will come with significant transaction fees. In this paper, we propose a highly cost effective and reliable digital forensics framework by exploiting multiple inexpensive blockchain networks as a temporary storage before the data is committed to Ethereum. To reduce Ethereum costs,we utilize Merkle trees which hierarchically stores hashes of the collected event data from IoT devices. We evaluated the approach on popular blockchains such as EOS, Stellar, and Ethereum by presenting a cost and security analysis. The results indicate that we can achieve significant cost savings without compromising the integrity of the data.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا