Do you want to publish a course? Click here

SoK: Exploring the State of the Art and the Future Potential of Artificial Intelligence in Digital Forensic Investigation

74   0   0.0 ( 0 )
 Added by Mark Scanlon
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Multi-year digital forensic backlogs have become commonplace in law enforcement agencies throughout the globe. Digital forensic investigators are overloaded with the volume of cases requiring their expertise compounded by the volume of data to be processed. Artificial intelligence is often seen as the solution to many big data problems. This paper summarises existing artificial intelligence based tools and approaches in digital forensics. Automated evidence processing leveraging artificial intelligence based techniques shows great promise in expediting the digital forensic analysis process while increasing case processing capacities. For each application of artificial intelligence highlighted, a number of current challenges and future potential impact is discussed.



rate research

Read More

73 - Xiaoyu Du , Mark Scanlon 2019
The ever increasing volume of data in digital forensic investigation is one of the most discussed challenges in the field. Usually, most of the file artefacts on seized devices are not pertinent to the investigation. Manually retrieving suspicious files relevant to the investigation is akin to finding a needle in a haystack. In this paper, a methodology for the automatic prioritisation of suspicious file artefacts (i.e., file artefacts that are pertinent to the investigation) is proposed to reduce the manual analysis effort required. This methodology is designed to work in a human-in-the-loop fashion. In other words, it predicts/recommends that an artefact is likely to be suspicious rather than giving the final analysis result. A supervised machine learning approach is employed, which leverages the recorded results of previously processed cases. The process of features extraction, dataset generation, training and evaluation are presented in this paper. In addition, a toolkit for data extraction from disk images is outlined, which enables this method to be integrated with the conventional investigation process and work in an automated fashion.
Many future innovative computing services will use Fog Computing Systems (FCS), integrated with Internet of Things (IoT) resources. These new services, built on the convergence of several distinct technologies, need to fulfil time-sensitive functions, provide variable levels of integration with their environment, and incorporate data storage, computation, communications, sensing, and control. There are, however, significant problems to be solved before such systems can be considered fit for purpose. The high heterogeneity, complexity, and dynamics of these resource-constrained systems bring new challenges to their robust and reliable operation, which implies the need for integral resilience management strategies. This paper surveys the state of the art in the relevant fields, and discusses the research issues and future trends that are emerging. We envisage future applications that have very stringent requirements, notably high-precision latency and synchronization between a large set of flows, where FCSs are key to supporting them. Thus, we hope to provide new insights into the design and management of resilient FCSs that are formed by IoT devices, edge computer servers and wireless sensor networks; these systems can be modelled using Game Theory, and flexibly programmed with the latest software and virtualization platforms.
Graph-structured data are an integral part of many application domains, including chemoinformatics, computational biology, neuroimaging, and social network analysis. Over the last two decades, numerous graph kernels, i.e. kernel functions between graphs, have been proposed to solve the problem of assessing the similarity between graphs, thereby making it possible to perform predictions in both classification and regression settings. This manuscript provides a review of existing graph kernels, their applications, software plus data resources, and an empirical comparison of state-of-the-art graph kernels.
Recent development in AI has enabled the expansion of its application to multiple domains. From medical treatment, gaming, manufacturing to daily business processes. A huge amount of money has been poured into AI research due to its exciting discoveries. Technology giants like Google, Facebook, Amazon, and Baidu are the driving forces in the field today. But the rapid growth and excitement that the technology offers obscure us from looking at the impact it brings on our society. This short paper gives a brief history of AI and summarizes various social, economic and ethical issues that are impacting our society today. We hope that this work will provide a useful starting point and perhaps reference for newcomers and stakeholders of the field.
We present a personal view of the state of the art in turbulence research. We summarize first the main achievements in the recent past, and then point ahead to the main challenges that remain for experimental and theoretical efforts.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا