Article

Computer Forensics: The Need for Standardization and Certification.

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Finally, state and federal governments must assume active roles in the oversight and accreditation of credentialing bodies with measurable results. Meyers and Rogers (2004) identify the following three areas where the computer forensics field needs improvement: the creation of a flexible standard, qualification of expert witnesses and standards regarding the analysis, preservation, and presentation of digital evidence. Any standard(s) developed for use in the computer forensics discipline, must allow for flexibility, so that the standard may adapt to the continuous changes in technology and the forensic process. ...
... Therefore, in adjudication processes, the courts accept persons as expert witnesses based on their skills and previous professional work experience. While this process has not been challenged thus far, Meyers and Rogers (2004) anticipate that in the future, expert witnesses' qualifications will be more commonly challenged. ...
... The final area identified by the authors as needing improvement is standards regarding the analysis, preservation, and presentation of digital evidence. Meyers and Rogers (2004) state that there should be "rigorous" standards and requirements along with continuous updates to the forensic process. Currently, the common method used to analyze digital evidence relies mostly on the software and/or hardware an expert uses in the analysis of the evidence; the authors challenge that relying solely on the software/hardware does not allow experts to fully understand the digital forensics process so that they may articulate the process to a judge in court proceedings. ...
... The latter has developed, together with its fundamental biological sciences, over several decades (Beebe and Clark, 2005;Palmer, 2001), whereas DF science is still being formed and therefore lags considerably behind the better-developed, underlying Computer Science (CS) Beebe and Clark, 2005;Carrier and Spafford, 2003). DF discipline has developed without any of the initial research required to provide an essential, thorough, scientific foundation (Valjarevic and Venter, 2015;US-CERT, 2012;Peisert et al., 2008;Meyers and Rogers, 2004;Carrier, 2002). The United States Computer Emergency Readiness Team (US-CERT, 2012) notes, "Because computer forensics is a new discipline, there is little standardization and consistency across the courts and industry." ...
... The United States Computer Emergency Readiness Team (US-CERT, 2012) notes, "Because computer forensics is a new discipline, there is little standardization and consistency across the courts and industry." Meyers and Rogers (2004) take an even stronger view, warning that DF is branded as a "junk science" because of the absence of certifications, standards or peer-reviewed methods (Meyers and Rogers, 2004). Similarly, Carrier (2002) highlights that issues are raised in considering DF as a science due to the absence of generally agreed-upon standards and procedures. ...
... The United States Computer Emergency Readiness Team (US-CERT, 2012) notes, "Because computer forensics is a new discipline, there is little standardization and consistency across the courts and industry." Meyers and Rogers (2004) take an even stronger view, warning that DF is branded as a "junk science" because of the absence of certifications, standards or peer-reviewed methods (Meyers and Rogers, 2004). Similarly, Carrier (2002) highlights that issues are raised in considering DF as a science due to the absence of generally agreed-upon standards and procedures. ...
... This shows that despite the fact that the Daubert case was heard in 1993 its influence is still strong in relation to digital evidence, further demonstrated by the more recent consultation paper issued by the Law Commission for England and Wales which effectively mimics the Daubert test (Edmond, 2010). However, when applying the Daubert test to cases involving digital forensic tools and techniques it appears that regarding digital forensics as a science causes some issues, in particular the lack of generally accepted standards and procedures (Carrier, 2002;Meyers & Rogers, 2004). Peisert et al (2008) suggest a reason for this is that the discipline has been developed without the typical initial research that would have provided the sound scientific basis necessary for admitting digital forensic evidence. ...
... Peisert et al (2008) suggest a reason for this is that the discipline has been developed without the typical initial research that would have provided the sound scientific basis necessary for admitting digital forensic evidence. This view has also been strongly expressed by Meyers and Rogers (2004), who warned of digital forensics being labelled as 'junk science' due to the lack of certifications, standards or peer-reviewed methods. This view is understandable given that the practice of digital forensics was initially undertaken by practitioners who were not scientists but law enforcement officers and only more recently has it become a role for IT professionals. ...
... As a result, it is not yet recognised as a formal 'scientific' discipline. (p. 1) Contrary to the contention of Buskirk and Liu (2006), who suggest that digital evidence is automatically presumed to be reliable, we have a situation in which, in the absence of anything better, courts are often using methods that apply to 'classical' science to determine the reliability of objects from digital forensics (Calhoun, 2008;Cheng, 2007;Kenneally, 2005;Kessler, 2010;Limongelli, 2008;Meyers & Rogers, 2004). In relation to this question of evidence reliability, two of Palmer's (2001) six phases of digital forensics relate directly to the acquisition of digital evidence; Preservation and Collection. ...
Article
Full-text available
As with other types of evidence, the courts make no presumption that digital evidence is reliable without some evidence of empirical testing in relation to the theories and techniques associated with its production. The issue of reliability means that courts pay close attention to the manner in which electronic evidence has been obtained and in particular the process in which the data is captured and stored. Previous process models have tended to focus on one particular area of digital forensic practice, such as law enforcement, and have not incorporated a formal description. We contend that this approach has prevented the establishment of generally-accepted standards and processes that are urgently needed in the domain of digital forensics. This paper presents a generic process model as a step towards developing such a generally-accepted standard for a fundamental digital forensic activity–the acquisition of digital evidence.
... Besides, the functionality of the cloud architecture was not changed and this had the advantage of reducing the cost. Consequently, a comparative study of different researches shows other relative works, [35][36][37][38][39][40][41][42] that have addressed the need of incorporating standardization/international standard as a way of addressing digital forensics with respect to standardization as is shown in Table 2. ...
... Literature Key aspects on the need for standardization 35,37 Study focuses on the need for standardized forensic processes in computer forensic in general, owing to the maturity of the forensic discipline. Lack of standardization has been identified as a core problem due to difficulties in interpreting and mandating forensic techniques. ...
Article
Full-text available
An increase in the use of cloud computing technologies by organizations has led to cybercriminals targeting cloud environments to orchestrate malicious attacks. Conversely, this has led to the need for proactive approaches through the use of digital forensic readiness (DFR). Existing studies have attempted to develop proactive prototypes using diverse agent‐based solutions that are capable of extracting a forensically sound potential digital evidence. As a way to address this limitation and further evaluate the degree of PDE relevance in an operational platform, this study sought to develop a prototype in an operational cloud environment to achieve DFR in the cloud. The prototype is deployed and executed in cloud instances hosted on OpenStack: the operational cloud environment. The experiments performed in this study show that it is viable to attain DFR in an operational cloud platform. Further observations show that the prototype is capable of harvesting digital data from cloud instances and store the data in a forensic sound database. The prototype also prepares the operational cloud environment to be forensically ready for digital forensic investigations without alternating the functionality of the OpenStack cloud architecture by leveraging the ISO/IEC 27043 guidelines on security monitoring.
... Besides, the functionality of the cloud architecture was not changed and this had the advantage of reducing the cost. Consequently, a comparative study of different researches shows other relative works, [35][36][37][38][39][40][41][42] that have addressed the need of incorporating standardization/international standard as a way of addressing digital forensics with respect to standardization as is shown in Table 2. ...
... Literature Key aspects on the need for standardization 35,37 Study focuses on the need for standardized forensic processes in computer forensic in general, owing to the maturity of the forensic discipline. Lack of standardization has been identified as a core problem due to difficulties in interpreting and mandating forensic techniques. ...
... Most techniques used by forensic analysts to analyse data are not standardised [Meyers, 2004]. As discussed further in Chapter 4, techniques are generally developed which meet the constraints of the team involved in the investigation. ...
... The structure and format of storage media varies between manufacturers, this affects the methodology used; each device type will need handling in a specific way to ensure the most thorough examination is completed [Meyers, 2004]. If incorrectly identified or handled, data may be lost, and in the worst case the media may become unrecoverable. ...
Thesis
This thesis establishes the evidential value of thumbnail cache file fragments identified in unallocated space. A set of criteria to evaluate the evidential value of thumbnail cache artefacts were created by researching the evidential constraints present in Forensic Computing. The criteria were used to evaluate the evidential value of live system thumbnail caches and thumbnail cache file fragments identified in unallocated space. Thumbnail caches can contain visual thumbnails and associated metadata which may be useful to an analyst during an investigation; the information stored in the cache may provide information on the contents of files and any user or system behaviour which interacted with the file. There is a standard definition of the purpose of a thumbnail cache, but not the structure or implementation; this research has shown that this has led to some thumbnail caches storing a variety of other artefacts such as network place names. The growing interest in privacy and security has led to an increase in user’s attempting to remove evidence of their activities; information removed by the user may still be available in unallocated space. This research adapted popular methods for the identification of contiguous files to enable the identification of single cluster sized fragments in Windows 7, Ubuntu, and Kubuntu. Of the four methods tested, none were able to identify each of the classifications with no false positive results; this result led to the creation of a new approach which improved the identification of thumbnail cache file fragments. After the identification phase, further research was conducted into the reassembly of file fragments; this reassembly was based solely on the potential thumbnail cache file fragments and structural and syntactical information. In both the identification and reassembly phases of this research image only file fragments proved the most challenging resulting in a potential area of continued future research. Finally this research compared the evidential value of live system thumbnail caches with identified and reassembled fragments. It was determined that both types of thumbnail cache artefacts can provide unique information which may assist with a digital investigation. ii This research has produced a set of criteria for determining the evidential value of thumbnail cache artefacts; it has also identified the structure and related user and system behaviour of popular operating system thumbnail cache implementations. This research has also adapted contiguous file identification techniques to single fragment identification and has developed an improved method for thumbnail cache file fragment identification. Finally this research has produced a proof of concept software tool for the automated identification and reassembly of thumbnail cache file fragments.
... Carving is done on the basis of file signatures and unfortunately not all file types have a standard footer signature and therefore locating the end of file is difficult. But if a file is carved in a forensically sound manner, it is then acceptable in the court of law [7]. Forensic soundness provides assurance that during the investigation the evidence was not destroyed or corrupted. ...
Article
This paper proposes an approach to extract hidden information as sensitive data can be hidden by the criminal in free space or slack space. But there might be cases when there exists no file system metadata information for file recovery. For this purpose, Data Carving techniques are used by forensic examiners. If a file is carved in a forensically sound manner, it is then acceptable in a court of law. Many automated tools exist to carve data out of a hard drive. In this paper, we looked at how to carve data in an old-fashioned way followed by carving data using tools.
... However, standardisation has been debated within the DF community. Some have promoted it to safeguard the credibility of the field and ensure the admissibility of evidence in court (e.g., Grobler, 2012;Guo & Hou, 2018;Marshall & Paige, 2018;Meyers & Rogers, 2004;Zahadat, 2019). Others have been sceptical about standardisation as the best solution to challenges, due to the cost and effort involved in complying with the standards and the complexity of validation in an environment of rapidly changing technologies (see, e.g., Sommer, 2010 partly since the quality standard does not address the financial viability of a company (Tully et al., 2020, pp. ...
Thesis
Full-text available
The topic of the thesis is the cognitive and human factor’s influence on the evidential value of digital traces. Digital evidence is of high importance for solving crime. Therefore, it is essential that digital traces are collected, examined, analysed, and presented in a way that safeguards their evidential value and minimises erroneous or misleading outcomes. A large body of research has been concerned with developing new methods, tools, processes, procedures, and frameworks for handling new technology or novel implementations of technology. In contrast, relatively few empirical studies have examined digital forensics (DF) practice. The thesis contributes to filling this knowledge gap by providing novel insights concerning DF investigative practices during the analysis and presentation stages of the DF process. The thesis draws on theories and research from several scholarly traditions, such as DF, forensic science, police science, and cognitive psychology, as well as social science traditions such as digital criminology and science and technology studies (STS). A mixed-methods approach is applied to explore the research question: How could a better understanding of the DF practitioners’ role in constructing digital evidence within a criminal investigation enable mitigation of errors and safeguard a fair administration of justice? The thesis is made up of five articles exploring the research question from different perspectives. The first article aims to bring insights about cognitive bias from the forensic science domain into the DF discipline and discusses their relevance and plausible implications to DF casework. The analysis suggests that cognitive and human factors influence decision-making during the DF process and that there is a risk of bias in all its stages. The second article applies an experimental design (the DF experiment). The article examines two aspects of DF decision-making: First, whether DF practitioners’ decision-making is biased by contextual information, and second, whether those who receive similar information produce consistent results (between-practitioner reliability). The results indicate that the context influenced the number of traces discovered by the DF practitioners and showed low between-practitioner reliability for those receiving similar contexts. The third article applies a qualitative lens to examine how the low between-practitioner reliability materialises itself in the DF reports and whether and how the trace descriptions influence the evidential value in a legal context. The article demonstrates how the DF practitioners interpret the same traces differently and develops the concept of “evidence elasticity” to describe the interpretative flexibility of digital traces. The article shows how the evidence elasticity of the digital traces enables the construction of substantially different narratives related to the criminal incident and how this sometimes may result in misinformation with the propensity to mislead actors in the criminal justice chain. The fourth article is based on a survey of the DF practitioners’ accounts of their investigative practice during the DF experiment. The article explores how they handled contextual information, examiner objectivity, and evidence reliability during the analysis of an evidence file. The results show that many started the analysis with a single hypothesis in mind, which introduces a risk of a one-sided investigation. Approximately a third of the DF participants did not apply any techniques to safeguard examiner objectivity or control evidence reliability. The fifth article examines the DF practitioners’ reporting and documentation practices. It centres on the conclusion types, the content relevant to the evidence value, and the applied (un)certainty expressions. The results were compared to a study of eight forensic science disciplines. The analysis showed that the DF practitioners typically applied categorical conclusions or strength of support conclusion types. They used a plethora of certainty expressions but lacked an explanation of their meaning or reference to an established framework. However, the most critical finding was substantial deficiencies in documentation practices for content essential for enabling audit of the DF investigative process and results, a challenge which also seemed shared with other forensic science disciplines. https://www.duo.uio.no/handle/10852/97851
... The Daubert criteria are widely accepted in the classical fields, like medical forensics. It can also be, and is, applied in the much younger field of IT-forensics (see e.g., [16,17]). It has to be admitted that the field of media forensics, which is the focus of this thesis, is still lacking maturity in this regard. ...
Article
Full-text available
Academic research in media forensics mainly focuses on methods for the detection of the traces or artefacts left by media manipulations in media objects. While the resulting detectors often achieve quite impressive detection performances, when tested under lab conditions, hardly any of those have yet come close to the ultimate benchmark for any forensic method, which would be courtroom readiness. This paper tries first to facilitate the different stakeholder perspectives in this field and then to partly address the apparent gap between the academic research community and the requirements imposed onto forensic practitioners. The intention is to facilitate the mutual understanding of these two classes of stakeholders and assist with first steps intended at closing this gap. To do so, first a concept for modelling media forensic investigation pipelines is derived from established guidelines. Then, the applicability of such modelling is illustrated on the example of a fusion-based media forensic investigation pipeline aimed at the detection of DeepFake videos using five exemplary detectors (hand-crafted, in one case neural network supported) and testing two different fusion operators. At the end of the paper, the benefits of such a planned realisation of AI-based investigation methods are discussed and generalising effects are mapped out.
... The qualifications and skills of expert witnesses is also a serious issue. Meyers and Rogers (2004) question whether one can be considered an expert based on the ability to use a tool or software package, but without the ability to clearly define how the tool works or without reviewing the source code. In many cases, forensic experts may apply a particular tool not because it is the most effective tool but because it is available, cheap, and the expert is familiar with it, (Precilla, Dimpe & Okuthe, 2017). ...
Article
Full-text available
The advancement of Information and Communication Technologies (ICT) opens new avenues and ways for cybercriminals to commit crime. The primary goal of this paper is to raise awareness regarding gaps that exist with regards to Nigeria’s capabilities to adequately legislate, investigate and prosecute cases of cybercrimes. The major source of cybercrime legislation in Nigeria is an act of the National Assembly which is majorly a symbolic legislation rather than a full and active legislation. In perusing these avenues of inquiry, the authors seek to identify systemic impediments which hinder law enforcement agencies, prosecutors, and investigators from properly carrying out their duties as expected.
... These are incredibly important for both, organizations and governments eager to improve in the field of information security. Nowadays, managerial tools like an information security management system are well known constructs and their importance for company success is undoubted as it leverages the overall security and possibility of universally understandably auditing and measuring of information security [12][13][14]. Standardization is an important aspect of technology development. Information technologies are facilitating process standardization and process control and creating business value. ...
Article
Full-text available
Since the introduction of Bitcoin, the term “blockchain” has attracted many start-ups and companies over the years, especially in the financial sector. However, technology is evolving faster than standardization frameworks. This left the industry in the position of having to use this emerging technology, without being backed by any international standards organization regarding for neither the technology itself, nor for a blockchain specific information security framework. In times of the General Data Protection Regulation and growing international trade conflicts, protecting information is more relevant than ever. Standardization of blockchains is an appeal to raise the development of information technologies to the next level. Therefore, this paper shall provide an overview of standardization organization’s publications about blockchains/distributed ledger technologies, a set of comparison criteria for future work and a comparison of the existing standards work itself. With that information, aligning to existing standardization efforts becomes easier, and might even present the possibility to create frameworks where there are none at the moment.
... As a result, it is likely that peer review approaches are divergent between organisations suggesting that the quality and robustness of such peer reviews may also vary. Ineffective peer review in DF has been noted for over 15 years (Meyers and Rogers, 2004) and if the field is to secure its status as an important and reliable forensic science, it must take steps towards rigorous peer review procedural development. ...
Article
Full-text available
The importance of peer review in the field of digital forensics cannot be underestimated as it often forms the primary, and sometimes only form of quality assurance process an organisation will apply to their practitioners' casework. Whilst there is clear value in the peer review process, it remains an area which is arguably undervalued and under-researched, where little academic and industrial commentary can be found describing best practice approaches. This work forms the first of a two part series discussing why the digital forensics discipline and its organisations should conduct peer review in their laboratories, what it should review as part of this process, and how this should be undertaken. Here in part one, a critical review of the need to peer review is offered along with a discussion of the limitations of existing peer review mechanisms. Finally, the ‘Peer Review Hierarchy’ is offered, outlining the seven levels of peer review available for reviewing practitioner findings.
... [59] Although val-id warrants have always been generally required by the courts, there are exceptions that allow law enforcement to engage in warrant-less searches. The exceptions include: Detention Short of Arrest: Stop-and-Frisk [60]; Search Incident to Arrest [61]; Vehicular Searches [62]; Vessel Searches; Consent Searches; Border Searches; "Open Fields" [63]; Plain View [64]; Public Schools [65]; Prisons and Regulation of Probation [66]; and Drug Testing. [67] In United States v. Schesso, 730 F.3d 1040 (9th Cir. ...
... Digital forensics is a discipline whose main objective is the application of computer tools in legal proceeding or an official investigation of some kind. It is used by experts to analyze, preserve and produce storage data for both volatile and non-volatile media, contributing to the veracity of digital evidence that is legally provided [6][7][8]. One of the fields of this discipline is audio forensics, where the aim is to acquire, analyze, or evaluate audio recordings which may be presented as admissible evidence in some official venue [9][10][11]. ...
Article
Full-text available
The verification of the integrity and authenticity of multimedia content is an essential task in the forensic field, in order to make digital evidence admissible. The main objective is to establish whether the multimedia content has been manipulated with significant changes to its content, such as the removal of noise (e.g., a gunshot) that could clarify the facts of a crime. In this project we propose a method to generate a summary value for audio recordings, known as hash. Our method is robust, which means that if the audio has been modified slightly (without changing its significant content) with perceptual manipulations such as MPEG-4 AAC, the hash value of the new audio is very similar to that of the original audio; on the contrary, if the audio is altered and its content changes, for example with a low pass filter, the new hash value moves away from the original value. The method starts with the application of MFCC (Mel-frequency cepstrum coefficients) and the reduction of dimensions through the analysis of main components (principal component analysis, PCA). The reduced data is encrypted using as inputs two values from a particular binarization system using Collatz conjecture as the basis. Finally, a robust 96-bit code is obtained, which varies little when perceptual modifications are made to the signal such as compression or amplitude modification. According to experimental tests, the BER (bit error rate) between the hash value of the original audio recording and the manipulated audio recording is low for perceptual manipulations, i.e., 0% for FLAC and re-quantization, 1% in average for volume (−6 dB gain), less than 5% in average for MPEG-4 and resampling (using the FIR anti-aliasing filter); but more than 25% for non-perceptual manipulations such as low pass filtering (3 kHz, fifth order), additive noise, cutting and copy-move.
... II. RELATED WORK Matthew and Marc [5] examined the need for standardisation and certification by analysing federal and state court cases (criminal and civil). They looked at some issues within the legal system which includes: search and seizure, expert qualifications, and analysis and preservation. ...
Conference Paper
a lot of research has been conducted with regards to digital forensic investigations. The investigation aspect has been the main focus, however, without proper and adaptable framework that can satisfy the technological evolution and changes that has made some of the existing digital forensic requirements frameworks obsolete. This study therefore, proposes the development of a standard that specifies the requirements that must be considered in an investigation. Through literature review methodology, requirements include investigation framework, skills, legal aspects and how evidence must be handled. This standard c a n be used as a reference po i n t that governs how forensic investigation should be conducted. The generic digital forensic requirements developed is simple, adaptable and easy to understand.
... The demand for computerized forensic tool testing is high [3,28]. Enhancing the value of digital forensics research, which builds upon itself, requires a clear strategy, otherwise this research will fail to meet market expectations. ...
Article
Full-text available
Computer forensic techniques are important for the prevention, detection, and investigation of electronic crime. Computer forensic investigators need computer forensic tools to produce reliable results that meet legal requirements and are acceptable in the courts. Most of these tools are closed-source, making the software a black-box for testing purposes. This paper illustrates a different black box testing method for experimenting computer forensic tools based on functional scenarios.
... Automating the digital forensics process of course has its own challenges. James et al. (2013) and Meyers et al. (2004) caution that the automation of the digital forensics process should not let the forensics profession be "dumbed down" because of expert investigators relying on automation more than their own knowledge. Instead, they suggest that it is more important that the untrained investigators conduct their investigation at the level of expert investigators. ...
Article
Full-text available
Open source software tools designed for disk analysis play a critical role in today's forensic investigations. These tools typically are onerous to use and rely on expertise both in investigation techniques as well as in the tools and disk structures. In previous work we presented the design and initial development for a toolkit that can be used as an automated assistant for forensic investigations. In this paper, we expand on previous work and illustrate our advanced automated disk investigation toolkit (AUDIT) which has been substantially improved and now uses a dynamic knowledge base and database. It also now supports reporting and inference functions. AUDIT can support the investigative process by handling the core IT expertise including choice and operational sequence of tool use as well as their proper con�figuration. Its capabilities as an intelligent digital assistant are evaluated through a series of tests comparing it against standard benchmark disk images as well as its support for a human investigator.
... The case for the requirement of an international standard in digital forensics has been well made (Barbara, 2005;Jones, 2004;Meyers & Rogers, 2004;Schwerha, 2008;Yasinsac et al., 2003;Young, 2007). The second consultation draft of The Forensic Science Regulator's Codes of Practice and Conduct, states that "The forensic examination of computers and telephones is a common aspect of police investigations leading to the production of intelligence and evidence. ...
Article
Full-text available
The numerous advantages offered by cloud computing has fuelled its growth and has made it one of the most significant of current computing trends. The same advantages have created complex issues for those conducting digital forensic investigations. Digital forensic investigators rely on the ACPO (Association of Chief Police Officers) or similar guidelines when conducting an investigation, however the guidelines make no reference to some of the issues presented by cloud investigations. This study investigates the impact of cloud computing on ACPO’s core principles and asks whether these principles can still be applied in a cloud investigation and the challenges presented thereof. We conclude that the ACPO principles can generally be upheld but that additional precautions must be taken throughout the investigation.
... Međutim, to stvara probleme u slučaju prekogranično počinjenih zločina, s obzirom da će digitalni forenzički dokazi prkupljeni u jednoj zemlji možda morati biti prezentirani u sudovima drugih zemalja. [21] Još od vremena prve digitalna forenzička istraživanja radionice (Digital Forensic Research Workshop -DFRWS) u 2001. godini, potreba za standardnim okvirom digitalne forenzičke istrage je prepoznata i istaknuta, ali do danas je bilo malo napretka u razvoju općenito prihvaćene metodologije. ...
... However, this creates issues when cross-border crimes are committed since digital forensic evidence acquired in one country may need to be presented in the courts of another. (Meyers, Rogers, 2004) One of the results of diverse approaches to collection and analysis of digital forensic evidence is that is become increasingly difficult to show why the process used in any particular case is reliable, trustworthy and accurate. (Cohen, 2008) Since the first Digital Forensic Research Workshop (DFRWS) in 2001, the need for a standard digital forensics framework has been understood, yet there has been little progress on one that is generally accepted. ...
... However, this creates issues when cross-border crimes are committed since digital forensic evidence acquired in one country may need to be presented in the courts of another. (Meyers, Rogers, 2004) One of the results of diverse approaches to collection and analysis of digital forensic evidence is that is become increasingly difficult to show why the process used in any particular case is reliable, trustworthy and accurate. (Cohen, 2008) Since the first Digital Forensic Research Workshop (DFRWS) in 2001, the need for a standard digital forensics framework has been understood, yet there has been little progress on one that is generally accepted. ...
Article
Full-text available
Given the omnipresence of digital evidence it is the rare crime that does not have some associated data stored and transmitted using computer systems. Despite its diffusion, few people are well versed in the evidentiary, technical, and legal issues related to digital evidence and as a result, digital evidence is often overlooked, collected incorrectly, or analyzed ineffectively. This article presents the basic steps of the digital evidence handling process, based on ISO/IEC 27037, DFRWS model and best practices from other professional sources, which can be abstractly defined to produce a model that is not dependent on a particular technology or electronic crime.
... integrating commonly used open source tools using that dynamically updated knowledge base and database; and providing inference reporting about the logic of how certain tools are automatically used and why. [8] and [11] caution that the automation of the digital forensics process should not be "dumbed down" because of expert investigators relying on automation more than their own knowledge. Our goal for AUDIT is to ensure that this does not happen. ...
Conference Paper
Full-text available
Open source software tools designed for disk analysis play a critical role in today's forensic investigations. These tools typically are onerous to use and rely on expertise both in investigation techniques as well as in the tools and disk structures. In previous work we presented the design and initial development for a toolkit that can be used as an automated assistant for forensic investigations. In this paper, we expand on previous work and illustrate our advanced automated disk investigation toolkit (AUDIT) which has been substantially improved and now uses a dynamic knowledge base and database. It also now supports reporting and inference functions. AUDIT can support the investigative process by handling the core IT expertise including choice and operational sequence of tool use as well as their proper configuration. Its capabilities as an intelligent digital assistant are evaluated through a series of tests comparing it against standard benchmark disk images as well as its support for a human investigator.
... Information encryption technology. Information encryption is based on the mathematical algorithm procedure as well as the information of the key to encode information, which can produce the technology of the character that is so difficult to be understood [9]. It can make the information with the feature of confidentiality, integrity and authenticity when it is transmitting, exchanging and storing the electronic commerce information. ...
... Automating the digital forensics process of course has its own challenges. It is cautioned in [52] and [62] that the automation of the digital forensics process should not let the forensics profession be "dumbed down" because of expert investigators relying on automation more than their own knowledge. Instead, they suggest that it is more important that the untrained investigators conduct their investigation at the level of expert investigators. ...
Thesis
Full-text available
Software tools designed for disk analysis play a critical role today in digital forensics investigations. However, these digital forensics tools are often difficult to use, usually task specific, and generally require professionally trained users with IT backgrounds. The relevant tools are also often open source requiring additional technical knowledge and proper configuration. This makes it difficult for investigators without some computer science background to easily conduct the needed disk analysis. In this dissertation, we present AUDIT, a novel automated disk investigation toolkit that supports investigations conducted by non-expert (in IT and disk technology) and expert investigators. Our system design and implementation of AUDIT intelligently integrates open source tools and guides non-IT professionals while requiring minimal technical knowledge about the disk structures and file systems of the target disk image. We also present a new hierarchical disk investigation model which leads AUDIT to systematically examine the disk in its totality based on its physical and logical structures. AUDIT's capabilities as an intelligent digital assistant are evaluated through a series of experiments comparing it with a human investigator as well as against standard benchmark disk images.
Chapter
The history of digital forensics and digital evidence is fascinating. For this discussion of the history of digital forensics and digital evidence, we have identified six time periods: (1) Foundation (pre-1980, 1980–1990), (2) Pre-automated Tools & Grassroot efforts (1991–2000), (3) Automated Tools/Acquisition (2001–2010), (4) Analysis & Examination (2011–2015), (5) Meaning & Context (2016–2019), (6) Autonomous Tools (2020-Present). These periods or eras were selected based on the clustering of common operational, legal, and scientific needs and demands. While we use the six eras as a framework for our discussion, these are conceptual and not meant to indicate actual hard demarcation points.
Chapter
Criminal law might sometimes be perceived to be at the margins of the automation process that involves increasing sectors of our society. While the expansion of automated-driven cars is by now an established fact, the technological upgrade of criminal justice mechanisms still tends to evocate sci-fi images and Minority-report-style dystopian scenarios to the non-specialists.However, the high variety of applications that can already be counted in this domain, and the acceleration of this process in the last few years, clearly speaks for a tangible expansion of AI technology also in the field. In particular, against some more-established scenarios, especially related to phenomena of so-called predictive policing, Multi-Agent Systems (MAS) open today the perspective of a much deeper involvement of automated technologies in the very shaping of investigative proceedings.The chapter offers an analysis of such potential, both with respect to their possible contribution to the efficiency of investigations (in their preventive and repressive dimension), and to avoid or reduce certain negative biases typical of “purely human” investigation processes, first of all, the tunnel vision effect.
Article
Full-text available
Crime Scene Investigation is the backbone of any criminal investigation. Expert report generated by Crime Scene Investigators are based on scientific evidence and are reported in a standard form. The content, when published should be reviewed rigorously by its peers. The expert report is a document detailing out how the investigation process of a crime scene was performed. The process of collecting and evidence processing do not only entail identifying, collecting and storing evidences for later analysis but it is a step by step process that is well structured in a document called Standard Operating Procedure (SOP) for Crime Scene Investigator. SOP is a complex document and sometimes overwhelmingly technical for the uninitiated. But if understood, it can be a good guide for law practitioners to determine if a piece of evidence should be included in a case. This article details out the SOP which entails groundwork before entering a crime scene, processing of a crime scene and wrapping up of the crime scene investigation. At each step, this paper will look at how it is adopted in current practices, its weakness and suggested improvement that should be adopted to ensure a final quality expert report.
Article
This paper investigates whether computer forensic tools (CFTs) can extract complete and credible digital evidence from digital crime scenes in the presence of file system anti-forensic (AF) attacks. The study uses a well-established six stage forensic tool testing methodology based on black-box testing principles to carry out experiments that evaluate four leading CFTs for their potential to combat eleven different file system AF attacks. Results suggest that only a few AF attacks are identified by all the evaluated CFTs, while as most of the attacks considered by the study go unnoticed. These AF attacks exploit basic file system features, can be executed using simple tools, and even attack CFTs to accomplish their task. These results imply that evidences collected by CFTs in digital investigations are not complete and credible in the presence of AF attacks. The study suggests that practitioners and academicians should not absolutely rely on CFTs for evidence extraction from a digital crime scene, highlights the implications of doing so, and makes many recommendations in this regard. The study also points towards immediate and aggressive research efforts that are required in the area of computer forensics to address the pitfalls of CFTs.
Chapter
This chapter primarily focuses on recommendations, suggestions, and directives from the legal systems on matters related to investigation of software copyright infringement and then presents them as positive contributory gestures by the legal systems across the world. Samples are taken from various laws, judicial suggestions and recommendations, and legal directives on copyright in order to discuss the ways in which these laws, recommendations, and directives can add credibility to the entire forensic procedure as well as value to the final forensic answers. These samples address judicial recommendations on a variety of software copyright issues such as software authorship, copyright protection of various software elements (including literal and non-literal software parts), the constitution of “substantial part,” the interpretation of software ideas, forensic exclusion policies of various unprotectable elements, “mining,” etc. The chapter concludes stressing the importance of imparting extensive cyber forensic education to judicial officers for making them fit not only to take intelligent judicial decisions but also to put forward wise judicial recommendations on software copyright infringement cases.
Chapter
This chapter includes the evolution of cyber forensics from the 1980s to the current era. It was the era when computer forensics came into existence after a personal computer became a viable option for consumers. The formation of digital forensics is also discussed here. This chapter also includes the formation of cyber forensic investigation agencies. Cyber forensic life cycle and related phases are discussed in detail. Role of international organizations on computer evidence is discussed with the emphasize on Digital Forensic Research Workshop (DFRWS), Scientific Working Group on Digital Evidence (SWDGE), chief police officers' involvement. Authenticity-, accuracy-, and completeness-related pieces of evidence are also discussed. The most important thing that is discussed here is the cyber forensics data.
Article
In the field of digital forensics it is crucial for any practitioner to possess the ability to make reliable investigative decisions which result in the reporting of credible evidence. This competency should be considered a core attribute of a practitioner's skill set and it is often taken for granted that all practitioners possess this ability; in reality this is not the case. A lack of dedicated research and formalisation of investigative decision making models to support digital forensics practitioner's is an issue given the complexity of many digital investigations. Often, the ability to make forensically sound decisions regarding the reliability of any findings is arguably an assumed trait of the practitioner, rather than a formally taught competency. As a result, the digital forensic discipline is facing increasing recent scrutiny with regards to the quality and validity of evidence it's practitioners are producing. This work offers the Digital Evidence Reporting and Decision Support (DERDS) framework, designed to help the practitioner assess the reliability of their ‘inferences, assumptions of conclusions’ in relation to any potentially evidential findings. The structure and application of the DERDS framework is discussed, demonstrating the stages of decision making a practitioner must undergo when evaluating the accuracy of their findings, whilst also recognising when content may be deemed unsafe to report.
Thesis
Full-text available
With consideration to how information technology has become embedded in our everyday lives, the activities of computer forensic specialists are becoming increasingly important in the field of criminal proceedings. The isolated specialist fields of previous times are becoming increasingly overlapped and the borders between them are often blurred. Currently, and even more so in the near future, computer forensic specialists are becoming essential in criminal proceedings in almost all forensic specialist fields. Those operating in the specialist field are only able to meet this challenge if their work, their profession has an appropriate scientific background. This basically includes knowledge of the prevailing legislative environment, use of internationally recognised methods and procedures, knowledge of the field’s international and domestic standards, and the use of the validated software and hardware components required for the implementation of the methods and procedures. The present study deals with each of the basic conditions listed, with the emphases determined by the author, which is manifested in the length and degree of detail of the section on the topic under discussion. Therefore, the essay presents merely a momentary picture of the current, yet changing, legislative environment, leaving space for the later insertion of the legislation to be finalised in the near future (e.g. the new act on criminal procedures), also providing a short historical and international outlook. The focus of this work is the specialist work of the computer forensic specialist and its scientific basis, which is also the most extensive independent part of the essay. It is here that the origins of the specialist field and the history of its science are discussed, as is its position within forensics and within the interdisciplinary field of other, larger areas of science. An investigation into the actual structure of the specialist field was performed via empirical research. The sources for this were the author’s own records of some three hundred cases, which, in addition to the legal requirements, have been supplemented with data assisting the purpose of the research and made suitable for detailed analysis. Although due to the features described above the research cannot be deemed to be representative, it still serves to identify the emphasis within the field, which helps to reveal the focus of the section on methodology.
Conference Paper
This paper, upon the use of cloud computing resource sharing, storage distribution and other characteristics and based on the analysis of cloud computing environments difficulties on digital forensics, new digital forensics methods and new digital forensics architecture in the cloud-based platform are proposes to meet rapid forensics needs in the era of cloud computing and to deal with the effectiveness, usefulness, depth issues and real-time and reliability problems.
Chapter
Cloud computing has become one of the most game changing technologies in the recent history of computing. It is gaining acceptance and growing in popularity. However, due to its infancy, it encounters challenges in strategy, capabilities, as well as technical, organizational, and legal dimensions. Cloud service providers and customers do not yet have any proper strategy or process that paves the way for a set procedure on how to investigate or go about the issues within the cloud. Due to this gap, they are not able to ensure the robustness and suitability of cloud services in relation to supporting investigations of criminal activity. Moreover, both cloud service providers and customers have not yet established adequate forensic capabilities that could assist investigations of criminal activities in the cloud. The aim of this chapter is to provide an overview of the emerging field of cloud forensics and highlight its capabilities, strategy, investigation, challenges, and opportunities. This paper also provides a detailed discussion in relation to strategic planning for cloud forensics.
Book
This book demonstrates the use of a wide range of strategic engineering concepts, theories and applied case studies to improve the safety, security and sustainability of complex and large-scale engineering and computer systems. It first details the concepts of system design, life cycle, impact assessment and security to show how these ideas can be brought to bear on the modeling, analysis and design of information systems with a focused view on cloud-computing systems and big data analytics. This informative book is a valuable resource for graduate students, researchers and industry-based practitioners working in engineering, information and business systems as well as strategy.
Conference Paper
Digital forensics is a fast-evolving field of study in contemporary times. One of the challenges of forensic analysis is the quality of evidence captured from computing devices and networks involved in a crime. The credibility of forensic evidence is dependent on the accuracy of established timelines of captured events. Despite the rising orders of magnitude in data volume captured by forensic analysts, the reliability and independence of the timing data source may be questionable due to the underlying network dynamics and the skew in the large number of intermediary system clocks that dictate packet time stamps. Through this paper, we propose a mechanism to verify the accuracy of forensic timing data through collaborative verification of forensic evidence obtained from multiple third party servers. The proposed scheme does analysis of HTTP response headers extracted from network packet capture (PCAP) files and validity testing of third party data through the application of statistical methods. We also develop a proof of concept universal time agreement protocol to independently verify timestamps generated by local logging servers and to provide a mechanism that may be adopted in digital forensics procedures.
Article
Full-text available
Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.
Conference Paper
Cyber Forensics Investigations and its training/education are relatively new. The nature of Cyber Forensics is complex that requires multidisciplinary skills, knowledge and abilities. Although there are proliferations training/ education programs, from a handful of day’s workshop to Masters Degree in Cyber Forensics, the world lacks of Cyber Forensics Investigators due to some factors. Consequently, this paper focuses on Competency Identification Requirements. The majority of the respondents (average of 95.66%) agreed that various stakeholders should carry out the Competency Identification stage, while, 86.7% of the respondents supported the characteristics of Competency Identification. Moreover, there is a significant relationship between Competency Identification and Cyber Forensics Investigator Proficiency.
Conference Paper
This paper reports experiences and lessons learned in the process of developing and implementing an undergraduate curriculum for digital forensics over the last three years at the University of Illinois at Urbana-Champaign. The project addresses the challenges of developing a higher-education standardized curriculum for digital forensics that meets the needs of the digital forensics community. The curriculum provides degree options and considers the growing employability of digital forensics students in an increasing range of jobs. The approach builds on the multidisciplinary nature of the field. The findings include a curriculum model, detailed course content, exams, and an evaluation package for measuring how students respond to the courses. This paper summarizes the model, results, challenges, and opportunities.
Article
Full-text available
Forensic Computing is a new and quickly developing field. It is in the process of becoming an academic discipline or sub-discipline with all the features from full undergraduate and postgraduate course provision to conferences and journals. An important question in this process of turning into an established discipline is whether it will coincide with the recognition of the graduates as professionals. This paper hopes to stimulate the debate as to whether forensic computing is or should be a discipline. In order to approach this question, the paper will discuss the concept of forensic computing including the most salient topics of interest and the problems it has to contend with. This will lead to a discussion of the notion of professions and professionals, which will be expanded with a view to the debate on computing as a profession. Based on these considerations the paper will conclude by asking whether there is merit in promoting the debate on the status of forensic computing as a profession above and beyond the arguments already rehearsed for computing in general.
Conference Paper
Contrary to traditional crimes for which there exists deep-rooted standards, procedures and models upon which courts of law can rely, there are no formal standards, procedures nor models for digital forensics to which courts can refer. Although there are already a number of various digital investigation process models, these tend to be ad-hoc procedures. In order for the case to prevail in the court of law, the processes followed to acquire digital evidence and terminology utilised must be thorough and generally accepted in the digital forensic community. The proposed novel process model is aimed at addressing both the practical requirements of digital forensic practitioners and the needs of courts for a formal computer investigation process model which can be used to process the digital evidence in a forensically sound manner. Moreover, unlike the existing models which focus on one aspect of process, the proposed model describes the entire lifecycle of a digital forensic investigation.
Article
Rapid evolution of information technology has caused devices to be used in criminal activities. Criminals have been using the Internet to distribute a wide range of illegal materials globally, making tracing difficult for the purpose of initiating digital investigation process. Forensic digital analysis is unique and inherently mathematical and generally comprises more data from an investigation than is present in other types of forensic investigations. To provide appropriate and sufficient security measures has become a difficult job due to large volume of data and complexity of the devices making the investigation of digital crimes even harder. Data mining and data fusion techniques have been used as useful tools for detecting digital crimes. In this study, we have introduced a forensic classification problem and applied ID3 decision tree learning algorithm for supervised exploration of the forensic data which will also enable visualization and will reduce the complexity involved in digital investigation process.
ResearchGate has not been able to resolve any references for this publication.