Article

Building theoretical underpinnings for digital forensics research

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

In order for technical research in digital forensics to progress a cohesive set of electronic forensics characteristics must be specified. To date, although the need for such a framework has been expressed, with a few exceptions, clear unifying characteristics have not been well laid out. We begin the process of formulating a framework for digital forensics research by identifying fundamental properties and abstractions.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Inaccuracies in attribution of authorship and the content of digital evidence, for example, are common occurrences affecting legal argument as to the completeness, correctness, authenticity, and faithfulness to an original source; thereby, raising doubts as to the worth of the evidence (Akester, 2004;Mocas, 2004). More disturbing is that even in the absence of any obvious irregularity of the software platforms housing the evidence, examination of any material of evidentiary value does not in itself attest to the accuracy or integrity of the evidence (Spenceley, 2003). ...
... For example, defining what is required to validate digital evidence in a broad range of legal settings. The terms accuracy, certainty, authenticity, integrity, in terms of the chain of custody, are mentioned in some literature but are not comprehensively defined in any standards (Mocas, 2004). ...
... For example, the primary evidence may be a threatening email sent by a suspect or other digital evidence, such as a user access entry logs, links the suspect to the computer at the critical time. However, this evidence does not exist in isolation as there is associative information that may corroborate or refute the primary evidence (Mocas, 2004). The access logs do not prove the suspect was using the computer at a specific time any more than fingerprint or DNA conclusively links the suspect to the computer keyboard at a particular time; other corroboration is required to confirm or refute the assertion. ...
Article
Digital evidence, now more commonly relied upon in legal cases, requires an understanding of the processes used in its identification, preservation, analysis and validation. Business managers relying on digital evidence in the corporate environment need a greater understanding of its true nature and difficulties affecting its usefulness in criminal, civil and disciplinary proceedings. This chapter describes digital evidence collection and analysis, and the implications of common challenges diminishing its admissibility. It looks at determining the evidentiary weight of digital evidence that can be perplexing and confusing because of the complexity of the technical domain. Digital evidence present on computer networks is easily replaced, altered, destroyed or concealed and requires special protection to preserve its evidentiary integrity. Consequently, business managers seeking the truth of a matter can find it a vexing experience, unless provided with a clear appraisal and interpretation of the relevant evidence. Validating evidence, that is often complex and incomplete, requires expert analysis to determine its value in legal cases to provide timely guidance to business managers and their legal advisers. While soundly configured security systems and procedures enhance data protection and recovery, they are often limited in the way they preserve digital evidence. Unprepared personnel can also contaminate evidence unless procedural guidelines and training are provided. The chapter looks at the benefits for prudent organisations, who may wish to include cyber forensic strategies as part of their security risk contingency, planning to minimise loss or degradation of digital evidence which, if overlooked, may have adverse legal repercussions.
... In addition it stipulates a philosophical account and common understanding for DFS as a scientific discipline 9 . It also assists in (i) the development of abstractions, principles, standards and models which are flexible, technology neutral, and generally acceptable 10,11 (ii) the development of DF process models which are mutually understood, observed and valid in both legal and technical contexts 12 and (iii) embedding scientific rigor, trust and reliability to facilitate education, application and research in DFS 13,14 . ...
... 3. Protection: DF is applied in the areas of (i) Law Enforcement (ii) Military Information Warfare Operations and (iii) Business & Industry 18 . During the review, frameworks and models were studied from the investigative context of law enforcement settings 10 . The goal in law enforcement settings is to produce reliable and admissible evidence under an ordered list of constraints comprising Law, Time, Resources and Technical Limits (in order of preference) 10 . ...
... During the review, frameworks and models were studied from the investigative context of law enforcement settings 10 . The goal in law enforcement settings is to produce reliable and admissible evidence under an ordered list of constraints comprising Law, Time, Resources and Technical Limits (in order of preference) 10 . Therefore, compliance to the law is the most important constraint. ...
Article
In this research, a literature review was conducted where twenty (n=20) frameworks and models highlighting preservation of the integrity of digital evidence and protection of basic human rights during digital forensic investigations were studied. The models not discussing the process at an abstract level were excluded. Therefore, thirteen (n=13) of the studied models were included in our analysis. The results indicated that published abstract models lack preserving the integrity of digital evidence and protecting the basic human rights as explicit overarching umbrella principles. To overcome this problem, we proposed an extension to Reith's abstract digital forensics model explicating preservation of integrity and protection of human rights as the two necessary umbrella principles.
... Repeatability, which is one of the fundamental precepts of digital forensics practice, is supported by the documentation of each and every action carried out on the digital evidence. Open source software is also the best choice for assessing accuracy, especially when conversion or migration occur, because the transparency of its code allows for a practical demonstration that nothing could be altered, lost, planted, or destroyed in the process (Mocas, 2004;Carrier, 2003b). ...
... A second type of integrity digital forensics experts are concerned with is duplication integrity, that is, the fact that, given a data set, the process of creating a duplicate of the data does not modify the data either intentionally or accidentally, and the duplicate is an exact bit copy of the original data set (Mocas, 2004). This type of integrity is extremely important because one can only preserve digital records by reproducing them. ...
... Regardless of the elements of the computer/system that are examined to verify it, computer/system integrity can be inferred on the basis of repeatability, verifiability, objectivity and transparency. More generically, an inference of system integrity can be made if the theory, procedure or process on which the system design is based 1) has been tested or cannot be tampered with; 2) has been subjected to peer review or publication (or follows a standard); 3) it's known or potential error rate is acceptable; and 4) is generally accepted within the relevant scientific community (Mocas, 2004). ...
Article
Trust has been defined in many ways, but at its core it involves acting without the knowledge needed to act. Trust in records depends on four types of knowledge about the creator or custodian of the records: reputation, past performance, competence, and the assurance of confidence in future performance. For over half a century society has been developing and adopting new computer technologies for business and communications in both the public and private realm. Frameworks for establishing trust have developed as technology has progressed. Today, individuals and organizations are increasingly saving and accessing records in cloud computing infrastructures, where we cannot assess our trust in records solely on the four types of knowledge used in the past. Drawing on research conducted at the University of British Columbia into the nature of digital records and their trustworthiness, this article presents the conceptual archival and digital forensic frameworks of trust in records and data, and explores the common law legal framework within which questions of trust in documentary evidence are being tested. Issues and challenges specific to cloud computing are introduced.
... Digital forensics is practiced in an investigative context, regardless of the domain of the investigation. The roots for the development of a theory of digital forensics, then, may be found in the early practice guidelines and principles developed by law enforcement and technical working groups (Mocas 2004). ...
... Records are considered trustworthy if they can be shown to be authentic (by establishing their identity and assessing their integrity), reliability, and accuracy. In the digital environment, archivists benefit from also incorporating concepts from digital forensics: concepts of authentication, reproducibility, non-interference, and minimization (Mocas 2004), and laws of association, context, access, intent, and validation (Andrew 2007). ...
Article
Full-text available
The field of digital forensics seems at first glance quite separate from archival work and digital preservation. However, professionals in both fields are trusted to attest to the identity and integrity of digital documents and traces – they are regarded as experts in the acquisition, interpretation, description and presentation of that material. Archival science and digital forensics evolved out of practice and grew into established professional disciplines by developing theoretical foundations, which then returned to inform and standardize that practice. They have their roots in legal requirements and law enforcement. A significant challenge to both fields, therefore, is the identification of records (archival focus) and evidence (digital forensics focus) in digital systems, establishing their contexts, provenance, relationships, and meaning. This paper traces the development of digital forensics from practice to theory and presents the parallels with archival science.
... i. The development of abstractions, principles, standards and models that are flexible, technology neutral, and generally acceptable (Carrier and Spafford, 2004;Mocas, 2004), ii. The development of DF process models that are mutually understood, observed and valid in both the legal and technical contexts (Pollitt, 1995), and ...
... i. The development of abstractions, principles, standards and models which can be flexible, technology neutral, and generally acceptable (Mocas, 2004;Carrier and Spafford, 2004); ...
... (Schwarz, Newby, & Carroll, 2009;Steel, 2006) A further aspect of reliability in relation to the activities of digital forensic practitioners who are handling digital evidence is the concept of "forensically sound tasks" as identified by Rogers (2006) who states that they are derived from properties of digital forensics and comprise of Authenticity, Chain of Custody, Integrity, Minimization and Reproducibility. Rogers cites McKemmish (1999) and Mocas (2004) as sources for these properties. ...
... McKemmish (1999) Reproducibility. Rogers cites McKemmish (1999) and Mocas (2004) as inspiration for these properties. However, on reviewing the tasks in detail it appears that Rogers has created separate tasks for what could realistically be one task in practice. ...
Article
Full-text available
As with other types of evidence, the courts make no presumption that digital evidence is reliable without some evidence of empirical testing in relation to the theories and techniques associated with its production. The issue of reliability means that courts pay close attention to the manner in which electronic evidence has been obtained and in particular the process in which the data is captured and stored. Previous process models have tended to focus on one particular area of digital forensic practice, such as law enforcement, and have not incorporated a formal description. We contend that this approach has prevented the establishment of generally-accepted standards and processes that are urgently needed in the domain of digital forensics. This paper presents a generic process model as a step towards developing such a generally-accepted standard for a fundamental digital forensic activity–the acquisition of digital evidence.
... What properties are necessary and/or sufficient for digital materials to be viable in a specific investigative context? Digital forensics bases trustworthiness on data integrity, authentication, reproducibility, non-interference and relevance (Mocas, 2004). ...
... and Archivematica, a digital preservation workflow suite conforming to ISO-OAIS and the findings of InterPARES (www.archivematica.org). In the digital forensics professional community, the literature has been predominantly technical, although there is literature calling for digital forensics to be situated within a broader social and theoretical framework (Mocas, 2004;Palmer, 2001Palmer, , 2002Pollitt, 2009). The field of diplomatics has been introduced into the forensics literature by forensic scientist Fred Cohen, now also a researcher with InterPARES Trust at UBC (Cohen, 2011(Cohen, , 2012. ...
Article
Purpose – This paper aims to explore a new model of “record” that maps traditional attributes of a record onto a technical decomposition of digital records. It compares the core characteristics necessary to call a digital object a “record” in terms of diplomatics or “evidence” in terms of digital forensics. It then isolates three layers of abstraction: the conceptual, the logical and the physical. By identifying the essential elements of a record at each layer of abstraction, a diplomatics of digital records can be proposed. Design/methodology/approach – Digital diplomatics, a research outcome of the International Research on Permanent Authentic Records in Electronic Systems (InterPARES) project, gives archivists a methodology for analyzing the identity and integrity of digital records in electronic systems and thereby assessing their authenticity (Duranti and Preston, 2008; Duranti, 2005) and tracing their provenance. Findings – Digital records consist of user-generated data (content), system-generated metadata identifying source and location, application-generated metadata managing the look and performance of the record (e.g., native file format), application-generated metadata describing the data (e.g., file system metadata OS), and user-generated metadata describing the data. Digital diplomatics, based on a foundation of traditional diplomatic principles, can help identify digital records through their metadata and determine what metadata needs to be captured, managed and preserved. Originality/value – The value and originality of this paper is in the application of diplomatic principles to a deconstructed, technical view of digital records through functional metadata for assessing the identity and authenticity of digital records.
... (Schwarz, Newby, & Carroll, 2009;Steel, 2006) A further aspect of reliability in relation to the activities of digital forensic practitioners who are handling digital evidence is the concept of "forensically sound tasks" as identified by Rogers (2006) who states that they are derived from properties of digital forensics and comprise of Authenticity, Chain of Custody, Integrity, Minimization and Reproducibility. Rogers cites McKemmish (1999) and Mocas (2004) as sources for these properties. ...
... McKemmish (1999) Reproducibility. Rogers cites McKemmish (1999) and Mocas (2004) as inspiration for these properties. However, on reviewing the tasks in detail it appears that Rogers has created separate tasks for what could realistically be one task in practice. ...
Thesis
Full-text available
Given the pervasive nature of information technology, the nature of evidence presented in court is now less likely to be paper-based and in most instances will be in electronic form . However, evidence relating to computer crime is significantly different from that associated with the more ‘traditional’ crimes for which, in contrast to digital forensics, there are well-established standards, procedures and models to which law courts can refer. The key problem is that, unlike some other areas of forensic practice, digital forensic practitioners work in a number of different environments and existing process models have tended to focus on one particular area, such as law enforcement, and fail to take into account the different needs of those working in other areas such as incident response or ‘commerce’. This thesis makes an original contribution to knowledge in the field of digital forensics by developing a new process model for digital data acquisition that addresses both the practical needs of practitioners working in different areas of the field and the expectation of law courts for a formal description of the process undertaken to acquire digital evidence. The methodology adopted for this research is design science on the basis that it is particularly suited to the task of creating a new process model and an ‘ideal approach’ in the problem domain of digital forensic evidence. The process model employed is the Design Science Research Process (DSRP) (Peffers, Tuunanen, Gengler, Rossi, Hui, Virtanen and Bragge, 2006) that has been widely utilised within information systems research. A review of current process models involving the acquisition of digital data is followed by an assessment of each of the models from a theoretical perspective, by drawing on the work of Carrier and Spafford (2003)1, and from a legal perspective by reference to the Daubert test2. The result of the model assessment is that none provide a description of a generic process for the acquisition of digital data, although a few models contain elements that could be considered for adaptation as part of a new model. Following the identification of key elements for a new model (based on the literature review and model assessment) the outcome of the design stage is a three-stage process model called the Advance Data Acquisition Model (ADAM) that comprises of three UML3 Activity diagrams, overriding Principles and an Operation Guide for each stage. Initial testing of the ADAM (the Demonstration stage from the DSRP) involves a ‘desk check’ using both in-house documentation relating to three digital forensic investigations and four narrative scenarios. The results of this exercise are fed back into the model design stage and alterations made as appropriate. The main testing of the model (the DSRP Evaluation stage) involves independent verification and validation of the ADAM utilising two groups of ‘knowledgeable people’. The first group, the Expert Panel, consists of international ‘subject matter experts’ from the domain of digital forensics. The second group, the Practitioner Panel, consists of peers from around Australia that are digital forensic practitioners and includes a representative from each of the areas of relevance for this research, namely: law enforcement, commerce and incident response. Feedback from the two panels is considered and modifications applied to the ADAM as appropriate. This thesis builds on the work of previous researchers and demonstrates how the UML can be practically applied to produce a generic model of one of the fundamental digital forensic processes, paving the way for future work in this area that could include the creation of models for other activities undertaken by digital forensic practitioners. It also includes the most comprehensive review and critique of process models incorporating the acquisition of digital forensics yet undertaken.
... For these methods to be suitable to provide expert evidence, it is no less important that the methods (1) avoid subjective interpretation , (2) withstand peer review and publication, and (3) be accepted by the respective scientific community. A failure to withstand such scrutiny early on can cause a method's use to become restricted (Mocas, 2003). The impact of these issues on event recovery can be examined by considering event data recovery as a component process of event reconstruction. ...
... When they repair data they should identify where and how data have been repaired or altered. Where the data are repaired, two desirable properties for the tool are ''duplication integrity'' (Mocas, 2003 ) for the body of the log, and ''identifiable interference'' , for the header, in which changes in the original data are made clearly identifiable. So, it would be desirable for a tool that repairs artifacts to report separately for each repair, the location and hashes of the altered and unaltered portions of the data, in order to clearly demonstrate how the exported data and the in situ source data differ. ...
Article
Full-text available
This paper proposes methods to automate recovery and analysis of Windows NT5 (XP and 2003) event logs for computer forensics. Requirements are formulated and methods are evaluated with respect to motivation and process models. A new, freely available tool is presented that, based on these requirements, automates the repair of a common type of corruption often observed in data carved NT5 event logs. This tool automates repair of multiple event logs in a single step without user intervention. The tool was initially developed to meet immediate needs of computer forensic engagements.Automating recovery, repair, and correlation of multiple logs make these methods more feasible for consideration in both a wider range of cases and earlier phases of cases, and hopefully, in turn, standard procedures. The tool was developed to fill a gap between capabilities of certain other freely available tools that may recover and correlate large volumes of log events, and consequently permit correlation with various other kinds of Windows artifacts. The methods are examined in the context of an example digital forensic service request intended to illustrate the kinds of civil cases that motivated this work.
... In recent times, more and more cases are being reported to investigation agencies, which involve criminal activity caused by the misuse of social media platforms [7]. These investigation agencies employ various digital forensics tools to extract key evidence from the mobile devices of culprits, to help get them convicted in the court of law. ...
Article
Full-text available
________________________________________________________________________________________________________ Abstract: In this paper, a plugin is developed a for our automated digital forensics framework to extract and preserve the evidence from the IOS-based mobile phone application, Olx. This plugin extracts personal details from Olx users, e.g, user name, mobile number, User Location, Country name, State name, City name, Last check-in attempt, Ad Images between different Olx users. While developing the plugin, we identified resources available in IOS-based devices holding key forensics artifacts. We highlighted the poor privacy scheme employed by Olx. This work has shown how the sensitive data posted in the Olx mobile application can easily be reconstructed, and how the traces, as well as the URL links of visual messages, can be used to access the privacy of any Olx user without any critical credential verification. We also employed the anti-forensics method on the Olx IOS application and were able to restore the application from the altered or corrupted database file, which any criminal mind can use to set up or trap someone else. The outcome of this research is a plugin for our digital forensics ready framework software which could be used by law enforcement and regulatory agencies to reconstruct the digital evidence available in the Olx mobile application directories on IOS-based mobile phones.
... This is the justification behind Srinivasan's proclamation of 10 rules that the DF inquiry procedure must follow. Given the central role that individuals and their personal information will play in present and future digital research, authors in [6,7] want to shed light on this intriguing, highly demanding topic. They also examined the role of anonymity in digital forensics. ...
Article
Full-text available
Rapid technological breakthroughs, a surge in the use of digital devices, and the enormous amount of data that these devices can store continuously put the state of digital forensic investigation to the test. The prevention of privacy breaches during a digital forensic investigation is a significant challenge even though data privacy protection is not a performance metric. This research offered solutions to the problems listed above that centre on the efficiency of the investigative process and the protection of data privacy. However, it’s still an open problem to find a way to shield data privacy without compromising the investigator's talents or the investigation's overall efficiency. This system proposes an efficient digital forensic investigation process which enhances validation, resulting in more transparency in the inquiry process. Additionally, this suggested approach uses machine learning techniques to find the most pertinent sources of evidence while protecting the privacy of non-evidential private files.
... Many models have been presented to speed up the investigative process or tackle issues during forensic investigations (Du, X., Le-Khac, N. A., & Scanlon, 2017). These features and terms establish research and tool development principles (Mocas, 2004). The investigator determines a model or methodology for crime scene processing and evidence confiscation. ...
Article
Full-text available
Information and communication technologies (ICT) have changed every area of our lives. Cyberspace-related areas have reflected these shifts. Cyberspace has an undeniable positive impact on information, trade, industry, and communication. On the other hand, cybercrime is a dark side of the Internet that degrades its peaceful use. Any illegal activity carried out by or via cyberspace and its electronic environment is characterized as cybercrime. Unlike traditional crimes, cybercrimes present a real dilemma because the identities of criminals may be hidden in the virtual domain. Digital forensics has emerged to formulate possible ways for cybercrime investigation and analysis process. In this paper, we deplore the idea of digital forensics in the context of cybercrimes. An investigation of the positive impact of digital forensics in combating cybercrimes is discussed. In today’s world of computers, any information can be made available within a few clicks for different endeavors. The information may be tampered with by changing the statistical properties and can be further used for criminal activities.
... They also work for private industries and firms on contrast so they need to be very expert in their jobs. They are not as regular because an [4] investigator can earn money per hour so working all the time is not necessary for them. ...
Article
In this paper importance and usage of digital forensic is discussed. As the world is developing and moving towards technology new and better ways or investigation should be introduced. Investigating methods like foot prints are used but with old technology. The new method that is still not commonly used is discussed for awareness. The growth of cybercrime is increasing with advanced methods, to overcome this problem investigation teams should be given special trainings. This will not only reduce the crime but also precious time and money of investigation department.
... Quais propriedades são necessárias e/ou suficientes para que os materiais digitais sejam viáveis em um contexto investigativo específico? A base da forense digital é a fidedignidade da integridade dos dados, a autenticação, a reprodutibilidade, a não-interferência e a relevância (Mocas, 2004). ...
Article
Full-text available
RESUMO Propósito-este artigo objetiva explorar um novo modelo de "documento de arquivo" analisando seus atributos a partir de uma análise técnica de documentos de arquivos no formato digital. O estudo compa-ra as caraterísticas centrais necessárias para nomear um objeto digital como "documento de arquivo" em relação com a diplomática, ou de "prova" quanto à análise forense digital. Este estudo divide os documen-tos digitais em três camadas de abstração, a saber: conceitual, lógica e física. Nossa proposta é aplicar a diplomática de documentos de arquivo digitais, para identificar os principais elementos em cada um desses níveis de abstração. Desenho/metodologia/abordagem-A diplomática digital é resultado do projeto da Pesquisa Internacional sobre Documentos de Arquivo Autênticos e Permanentes em Sistemas Eletrônicos (InterPARES) 12 , a qual fornece para os arquivistas uma metodologia que analisa a identidade e a integridade dos documentos de arquivos digitais em sistemas eletrônicos, como também avalia sua autenticidade (Duranti e Preston, 2008; Duranti, 2005) e rastreia sua proveniência. Descobertas-os documentos de arquivo digitais estão estruturados em: dados (conteúdo) gerados pelo usuário, metadados gerados pelo sistema que identificam a fonte e a localização, metadados gerados por aplicativos que gerenciam a aparência e o desempenho do documento de arquivo (como, por exemplo, o formato de arquivo nato digital), metadados gerados por aplicativos que descrevem os dados (como por exemplo os metadados do sistema operacional utilizado pelo arquivo), e metadados que descrevem os dados gerados pelo usuário. A diplomática digital, baseada nos princípios da diplomática tradicional, pode subsidiar na identificação de documentos de arquivos digitais por meio de seus metadados e determinar quais deles são necessários para serem capturados, gerenciados e preservados. Originalidade/valor-O valor e a originalidade deste artigo estão na aplicação dos princípios diplomáticos para uma visão técnica desconstruída dos documentos de arquivo digitais, por meio dos metadados funcio-nais que avaliem a identidade e autenticidade desses documentos arquivísticos digitais.
... For civil investigations, in particular, laws may limit the skills of experts to undertake examinations. Constraints against network monitoring or reading of personal marketing and sales communications often exist [4]. Evidence from computer forensics investigations is usually put through the same guidelines and practices of other digital proof. ...
... Mocas identified the following standards for digital evidence: Duplication integrity, Authentication, Reproducibility, Non-interference, Identifiable non-interference, and Minimisation. These properties and terms establish principles for research and tools development (Mocas, 2004). For crime scene processing and evidence confiscation, the use of model or methodology is dependent upon the investigator. ...
Article
Full-text available
The Internet of Things (IoT) represents the seamless merging of the real and digital world, with new devices created that store and pass around data. Processing large quantities of IoT data will proportionately increase workloads of data centres, leaving providers facing new security, capacity and analytics challenges. Handling this data conveniently is a critical challenge, as the overall application performance is highly dependent on the properties of the data management service. This paper explores the challenges posed by cybercrime investigations and digital forensics concerning the shifting landscape of crime – the IoT and the evident investigative complexity – moving to the Internet of Anything (IoA)/Internet of Everything (IoE) era. IoT forensics requires a multi-faceted approach where evidence may be collected from a variety of sources such as sensor devices, communication devices, fridges, cars and drones, to smart swarms and intelligent buildings.
... For example, toxicology forensics was established in the 1800's (Levine, 2003) and fingerprint forensics in the 1890's (Tilstone, Savage, & Clark, 2006). According to Pollitt (2010) Within the digital forensics field, there are multiple digital forensic process models for investigations (Beebe & Clark, 2005;Carrier, Spafford et al., 2003;McKemmish, 1999;Mocas, 2004). A general digital forensics model was proposed by Rigby and Rogers (2007), which consists of seven different phases: (1) Preparation, (2) identification, (3) preservation, ...
Thesis
Full-text available
WhatsApp is the most popular instant messaging application worldwide. Since 2016, users can send and receive messages through desktop clients, either through the WhatsApp desktop application or the web client accessible from supported web browsers. The author identified a gap in the literature in terms of WhatsApp forensics for desktop and web clients. The aim of the study was to locate forensic artifacts on WhatsApp clients. These clients included the desktop application on both Windows and Mac operating systems. Chrome and Firefox web clients were also analyzed for the Windows operating system, as well as Chrome and Safari web clients on the Mac operating system. A WhatsApp log file was identified as the main artifact providing information throughout all clients analyzed. Cached profile pictures were also found, as well as history information about visited websites and ran applications.
... This software guarantees that the collection of data meets generally accepted standards of legal proceedings. These practices include ensuring the integrity of these data (by precluding voluntary or accidental modification of the digital evidence), the authenticity of the evidence, and the reproducibility of all procedures related to evidence collection and analysis (Mocas, 2004). Before starting the collecting phase of the study, we obtained approval (#CERAS-2011-12-255-A) from the ethical committee of the University of Montreal. ...
Article
Full-text available
This study analyzed the evolution over time of the activity of consumers of child sexual exploitation material (CSEM). To this end, images and metadata were extracted from the hard drives of 40 individuals convicted of possession of child pornography and analyzed. A sample of these images (N = 61,244) was categorized by the age of the subjects depicted and—using the Combating Paedophile Information Networks in Europe (COPINE) scale—by severity of the acts depicted. Collecting activity was observed to follow four patterns. The most prevalent pattern was a progressive decrease in the age of the person depicted and a progressive increase in the severity of the sexual acts. In light of the results, we propose four explanations of the nature of, and variations in, child-pornography collections.
... Another independent examination of the digital forensic models and examination of its implication in the perspective of the challenges that were highlighted in the report of DRFWS 2001 was presented by Reith et al. [19]. For the advancement of digital forensic field in 2003, Mocas [20] recognized three key challenges. These challenges stated by him were:-1. ...
... In today's era rootkits are famous among the hacker community to thwart the process of live response methodologies on a windows machine. The forensic evidence is fragile to handle in the sense that it can be easily be modified, duplicated, restored or destroyed [8]. The Live Response Methodology, came into force when it was discovered that the traditional forensic methodology i.e., pulling the plug off the system and then to image the hard drive is no longer a viable option to acquire evidence from a machine. ...
... Due to the nature and scope of digital evidence, however, it is very difficult to present this evidence in a court of law. Much research currently focuses on this problem (Rogers & Seigfried, 2004;Baryamureeba & Tushabe, 2004;Beebe & Clark, 2004;Oppliger & Rytz, 2003;Yasinsac, Erbacher, Marks, Politt & Sommer, 2003;Mocas, 2004;Leigland & Krings, 2004;Meyers & Rogers, 2004). It was established in the above research efforts that the following requirements as devised by Daubert and Frye (Gianelli & Imwinkelried, 1999) have to hold, before any court of law will accept the digital evidence (Bernstein, 2000;Jonakait, 1993;Carrier, 2002): ...
Thesis
Full-text available
The new and upcoming field of wireless sensor networking is unfortunately still lacking in terms of both digital forensics and security. All communications between different nodes (also known as motes) are sent out in a broadcast fashion. These broadcasts make it quite difficult to capture data packets forensically and, at the same time, retain their integrity and authenticity. The study presents several attacks that can be executed successfully on a wireless sensor network, after which the dissertation delves more deeply into the flooding attack as it is one of the most difficult attacks to address in wireless sensor networks. Furthermore, a set of factors is presented to take into account while attempting to achieve digital forensic readiness in wireless sensor networks. The set of factors is subsequently discussed critically and a model is proposed for implementing digital forensic readiness in a wireless sensor network. The proposed model is next transformed into a working prototype that is able to provide digital forensic readiness to a wireless sensor network. The main contribution of this research is the digital forensic readiness prototype that can be used to add a digital forensics layer to any existing wireless sensor network. The prototype ensures the integrity and authenticity of each of the data packets captured from the existing wireless sensor network by using the number of motes in the network that have seen a data packet to determine its integrity and authenticity in the network. The prototype also works on different types of wireless sensor networks that are in the frequency range of the network on which the prototype is implemented, and does not require any modifications to be made to the existing wireless sensor network. Flooding attacks pose a major problem in wireless sensor networks due to the broadcasting of communication between motes in wireless sensor networks. The prototype is able to address this problem by using a solution proposed in this dissertation to determine a sudden influx of data packets within a wireless sensor network. The prototype is able to detect flooding attacks while they are occurring and can therefore address the flooding attack immediately. Finally, this dissertation critically discusses the advantages of having such a digital forensic readiness system in place in a wireless sensor network environment.
... As early as 2003, Mocas [142] has identified three main challenges that researchers need to overcome to advance the field of digital forensics from a theoretical standpoint. These challenges are: ...
Article
Full-text available
Digital forensics is the process of employing scientific principles and processes to analyze electronically stored information and determine the sequence of events which led to a particular incident. In this digital age, it is important for researchers to become aware of the recent developments in this dynamic field and understand scope for the future. The past decade has witnessed significant technological advancements to aid during a digital investigation. Many methodologies, tools and techniques have found their way into the field designed on forensic principles. Digital forensics has also witnessed many innovative approaches that have been explored to acquire and analyze digital evidence from diverse sources. In this paper, we review the research literature since 2000 and categorize developments in the field into four major categories. In recent years the exponential growth of technological has also brought with it some serious challenges for digital forensic research which is elucidated. Within each category, research is sub-classified into conceptual and practical advancements. We highlight the observations made by previous researchers and summarize the research directions for the future.
... The practice of digital forensics is firmly rooted in computer science, and there are many articles and books by practitioners devoted to the development and testing of investigation techniques, issues and toolkits. However, there is a growing body of literature that focuses on more abstract theoretical questions of how digital forensics addresses authenticity, reliability and evidentiary potential of digital materials extracted or recovered from their native systems (see for example Palmer 2001;Mocas 2004). ''When contemplation is codified in disciplined study, we have the sense of theory as that part of a technical subject devoted to elucidating the general facts, principles, or propositions on which the subject depends, as distinguished from the practice of it'' (Eastwood 1994, p. 124). ...
Article
Full-text available
Rapid technological advances are fuelling trust requirements and concerns about public and private sector records alike. To guarantee the reliability and authenticity of records requires a framework of policies, procedures, technologies, and intentional action or intervention by “trusted custodians” who have the knowledge required for attesting to and ensuring the continuing authenticity of the records. Records professionals have claimed the role of trusted keepers of the authentic record of our times, but how do they earn that trust? To begin, by acquiring competence. In order to define what kind of education would contribute to qualifying records professionals as competent, it is necessary to identify the components of knowledge they require and the role that society at large expects of them. The responsibilities and challenges presented by managing digital records through time are ones that records professionals should not meet in isolation. In order for records to be able to serve as evidence of actions and events, they must be protected as such. Records-related knowledge requirements are being articulated in the related disciplines of archival science and records management, law, digital forensics, and information assurance and cyber-security. The need for interdisciplinary knowledge to understand and manage the complexities of digital records is being realized in new research alliances that foster the development of knowledge that can support the role of trusted keepers of the authentic record of our times. One such alliance is the Digital Records Forensics Project at the University of British Columbia.
... Software packages have been known to be purchased based on nothing more than web-based reports [25]. Therefore, researchers argue that it is vital for digital forensics to develop an evaluation methodology for tools that will be accepted by vendors, practitioners, and the scientific community [21] [26]. Failure to provide quality assessments of the tools used, could impact on the credibility of digital forensics practitioners, along with the credibility of the evidence produced . ...
Article
Full-text available
This paper outlines the specification of the Cloud-based D-FET platform which is used to evaluate the performance of digital foren-sics tools, which aim to detect the presence of trails of evidence, such as for the presence of illicit images and determination of user accounts from a host. Along with measuring key quality metrics, such as true-positives, and false-positives, it also measures operational performance, such as for the speed of success, CPU utilization and memory usage. This is used to determine the basic footprint of the package-under-test. The paper presents a proof-of-concept of the system using the VMware vSphere Hypervisor (ESXi) within the vCenter Cloud management in-frastructure, which provides a cluster environment, and supports the cre-ation and instantiation of a well-defined virtual test operation system. The infrastructure has been used within a teaching environment for two semesters, and has been shown to cope well in terms of performance and administration. Two key evaluation points related to whether a cloud-based infrastructure will provide improvement on existing stand-alone and workstation-based virtualisation are related to the improvement in energy consumption and in the CPU utilization footprint for each virtual machine. Thus the results show some metrics related to the energy and CPU consumptions of the created digital forensics instances, which can be used to justify the improvements in energy consumption, as opposed to stand-alone instances, and in the scalability of the infrastructure.
... We therefore present an analysis of this particular problem from an investigative perspective. Additional motivation for this work is highlighted by Mocas (Mocas, 2003), who states, that any work conducted in the field of digital investigation is unique, in that the researcher must be conversant with theoretical research and, must obtain a knowledge of the realities of investigation, whether it be procedural, or technical. Thus, in the development of this framework, we hope to bridge the gap between both of the essential fields of theoretical research, and practical application. ...
Article
Full-text available
Network based forensic investigations often rely on data provided by properly configured network-based devices. The logs from interconnected devices such as routers, servers and Intrusion Detection Systems (IDSs) can yield important information, which can be used during an investigation to piece together the events of a security incident. A device, such as a router, which performs its intended duties as well as logging tasks, can be defined as having in-line logging capabilities. A system that only performs these duties, such as an IDS, can be described as an out-of-line logging system. The usefulness of these logs, though, must be compared against the impact that they have on the systems that produce them. It is thus possible to introduce a detrimental burden on inline devices. This can thus reduce the capability of the device to provide core functionality, and, the extra evidence generated could place an increased burden on the forensic investigator. Therefore, when configuring network devices, the security practitioner is the key to producing a careful balance between security, performance and generated data volume. This paper outlines an intensive experiment to compare and contrast different logging schemes. These tests are placed within the scenario of a forensic investigation, which involves extensive data logging and analysis. The metrics compare CPU utilisation, bandwidth usage, memory buffers, usefulness of these records to the investigation, and so on. The two logging systems examined are the Cisco 20x series based routers, for in-line logging capabilities with Syslog, and the IDS Snort for out-of-line logging. This work provides an empirical perspective by plotting the footprint that this logging scheme has on the core network infrastructure, thus providing a proposed optimal logging approach for a network, along with the comparative merits of in-line and out-of-line auditing systems.
... have indicated that practitioners lack a conceptual framework for thinking about computer security/information assurance that encourages proactive approaches to digital forensics. [40,42,43] The methodology introduced in this paper, and the theoretical analysis that led to its development, are offered as a possible beginning. ...
Conference Paper
Full-text available
When incident responders collect network forensic data, they must often decide between expending resources collecting forensically sound data, and restoring the network as quickly as possible. Organizational network forensic readiness has emerged as a discipline to support these choices, with suggested checklists, procedures and tools. This paper proposes a life cycle methodology for "operationalizing" organizational network forensic readiness. The methodology, and the theoretical analysis that led to its development, are offered as a conceptual framework for creating more efficient, proactive approaches to digital forensics on networks
... Consequent with this increase in computer crime is the in- crease in evidence contained on computers that must be secured if it is to be admissible as evi- dence in a court of law (Wolfe-Wilson and Wolfe, 2003). Data integrity and authentication must be assured, methods to gather and examine the evi- dence must be reproducible and it must be able to be shown that the gathering of the evidence did not change either the data itself or the system from which it was taken (Mocas, 2004). ...
Article
Full-text available
Computer security is of concern to those in IT (Information Technology) and forensic readiness (being prepared to deal effectively with events that may require forensic investigation) is a growing issue. Data held only on magnetic or other transient media require expert knowledge and special procedures to preserve and present it as valid in a criminal or employment court. Staff required to handle possible forensic evidence should be forensically knowledgeable. Having policies and procedures in place is one inexpensive way to protect the forensic data and can mean the difference between a valid case and no case.This paper presents the results of a survey of IT managers in New Zealand (NZ) examining the state of awareness of IT management in NZ regarding the field of digital forensics in general and their state of preparation for protection of forensic data in the case of an event requiring forensic analysis.
... Following the first DFRWS, Mocas proposed a framework to help build "theoretical underpinnings for digital forensics research (Mocas, 2004)." The purpose of the framework was to "define a set of properties and terms that can be used as organizing principles for the development and evaluation of research in digital forensics." ...
Article
Full-text available
Today’s Golden Age of computer forensics is quickly coming to an end. Without a clear strategy for enabling research efforts that build upon one another, forensic research will fall behind the market, tools will become increasingly obsolete, and law enforcement, military and other users of computer forensics products will be unable to rely on the results of forensic analysis. This article summarizes current forensic research directions and argues that to move forward the community needs to adopt standardized, modular approaches for data representation and forensic processing.
... Live CD is a kind of operation system distribution which can be booting from a read-only medium (such as a CD- ROM or DVD) without actually installing into hard disk [11]. Usually, it was named depending on what media it stores. ...
Conference Paper
As popularity of the Internet continues to grow, it changes the way of computer crime. Number of computer crime increases dramatically in recent years and investigators have been facing the difficulty of admissibility of digital evidence. To solve this problem, we must collect evidence by digital forensics techniques and analyze the digital data, or recover the damaged data. In this research, we integrate several open source digital forensics tools and create a graphic user interface to develop a user-friendly environment for investigators. To avoid evidence loss due to shutdown of target hosts, we use the live analysis technique to collect volatile data with executing commands from an external USB. We also create a live USB so that target hosts can boot from the USB which contains a functional operating system with tools for forensic discovery.
... Digital evidence is stored in computer can play a major role in a wide range of crimes, including murder, rape, computer intrusions, espionage, and child pornography in proof of a fact about what did or did not happen [3, 17]. Digital information is fragile in that it can be easily modified, duplicated, restored or destroyed, etc [10]. The course of the investigation, the investigator should assure that digital evidence is not modified without proper authorization [9]. ...
Conference Paper
As the popularity of the internet continues growing, not only change our life, but also change the way of crime. Number of crime by computer as tools, place or target, cases of such offenders increases these days, fact to the crime of computer case traditional investigators have been unable to complete the admissibility of evidence. To solve this problem, we must collect the evidence by digital forensics tools and analysis the digital data, or recover the damaged data. In this research, we use the open source digital forensics tools base on Linux and want to make sure the stability of software then prove the evidence what we have. To avoid the data loss due to the shutdown of machines, we use the Live-analysis to collect data and design the Live DVD/USB to make image file and analysis the image. We use the MD5 and SHA-1 code to identity the file before the final report and ensure the reliability of forensic evidence on court.
... Some recent works have addressed challenges in the effective acquisition of volatile memory [14, 15, 17] and specifically in Windows based memory analysis in a computer [13, 18], while Buchholz and Spafford have studied the role of file system metadata in digital forensics [4]. Since digital forensics has predominantly been reactionary, some research contributions have been reported in formal methods for event reconstruction [10] and building theoretical foundations [12] to digital forensics. Turner has applied the DEB model to selective imaging of hard disk drives [20] and Beebe and Clark [3] introduce a text string search engine in for thematic searching in digital evidence. ...
Conference Paper
Full-text available
The analysis and value of digital evidence in an i nvestigation has been the domain of discourse in the digital forensic com munity for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a dis tant dream. With the advent of modern technologies, pro-active measures are integr al to keeping abreast of all forms of cyber crimes and attacks. This paper motiv ates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architec ture ( FIA ) which provides a framework for abstracting the evidence source and s torage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identif ies evidence information from multiple sources that enables an investigator to bu ild theories to reconstruct the past. FIA is hierarchically composed of multiple la yers and adopts a technology independent approach. FIA is also open and extensib le making it simple to adapt to technological changes. We present a case study u sing a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.
... acquisition, duplication, storage, reconstruction: it is important that the evidence have continuity [14] from crime scene to court. Therefore, as the evidence is processed along the way, there is a need to ensure that the processes preserve the evidentiary properties [8] of the evidence such as provenance [14], integrity and admissibility. @BULLET Design of abstract evidential data constructs satisfying legal admissibility requirements. ...
Conference Paper
For security-emphasizing fields that deal with evidential data acquisition, processing, communication, storage and presentation, for instance network forensics, border security and enforcement surveillance, ultimately the outcome is not the technical output but rather physical prosecutions in court (e.g. of hackers, terrorists, law offenders) or counter-attack measures against the malicious adversaries. The aim of this paper is to motivate the research direction of formally linking these technical fields with the legal field. Notably, deriving technical representations of evidential data such that they are useful as evidences in court; while aiming that the legal parties understand the technical representations in better light. More precisely, we design the security notions of evidence processing and acquisition, guided by the evidential requirements from the legal perspective; and discuss example relations to forensics investigations.
... Stephenson suggests an outcomeoriented classification of goals: improved system defenses and accurately restored systems vs. legal or military action. [4] Other literature classifies goals according to the environment, each associated with emphasis on particular concerns: law enforcement, commercial, and military. [7] Obviously, these categorizations are not orthogonal. ...
Conference Paper
The purpose of this research is to investigate the automated analysis of network based evidence in response to cyberspace attacks. The automated analysis techniques to be developed and studied will combine the efficiency of both existing and novel local search techniques with the scalability and robustness of evolutionary computation and other computational intelligence techniques.
Chapter
Traditionally, records and archives were primarily paper-based and the infrastructures and ecosystems in the paper world were easily determined. Trust records were guaranteed as they were kept in physical filing cabinets where the archivist and records manager had total control. Today in the digital world, the formats for records and archives are no longer physical and are constantly changing resulting in their loss and inaccessibility, thereby bringing distrust and impeding decision making. The purpose of this chapter is to explore AI in records and archives management in an era where computers are becoming knowledge processing machines. As the world moves into the fifth Industrial Revolution (5IR), it is imperative for records and archives organizations to address the surrounding issues of trust, policies and ethical considerations in records and archives management so that the records remain effective sources for decision making.
Article
Full-text available
Resumo Considerando as persistentes dificuldades para gerenciar documentos de arquivo digitais, os estudos desenvolvidos na Arquivologia estabelecem relações com outras ciências para dar possíveis respostas. Assim como os estudos da Diplomática são relevantes para a Arquivologia na verificação de autenticidade através da aplicação dos elementos externos e internos nos documentos de arquivo, as pesquisas forenses estão fornecendo subsídios nas rotinas administrativas e arquivísticas, para tentar atingir e garantir a fidedignidade documental (completude, confiabilidade e autenticidade). Desta forma, o objetivo deste artigo é incorporar os conceitos da Ciência Forense Digital na Arquivologia, trabalho que está sendo desenvolvido em projetos internacionais, principalmente nos Estados Unidos e no Canadá, aproveitando a maturidade dos estudos forenses no domínio digital. Para integrar as duas ciências em questão, reflete-se nas funções forenses: identificação, compilação, preservação, verificação, análise e apresentação. Junto com as funções arquivísticas: criação/produção, avaliação, classificação, descrição, difusão, preservação e aquisição. Para isso, a metodologia usada, parte da literatura científica das pesquisas nacionais e internacionais em andamento, que permitam visibilizar, tanto as convergências, como as divergências das fases ou passos forenses, como das funções arquivísticas. Os resultados apontaram a possibilidade de vincular as funções de ambas as ciências refletindo, no entanto, em alguns conceitos que precisam ser melhor definidos, na busca por estabelecer uma ciência Forense-Arquivística. A conclusão deste artigo baseia-se na necessidade de continuar aprofundando nos estudos nacionais, para delimitar os elementos e conceitos e que contribuam para os estudos arquivísticos na realidade digital. Palavras-chave Arquivologia. Forense Digital. Documentos de Arquivo Natos Digitais. Fidedignidade.
Article
Full-text available
This case study presents a qualitative assessment of the reliability of digital forensic investigation in criminal cases in Norway. A reliability validation methodology based on international digital forensic standards was designed to assess to what extent those standards are implemented and followed by law enforcement in their casework. 124 reports related to the acquisition, examination, and analysis of three types of digital data sources - computers, mobile phones, and storage devices were examined. The reports were extracted from the criminal case management system used by the police and prosecution services. The reports were examined on technology, method, and application level in order to assess the reliability of digital evidence for criminal proceedings. The study found that digital forensic investigation in 21 randomly sampled criminal cases in Norway were insufficiently documented to assess the reliability of the digital evidence. It was not possible to trace the digital forensic actions performed on each item or link the digital evidence to its source. None of the cases were shown to comply with digital forensic methodology, justify the methods and tools used, or validate tool results and error rates.
Article
Contemporary criminal investigation assisted by computing technology imposes challenges to the right to a fair trial and the scientific validity of digital evidence. This paper identifies three categories of unaddressed threats to fairness and the presumption of innocence during investigations – (i) the inappropriate and inconsistent use of technology; (ii) old procedural guarantees, which are not adapted to contemporary digital evidence processes and services; (iii) and the lack of reliability testing in digital forensics practice. Further, the solutions that have been suggested to overcome these issues are critically reviewed to identify their shortcomings. Ultimately, the paper argues for the need of legislative intervention and enforcement of standards and validation procedures for digital evidence in order to protect innocent suspects and all parties in the criminal proceedings from the negative consequences of technology-assisted investigations.
Chapter
Thousands of years ago in China, fingerprints were used on all business documents in the same way signatures are used today—to provide approval of the document, or authorization to perform any actions that the document outlines. This was the very first application of forensics in the world. Since then, law enforcement around the world has slowly developed the forensic skills and tools to investigate crime scenes, using forensic sciences and methods to learn what happened. An example of this is Song Ci, an outstanding forensics scientist in Ancient China who documented his lifetimes of experience and thoughts on forensic medicine in his book “Washing Away of Wrongs: Forensic Medicine”. These works were the first of their kind in the world (Fig. 1.1).
Article
Full-text available
The advancement of smartphone technology has attracted many companies in developing mobile operating system (OS). Mozilla Corporation recently released Linux-based open source mobile OS, named Firefox OS. The emergence of Firefox OS has created new challenges, concentrations and opportunities for digital investigators. In general, Firefox OS is designed to allow smartphones to communicate directly with HTML5 applications using JavaScript and newly introduced WebAPI. However, the used of JavaScript in HTML5 applications and solely no OS restriction might lead to security issues and potential exploits. Therefore, forensic analysis for Firefox OS is urgently needed in order to investigate any criminal intentions. This paper will present an overview and methodology of mobile forensic procedures in forensically sound manner for Firefox OS
Article
This paper presented a real-time forensics model based on fuzzy theory, and the prototype was implemented. The model took advantage of fuzzy theory in disposing of uncertain problems. The information records, which may be suspicious intrusion information, were dynamically transferred to a secure object 'M', in which every key file had a corresponding image. To store those records longer and at first priority, the information records were of 3 various security levels. The security of the model was evaluated and testified. 95% of the information records, which may be suspicious intrusion information, were safely transferred to an authentic place and then could be selectively stored.
Article
Today's Golden Age of computer forensics is quickly coming to an end. Without a clear strategy for enabling research efforts that build upon one another, forensic research will fall behind the market, tools will become increasingly obsolete, and law enforcement, military and other users of computer forensics products will be unable to rely on the results of forensic analysis. This article summarizes current forensic research directions and argues that to move forward the community needs to adopt standardized, modular approaches for data representation and forensic processing. © 2010 Digital Forensic Research Workshop. Published by Elsevier Ltd. All rights reserved.
Conference Paper
Research in the area of memory forensics has been flourishing over the last years, and powerful analysis frameworks such as Volatility have been developed. While these frameworks permit examining a forensic memory snapshot in great detail, they mainly aim at experienced investigators with a thorough knowledge of operating system internals. On the other hand, result correlation and interpretation is especially demanding for personnel with only marginal forensic expertise as they are often found in IT departments of smaller- and medium-sized enterprises. This task becomes even more challenging when attempting to detect traces of sophisticated malicious applications such as rootkits. In this paper, we present rkfinder, a plug-in for the well-known forensic framework DFF, that integrates major capabilities of Volatility into an intuitive and easy-to-use graphical user interface. Rkfinder generates an abstract, tree-like view of the system state, implements checks that are capable of revealing inconsistencies, and automatically highlights suspicious objects that may indicate the presence of a threat. Thereby, potential sources of a system infection are better visible and can be better addressed in the course of incident response.
Conference Paper
Full-text available
For Global Service-Oriented Architecture (GSOA), we need to make sure that the Web Services provided by the service providers are visible to all potential consumers across the globe. In this paper, we discuss about Universal Description Discovery and Integration (UDDI) and its limitations as public repository. For this, we propose using Really Simple Syndication (RSS) as a repository. RSS is indeed based on the Extensible Markup Language (XML) and hence it enables easy data transformation. It is also platform independent, lightweight and is widely used as open source technology. Due to the auto push nature of RSS, the global RSS repository will only send the recently added Uniform Resource Identifier (URI) links to the subscribed consumers. The client can maintain its local cache for supporting faster execution of most frequently used Web Services. The new URI links provided by the server will get appended to the client's local cached repository thereby reducing data traffic over the Internet. In GSOA, we discuss logic that will assign the ranks to the Web Services and will help compare Web Services providing similar functionalities; thereby it helps the Web Service consumer to dynamically select the best available Web Service depending on the user criteria. The local RSS-based repository implements the local ranking, wherein the user's relative preferences are stored. The most important thing is that the local RSS takes priority over the global RSS ranks.
Conference Paper
Forensic investigations on networks are not scalable in terms of time and money [1]. Those investigations that do occur consume months of attention from the very experts who should be investing in more productive activities, like designing and improving network performance [1]. Given these circumstances, organizations often must select which cases to pursue, ignoring many that could be prosecuted, if time allowed. Recognizing the exponential growth in the number of crimes that employ computers and networks that become subject to digital evidence procedures, researchers and practitioners, alike, have called for embedding forensics—essentially integrating the cognitive skills of a detective into the network [2, 3, 4]. The premise is that the level of effort required to document incidents can thus be reduced, significantly. This paper introduces what technical factors might reflect those detecting skills, leading to solutions that could offset the inefficiencies of current practice.
Article
Full-text available
We describe an investigation of authorship gender and language background cohort attribution mining from e-mail text documents. We used an extended set of predominantly topic content-free e-mail document features such as style markers, structural characteristics and gender-preferential language features together with a Support Vector Machine learning algorithm. Experiments using a corpus of e-mail documents generated by a large number of authors of both genders gave promising results for both author gender and language background cohort categorisation.
Article
Full-text available
Law enforcement is in a perpetual race with criminals in the application of digital technologies, and requires the development of tools to systematically search digital devices for pertinent evidence. Another part of this race, and perhaps more crucial, is the development of a methodology in digital forensics that encompasses the forensic analysis of all genres of digital crime scene investigations. This paper explores the development of the digital forensics process, compares and contrasts four particular forensic methodologies, and finally proposes an abstract model of the digital forensic procedure. This model attempts to address some of the shortcomings of previous methodologies, and provides the following advantages: a consistent and standardized framework for digital forensic tool development; a mechanism for applying the framework to future digital technologies; a generalized methodology that judicial members can use to relate technology to non-technical observers; and, the potential for incorporating non- digital electronic technologies within the abstraction
Article
A strong factor in the early development of computers was security – the computations that motivated their development, such as decrypting intercepted messages, generating gunnery tables, and developing weapons, had military applications. But the computers themselves were so big and so few that they were relatively easy to protect simply by limiting physical access, to them to their programmers and operators. Today, computers have shrunk so that a web server can be hidden in a matchbox and have become so common that few people can give an accurate count of the number they have in their homes and automobiles, much less the number they use in the course of a day. Computers constantly communicate with one another; an isolated computer is crippled. The meaning and implications of “computer security” have changed over the years as well. This paper reviews major concepts and principles of computer security as it stands today. It strives not to delve deeply into specific technical areas such as operating system security, access control, network security, intrusion detection, and so on, but to paint the topic with a broad brush.
Article
For many years, the security research community has focused on the confidentiality aspect of security, and a solid analytical foundation for addressing confidentiality issues has evolved. Now it is recognized that integrity is at least as important as confidentiality in many computer systems; it is also apparent that integrity is not well understood. The purpose of this paper is to lay a foundation for understanding integrity and investigate how it can be promoted and preserved in computer systems. The paper begins by exploring what is meant by integrity. It identifies three primary subgoals of integrity: (1) prevent unauthorized users from accessing system resources or modifying data, (2) maintain traditional data consistency, as well as the consistency of a system with respect to its environment, and (3) prevent authorized users from improperly accessing system resources or modifying data. Having articulated a vision of integrity, the paper discusses principles underlying the preservation of integrity, analyzes manual and automated integrity-preserving mechanisms, and examines integrity models and proposed implementation of the models. The paper concludes that although some gaps in understanding still exist, it is possible to begin to standardize integrity properties of systems.
Article
Despite the potentially grave ramifications of relying on faulty information in the investigative or probabitive stages, the uncertainty in digital evidence is not being evaluated at present, thus making it difficult to assess the reliability of evidence stored on and transmitted using computer networks. As scientists, forensic examiners have a responsibility to reverse this trend and address formally the uncertainty in any evidence they rely on to reach conclusions. This paper discusses inherent uncertainties in network related evidence that can be compounded by data corruption, loss, tampering, or errors in interpretation and analysis. Methods of estimating and categorizing uncertainty in digital data are introduced and examples are presented.
Article
This paper discusses some of the unique military requirements and challenges in Cyber Forensics. A definition of Cyber Forensics is presented in a military context. Capabilities needed to perform cyber forensic analysis in a networked environment are discussed, along with a list of current shortcomings in providing these capabilities and a technology needs list. Finally, it is shown how these technologies and capabilities are transferable to civilian law enforcement, critical infrastructure protection, and industry.
Article
This paper uses the theory of abstraction layers to describe the purpose and goals of digital forensic analysis tools. Using abstraction layers, we identify where tools can introduce errors and provide requirements that the tools must follow.
Article
This document represents some of the work that has been performed in researching this problem and defining and building a solution to this problem. Although many different technical approaches to creating traceable time are possible with varying levels of automation, the following describes our approach
Air Force Research Labora-tory, Rome Research Site An examination of digital forensic models
  • Reith M C Carr
  • Gunsch
Technical Report DTR-T001-01, Air Force Research Labora-tory, Rome Research Site; 2001. Reith M, Carr C, Gunsch G. An examination of digital forensic models. Int J Digit Evid 2002;1(3).
The legal ramifications of operating a honey-pot):18e9. Scientific Working Group on Digital Evidence. Proposed stand-ards for the exchange of digital evidence
  • Richard Salgado
Salgado Richard. The legal ramifications of operating a honey-pot. IEEE Secur Privacy 2003;1(2):18e9. Scientific Working Group on Digital Evidence. Proposed stand-ards for the exchange of digital evidence. Forensic Sci Commun April 2000;2(2).
Palmer G. A road map for digital forensics research. Report from the First Digital Forensics Research Workshop (DFRWS)
  • Palmer
Recovering and examining computer forensic evidence
  • Noblett
Language and gender analysis of e-mail authorship for computer forensics
  • Corney De Vel Olivier
  • Anderson Malcolm
  • Mohay Alison
  • George
International principles for computer evidence
  • International Organization on Computer Evidence
Proposed standards for the exchange of digital evidence
  • Scientific Working Group on Digital Evidence