ArticlePDF Available

An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents under the Ubiquitous Computing Environments

Authors:
Article

An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents under the Ubiquitous Computing Environments

Abstract and Figures

Innumerable firms are extensively integrating state-of-the-art ICT to boost the competitiveness of the organizations in all aspects. Simultaneously, the unprecedented availability of UC networks and mobile devices are exponentially growing. Unfortunately, based on the current voluminous computer crime incidents, the ICT deployments under UC infrastructures might jeopardize the organizations if they ignore the imminent necessity of DF in their homogeneous/heterogeneous ISs. Most enterprises are dearth of vigilance concerning the above issues although they might be aware that the salient and stringent computer crimes are capable of devastating the company’s intangible assets silently. Vandalism of intellectual property or conducting industrial espionage for the valuable assets via trustworthy UC networks becomes an approaching menace. Hence, the DF plays an essential role in the information security arena. Demonstrably, there is no one DF suite can encompass all aspects or purposes due to the dynamic diversities of computer crimes in their natures. Interchangeably utilizing various DF tools is a decent approach to find the causes for the associate computer crimes and prevents the related information security incidents from occurring. At last, a DF scenario review utilizing the proposed ontology-driven model with respect to the UC environment was conducted and demonstrated. KeywordsUbiquitous computing-Digital forensics-Computer crime-Information security-Ontology-driven model
Content may be subject to copyright.
Wireless Pers Commun
DOI 10.1007/s11277-009-9886-x
An Ontology-driven Model for Digital Forensics
Investigations of Computer Incidents under
the Ubiquitous Computing Environments
Hai-Cheng Chu ·Der-Jiunn Deng ·Han-Chieh Chao
© Springer Science+Business Media, LLC. 2009
Abstract Innumerable firms are extensively integrating state-of-the-art ICT to boost the
competitiveness of the organizations in all aspects. Simultaneously, the unprecedented avail-
ability of UC networks and mobile devices are exponentially growing. Unfortunately, based
on the current voluminous computer crime incidents, the ICT deployments under UC infra-
structures might jeopardize the organizations if they ignore the imminent necessity of DF in
their homogeneous/heterogeneous ISs. Most enterprises are dearth of vigilance concerning
the above issues although they might be aware that the salient and stringent computer crimes
are capable of devastating the company’s intangible assets silently. Vandalism of intellec-
tual property or conducting industrial espionage for the valuable assets via trustworthy UC
networks becomes an approaching menace. Hence, the DF plays an essential role in the infor-
mation security arena. Demonstrably, there is no one DF suite can encompass all aspects or
purposes due to the dynamic diversities of computer crimes in their natures. Interchangeably
utilizing various DF tools is a decent approach to find the causes for the associate computer
crimes and prevents the related information security incidents from occurring. At last, a
DF scenario review utilizing the proposed ontology-driven model with respect to the UC
environment was conducted and demonstrated.
H.-C. Chu (B
)
Department of Information Management/International Business, Tunghai University, Taichung, Taiwan
e-mail: hcchu@thu.edu.tw
D.-J. Deng
Department of Computer Science and Information Engineering, National Changhua University
of Education, Changhua, Taiwan
e-mail: djdeng@cc.ncue.edu.tw
H.-C. Chao
Department of Electronic Engineering, Institute of Computer Science & Information Engineering,
National Ilan University, Ilan, Taiwan
e-mail: hcc@niu.edu.tw
H.-C. Chao
Department of Electrical Engineering, National Dong Hwa University, Hualien, Taiwan
123
H.-C. Chu et al.
Keywords Ubiquitous computing ·Digital forensics ·Computer crime ·
Information security ·Ontology-driven model
1 Introduction
As the Ubiquitous Computing (UC) becomes prevalent and dominant, more unscrupulous
Internet technology hacktivists commit swindles towards innocent victims via this ever grow-
ing technology. As Information Communication Technology (ICT) progresses on a daily
basis, many enterprises have utilized homogeneous/heterogeneous Information Systems (ISs)
and Applications Programs (APs) into their core operations in order to achieve competitive-
ness in the related arenas. Indeed, business operation efficiency has been improved due to
tremendously huge amount of corporate data being digitalized, transmitted, shared, processed
and stored in diverse databases across the Internet. However, due to the vulnerability of ISs
or security leakages in the APs under the UC environments, deceitful information predators
or crime syndicate exist somewhere in the networks desperately hunting for innocent victims
in order to pursuit their criminal and lucrative purposes.
Currently, from desktop PCs to mobile devices are widely utilizing ubiquitous wire-
less communication infrastructures. MWLAN (Metropolitan Wireless Local Area Network),
Wireless Personal Communication (WPC) network, Wi-Fi, 3G, and WiMAX are playing
essential roles in Future Communication Computing (FCC). Under such circumstances,
information leakage and security threatening have become a seriously challenging issue.
An untrustworthy wireless Access Point (AP) may reveal classified or precious intangible
assets to hackers, which results in unexpected disaster for individuals or enterprises without
any awareness of the current users. Digital Forensics (DF) is a newly emerging research filed,
which is different from computer security that is emphasized on the prevention of computer
crimes. Digital Forensics is focusing on the digital evidence, which is a digital trail that the
suspect intentionally or accidentally left behind on the crime scene and it might be volatile
in its nature [1]. Digital Forensics is practiced in both public sector and private sector. For
public sector, law enforcement prosecutors play a fundamental role in this field. For pri-
vate sector, some ISs consulting corporations are capable of providing this kind of service.
Undoubtedly, most firms prefer to handle the computer crimes themselves instead of filing to
the law enforcement because chain of custody of digital evidences may accidentally reveal
company’s confidential documents, which the organizations can not afford to take the risk in
terms of business profitability.
Generally speaking, the distinct DF stages encompass the procedures of identification,
collection, extraction, preservation and interpretation of the digital evidence that was pres-
ent concerning a computer incident or confidentiality breach [13]. Once the computer incident
occurs, any organization needs to find the causes immediately in order to stop the contin-
uous information security breaches. As we stated above, most corporations are not willing
to disclose the security incident to law enforcement due to the fact that the disclosure of
the incident might give the chance for their competitors to take advantage of the exploited
vulnerability, which may results in recoverable disasters in the market shares. Consequently,
DF specialists in the private sector become crucial once the situation occurs. If the DF staff
is not capable of capturing the intangible digital trails that the suspect left behind timely or
appropriately, the digital evidences could be disappeared, altered or erased by the intruder
[7]. In many cases, the digital evidence is unrecoverable. The DF staff is indispensable to
unveil the causes of the incident in order to prevent the similar potential computer crimes
from occurring or threatening the enterprise. Accordingly, in this research, we proposed an
123
An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents
ontology-driven model and guidelines with respect to UC environments for the DF staff to
consider and follow if a computer incident occurs.
The scientific contributions of the paper are the proposed ontology-driven model for
DF investigations under UC environments, which provided ontological guidelines for the
DF staffs to integrate with the investigation topology, which few researches have explored.
Additionally, the DF staffs need to utilize the open source software or proprietary toolkits to
deal with the obfuscations. We also provideda DF scenario review by adopting some popular
DF tools (ProDiscover Basic AccessData-FTK and Guidance Software-EnCase eDiscovery)
integrating with the distinct DF procedures presented in this paper. At last, we carried on
DF tools assessment criteria among these software suites based on the key forensics func-
tionalities during the investigation. The assessment criteria of the above tools provide clear
guiding principles for the DF staff to ponder. This paper is also attempting to shrink the
theory-practice gap in this challenging DF arena.
2 Related Researches Concerning DF
In the past two decades, computer crimes and computer-related crimes have mushroomed in
all countries because networks become ubiquitous. Generally speaking, the computer crime
means the computers become the target of cyber attacks, such as denial of service attacks
or virus attacks. The computer-related crime stands for the use of computers storage devices
and ICT tools to commit cybercrimes, such as corporate frauds [4,14]. For most computer
crackers, they may accidentally leave some kind of digital trails, which represent digital evi-
dences that can be used to prosecute the criminals in a court of law in order to hinder these
ever growing high-tech crimes that resulted in tremendous financial losses worldwide [6].
The DF emerges and becomes the critical factor to prosecute the culprit by law enforce-
ment and prevents the associate cybercrime incidents from recurring in the public sector.
Unfortunately, very few colleges/universities programs worldwide offer a comprehensive
curriculum that encompasses this appearing knowledge as well as the skill sets [2,10,17].
Basically, DF and Computer Forensics are used interchangeably in the research literatures.
Due to voluminous literatures on the problem of computer crimes, there is an urgent necessity
for the DF specialists in both public and private sectors.
Obviously, the DF is still in its infant stage and possesses a pressing need for direction
and definition for colleges/universities to establish the associate certifications or curriculum
programs as well as pedagogical models. The ICT-savvy hacktivists are committing heinous
swindles on the Internet. Hence, the forensics science community has become aware of the
importance of DF and it must be addressed as a professional and a science, which will be
closely related to many court cases in the public sector [16]. The DF is consisted of a multi-
disciplinary curriculum, which is based on the disciplines of criminology and profound ICT
knowledge. Researches pointed out that the essential DF category includes crime, computer,
security, legislation, investigation processes and forensics tools. Based on existing researches,
generally speaking, the DF is a serial investigation procedure of information security threat
where there is a digital evidence for the suspicious behavior regarding the sinister side of
computer manipulation. It encompasses the stages of Planning, Identification, Collection,
Classification, Preservation, Analysis, Reconstruction, Documentation, and Presentation of
electronic evidences in a legitimate manner to investigate the suspect to be inculpatory or
exculpatory in a court of law in the public sector or terminate an employee for his/her policy
violation concerning computer usage in the private sector. In this paper, we conducted the
following concise categories without losing the spirit of the detailed stages to go through
123
H.-C. Chu et al.
Fig. 1 The DF investigation topology
the scenario review as well as the assessment criteria of the DF tools in the later section of
the paper. These categories are Identifying, Collecting, Analyzing,andPresenting sequential
stages as Fig. 1depicted. The figure illustrated the topology of the DF investigation.
The Identifying stage includes the initial evaluation of the characteristics of the computer
incident on the crime scene. The DF staff should suitably select the appropriate commercial
or open-source software detecting tools to deal with the computer crimes under UC environ-
ments. We have seen some of the employee selling company’s precious information to its
competitors: the new product design blueprint, customer databases, etc. The DF staff should
take a serious attitude since vandalism of corporate digital intellectual property occurs in a
disruptive manner, which is a typical industrial espionage criminal conspiracy [5]. As long as
there is a possible negligible digital trail left behind on the crime scene, the DF staff should
report to CISO (Chief Information Security Officer) within the enterprise immediately as
long as the crime scene has been spotted. As Internet related technologies update rapidly, the
crime scenes could be thousands of miles away other than the suspect’s desktop PC. Versa-
tile, flexible and reliable DF tools become much more imperative in dealing with the above
incidents. The application and assessment criteria of the common commercial DF toolkits
will be presented in the later section for illustration.
The Identifying stage, preparation for the appropriate forensics tools and the adoption of
the right procedures are essential since the live data acquisition will be quite different from
the seizure of static data acquisition in terms of digital evidence [3]. For live data acquisition,
digital evidence could be cached in router, on-line chatting room, Instant Messaging (IM) or
RAM (Random Access Memory). Once the power is off or the termination of certain APs,
the digital trails will be gone forever. Live data acquisition might need special toolkits and
should be identified before the DF operation. Hence, identifying the crime scene and the char-
acteristics of the computer incident determines the corresponding SOP (Standard Operating
Procedure) in later stages. Sometimes, seizing peripherals and digital media storages like
cell phones, external hard drives, thumb drives, memory sticks or disk arrays is mandatory.
The suspect’s computer and the associate peripherals are bagged and tagged, which means
the media storage is preserved in antistatic evidence bags after the search and seizure due to
the static electricity could destroy the integrity of the digital data.
The Collecting stage can be decomposed into Data Acquisition,Integrity and Discrimina-
tion processes. As we stated above, static data acquisition and live data acquisition are two
123
An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents
distinct categories. Especially the live data acquisition is hardly to repeat because some digital
evidences are volatile in nature. Seizing digital evidences at the crime scene requires special
attentions as well as tools and they depend on the nature of the case and the alleged crime
or policy violation. Thus, the DF specialist must have completely evaluated the incident and
fully equipped in the Identifying stage. The bit-stream (bit-to-bit) duplication should create
associate digital fingerprints of the suspect’s devices to authenticate or ensure data integ-
rity. Integrity check mechanism is embedded within several commercial acquisition tools via
applying Cyclic Redundancy Check (CRC32), Message Digest 5 (MD5) and Secure Hash
Algorithm (SHA-1) to guarantee the data integrity of the bit-stream copy of the original
digital evidence. CRC32, MD5 and SHA-1 are mathematical algorithms that can produce
a unique hexadecimal value of a file or storage media to determine whether the data has
been modified. If the hash value of the digital evidence (a single file, hard drive, thumb drive
or disk arrays) remains the same after the investigation, we can conclude the integrity of
the digital evidence. As long as the digital evidence has been damaged or tampered during
the investigation processes, the hash values will be changed simultaneously. If this situation
happens, the digital evidence will not be admissible in a court of law by the jury in the United
States [8].
The Analyzing stage consists of Recovery and Reconstruction processes in order to infer
the possible causes of the computer crime or policy violation. Several DF workstations should
be set up in the lab based on the diversity of operating systems or ISs in the organizations.
The examiner should always only utilize DF workstations to analyze the digital evidences
and preserve the results on that workstation, which resides the bit-stream copy of the original
digital evidence. The original digital evidences should be bagged and tagged in antistatic
evidence bags and stored in the safe of the DF lab [9]. The Recovery process is the function
to recuperate the digital evidences that were collected and preserved. This process is the
most demanding tasks during the whole stages. Several DF toolkits provide data viewing,
keyword searching, data carving or bookmarking functions for the DF specialists to facilitate
during this stage. Looking for hidden or encrypted files is essential and imperative at this
moment. There are some DF toolkits that can retrieve information concerning the deleted
files or erased e-mails whether using SMTP or POP3. The Reconstruction process focuses on
re-creating or simulating the suspect’s environment to demonstrate the scenarios regarding
the incident. Concurrently, the DF specialists work on bit-stream clone of the suspect’s drive
in order to avoid inadvertently modifying or damaging the original digital evidence. With-
out loss of generality, the Reconstruction process includes disk-to-disk clone, disk-to-image
clone, partition-to-partition clone and image-to-partition clone.
The Presenting stage is the final step to fulfill the DF investigation. The DF investigator
should compile all the digital evidences and present the sophisticated technical statements or
terminologies into a plain text that can be easily understandable by the board of trustee in the
private sector or the jury in the public sector. Furthermore, several DF tools have the build-in
report generator as well as log reports, which will record the procedures that the examiner
performed.
3 The Proposed Ontology-driven Model for DF Investigations Under
the UC Environments
Ubiquitous Computing is a phenomenon that the computing power is invisibly and intangibly
embedded in the digital era. Evidently, the ultimate goal of UC is to provide the end users a
seamlessly integrated wireless computing environment acting as a daily critical infrastructure
123
H.-C. Chu et al.
Fig. 2 The ontology-driven model for DF investigators with respect to UC
without the traditional barrier of advanced technologies. The UC environment is a complex
system with the comprehensive combination of computer hardware, software and intelligent
interfaces to fulfill the wireless connection with reliability, dependability, availability and
interoperability. Nowadays, a variety of elements of UC are present in a number of devices
and objects, which are addressable and connected. According to the related researches indi-
cated that identification, location, sensing,andconnectivity are the key elements in a UC
environment [11,15]. Consequently, we proposed an ontology-driven model for DF investi-
gators under the UC Environments as Fig. 2illustrated.
We define element as the scope of collecting of digital evidences with respect to element
in the proposed ontology-driven model. Hence, we have the following representations:
UC =ILSC
I={MAC address,IP address,e-mail account, IM account , SIM car d,
(I,operating system)}where
(I,operating system)=(I,operating system,proprietary)(I,operating system,open source)with
(I,operating system,proprietary)={Windows Vista,Windows XP,Windows 2000/2003};
(I,operating system,open source)={Fedora,FreeBSD,RedHat};
L={IP address,DNS resolution};
S={Wi-Fi,WiMAX,3G adapter,IrDA,Bluetooth};
123
An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents
C={(C,context awareness),(C,service discovery),(C,ambient information),telepresence,
mapped network drive}where
(C,context awareness)={RSS,e-mail subscription};
(C,service discovery)={availableWi-Fi/WiMAX/IrDA/Bluetooth hotspot,
available mobile phone networks hotspot};
(C,ambient information)={(C,ambient information,volatile data acquisition)
(C,ambient information,storage devices)}with
(C,ambientin formation,volatile data acquisition)={cache/registry,routing table/
process table/RAM,temporary file system};
(C,ambient information,storage devices)={thumb drive,memory card(CF/MMC/SD/
miniSD),
(C,ambient information,storage devices,external HD)}with
(C,ambient information,storage devices,external HD)
={(C,ambient information,storage devices,external HD,Windows)
(C,ambient information,storage devices,external HD,Linux/Unix),HFSp lus }and
(C,ambient information,storage devices,externalHD,Windows)={NT FS , F A T };
(C,ambient information,storage devices,external HD,Linux/Unix)
={root directory,extended file system}
For each single element , we can apply the topology illustrated on Fig. 1to proceed the
digital investigation regarding a certain computer incident.
4 A DF Scenario Review for a UC e-Crime
The scenario occurred to a successful electronic switch design corporation in Asia and there
was one employee quitting the marketing job without notifying the related superior of the
company, which was quite abnormal of the organization. In the meanwhile, the CEO of
the company was speculating concerning the industrial espionage conspiracy via ICT might
occur due to the competitor announced the newly designed product whose specification was
almost identical to the new one they launched. The unusual coincidence made the CISO
of the corporation immediately checked the vanishing employee’s desktop PC and initially
found out that all the contents in the data drive was purposely erased. In addition, one thumb
drive was found near the PC. The CISO decided to carry on the related DF investigation
regarding this suspicious computer incident. As we stated above, due to the pervasiveness
of the UC infrastructures, end users can easily access the wireless networks and process
transactions. Some mobile computing users utilize the wireless adapter to connect to the
ubiquitous networks except the corporate wireless network in order to avoid the auditing
mechanism embedded in the corporate network. Under such circumstances, it becomes a
critical issue that information security might be the breach of the corporate intangible assets.
In our proposed scenario, the CISO was also skeptical about the suspect accessing some
other wireless networks in the office to stay away from the auditing and monitoring network
traffics procedures that are being enforced within the corporation. In terms of finding the
possible causes of the potential computer incident, the CISO followed the ontology-driven
model we presented above.
Step 1 The CISO searched the nearby wireless network whose signal was strong enough
to make the Internet connection on the crime scene and the result was as Fig. 3
shown. From the figure, the CISO was not able to conclude which wireless AP was
123
H.-C. Chu et al.
Fig. 3 The available wireless AP near the crime scene
Fig. 4 Capturing the
disk-to-image file of the suspect’s
thumb drive
using. However, there were two wireless APs that were open to public, which means
the suspect may take advantage of the current UC environment. In other words,
(C,service discovery)and Swere positively verified.
Step 2 The write-blocker function was enabling before the following investigations to ensure
the integrity of the digital evidences [12]. Initially, the DF specialist can not find
any file in the thumb drive (ASUS 2GB) under the Windows XP operating sys-
tem. In order to preserve the integrity of the digital evidence, the DF specialist
should make the image of the thumb drive. Hence, the CISO applied ProDiscover
Basic [19] to capture the disk-to-image file of the suspect’s thumb drive as Fig. 4
indicated. (C,ambient information, storage devices)was carried on based on the ontology-
driven model. The obtained image file, 0101.eve, with file size 1.914GB, was stored
in another newly formatted thumb drive (SONY 4GB) with FAT file format [4]
in accordance with (C,ambient information, storage devices, external HD, Windows)presented
above. At this moment, the suspect’s thumb drive was tagged and bagged in an anti-
static evidence bag and was kept in the evidence safe. The following operations will
be performed on the new thumb drive (SONY 4GB) to preserve the original digital
evidence from being lost or accidently damaged during the investigation procedures.
Step 3 In order to guarantee the integrity of the probative digital evidence, the DF staff
needed to calculate the MD5 and SHA-1 hash values before the following forensics
operations. Hence, the DF staff utilized AccessData-FTK Imager [20] to get the hash
123
An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents
Fig. 5 Digital signatures of the
thumb drive with MD5 and
SHA-1 hash values before
examination
Fig. 6 The erased contents in the suspect’s thumb drive
values of the thumb drive. The values representing the digital signatures of the thumb
drive were shown respectively in Fig. 5. Hence, we obtained the followings:
HashMD5(001.eve, time stamp i) =3c795cbc9add496f15b479d2dd53760a
HashSHA1(001.eve , time s ta mp j )
=329662451980c4ac3b14b9c6d2e2d53b3179416f
Step 4 For the purpose of analyzing the imaged file of the suspect’s thumb drive, the DF
staff used ProDiscover Basic to import the disk-to-image file, 0101.eve, as adding the
digital evidence on the DF workstation (ACER notebook with Microsoft Windows
XP Professional version 2002, service pack 3, Intel Pentium M processor 2.00GHz,
1.00 GB RAM). The DF staff utilized ProDiscover Basic to read this disk-to-image
file as through it was the original disk and found out that there were three files hid-
den (PST, JPG, and DOC files) in the thumb drive, whose content has been erased
as Fig. 6indicated. The DF staff used the data viewing and data carving functions
visualizing the contents of the files. From the file names of the JPG and DOC files, the
DF specialist legitimately speculated that the employee might steal the new prod-
uct blueprint, which needs super user access authority in the company’s Intranet.
Obviously, the suspect did not have it except the CEO.
Hence, the DF staff applied ProDiscover Basic to recover the deleted file, prod-
uct_spec.jpg, and demonstrated that the missing employee was holding the blueprint
of the newly designed product as Fig. 7indicated, which was highly confidential.
From DF point of view, the computer incident could be classified as the vandalism
of intangible asset of the corporation.
123
H.-C. Chu et al.
Fig. 7 The blueprint of the
newly designed product
recovered from the suspect’s
thumb drive
Fig. 8 The detailed content of the deleted e-mails from the suspect’s thumb drive
Step 5 Furthermore, the DF specialist recovered the deleted PST file, trade_exam.pst,using
ProDiscover Basic. Then the DF specialist use AccessData-FTK to examine the
trade_exam.pst file, which showed the detailed content of the deleted e-mail as Fig. 8
indicates. The DF specialist proofed that the vanishing employee possessed some
tremendously secret information regarding the new product and mailed out through
Microsoft Outlook to the competitor. So far, the above digital evidence might con-
clude that an industrial espionage and vandalizing precious intangible assets were
occurring within this enterprise conducted by the missing employee.
Step 6 In order to verify the integrity of the collected digital evidence, the DF staff uti-
lized AccessData-FTK Imager to recalculate the hash values of the thumb drive and
the result as Fig. 9shown, which obtained the same MD5 and SHA-1 hash values
respectively as Fig. 5indicated. This verified the integrity of digital evidence, which
means that the collected digital evidence was not unexpectedly altered or damaged
during the investigation stages. The data integrity was based on mathematical proof
via MD5 and SHA-1 algorithms.
Hash
MD5(001.eve, t ime s tamp m) =3c795cbc9add496f15b479d2dd53760a
Hash
SHA1(001.eve , time s ta mp n)
=329662451980c4ac3b14b9c6d2e2d53b3179416f
The DF staff obtained Hash
MD5(001.eve, time stamp i) =Hash
MD5(001.eve,
time stamp m) and Hash
SHA1(001.eve , time st amp j ) =Hash
SHA1(001.eve,
123
An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents
Fig. 9 The digital signatures of the imaged file of the thumb drive with MD5 and SHA-1 hash values after
examination
Fig. 10 The suspect was using Microsoft Outlook Express to communicate with the outsider
time stamp n). Accordingly, we can conclude and guarantee that the digital evi-
dence was not modified or tampered during the investigation processes and could be
probative in a court of law.
Step 7 Since all the files on the suspect’s data drive had been deleted in the desktop PC,
the DF specialist applied FINALeMAIL [21] to recover the suspect’s desktop deleted
e-mails and found out that the suspect was using Microsoft Outlook Express to com-
municate with the company’s competitor and this is clearly an industrial espionage
and computer-related crime as Fig. 10 indicated. Iand Lwere verified at this
step.
5 Generic Assessment Criteria of DF Toolkits and the Related Issues
The purpose of this paper has no intention to focus on any particular DF toolkit. How-
ever, our assessment criteria were performed based on the similarities, strength or weak-
ness of some generic common commercial software suites in the arena of DF as we
proposed above. Frankly speaking, there is no single DF toolkit can fulfill all purposes.
The DF specialist should be versatile and skillful in the combination of several DF tools
suites. In this section, we presented three typical commercial DF tools suites (ProDiscover
Basic, AccessData-FTK,andGuidance Software-EnCase eDiscovery) that are prevalent
in the market for references [4,13,18]. The ontological and tabular assessment criteria of
the above DF toolkits were based on the following functionalities and attributes: Acqui-
sition, Integrity, Discrimination (during the Collecting stage), Recovery, Reconstruction
(during the Analyzing stage), Log Reposts and Report Generator (during the Presenting
stage).
Duplication of the suspect’s hard drive or attached data storage is the first priority to avoid
the magnetic data from being lost or interfered during the investigation stages. Basically,
there are two ways being applied to fulfill the above goal: physical data duplication and
123
H.-C. Chu et al.
logical data duplication. In some cases, we must utilize the logical data duplication due to
the issue that the suspect’s drive may have been encrypted, which is increasingly empha-
sized. The Graphical User Interface (GUI) approach is definitely a user friendly mecha-
nism. However, in some DF software suites, the GUI approach can consume lots of system
resources. Consequently, the command line approach provides an alternative for the DF
staffs, especially when the Windows operating system was not able to be normally booted.
Under such circumstances, the command line approach could be the only one way to start
the case. As the UC infrastructures become pervasive, the DF specialists can remotely pro-
cess the computer security incident whenever and wherever the trustworthy UC networks
can be available. In that case, the efficiency of the DF for a certain computer incident will
be highly increased and this will definitely stimulate and speed up the prevalent estab-
lishment for the cyber forensics lab under the UC environments to face the unprecedented
computer crimes challenges. For the purpose of checking the integrity of the digital evi-
dence collected, the DF staffs have to perform the hashing algorithm for the digital evidence
being examined. MD5 or SHA-1 hash value certifies the integrity of the digital evidence
being explored because we can not perform the investigation on the original digital evi-
dence.
The filtering function can dramatically reduce the search time if the capacity of the storage
device is very large. Currently, finding digital trails within 500 GB hard drive without the
filtering function could be a time consuming and tedious task. Furthermore, in some com-
puter crime incidents, renaming the file extension is a common strategy in order to avoid the
filtering function. Thus, file header analysis provides the functionality to unveil the hidden
data from the image file of the suspect’s drive.
Data viewing mechanism provides the examiner an easy short-cut to peek the files in
their original formats. Keyword search function can filter the irrelevant entries and speed
up the data crunching procedure. Data carving function is essential for the investigator
to extract data from unallocated data areas of a drive or an image file. Once locating the
entire file structures or the fragments, the DF staffs are able to utilize this functionality to
carve and copy into a new file for detailed investigations. Unfortunately, encrypted files and
systems result in additional overhead for the DF staffs. If the DF software suites embed
with the decrypting functionality, then it will obviously alleviate the burdens for the DF
team.
Image-to-disk clone is a typical approach in many cases and disk-to-disk clone is also a
convenient way during Reconstruction stage. But, the DF examiner might have to be aware
of the limitations of certain software suites. In some situations, when the DF staffs pro-
cess disk-to-disk clone, the target clone disk needs to be identical as the suspect’s device
including the make, size, cylinders, sectors, and track counts. Consequently, some DF soft-
ware suites have build-in functionality that can force the geometry of the suspect’s drive
to be the same as the target drive. Additionally, image-to-disk clone and image-to-partition
clone might take longer period of time due to this kind of data transferring is slower in its
nature or the suspect’s disk capacity is relatively large. Some software suites bundled up
with functions to generate log reports, which can precisely record the operations by the DF
staff. Report generator can facilitate the final official documentation of the investigation to
be submitted to the board of trustee in the private sector or the court of law in the public
sector.
At last, the generic assessment criteria of the proposed DF software suites are presented
in an ontological and tabular format as Fig. 11 illustrated. Although the DF examiners might
use different toolkits, the generic functionalities we presented in this paper provide the clear
guidelines as the assessment criteria for the adoption of the suitable DF toolkits.
123
An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents
Fig. 11 Generic assessments criteria of DF toolkits
6Conclusion
As UC infrastructures become prevalent, escalating computer crimes and the non-stop ex-
ploited vulnerabilities of ISs under UC environments stimulate the DF specialists to seriously
recognize the phenomenon. Nowadays, the ICT is widely being incorporated into the oper-
ations of enterprises via diverse trustworthy UC infrastructures. It’s an urgent need for the
DF specialists to combat the ICT savvy criminals or e-crime syndicate. In this paper, we
provided an ontology-driven model for DF investigators based on the basic elements of UC.
Demonstrably, the DF specialists rely heavily on DF tools and there is no one DF tool suite
that can encompasses all the possible situations because of the dynamic complexity in every
single computer incident. Consequently, some generic assessment criteria of the DF toolkits
are discussed and presented along with an ontological and tabular representation. The paper
has no intention to recommend any DF suite, but, it’s extremely important to have DF tool-
kit suites assessment criteria in this challenging field. Simultaneously, we conducted a DF
scenario review base on the ontology-driven model we presented, which few researches have
done.
123
H.-C. Chu et al.
References
1. Andrew, M. W. (2007). Defining a process model for forensic analysis of digital evidence devices
and storage media. In Proceedings of the 2nd international workshop on systematic approaches to
digital forensic engineering (SADFE 2007). IEEE.
2. Anson, S., & Bunting, S. (2007). Windows network forensics and investigation. New York: Wiley
Publishing.
3. Battistoni, R., Biagio, A. D., Pietro, R. D., Formica M., & Mancini L. V. (2008). A live digital
forensic system for windows networks (Vol. 278, pp. 653–667). International Federation for Information
Processing (IFIP). Boston: Springer.
4. Brinson, A., Robinson, A., & Rogers, M. (2006). A cyber forensic ontology: Creating a new approach
to studying cyber forensics. Digital Investigation, 3, 37–43.
5. Burmester, M., & Mulholland, J. (2006). The advent of trusted computing: Implications for digital
forensics (pp. 283–287). In Proceedings of the 2006 ACM symposium on applied computing,April
23–27, Dijon, France.
6. Casey, E. (2006). Investigating sophisticated security breaches. Communication of the ACM, 49(2),
48–54.
7. Cassidy, R. F., Chavez, A., Trent J., & Urrea, J. (2008). Remote forensic analysis of process control
systems (Vol. 253, pp. 223–235). International federation for information processing (IFIP), Critical
Infrastructure Protection. Boston: Springer.
8. Chaikin, D. (2006). Network investigations of cyber attacks: The limit of digital evidence. Crime
Law Social Change, 46, 239–256.
9. Chen, H., et al. (2004). Crime data mining: A general framework and some examples. IEEE
Computer, 37(4), 50–56.
10. Craiger, P. (2007). Training and education in digital evidence, handbook of digital and multimedia
forensic evidence (pp. 11–22). Totowa, NJ: Human Press Inc.
11. Ley, D. (2007). Ubiquitous computing. Emerging technologies for learning. Becta, 2, 64–79.
12. Lyle, J. R. (2006). A strategy for testing hardware write block devices. Digital Investigation, 3, 3–9.
13. Nelson, B., Phillips, A., Enfinger, F., & Steuart, C. (2008). Guide to computer forensics and investigations
(3rd ed., pp. 223–235). Course Technology.
14. Nena, L. (2006). Crime investigation: A course in computer forensics. Communications of AIS, 18, 2–34.
15. Oh, S. M., Kim, Y. M., Jang, J. H., Koh, B. S. & Choi, Y. R. (2007). A study of volatile information
collection of computer forensics system for computer emergency based on ubiquitous computing. In
3rd International Conference on Natural Computation (ICNC 2007). IEEE.
16. Pollitt, M. (2007). An ad hoc review of digital forensic models. In Proceedings of the 2nd international
workshop on Systematic Approaches to Digital Forensic Engineering (SADFE 2007). IEEE.
17. Richard, G. G., III, & Roussev, V. (2006). Next-generation digital forensics. Communication of the
ACM, 49(2), 76–80.
18. http://www.encase.com.
19. http://www.techpathways.com.
20. http://www.accessdata.com.
21. http://www.finaldata.com.
Author Biographies
Hai-Cheng Chu received his Ph.D. degree in System Science and
Industrial Engineering in 1996 and Master of Computer Science in
1992 from SUNY at Binghamton respectively. He was a senior soft-
ware engineer at Cheyenne Software, New York, U.S.A. in 1996.
Dr. Chu was lecturing at University of Northern British Columbia
(UNBC) in Canada. Currently, Dr. Chu is an associate professor
at the Department of Information Management/International Busi-
ness/EMBA of Tunghai University, Taiwan. He has been devoting
himself to e-Commerce over the past decade both in academic and
industrial arenas. Dr. Chu attended Harvard Business School for PCM-
PCL V program in 2007. He has authored several textbooks respect-
ing Management Information System, e-Commerce, Global Logistics
Management, Commercial Automation, System Analysis and Design.
Currently, Dr. Chu focuses on the research concerning cyber terrorism
and digital forensics.
123
An Ontology-driven Model for Digital Forensics Investigations of Computer Incidents
Der-Jiunn Deng received the B.S. degree in computer science from
Tung-Hai University, the M.S. degree in information management from
National Taiwan University of Science and Technology, and the Ph.D.
degree in electrical engineering from National Taiwan University in
1997, 1999 and 2005, respectively. He joined National Changhua Uni-
versity of Education as an assistant professor in the Department of
Computer Science and Information Engineering in August 2005 and
then became an associate professor in February 2009. Dr. Deng vis-
ited Iowa State University, USA, in 2006. His research interests include
multimedia communication, quality-of-service, and wireless networks.
Dr. Deng was the guest editor of the special issue on Ubiquitous Mul-
timedia Computing: Systems, Networking, and Applications for the
International Journal of Ad Hoc and Ubiquitous Computing (IJAHUC)
and the guest editor of the special issue on Internet Resource Sharing
and Discovery for the Journal of Internet Technology (JIT). He also
served on several technical program committees for IEEE and other
international conferences. He is now serving as a program chair for the 2nd International Conference on
Computer Science and its Applications (CSA-09). Dr. Deng is a member of the IEEE.
Han-Chieh Chao is a jointly appointed Professor of the Department
of Electronic Engineering and Institute of Computer Science Infor-
mation Engineering, National Ilan University, I-Lan, Taiwan. He also
holds a joint professorship of the Department of Electrical Engineering,
National Dong Hwa University, Hualien, Taiwan. His research inter-
ests include High Speed Networks, Wireless Networks and IPv6 based
Networks and Applications. He received his MS and Ph.D. degrees
in Electrical Engineering from Purdue University in 1989 and 1993
respectively. Dr. Chao is also serving as an IPv6 Steering Commit-
tee member and Deputy Director of RD division of the NICI Taiwan,
Co-chair of the Technical Area for IPv6 Forum Taiwan. Dr. Chao is an
IEEE senior member, IET and BCS Fellows.
123
... Approaches in digital forensics can be modified into practical methods; and subsequently used by Incident Response Teams of organisations to deal with information security issues (e.g., Dokko and Shin, 2019 ;Chu et al., 2016Chu et al., , 2011. Jiang and Li (2019) discover that since 2013, digital forensics researchers have been focusing on cloud computing, IoTs, big data, and construction of the automatic, efficient and intelligent forensic system to produce better techniques. ...
Article
Full-text available
[Accepted for publication 24/08/2020] Industrial Espionage (IE) is an umbrella term covering a complicated range of activities performed to gain competitive advantages, resulting in a huge amount of financial loss annually. Currently, techniques generated by rapid developments of Internet of Things (IOTs) and Data Science are enabling a massive increase of both frequency and power of IE related activities in our increasingly challenging global commercial environment. Thus, an in-depth understanding of IE is necessary. In this paper, we report a comprehensive Systematic Literature Review (SLR) of current English literature on IE. Particularly, we systematically: i) identify key features of IE by analysing its current definitions, and coin our own working definition; ii) discuss the current state of research on IE from different academic disciplines; iii) highlight some key challenges in the current state of research on IE; and iv) identify some possible trends in its future development. Further, based on our findings, we call for more multi-disciplinary/multi-agency research on IE in order to construct a comprehensive framework to combat IE.
Article
In parallel with the exponentially growing number of computing devices and IoT networks, the data storage and processing requirements of digital forensics are also increasing. Therefore, automation is highly desired in this field, yet not readily available, and many challenges remain, ranging from unstructured forensic data derived from diverse sources to a lack of semantics defined for digital forensic investigation concepts. By formally describing digital forensic concepts and properties, purpose‐designed ontologies enable integrity checking via automated reasoning and facilitate anomaly detection for the chain of custody in digital forensic investigations. This article provides a review of these ontologies, and investigates their applicability in the automation of processing traces of digital evidence. This article is categorized under: • Digital and Multimedia Science > Artificial Intelligence • Digital and Multimedia Science > Cybercrime Investigation • Digital and Multimedia Science > Cyber Threat Intelligence
Chapter
In today’s culture organizations have come to expect that information security incidents and breaches are no longer a matter of if but when. This shifting paradigm has brought increased attention, not to the defenses in place to prevent an incident but, to how companies manage the aftermath. Using a phenomenological model, organizations can reconstruct events focused on the human aspects of security with forensic technology providing supporting information. This can be achieved by conducting an after action review for incidents using a phenomenological model. Through this approach the researcher can discover the common incident management cycle attributes and how these attributes have been applied in the organization. An interview guide and six steps are presented to accomplish this type of review. By understanding what happened, how it happened, and why it happened during incident response, organizations can turn their moment of weakness into a pillar of strength.
Article
Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that bring up various security threats. They are extreme new, innumerous and mutational. Partial and fragmented network security knowledge is usually provided by traditional computer security systems, including IDS, anti-virus system, etc., and direct knowledge query of web search engines. However, they are unable to provide users with the comprehensive knowledge and rapid reaction solutions. Users are consequently hard to clarify the causes and consequences of network security problems. A Comprehensive Security Integrated (CSI) model with acquisition, storing, reusing and integration of network security knowledge for cloud computing is proposed to help understand and trace the suspicious network attack threats and origins. Also, several knowledge classes (frames) are constructed in the implementation for knowledge query and manipulation. The process of CSI model and how it provides the comprehensive knowledge for problem solving are then explained by demonstration cases. Moreover, the demonstration results indicate that the CSI model and the implementation are more efficient than traditional fragmented knowledge from web search.
Article
The ontology is usually the knowledge infrastructure for semantic Web and some intelligent systems. Ontology technology is one of the key technologies of the semantic Web. However, the performance of large-scale ontology reasoning has become one of the bottlenecks of the semantic Web based applications. We examine the problem of paralleling the reasoning process for ontology. A key challenge in the problem is partitioning the computational workload to minimize duplication of computation and data communicated among computing nodes. A hybrid partitioning based parallel ontology reasoning approach is proposed to address this challenge. Firstly, OWL ontology files were stored into the database in the form of triple table using Oracle 11g semantic technology. Secondly, a hybrid ontology partitioning approach, which combines both tuple partition and rule partition was designed and implemented. Thirdly, hybrid partitioning based ontology reasoning was achieved using parallel computing technology. Experimental results show that the parallel reasoning methods described herein can significantly improve reasoning performance.
Article
Because of the rapid development of software technology, many enterprises require more high-performance hardware to enhance their competitiveness. Cloud computing is the result of distributed computing, grid computing and is gradually being seen as the future solution to the companies. Cloud computing can virtualize existing software and hardware to reduce costs. Thus, enterprises only require high Internet bandwidth and devices to access cloud service on the Internet. This would decrease many overhead costs and reduce IT staff requirement. A cloud environment provider provides many companies to rent a cloud service simultaneously in the provider’s cloud, the technology is named multi-tenancy cloud service. However, how to access resource safely is an important topic if user want to adopt multi-tenancy cloud computing technology. The cloud-computing environment is vulnerable to network-related attacks. This research uses role-based access control authorization mechanism concept and combines it with attribute based access control to determine which tenant that user can access. The enhanced authorization mechanism can improve the safety of cloud computing services and protected the data secret.
Article
This paper proposes a power saving mechanism for embedded system and multimedia streaming service design based on digital signal processor. It simultaneously achieves two main functions: a high-quality multimedia service in cloud computing, and a power saving control mechanism for extending the lifetime of hand-held devices used on multimedia streaming server applications. This paper enables dynamic voltage and frequency scaling power control mechanisms to enable frequency and voltage adjustment according to the system state of a device. The proposed mechanism increases the frame rate of multimedia services to 16.66%/s and mitigates power consumption by 1.5%.
Article
Full-text available
The volume of crime data is increasing along with the incidence and complexity of crimes. Data mining is a powerful tool that criminal investigators who may lack extensive training as data analysts can use to explore large databases quickly and efficiently. The collaborative Coplink project between University of Arizona researchers and the Tucson and Phoenix police departments correlates data mining techniques applied in criminal and intelligence analysis with eight crime types. The framework has general applicability to crime and intelligence analysis because it encompasses all major crime types as well as both traditional and new intelligence-specific data mining techniques. Three case studies demonstrate the framework?s effectiveness.
Article
Full-text available
The shortcomings of the current generation of digital forensic tools and suggestions to overcome them are discussed. A major problem when investigating large targets is how to capture the essential data during acquisition when working copies of potential evidence sources are created. Smarter acquisition can reduce the amount of data that must be examined by targeting interesting evidence and leaving behind data. The digital forensics community requires better analysis tools that rely on distributed processing. Digital forensics investigators must be relieved of manual, time-consuming tasks, allowing them more time to think to reduce case turnaround time. Commodity computer clusters can reduce the waiting time for traditional forensic operations and provide an architecture for development of much more powerful analysis techniques.
Article
Full-text available
The field of cyber forensics, still in its infancy, possesses a strong need for direction and definition. Areas of specialty within a professional environment, certifications, and/or curriculum development are still questioned. With the continued need to standardize parts of the field, methodologies need to be created that will allow for uniformity and direction. This paper focuses on creating an ontological for the purpose of finding the correct layers for specialization, certification, and education within the cyber forensics domain. There is very little information available on this topic and what is present, seems to be somewhat varied. This underscores the importance of creating a method for defining the correct levels of education, certification and specialization. This ontology can also be used to develop curriculum and educational materials. This paper is meant to spark discussion and further research into the topic.
Article
Cyber attackers are rarely held accountable for their criminal actions. One explanation for the lack of successful prosecutions of cyber intruders is the dependence on digital evidence. Digital evidence is different from evidence created, stored, transferred and reproduced from a non-digital format. It is ephemeral in nature and susceptible to manipulation. These characteristics of digital evidence raise issues as to its reliability. Network-based evidence – ie digital evidence on networks – poses additional problems because it is volatile, has a short life span, and is frequently located in foreign countries. Investigators face the twin obstacles of identifying the author of a cyber attack and proving that the author has “guilty knowledge.” Even more is at stake when the cyber attacker is a trusted insider who has intimate knowledge of the computer security system of the organisation. As courts become more familiar with the vulnerabilities of digital evidence, they will scrutinise the reliability of computer systems and processes. It is likely that defence counsel will increasingly challenge both the admissibility and the weight of digital evidence. The law enforcement community will need to improve competencies in handling digital evidence if it is to meet this trend.
Conference Paper
There are three components of the overall digital forensic process defined by current industry standards; acquisition, preservation, and analysis. This paper explores defining a model for the analysis phase that will address legal and technical concerns surrounding the forensic analysis process
Conference Paper
Digital forensics has been the subject of academic study for a relatively brief period of time. One of the foundational ways in which researchers try to understand the scientific basis of a discipline is to construct models which reflect their observations. This paper reviews a collection of fifteen published papers which represent data points in the development of digital forensic models. It is neither an exhaustive review, nor exhaustive list of all available papers
Conference Paper
The release of computer hardware devices based on "trusted computing" technologies is heralding a paradigm shift that will have profound implications for digital forensics. In this paper, we map out the contours of a trusted environment in order to establish the context for the paper. This is followed by the main components of the TC architecture with an emphasis on the Trusted Platform and the Trusted Platform Module (TPM). The next section presents a synopsis based on three threat models, viz., (i) pc owner-centric, (ii) trusted computing-centric, and (iii) digital forensics-centric and then briefly touches on the implications and unintended consequences of trusted computing for digital forensics. Finally, the last section of the concludes with a recommendation on how to mitigate the negative effects of trusted computing.