PreprintPDF Available

Requirements in digital forensics method definition: observations from a UK study (DIIN_2018_193_R1 Accepted for publication, 11/Sept/ 2018).

Authors:
Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

During a project to examine the potential usefulness of evidence of tool verification as part of method validation for ISO 17025 accreditation, the authors have examined requirements statements in several digital forensic method descriptions and tools. They have identified that there is an absence of clear requirements statements in the methods and a reluctance or inability to disclose requirements on the part of tool producers. This leads to a break in evidence of correctness for both tools and methods, resulting in incomplete validation. They compare the digital forensics situation with other ISO 17025 accredited organisations, both forensic and non-forensic, and propose a means to close the gap and improve validation. They also review existing projects which may assist with their proposed solution.
Content may be subject to copyright.
Requirements in digital forensics method definition:
observations from a UK study
Angus M. Marshall, Richard Paige
Dept. of Computer Science, University of York, UK
Abstract
During a project to examine the potential usefulness of evidence of tool
verification as part of method validation for ISO 17025 accreditation, the
authors have examined requirements statements in several digital forensic
method descriptions and tools. They have identified that there is an absence
of clear requirements statements in the methods and a reluctance or inability
to disclose requirements on the part of tool producers. This leads to a break
in evidence of correctness for both tools and methods, resulting in incomplete
validation. They compare the digital forensics situation with other ISO 17025
accredited organisations, both forensic and non-forensic, and propose a means
to close the gap and improve validation. They also review existing projects
which may assist with their proposed solution.
Keywords: ISO 17025, ISO 27041, quality standards, method validation,
Tool verification, forensic tool development
1. Introduction1
ISO/IEC 27041 [1], as part of a group of standards dealing with digital2
investigations, is the standard which describes a process by which a method3
can be shown to be fit for its intended purpose. To achieve this, it proposes a4
process for the validation of methods used in a digital investigation. Within5
the description of validation it suggests that evidence of a tool’s verification6
against a declared set of requirements can be used as means to reduce the7
amount of validation required for processes in which the tool participates.8
Email address: angus.marshall@york.ac.uk (Angus M. Marshall)
This work was supported by the University of York Research Priming Fund
Preprint submitted to Digital Investigation August 31, 2018
i.e. it suggests that those process requirements which are wholly satisfied by9
the tool, and for which evidence of verification exists, need not be subjected10
to further testing.11
Note: in this project we have concentrated solely on the validation and12
verification issue. The other standards in the group propose models of evi-13
dence gathering and processing which. although useful, are not considered14
core issues for this work.15
From the perspective of software engineering the proposal in ISO/IEC16
27041 [1] is entirely acceptable. However, for such a mechanism to succeed,17
the tool and the process in which it participates must be specified in terms18
of requirements which can be mapped against each other to show how the19
tool conforms to, or partially fulfills, the requirements of the process.20
In effect, the proposal is that there is some degree of overlap between tool21
requirements and method requirements, ranging from the possibility that a22
tool’s requirements are a complete subset of a method’s requirements (Figure23
1) to the, potentially, less likely situation where a method’s requirements are24
a subset of a tool’s (Figure 1).25
Figure 1: Tool requirements are a subset of method. Typical of specialist tools or small
tools produced to assist with part of a method.(Shaded area = the set of requirements
which much be satisfied for validation.)
In practice, because some of the requirements for a method with an inves-26
tigative context will be non-technical in nature, it is believed that the most27
common situation will be that shown in Figure 1, where a tool’s requirements28
2
Figure 2: Method requirements are a subset of tool. Considered rare, but possible where
a method exactly follows a process defined by the tool producer and uses only a subset of
the tool functionality.(Shaded area = the set of requirements which much be satisfied for
validation.)
intersect with those of a method, and only those tool requirements lying in29
the intersection are relevant to the validation of the method.30
During research into how this mechanism could be applied in practice,31
particularly to allow producers of tools for digital forensic processes to sup-32
port their customers’ compliance with ISO 17025’s2validation requirement33
[2], through disclosure of evidence of testing and without compromising com-34
mercially sensitive information such as details of test data, the authors have35
found that such a mapping appears, at the time of writing, to be impossible36
to perform. This is because it has proved impossible to obtain the necessary37
levels of information about requirements from any of the participants in the38
study. Two main factors appear to affect this:39
Firstly, the process definitions examined in our study do not contain40
any technical requirements which can be mapped. Rather, they con-41
tain primarily non-technical requirements aligned to the needs of the42
2In this document we concentrate on the use of ISO 17025:2005 as the currently de-
ployed standard. We consider the implications of transition/update to the 2017 version in
the Conclusions of this document
3
Figure 3: Tool requirements intersect with the method. Common where the tool fulfils
some or all of the technical requirements, but there are other non-technical requirements to
be satisfied.(Shaded area = the set of requirements which much be satisfied for validation.)
Criminal Justice System.43
Secondly, the tool producers are either unable (in the case of most small44
providers) or unwilling (in the case of most larger providers) to provide45
information about how they capture customer requirements, let alone46
disclose what those requirements are.47
Some even went as far as responding to the request for information with48
statements such as “The information you seek is commercially sensitive49
as we operate in a very competitive landscape. Unfortunately, we can’t50
give out any specifics on our product development techniques to third51
parties.” The authors struggle to understand this type of response as52
our questions related to high-level development models and require-53
ments capture methods rather than specific details of implementation54
of tools or tests. We can only surmise that the tool providers who55
responded in this way either lack confidence in their own products or56
believe that they are using innovative development techniques which57
no other developer has considered.58
2. Principles of ISO 1702559
Before examining the concept of validation more closely, it may be helpful60
to review some of the principles which underpin ISO 17025 which are embod-61
ied in the earlier version and which have influence its use in “non-forensic”62
4
organisation such as those carrying out calbration of tools or testing of chem-63
ical compounds or metal alloys.64
Gravel[3], writing in 2002 about the 1999 version of ISO 17025 described65
8 principles which were embodied within the standard as:66
Capacity Concept that a laboratory has the resources (people67
with the required skills and knowledge, the environment68
with the required facilities and equipment, the quality con-69
trol, and the procedures) in order to undertake the work and70
produce competent results.71
Exercise of responsibility Concept that persons in the organisa-72
tion have the authority to execute specific functions within73
the overall scope of work and that the organisation can74
demonstrate accountability for the results of the work.75
Scientific method Concept that the work carried out by the or-76
ganisation is based on accepted scientific approaches, prefer-77
ably consensus-based, and that any deviations from accepted78
scientific approaches can be substantiated in a manner con-79
sidered generally acceptable by experts in that field.80
Objectivity of results 1. Concept that the results produced81
within the scope of work of the organisation, are mainly82
based on measurable or derived quantities.83
2. Concept that subjective test results are produced only by84
persons deemed qualified to do so and that such results are85
noted as being subjective, or are known by experts in that86
field of testing to be mainly subjective.87
Impartiality of conduct Concept that the pursuit of competent88
results through the use of generally accepted scientific ap-89
proaches is the primary and overriding influence on the work90
of persons executing tests - all other influences being con-91
sidered secondary and not permitted to take precedence.92
Traceability of measurement 1. Concept that the results pro-93
duced, within the scope of work of the laboratory, are based94
on a recognised system of measurement that derives from95
accepted, known quantities (SI system) or other intrinsic or96
well-characterised devices or quantities.97
5
2. Concept that the chain of comparison of measurement98
between these accepted, known quantities or intrinsic de-99
vices or quantities, and the device providing the objective100
result, is unbroken for the transfer of measurement charac-101
teristics, including uncertainty, for the whole of the mea-102
surement chain.103
Repeatability of test Concept that the test which produced the104
objective results, will produce the same results, within ac-105
cepted deviations during subsequent testing, and within the106
constraints of using the same procedures, equipment and107
persons used during a previous execution of the test.108
Transparency of process Concept that the processes existent109
within the laboratory producing the objective results, are110
open to internal and external scrutiny, so that factors which111
may adversely affect the laboratory’s pursuit of objective112
results based on scientific method, can be readily identified113
and mitigated.114
With the exceptions of Capacity and Exercise of responsibility, these prin-115
ciples establish a need to show, not just that a chosen method satisfies re-116
quirements for an intended use, but that the method is fundamentally correct117
or sound, and satisfies broader ranging technical requirements.118
From our reviews of both the 2005 and 2017 versions of ISO 17025, it119
appears that these principles have been retained in the most recent versions120
of the standard.121
3. Application of ISO 17025:2005 to “non-forensic” disciplines122
A regularly voiced criticism of ISO 17025 is that it is, as its title suggests,123
intended for Testing and Calibration laboratories. In order to understand124
how ISO 17025 is applied in these “non-forensic” organisations, and to de-125
termine if or how it is applied differently in a forensic context, the authors126
carried out a review of publicly available accreditation records.127
The United Kingdom Accreditation Service (UKAS) maintains a register128
of accredited bodies [4] which is open for public inspection. The entries in129
this register include detail of each test for which a body has been accredited,130
giving a brief description of the method used where appropriate or necessary.131
6
Examination of a sample of 100 accredited organisations in a range of132
“non-forensic” and “non-medical” areas reveals that these organisations ap-133
ply two approaches to defining the requirements for their accredited process:134
Physical properties Where precise measurement of physical properties is135
possible (e.g. for volumetric, force, torque, acoustics), the schedules of136
accreditations specify, using SI units, the range of measurement possi-137
ble and tolerances (uncertainty) allowed for that measurement.138
External standards In other circumstances, where an industry has defined139
its own standards, the accreditation is based on implementation of the140
published standard which either defines the range and uncertainty for141
the measurement, or defines the method itself.142
In both of these cases, the requirements for the method, and thus its143
validation, are available in published form (either directly in the schedule144
of accreditation or in the published standard) and thus can be subjected to145
independent scrutiny and adopted by others practicing in the same technical146
field. In fact, the published requirements allow an independent verification of147
the method to show correctness in the form of conformance to a general set of148
standardised requirements rather than just conformance to the requirements149
for a particular use-case.150
Moreover, the presence of these published criteria allow customers to151
identify those testing bodies whose methods may satisfy their needs before152
entering into discussions with the testing body. In effect, the listed require-153
ments and associated tests become a menu from which the customer and154
test body can choose the most appropriate way of meeting the customer’s155
particular needs.156
4. A Discussion of Validation157
In many discussions of accreditation against the standard, the concept of158
“validation of the tool” or even “tool accreditation” is raised by users and159
vendors as a means to shortening or eliminating the process. To the authors,160
this hints that there may be some either confusion about the meanings of161
these terms, or a different use of language in effect. It is, therefore, instruc-162
tive to consider the software engineering distinction between verification and163
validation and contrast it with the ISO 17025 view.164
7
4.1. ISO 17025:2005 approach to validation.165
ISO 17025:2005 [2] contains no direct definition of validation but, in ac-166
cordance with ISO practice, refers the reader to ISO 17000 and ISO 9000167
for inheritance of relevant definitions. This practice, of relying on definitions168
found in other standards, is common with the ISO range of standards, but169
can cause problems for some users as they may perceive a requirement to170
have access to the defining standard as well as the standard they are trying171
to implement, or they may rely solely on common usage of the word as op-172
posed to ISO’s stipulative definitions (aka the “Humpty Dumpty” rule3). In173
practice, ISO provides an Online Browsing Platform [6] (OBP) which allows174
access to definitions and some other text without further expenditure.175
Using the OBP, the authors have found that ISO 17000 contains no def-176
inition of validation. Thus the ISO 9000:2005 [7] definition should be used177
as this is the most recently published version prior to the publication of ISO178
17025:2005. This gives the following definition of validation:179
Confirmation, through the provision of objective evidence, that180
requirements for a specific intended use or application have been181
fulfilled.182
NOTE 1 The term validated is used to designate the correspond-183
ing status.184
NOTE 2 The use conditions for validation can be real or simu-185
lated.186
and defines objective evidence as187
Data supporting the existence or verity of something188
NOTE: Objective evidence may be obtained through observation,189
measurement, test, or other means.190
with requirement as191
need or expectation that is stated, generally implied or obligatory192
Note 1 to entry: Generally implied means that it is custom193
or common practice for the organization (3.3.1), its customers194
3”When I use a word, it means it means just what I choose it to mean”[5]
8
(3.3.5) and other interested parties (3.3.7), that the need or ex-195
pectation under consideration is implied.196
Note 2 to entry: A qualifier can be used to denote a specific type197
of requirement , e.g. product requirement , quality management198
requirement , customer requirement .199
Note 3 to entry: A specified requirement is one that is stated, for200
example in a document (3.7.2).201
Note 4 to entry: Requirements can be generated by different in-202
terested parties (3.3.7).203
Note 5 to entry: This definition differs from that provided in204
3.12.1 of ISO/IEC Directives, Part 2:2004. 3.12.1 requirement205
expression in the content of a document conveying criteria to be206
fulfilled if compliance with the document is to be claimed and207
from which no deviation is permitted208
This suggests that validation is a demonstration of suitability for a par-209
ticular use-case, that the requirements for a validated process should be de-210
rived from the intended use-case and that validation should be the process211
of obtaining data which shows that a method or process meets those specific212
requirements.213
4.2. Software Engineering approach to verification and validation214
In the world of digital forensics we tend to rely on third-party tools which215
we trust have been produced in accordance with good engineering practices.216
For the most common analytical tools, this is software which we trust has217
been correctly specified, implemented and tested. However, the responses218
to our questions about development models suggest that there is some dis-219
connect between the tool producers and the way end-users are expected to220
provide evidence of fitness for purpose. In order to understand how this may221
have arisen, we turned to a consideration of Software Engineering terminol-222
ogy to discover if there is a fundamental conceptual difference.223
In Software Engineering, we commonly paraphrase Verification as “are224
we building the product right?” and validation as “are we building the right225
product?”[8]. i.e. verification is a demonstration of the correctness of the226
product whereas validation is a demonstration of suitability for a particular227
use. More formally the IEEE Standard Glossary of Software Engineering228
Terminology[9],states these as229
9
Verification230
(1) The process of evaluating a system or component to determine231
whether the products of a given development phase satisfy the condi-232
tions imposed at the start of that phase.233
(2) Formal proof of program correctness.234
Validation235
The process of evaluating a system or component during or at the end236
of the development process to determine whether it satisfies specified237
requirements.238
For completeness, [9] also defines a requirement as239
(1) A condition or capability needed by a user to solve a problem240
or achieve an objective. (2) A condition or capability that must241
be met or possessed by a system or system component to satisfy a242
contract, standard, specification, or other formally imposed doc-243
uments.244
(3) A documented representation of a condition or capability as245
in (1) or (2).246
These definitions are completely consistent with those found in the ISO247
and ISO/IEC standards under consideration.248
Software products should, therefore, be subjected to verification during249
development - to show that they are correct and complete, and validation250
post-development to show that they meet the requirements for their intended251
use-cases. In more common terms, the validation test can be considered to252
be an acceptance test.253
In the case of custom software, produced in response to a particular prob-254
lem, the process of verification could result in validation for that problem. In255
the case of off the shelf software (e.g. word processors, spreadsheets, common256
forensic tools), however, verification during the development phases is based257
on a generic statement of requirements which meets the needs of a perceived258
customer or a group of idealised customers. It is the responsibility of the259
customer to ensure that the verified tool provides a valid solution to their260
problem as part of the procurement and pre-deployment process.261
It is, thus, entirely possible to verify a product which cannot be validated262
as it does not provide a suitable solution to the problem under considera-263
tion (e.g. a custom-built spreadsheet may be completely correctly built but264
10
unusable as a presentation package) and it is also possible to validate an265
unverified product by showing that, despite its inherent flaws, the product266
satisfies a particular case-specific set of requirements. For example, a cal-267
culator which always states that 2+2=5 is unlikely to be verifiable, but can268
participate in a validated method where the requirement is to calculate that269
3+3=6. Similarly a tool, designed to parse FAT filesystems only, will not270
parse NTFS. It is therefore, not verifiable for NTFS but can participate in271
methods which are validated for examination of a FAT formatted filesystem.272
In the latter case the unverified product cannot be shown to have any273
utility beyond the limited circumstances for which it is validated.274
In the former case, however, the verified product may be useful in other275
situations and the presence of evidence of verification can be used to assist the276
process of choosing it as a potential solution - i.e. the evidence of verification277
may show that the validation requirements have already been met during the278
development process.279
This depends entirely on the existence of suitable statements of require-280
ments for both the tool as it was developed and the situation in which it281
is to be used, and satisfactory evidence that those requirements have been282
satisfied.283
4.3. Implications for method validation284
Given that the definitions and usage of validation and verification, as285
outlined above, appear to be consistent it should, therefore, be possible to286
use software engineering evidence of verification, as suggested in ISO/IEC287
27041 [1] as part of the validation of a suitably documented method.288
5. Our study289
5.1. Laboratory documentation290
In our study, we examined a small randomly chosen set of Standard Op-291
erating Procedures (SOPs) and Validation plans and records from two ac-292
credited digital forensic laboratories. The SOPs were written in a format293
which appears to be based on the SWGDE Model [10] and be consistent294
with the accepted standard format within forensic science laboratories in the295
UK. These contain sections detailing Purpose, Scope, Equipment, Limita-296
tions, Procedure, Processing, Success/Failure Criteria and References. None297
of these SOPs contained any obvious definitions of technical requirements.298
Rather they tend to define success in terms of processing completing without299
11
any errors being reported, and give a broad area of application in the Scope300
statement.301
Validation plans contained some identified requirements, but these were302
arranged as End User (the Criminal Justice System), Legal (including com-303
pliance with ISO 17025), Compatibility (output format only) and Ethical.304
No obvious low-level technical requirements were specified in any of the plans.305
Validation records showed that validation processes tended to consist of306
evidence that the process under test produced the same results as the same307
process run on other equipment or that it produced expected results from a308
particular test case.309
The testing thus satisfied the letter of the ISO 17025:2005 description of310
validation, but may not have achieved the level suggested by the principles311
in [3], particularly in respect of Traceability and Transparency.312
This apparent failing is not thought to be a problem for other forensic dis-313
ciplines whose roots lie in other sciences such as chemistry, physics or biology,314
where the methods used in forensic laboratories are specific adaptations of315
well-known methods which are used for other purposes and which have been316
subjected to rigorous peer-review through publication and extensive use in317
other work.318
Digital Forensics, however, has its roots in engineering and is highly re-319
liant on reverse-engineering of decisions and implementations made by others.320
Many of these implementations (e.g. hard disc firmware, filesystem imple-321
mentations, data caching) are not published or reviewed as they are commer-322
cially sensitive and/or there is no need for the majority of users/customers323
to have any particular interest in the low-level implementational detail which324
is of particular interest to a digital forensic examiner or analyst. As a result,325
it may be considered to be difficult for producers or users of forensic tools326
to show that the tools are actually correct except by potentially lengthy and327
costly empirical methods.328
This is compounded by a fundamental difference in the nature of the way329
in which off the shelf software (OTSS) is used. In a non-forensic context,330
OTSS is typically intended to process inputs provided by a user in order to331
generate a particular output. In this situation, the inputs are known, or can332
be examined, before the output is seen and thus detection of incorrect results333
can be simple. In the forensic context, however, examinations start with a334
source of potential evidence whose contents are unknown. Thus the inputs335
to the whole forensic process are unknown. Although the user may have336
some experience of what abnormal outputs look like, this depends entirely337
12
on the tool actually producing abnormal outputs or indications of errors.338
It is entirely possible for a tool to process inputs incorrectly and produce339
something which still appears to be consistent with correct operation. In the340
absence of objective verification evidence, assessment of the correctness, or341
otherwise, of any results produced by a tool relies solely on the experience342
of the operator.343
It should also be borne in mind that updates to hardware and software344
may have no apparent effect on system behaviour as far as a typical user is345
concerned, but may dramatically change the way in which internal processing346
is carried out and data is stored. This impacts both on the ability to recover347
and interpret data and on the behaviour of the tools used to perform these348
operations.349
5.2. Vendor evidence of verification350
Our study circulated a questionnaire and received 14 responses from tool351
providers. Of these, 2 could be considered major providers although one is352
more focussed on e-Discovery than criminal investigations.353
The 12 small providers seemed confused about what was meant by cus-354
tomer requirements with responses including “I’m my own customer”, “Sorry,355
I don’t understand the question’, “Forums, social media”, “I do not - many356
potential customers seem utterly bemused why they should be interested357
at all”. Of the 14, 3 identified the use of JIRA / Confluence /Github as358
a means of deriving requirements and three others identified Meetings and359
Communications with end users as the mechanisms used.360
When asked how they demonstrated that their tool satisfied user require-361
ments, responses include use of NIST test disc images, use within ISO 17025362
accredited laboratories, and meetings. Only one of the survey group men-363
tioned compliance testing.364
We also, as noted in the introduction, met with considerable resistance365
from some of the better-known providers when we asked for information366
about this topic. As a result, we cannot provide objective evidence for any367
degree of confidence that tool providers are meeting the genuine requirements368
of the digital forensic laboratories.369
Customers for the tools have little incentive to consider the technical370
requirements as it seems possible to obtain accreditation to ISO 17025:2005371
without them, and most tool providers are either unable or unwilling to372
provide evidence that they have verified their tools against any customer or373
technical requirements.374
13
6. Transition to ISO 17025:2017375
The position in respect of accreditation to ISO 17025:2017[11] may be376
somewhat different as this now contains definitions of validation and verifi-377
cation which are very similar to those used in ISO 27041 and the software378
engineering world, viz:379
Validation Verification, where the specified requirements are fit380
for an intended use381
Verification Provision of objective evidence that a given item382
fulfils specified requirements383
Thus validation appears, in the newer version, to be reliant on verification384
against specified requirements and comparison of those requirements with the385
requirements of the intended use-case.386
7. Conclusion387
Contrary to previous arguments that ISO 17025 [12] is an unwieldy stan-388
dard for digital forensics because of the complexity of validation, we believe389
that it can be applied if certain preconditions are met.390
For ISO 17025 to be successfully applied, the existing understanding of391
requirements needs to be reconsidered. Rather than relying on the concepts392
of “customer requirements” [13], where the customer is the customer of the393
laboratory (i.e. law enforcement agents, lawyers, the criminal justice system394
etc.) to provide the baseline for method validation, forensic science providers395
should consider the technical requirements for their own processes and use396
the customer requirements as a means of selecting the most appropriate pro-397
cesses to deploy. This would be consistent with the way other “non-forensic”398
accredited testing and calibration organisations operate.399
Within forensic science disciplines we suggest that all labs. will have400
the same common core technical requirements for generic method types (e.g.401
in digital forensics, hard disc imaging is a core process, as is extraction of402
data from devices running specific iOS versions etc.), that these should be403
established by technical working groups from within each discipline, and404
documented in agreed international standards which can be maintained for405
use and development by the community.406
14
The requirements contained in these standards can then form the basis407
of a specification mechanism for methods. Clear identification of the techni-408
cal requirements vs. the non-technical would allow producers and users to409
identify priority areas for new tool development.410
Publication, and public maintenance, of this common set of requirements411
would also allow transparency in the verification and validation process.412
Rather than relying on “commercially sensitive” information, which may413
or may not be correct, it would become possible for all those involved to use414
the disclosed information and make claims (with appropriate substantiating415
evidence) based upon it.416
Furthermore, if the suggestion of ISO/IEC 27041:2015 [1] that processes417
should be designed to be atomic in nature (i.e. small, single purpose with418
low coupling and high cohesion to other processes) can be followed, the set419
of requirements for any one process can be kept to a minimum, resulting420
in a better defined set of conditions for validation and an elimination of421
revalidation being triggered by changes elsewhere in the process. All the422
methods which were volunteered for our study were monolithic in nature423
and contained a high degree of repetition of tightly coupled (by virtue of424
being included in each SOP) initial process stages (e.g. retrieval of physical425
items from an evidence store) before progressing to the unique elements of426
the process.427
8. Existing related work428
8.1. Introduction429
Since starting the original project, we have been made aware of some430
projects which may provide, at least in part, some of the missing require-431
ments, specifications and evidence of correctness. A brief review of two of432
these, in the context of our analysis and proposals, is given below.433
8.2. NIST/DHS Computer Forensics Tool Testing434
The National Institute for Science and Technology (NIST) and the Dept.435
of Homeland Security (DHS) have started some of this work in their Com-436
puter Forensics Tool Testing programme [14] (CFTT). In this project, a steer-437
ing group defines the requirements for particular tool functions and NIST438
then tests tools against the resulting specifications. At the time of writing,439
the coverage is somewhat limited, concentrating on a few areas which may440
15
be particularly common in investigations, but a good range of tools has been441
considered and an online catalogue of tools and results has been produced.442
The Federated Tool Testing project as a sub-project of this initiative may443
be a particularly useful model as it makes available a test suite which can444
be used by anyone who wishes to test tools against the requirements already445
defined by the project and share their results.446
It is unclear, however, how the programme’s priority areas are established447
or how the requirements are, themselves, validated at as this part of the448
process does not appear to be documented. It is also noteworthy that the449
requirements are purely at the tool level rather than the broader method450
level. This may result in an undue emphasis on producing requirements for451
existing tools, at the expense of producing requirements which have not yet452
been satisfied but which should be considered high priority as they reflect an453
emerging real problem area.454
We also suggest that a broader consideration could create opportunities455
for better tool integration (i.e. improved exchange of data between tools and456
better cohesion for improved process flows) as well as improved concordance457
with external requirements such as legal issues.458
8.3. SWGDE guidance on testing and validation459
The Scientific Working Group on Digital Evidence (SWGDE) has issued460
a number of documents which are intended to assist in the design, imple-461
mentation and validation of methods for digital forensic processes. Of these,462
the two which appear to have most direct application to the area we are463
investigating are464
SWGDE Recommended Guidelines for Validation Testing [15]465
SWGDE Minimum Requirements for Testing Tools used in Digital and466
Multimedia Forensics [16] (At the time of writing, this document was467
in draft form and had been issued for consultation).468
The SWGDE validation guidance[15] states that469
Validation testing should be applied to all tools, techniques and470
procedures471
and further that472
16
Tools, techniques and procedures, which, by virtue of their widespread473
use, duration of use, and acceptability by the larger informa-474
tion technology community, are generally acknowledged as reli-475
able and trustworthy. Consideration may be given to the general476
acceptance of a tool, technique, or procedure in the determination477
of whether validation is required.478
. The latter paragraph appears, to some extent, to contradict the former.479
In our experience, it seems that this is generally interpreted to mean that480
something which is in widespread use may be considered reliable.481
We argue that this is not the intent of the “general acceptance” statement.482
In part, this is because of the presence of the phrase “larger information483
technology community” which is a clear indication that the tools, techniques484
and procedures under consideration are of a more general-purpose nature485
than the specialist tools deployed in an investigative context. Spreadsheets,486
word processors, email programs etc. may generally be considered acceptably487
reliable because they have minimal impact on evidential product and, should488
they prove to have an error, the sheer number of users worldwide means that489
it is likely to be detected and documented relatively quickly.490
More importantly, however, if this general acceptance principle is allowed491
to apply to commonly adopted “forensic” tools, techniques and procedures it492
has the potential to result in bad evidence. If the tool, technique or procedure493
has not been subjected to independent scrutiny (e.g. through peer-reviewed494
publication or properly evidenced validation testing) there is insufficient ev-495
idence that it does work correctly. As we note above, digital forensics relies496
heavily on reverse engineering in order to process and interpret data. At497
the level that most users operate, it does not have sufficient foundational498
scientific principles to allow a reversion to first principles to be applied in499
order to demonstrate correctness. There is always likely to be some doubt500
or uncertainty about the way the data is being processed and interpreted.501
This can be reduced only through production of evidence of correctness and502
adequacy through appropriate software engineering methods, such as testing.503
Note: we do not see this as a flaw in the SWGDE guidance, but rather504
in the way that a large part of the community has chosen to interpret this505
particular recommendation. It should be noted that similar phrases appear506
in other guidance and, in our experience, are similarly interpreted.507
The remainder of this document gives a high-level overview of the devel-508
opment of a testing procedure which, if underpinned by well-defined require-509
17
ments which allow the identification of appropriate test cases could result in510
good evidence of validation and identification of boundary cases for methods.511
The tool testing guide[16] is more detailed in its recommendations and512
gives advice about specific tool types and the conditions which should be513
considered for their testing. Again, however, it makes little reference to514
using a well-defined set of requirements to assist in the identification of test515
cases. It does acknowledge that the testing proposed is purely a minimum516
and that organisations should consider their own particular requirements.517
It is our view that evidence of testing, produced in the recommended518
way, could be applied as an adjunct to method validation, providing the re-519
quirements are properly defined and documented. It should be remembered,520
however, that tool testing alone is unlikely to be produce the evidence of521
validation required by either ISO 17025[2][11] or ISO/IEC 27041[1], unless it522
can be clearly shown that the method is wholly and solely implemented by523
the tool (see Figure 1).524
9. Final thoughts525
While the NIST and SWGDE projects outlined above may start to pro-526
vide the type of evidence that is necessary to demonstrate that a method is527
valid, the potential lack of transparency in the requirements definition pro-528
cesses introduces another element of uncertainty. i.e. if the requirements529
cannot be shown to be correct, can tests based on those requirements show530
correctness? This can, to a large extent, be addressed by adopting the “non-531
forensic” accredited organisation model of using publicly available agreed532
standard specifications/requirements and/or methods which can be subjected533
to external independent scrutiny.534
It also be useful to engage in a more open process, similar to those pro-535
posed for use in the specification and testing of safety-critical systems [17].536
[1] ISO/IEC, ISO/IEC 27041:2016 guidance on assuring the suitability and537
adequacy of digital investigation method (2016).538
[2] ISO, ISO 17025:2005 general requirements for the competence of testing539
and calibration laboratories (2005).540
[3] J. Gravel, Principles behind the requirements of iso 17025, online at541
http://www.cala.ca/ISO-IEC 17025 Principals.pdf, 2002. Last accessed542
25th April 2018.543
18
[4] UKAS, Directory of accredited organisations, online at544
https://www.ukas.com/services/other-services/directory-of-accredited-545
organisations/, 2018. Last viewed 4th June 2018.546
[5] Rev. Charles Dodgson (Lewis Carroll), Through the Looking Glass,547
1872.548
[6] ISO, ISO online browsing platform, online at549
https://www.iso.org/obp/ui/, 2018. Last accessed 13th August550
2018.551
[7] ISO, ISO 9000:2005 quality management systems – fundamentals and552
vocabulary (2005).553
[8] B. W. Boehm, Verifying and validating software requirements and design554
specifications, IEEE Softw. 1 (1984) 75–88.555
[9] IEEE, Ieee standard glossary of software engineering terminology, IEEE556
Std 610.12-1990 (1990) 1–84.557
[10] Scientific Working Group on Digital Evidence (SWGDE), SWGDE558
model standard operation procedures for computer forensics, online559
at https://www.swgde.org/documents/Current Documents/SWGDE560
QAM and SOP Manuals/SWGDE Model SOP for Computer Forensics,561
2012. Last viewed 5th June 2018.562
[11] ISO, ISO 17025:2017 general requirements for the competence of testing563
and calibration laboratories (2017).564
[12] P. Sommer, Accrediting digital forensics - what are the choices?, Digital565
Investigation (2018).566
[13] International Laboratory Accreditation Cooperation, ILAC567
G19:08/2014 modules in a forensic science process (2014).568
[14] National Institute for Science and Technology (NIST),569
Computer forensics tool testing programme, online at570
https://www.nist.gov/itl/ssd/software-quality-group/computer-571
forensics-tool-testing-program-cftt, 2018. Last viewed 13th August572
2018.573
19
[15] Scientific Working Group on Digital Evidence (SWGDE), SWGDE Rec-574
ommended Guidelines for Validation Testing (Version 2.0) (2014). Last575
accessed 13th August 2018.576
[16] Scientific Working Group on Digital Evidence (SWGDE), SWGDE Min-577
imum Requirements for Testing Tools used in Digital and Multimedia578
Forensics (2018). Draft Version 1.0 dated 9th July 2018. Last accessed579
13th August 2018.580
[17] L. E. G. Martins, T. Gorschek, Requirements engineering for safety-581
critical systems: A systematic literature review, Information and Soft-582
ware Technology 75 (2016) 71 – 89.583
20
ResearchGate has not been able to resolve any citations for this publication.
Article
Context Safety-Critical Systems (SCS) are becoming increasingly present in our society. A considerable amount of research effort has been invested into improving the SCS requirements engineering process as it is critical to the successful development of SCS and, in particular, the engineering of safety aspects. Objective This article aims to investigate which approaches have been proposed to elicit, model, specify and validate safety requirements in the context of SCS, as well as to what extent such approaches have been validated in industrial settings. The paper will also investigate how the usability and usefulness of the reported approaches have been explored, and to what extent they enable requirements communication among the development project/team actors in the development of SCS. Method We conducted a systematic literature review by selecting 151 papers published between 1983 and 2014. The research methodology to conduct the SLR was based on the guidelines proposed by Kitchenham and Biolchini. Results The results of this systematic review should encourage further research into the design of studies to improve the requirements engineering for SCS, particularly to enable the communication of the safety requirements among the project team actors, and the adoption of other models for hazard and accident models. The presented results point to the need for more industry-oriented studies, particularly with more participation of practitioners in the validation of new approaches. Conclusion The most relevant findings from this review and their implications for further research are as follows: integration between requirements engineering and safety engineering areas; dominance of the traditional approaches; early mortality of new approaches; need for industry validation; lack of evidence for the usefulness and usability of most approaches; and the lack of studies that investigate how to improve the communication process throughout the lifecycle. Based on the findings, we suggest a research agenda to the community of researchers and advices to SCS practitioners.
Draft Version 1.0 dated 9th
  • Forensics
Forensics (2018). Draft Version 1.0 dated 9th July 2018. Last accessed 579 13th August 2018.