Daniel L. Moody’s research while affiliated with Simplot Australia Pty. Ltd. and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (65)


Visual notation design 2.0: Towards user comprehensible requirements engineering notations
  • Conference Paper

July 2013

·

143 Reads

·

87 Citations

·

Nicolas Genon

·

·

Daniel L. Moody

The success of requirements engineering depends critically on effective communication between business analysts and end users, yet empirical studies show that business stakeholders understand RE notations very poorly. This paper proposes a novel approach to designing RE visual notations that actively involves naïve users in the process. We use i*, one of the most influential RE notations, to demonstrate the approach, but the same approach could be applied to any RE notation. We present the results of 5 related empirical studies that show that novices outperform experts in designing symbols that are comprehensible to novices: the differences are both statistically significant and practically meaningful. Symbols designed by novices increased semantic transparency (their ability to be spontaneously interpreted by other novices) by almost 300% compared to the existing i* notation. The results challenge the conventional wisdom about visual notation design: that it should be conducted by a small group of experts; our research suggests that it should instead be conducted by large numbers of novices. The approach is consistent with Web 2.0, in that it harnesses the collective intelligence of end users and actively involves them in the notation design process as “prosumers” rather than passive consumers. We believe this approach has the potential to radically change the way visual notations are designed in the future.



Towards a More Semantically Transparent i* Visual Syntax

March 2012

·

42 Reads

·

13 Citations

Lecture Notes in Computer Science

Nicolas Genon

·

·

Hubert Toussaint

·

[...]

·

Daniel Moody

[Context and motivation] i* is one of the most popular modelling languages in Requirements Engineering. i* models are meant to support communication between technical and non-technical stakeholders about the goals of the future system. Recent research has established that the effectiveness of model-mediated communication heavily depends on the visual syntax of the modelling language. A number of flaws in the visual syntax of i* have been uncovered and possible improvements have been suggested. [Question/problem] Producing effective visual notations is a complex task that requires taking into account various interacting quality criteria. In this paper, we focus on one of those criteria: Semantic Transparency, that is, the ability of notation symbols to suggest their meaning. [Principal ideas/results] Complementarily to previous research, we take an empirical approach. We give a preview of a series of experiments designed to identify a new symbol set for i* and to evaluate its semantic transparency. [Contribution] The reported work is an important milestone on the path towards cognitively effective requirements modelling notations. Although it does not solve all the problems in the i* notation, it illustrates the usefulness of an empirical approach to visual syntax definition. This approach can later be transposed to other quality criteria and other notations.


Representing Classes of Things and Properties in General in Conceptual Modelling

January 2012

·

13 Reads

How classes of things and properties in general should be represented in conceptual models is a fundamental issue. For example, proponents of object-role modelling argue that no distinction should be made between the two constructs, whereas proponents of entity-relationship modelling argue the distinction is important but provide ambiguous guidelines about how the distinction should be made. In this paper, the authors use ontological theory and cognition theory to provide guidelines about how classification should be represented in conceptual models. The authors experimented to test whether clearly distinguishing between classes of things and properties in general enabled users of conceptual models to better understand a domain. They describe a cognitive processing study that examined whether clearly distinguishing between classes of things and properties in general impacts the cognitive behaviours of the users. The results support the use of ontologically sound representations of classes of things and properties in conceptual modelling.


The physics of notations: a scientific approach to designing visual notations in software engineering

June 2010

·

224 Reads

·

47 Citations

Proceedings - International Conference on Software Engineering

Visual notations form an integral part of the language of software engineering (SE). Yet historically, SE researchers and notation designers have ignored or undervalued issues of visual representation. In evaluating and comparing notations, details of visual syntax are rarely discussed. In designing notations, the majority of effort is spent on semantics, with graphical conventions often an afterthought. Typically no design rationale, scientific or otherwise, is provided for visual representation choices. While SE has developed mature methods for evaluating and designing semantics, it lacks equivalent methods for visual syntax. This tutorial defines a set of principles for designing cognitively effective visual notations: ones that are optimised for human communication and problem solving. Together these form a design theory, called the Physics of Notations as it focuses on the physical (perceptual) properties of notations rather than their logical (semantic) properties. The principles were synthesised from theory and empirical evidence from a wide range of fields and rest on an explicit theory of how visual notations communicate. They can be used to evaluate, compare and improve existing visual notations as well as to construct new ones. The tutorial identifies serious design flaws in some of the leading SE notations together with practical suggestions for improving them. It also showcases some examples of visual notation design excellence from SE and other fields.


Table 1 Semiotic clarity analysis results 
Fig. 1 i* Consists of two diagram types (from [90])
Table 2 Symbol overload analysis of i* relationships 
Table 3 Semantic analysis of contribution links 
Fig. 8 Metaclass Hierarchy for i*: each element on the diagram corresponds to a semantic construct (metaclass), with dotted elements showing abstract metaclasses  

+22

Visual syntax does matter: Improving the cognitive effectiveness of the i* visual notation
  • Article
  • Full-text available

June 2010

·

1,275 Reads

·

130 Citations

Requirements Engineering

Goal-oriented modelling is one of the most important research developments in the requirements engineering (RE) field. This paper conducts a systematic analysis of the visual syntax of i*, one of the leading goal-oriented languages. Like most RE notations, i* is highly visual. Yet surprisingly, there has been little debate about or modification to its graphical conventions since it was proposed more than a decade ago. We evaluate the i* visual notation using a set of principles for designing cognitively effective visual notations (the Physics of Notations). The analysis reveals some serious flaws in the notation together with some practical recommendations for improvement. The results can be used to improve its effectiveness in practice, particularly for communicating with end users. A broader goal of the paper is to raise awareness about the importance of visual representation in RE research, which has historically received little attention. KeywordsGoal modelling- i*-Visualisation-Visual syntax-Evaluation-Visual notation-Visual language

Download

Representing Classes of Things and Properties in General in Conceptual Modelling: An Empirical Evaluation

April 2010

·

93 Reads

·

25 Citations

How classes of things and properties in general should be represented in conceptual models is a fundamental issue. For example, proponents of object-role modelling argue that no distinction should be made between the two constructs, whereas proponents of entity-relationship modelling argue the distinction is important but provide ambiguous guidelines about how the distinction should be made. In this paper, the authors use ontological theory and cognition theory to provide guidelines about how classification should be represented in conceptual models. The authors experimented to test whether clearly distinguishing between classes of things and properties in general enabled users of conceptual models to better understand a domain. They describe a cognitive processing study that examined whether clearly distinguishing between classes of things and properties in general impacts the cognitive behaviours of the users. The results support the use of ontologically sound representations of classes of things and properties in conceptual modelling.


The “Physics” of Notations: Toward a Scientific Basis for Constructing Visual Notations in Software Engineering

January 2010

·

1,145 Reads

·

1,012 Citations

IEEE Transactions on Software Engineering

Visual notations form an integral part of the language of software engineering (SE). Yet historically, SE researchers and notation designers have ignored or undervalued issues of visual representation. In evaluating and comparing notations, details of visual syntax are rarely discussed. In designing notations, the majority of effort is spent on semantics, with graphical conventions largely an afterthought. Typically, no design rationale, scientific or otherwise, is provided for visual representation choices. While SE has developed mature methods for evaluating and designing semantics, it lacks equivalent methods for visual syntax. This paper defines a set of principles for designing cognitively effective visual notations: ones that are optimized for human communication and problem solving. Together these form a design theory, called the Physics of Notations as it focuses on the physical (perceptual) properties of notations rather than their logical (semantic) properties. The principles were synthesized from theory and empirical evidence from a wide range of fields and rest on an explicit theory of how visual notations communicate. They can be used to evaluate, compare, and improve existing visual notations as well as to construct new ones. The paper identifies serious design flaws in some of the leading SE notations, together with practical suggestions for improving them. It also showcases some examples of visual notation design excellence from SE and other fields.


Figure 2. Relationships among top 6 theories Scope. There is a high degree of overlap in the scope (boundaries) of the theories: TAM's scope is a subset of UTAUT's: both explain and predict IS usage; however UTAUT covers both voluntary and non-voluntary usage, while TAM only addresses voluntary usage. TAM and UTAUT form subsets of the domain of IDT, which explains and predicts adoption of technology generally rather than only information technology. TAM forms a subset of the domain of TRA while UTAUT forms a subset of the domain of TPB (as TRA and TPB explain and predict all human behaviour, not just computer usage behaviour) TAM and UTAUT form subsets of the scope of ISM, as usage is included within ISM as one component of IS success. Usage also appears in TPC, which means that it overlaps with TAM, UTAUT and ISM. Competition. There is also a great degree of competition among the theories: TAM, UTAUT, TRA and IDT are competitors as they can all be used to explain the same dependent variable (IS usage) 4 ISM and TPC both attempt to explain the same dependent variable (IS impact)
In Search of Paradigms : identifying the theoretical foundations of the information system field

January 2010

·

113 Reads

·

4 Citations

The goal of this paper is identify the theoretical foundations of the IS field. Currently there is a lack of consensus about what the core IS theories are, or even if we have any at all. If we do, they certainly don’t appear in IS curricula or textbooks as they do in more mature disciplines. So far, most of the debate on this issue has been conducted at a subjective and prescriptive (normative) level. We attempt to broaden the debate by taking a descriptive (positive) approach, using relatively objective data. We do this by consulting the “geological record”: the pattern of citations in the leading IS journals. We use a combination of quantitative and qualitative techniques to identify the most influential theories in the IS field. The results of our analysis are surprisingly positive, especially in the light of warnings about IS being overly dependent on reference disciplines (a discipline with no theory to call its own) and being obsessed with research methodology (emphasising how to research at the expense of what to research). This suggests that the negative views often expressed about the progress of IS may be unjustified and that its development has followed the normal evolutionary pattern of any research field. Being aware of our theoretical foundations will help clarify our disciplinary identity and guide teaching and scholarship.


Figure 2. Relationships among top 6 theories Scope. There is a high degree of overlap in the scope (boundaries) of the theories: TAM's scope is a subset of UTAUT's: both explain and predict IS usage; however UTAUT covers both voluntary and non-voluntary usage, while TAM only addresses voluntary usage. TAM and UTAUT form subsets of the domain of IDT, which explains and predicts adoption of technology generally rather than only information technology. TAM forms a subset of the domain of TRA while UTAUT forms a subset of the domain of TPB (as TRA and TPB explain and predict all human behaviour, not just computer usage behaviour) TAM and UTAUT form subsets of the scope of ISM, as usage is included within ISM as one component of IS success. Usage also appears in TPC, which means that it overlaps with TAM, UTAUT and ISM. Competition. There is also a great degree of competition among the theories: TAM, UTAUT, TRA and IDT are competitors as they can all be used to explain the same dependent variable (IS usage) 4 ISM and TPC both attempt to explain the same dependent variable (IS impact)
Table 4. Theoretical Foundations of the IS Field (native theories shaded) 
In Search of Paradigms: Identifying the Theoretical Foundations of the IS Field.

January 2010

·

521 Reads

·

29 Citations

The goal of this paper is identify the theoretical foundations - the core theories - of the IS field. Currently there is a lack of consensus about what the core IS theories are, or even if we have any at all. If we do, they certainly don't appear in IS curricula or textbooks as they do in more mature disciplines. So far, most of the debate on this issue has been conducted at a subjective and prescriptive (normative) level. We attempt to broaden the debate by taking a descriptive (positive) approach, using relatively objective data. We do this by consulting the "geological record": the pattern of citations in the leading IS journals. We use a combination of quantitative and qualitative techniques to identify the most influential theories in the IS field. The results of our analysis are surprisingly positive, especially in the light of warnings about IS being overly dependent on reference disciplines (a discipline with no theory to call its own) and being obsessed with research methodology (emphasising how to research at the expense of what to research). This suggests that the negative views often expressed about the progress of IS may be unjustified and that its development has followed the normal evolutionary pattern of any research field. Being aware of our theoretical foundations will help clarify our disciplinary identity and guide teaching and scholarship.


Citations (54)


... In total, we found 32 studies that propose a diverse set of pedagogical strategies to improve student factors, which we show in Fig. 15 (note, studies may pertain to more than one student factor). The most commonly addressed student factors are motivation/enthusiasm (11 studies in total i.e., [5,31,32,34,50,51,94,98,117,138,180]), understanding/ comprehension (8 studies, i.e., [44,63,103,117,159,170,181,188]), retention/learning (7 studies, i.e., [6,15,16,34,66,122,187]), and engagement/ interest (also 7 studies, i.e., [5,31,32,94,117,138,180]). The remaining six studies target a diverse, yet more abstract set of student factors, i.e., "combating students being overwhelmed" [190], "effort and aggravation" [58], "review effectiveness" [134], "acceptance of uncertainty" [14], "process competency" [129], and "introspection" into the validation process (i.e., [11,13,41], which for the purpose of this discussion, we consider one contribution). ...

Reference:

A systematic literature review of requirements engineering education
Incorporating quality assurance processes into requirements analysis education
  • Citing Conference Paper
  • January 2003

ACM SIGCSE Bulletin

... The first approach mentioned and added in the model by Savi et al. (2011), analyzes the students' perception regarding short-term and long-term learning variables [Moody e Sindre 2003]. For each variable, a question was provided with answers varying on the Likert scale, following the same process as the previous subsections, as illustrated in Figure 5. ...

Evaluating the Effectiveness of Learning Interventions: An Information Systems Case Study
  • Citing Article
  • January 2003

... The problem with relying solely on these properties is that there is no guarantee that a model element satisfying some topological requirement (e.g., a node with more edges connected to it) by necessity represents the model's most important concepts. This is related to the work by [19,20], that while criticizing existing CM-CM methods, referred to it as lack of cognitive justification. ...

A decomposition method for entity relationship models: A systems theoretic approach
  • Citing Article
  • January 2000

... In addition, to the "pure" PJBL courses, there have been experiences with many smaller projects in the context of more traditional lecture courses. Those projects either spanned several courses, taking place in the 4th semester of both the aforementioned study programs (Sindre et al., 2003c), or were embedded inside single courses, exploring more focused tasks like programming (Sindre, Line and Valvåg, 2003b), requirements elicitation (Sindre, 2005), and document review (Sindre et al., 2003a). PJBL classes have also been offered in a more flexible manner, using the form of an intensive course. ...

Introducing Peer Review in an IS Analysis Course

... In the case of PHRs, practitioners' perception of effectiveness was considered much more important than adoption in practice. For this purpose, the effectiveness perception parameters measured the effects on the decision to use the framework [16][17][18]. Only perception effectiveness in MEM was previously hypothesised in the literature, and it was based on TAM and the Theory of Reasoned Action (TRA). ...

Comparative Evaluation of Large Data Model Representation Methods: The Analyst's Perspective
  • Citing Conference Paper
  • September 2002

Lecture Notes in Computer Science

... The principle is defined as: "…the extent to which the meaning of a symbol can be inferred from its appearance" [113]. When signs indicate low ST, not only does the cognitive load increase, but users also have difficulty to differentiate between concepts and need more time to learn a new language [56,57,114,116]. The principle has been recognized as essential to ensure that non-experts can better understand visual notation [96,117]. ...

Towards a More Semantically Transparent i* Visual Syntax
  • Citing Conference Paper
  • March 2012

Lecture Notes in Computer Science

... Furthermore, according to Caire et al. [29], semantic transparency is crucial to designing comprehensible visual notations for inexperienced users. This property ensures that the meaning of a symbol is evident (i.e., intuitive and transparent) from its visual representation alone, akin to onomatopoeia in spoken languages. ...

Visual notation design 2.0: Towards user comprehensible requirements engineering notations
  • Citing Conference Paper
  • July 2013

... The results of the study are analysed for the entire group (all) and the individual groups as outlined above. This research somewhat extends the contribution by Caire et al. (2013) and El Kouhen et al. (2015) who challenge the conventional approach of 'design by committee'. Caire et al. (2013) showed that novice users propose symbol sets that are more semantically transparent when compared with experts. ...

Caire, Patrice; Genon, Nicolas; Heymans, Patrick; Moody, Daniel L.: Visual Notation Design 2.0: Towards User-Comprehensible RE Notations, RE 2013, Requirements Engineering, Proceedings of the 21st IEEE International Requirements Engineering Conference.
  • Citing Article
  • July 2013

... This paper responds to calls to increase rigor and relevance in DSR by addressing an important, but not well-understood, aspect of design theorizing – the question of whether specific design choices used to instantiate a design theory should be considered and accounted for in the design theory. It is widely contended that many IS theories (including design theories) are midrange theories14151617. Merton [18] defines mid-range theories as theories that are moderately abstract (i.e., they do not purport to explain everything) but " close enough to observed data to be incorporated in propositions that permit empirical testing " (p. ...

In Search of Paradigms : identifying the theoretical foundations of the information system field