Content uploaded by Riccardo Bonazzi
Author content
All content in this area was uploaded by Riccardo Bonazzi
Content may be subject to copyright.
Analysis of serious games implementation for
project management courses
R. Bonazzi
1
, S. Missonier
1
, D. Jaccard
2
, P. Bienz
1
, B. Fritscher
1
, E.
Fernandes
1
Abstract Previous researches in pedagogy and project management have already
underlined the positive contribution of serious games on project management
courses. However, the empirical outcome of their studies has not been translated
yet into functional and technical specifications for "serious games" designers. Our
study aims at obtaining a set of technical and functional design guidelines for seri-
ous game scenario editors to be used in large classes of project management stu-
dents. We have conceived a framework to assess the influence of different serious
games components over student’s perceived acquired competency. Such frame-
works will allow us to develop a software module for reflective learning, which is
meant to extend theory of serious games design.
Introduction
Information system (IS) project management courses are known to be challenging
to conceive, since most of the skills required for project managers cannot be
achieved ex cathedra. Problems in IS are characterized by incomplete, contradic-
tory and changing requirements, and solutions are often difficult to recognize be-
cause of complex interdependencies. This leads to an educational dilemma in
teaching such problems because a rich background of knowledge and intuition are
needed for effective problem-solving. Hence complexity is added rather than re-
duced with increased understanding of the problem (Connolly and Stanfield,
2006).
As a consequence of the large number of failed projects a strong challenge to
traditional methods of project management based on universal best practices (such
1
University of Lausanne, School of Economics, Lausanne, Switzerland
stephanie.missonier@unil.ch
2
HEIG-VD, Yverdon-les-Bains, Switzerland dominique.jaccard@heig-vd.ch
as the Project Management Institute) emerges in the academic world and among
practitioners (Hodgson and Cicmil, 2007; Sauer and Reich, 2009). Traditional
approaches are part of a very instrumental and functionalist vision of promoting
project management principles that do not reflect the reality of the projects, which
is ambiguous, fragmented, complex, socio-technical built, and with a strong politi-
cal character. Therefore, project management is a discipline that requires knowl-
edge and reflective practice that allows players to lead the project team in an
emergent way. This kind of frameworks requires a high degree of interaction be-
tween teacher and students. But face-to-face exchanges are hard to manage when
the number of students is greater than forty (Smith and Kampf, 2004).
Game-based learning (also known as serious games) uses simulation to allow
students to actively acquire competences required to solve problems. Hence game-
based learning scenarios might be the solution to introduce large classes to IS pro-
ject management since they are known to have an effect on student's self-efficacy
as well as acquisition and retention of declarative and procedural knowledge
(Sitzmann, 2011). Yet little interest has been given so far on how to design a sce-
nario editor to support an IS project management course by means of game-based
learning. In software engineering courses, game-based simulations are far less
used than other types of educational approaches (e.g. industrial partnership or
team learning) and they lack to incorporate model-based instruction and reflective
learning (Navarro and Van Der Hoek, 2009). We expect a similar trend in IS pro-
ject management courses. Therefore our research question is:
How to design a game-based learning scenario editor to support an infor-
mation system project management course for more than forty students?
By adopting a design science methodology this study aims at obtaining a
framework to design game-based learning scenario editors to enhances project
management competences for students attending the course. Such framework is
induced by testing different software components to assess their influences of stu-
dents’ acquired competency. Therefore the creation of a model to assess the soft-
ware components described in this paper is the initial step of such study. We start
here by assessing the gaps in the existing literature and by deriving a conceptual
model in the next section. The third section illustrates the methodology we adopt
to test our conceptual model and to assess the pedagogical effect of different soft-
ware tools. The results of a first assessment performed in one of these teaching
courses are presented in the fourth section as example. The paper ends by discuss-
ing the results obtained and by highlighting the next steps of our study.
Literature review
This section briefly assesses the state of the art in game-based learning for project
management course. We are looking for concrete evidences regarding the link
between game-based learning and performance of the IS project management
course. Hence we use the guidelines of Okoli and Schabram (2010) for a protocol
to assess the existing literature. For sake of simplicity we decide to limit our
Google Scholar search to articles published in the period 2005 -2010. Using the
selected keywords (“project management”; “information systems”; “game-based
learning”) we obtain 59 results, among which 21 are cited by at least another pa-
per and accessible to us. Since we are interested in articles that have assessed the
performance of the serious game analysed, we skim our set of articles to only a
few. For those papers we perform forward and backward analysis, i.e. we assess
the papers that cite/are cited by them. At the end we obtain two streams of re-
search: ex-ante evaluation and ex-post evaluation. Since we wish to connect these
two stream of research we derive three concepts: the student’s perceived acquired
competency (1), which is the set of measured capabilities that the student acquires
in class; the perception of the serious game design (2), which we consider here as
the set of features that the game-based learning software possesses to empower the
teacher; the student's engagement (3), i.e. the student's will to take part actively to
the game-based learning experience.
The first stream of research focuses on the ex ante evaluation of the effect that
serious game design has on student’s perceived acquired competency. This group
of papers claims that while traditional methods are based on an instructivist meth-
odology, game-based learning provides a constructivist learning environment
where learners can practice the formulation of requirements specification through
requirements elicitation and learning by doing (Hainey and Connolly, 2010). In
addition to that game-based learning provides a challenging and complex real-
world environment within which to apply their theoretical knowledge to overcome
difficulties in dealing with ambiguity and vagueness, while developing self-
confidence and increased motivation (De Freitas et al., 2006).
The second stream of research focuses on the ex post evaluation of the effect
that student's engagement having played the serious game has on the student’s
perceived acquired competency. Researchers collect student's suggestions for
game changes (Pfahl et al., 2004; Dantas et al., 2004) and perceived competences
needed (Pfahl et al., 2004; Greese von Wangenheim et al., 2009; Zapata, 2010;
Mawdesley et al., 2011).
To link these two streams of research we suggest considering the student's en-
gagement as a mediator between serious game design and student’s perceived ac-
quired competency. At the end we derive the following set of hypotheses: (H1)
the perception of the serious game design influence the student’s perceived
acquired competency; (H2) the student's engagement influences the student’s
perceived acquired competency; (H3) the perception of the serious game de-
sign influences the student's engagement.
In the next section we illustrate how we intend to design an experiment to test
our hypotheses.
Methodology
In this section we briefly describe the methodology we use to perform our experi-
ment. Design science seeks for outcomes that can be relevant for practitioners and
that have been obtained in a rigorous way. The purpose in this kind of study is
usefulness rather than truth. Although design science has been used since many
decades, it has been officially accepted in information system since the Manage-
ment Information System Quarterly article of Hevner et al. (2004). In our study
and in this paper we adopt the methodology suggested by Peffers et al. (2008),
which proposes a process composed of six steps. Following the first step we
clearly identify our problem, using the literature review, as summarized by our
research question.
The second step of the methodology identifies the objectives of the solution. In
this sense, in the previous section we have identified two gaps in the literature: the
first one concerns the link among ex ante and ex post evaluation criteria, whereas
the second one regards the use of reflective learning by means of serious games.
Thus our study should start by conceiving a framework to assess the correlation
among ex ante and ex post evaluation criteria. Then we will move towards the
development of an additional module for reflective learning over the student's
achieved skills and towards the assessment of its added value.
Design and development: In the third step of the methodology the design and
development of the new component occurs. Yet in the first part of our study de-
scribed here, the development is minimal since we have decided to reuse an exist-
ing serious game. The selected platform to test our assessment framework is a
game-based learning scenario editor called Albasim. The main reason underlying
the choice of such platform is its large set of existing features and the direct link
that the authors have with the development team of the software. This is going to
be very useful during the second part of the study, when we will be developing an
additional component. Figure 1 illustrates the dashboard used by the game players
by means of a web browser. On the top right corner there are the key performance
indicators. On the top left corner of the screen the four stages of the game are il-
lustrated: the players start by the project initiation (1), then they move on by plan-
ning the project (2) and executing it (3) before closing it (4). The central part of
the screen is multifunctional, whereas the right side of the central screen allows
the player to manage resources and task, and to read e-mails send by the central
system. For what concerns the reflective learning, the system does not have a
dedicated feature, leaving to the teachers the task to arrange students’ presenta-
tions to share lessons learned, as explained in the following section.
Figure 1: Dashboard of Albasim (Source: www.albasim.com)
The pedagogical scenario implemented: The fourth step of the methodology of
Peffers et al. (2008) requires a demonstration of the artefact. In our case the game
requires two four-hour sessions, for a total of eight class hours over two weeks.
Before the first session the students receive the software manual and the business
case. At the beginning of the first session students get familiar with the idea of
serious game and to the functionalities of the software (e.g. the dashboard). Then
the students are asked to gather in group and to collect and process information
own by the different fictive players in the game, in order to deliver a project pro-
posal to be validated with the client (i.e. the professor). During the rest of the
week the students are supposed to work in group to complete the assignment and
send the improved project proposal to the professor, who choses two proposals
among them. At the beginning of the second game session the chosen grooups are
asked to do a short presentation of their project proposal to the rest of the class.
Once two student groups have presented the teachers gives them a constructive
feedback and add some remarks about the overall performance of the other groups
(best and worst practices). After the presentations the teacher recalls to the class
key theoretical concepts regarding project planning. Then the students are asked to
work in group to make and to justify their planning decisions, while taking into
account a set of constraints (time, cost, quality, resources availability and risks). In
the rest of the week student groups are ask to finalize the WorkBreakdown Struc-
ture, Program Evaluation and Review Technique and Gantt diagrams, together
with cost estimations.
Evaluation: the fifth step of the methodology concerns the evaluation of the ar-
tifact. To operationalize our constructs we reuse existing items from the two
streams of literature whenver possible and we obtain a set of five-point Likert
scale items, which are meant to be collected by questionnaire to be handed once
the students have completed the assignments of the second game session. For the
student’s perceived acquired competency we derive four items inspired by Zapata
(2010) and Mawdesley et al. (2011). For the serious game design we implement
seven items inspired by Hainey and Connolly (2010) and de Freita and Oliver
(2004). For the student's engagement we use seven items inspired by Gresse von
Wangenheim et al. (2009) and Dantas et al. (2004). A set of open questions has
been collected as well, but their answers will be not presented here for sake of
brevity.
Current results
We have tested the serious game with a sample of bachelor students enrolled in a
project management course with a special focus on information systems. We have
collected students' perception by means of an electronic survey. We have ob-
tained 74 answers out of the total of 104 students. Although limited in size, we
consider this sample as representative for our study and a good starting point to
perform statistical analysis using Stata 11. We started by performing the Cron-
bach’s alpha test over each set of items to measure how well each set of items was
representing the concepts. A Cronbach’s alpha value of 1.00 would be optimal,
whereas a value below 0.70 should be rejected. In our case we obtained the fol-
lowing results: acquired competency = 0.79; design = 0.80; engagement = 0.77.
While testing the causality effect we have performed seemingly unrelated re-
gressions among the three constructs obtained by performing the average of each
set of items (i.e tau-equivalent factor loadings). In other words we have asked
Stata 11 to tests all the regressions at once.
Figure 2 represents the results that we obtained and it shows that serious game
design has also a direct effect over student's acquired competencies, which is sta-
tistically significant (p<0.01). It also appears that the student's engagement has an
effect over the student's perceived acquired competency that is statistically signifi-
cant (p <0.05). Finally the serious game design has an effect over the student's
engagement that is statistically significant (p <0.01). Thus all hypotheses are
confirmed.
Figure 2: Results of the preliminary test
The direct and indirect effect of serious game design explains almost 50% of
student’s perceived acquired competency variance among students (R
2
=0.47). This
is to say that none of the two effects should be neglected. In addition to that the
student's engagement variability among students is largely explained by serious
game design (R
2
=0.56), which leads us to believe this model has a good explana-
tory power. We have also controlled for the effect of sex and nationality and the
results were not statistically relevant.
Conclusions and further works
We start this section by recalling our research question: How to design a game-
based learning scenario editor to support an information system project manage-
ment course for more than forty students? In this paper we present our framework
to link ex ante and ex post evaluation criteria to assess a game-based learning edi-
tor. Now that the framework is in place we can develop the reflective learning
module and we can assess its added value by using such module on a subset of the
overall students’ sample, treating the rest of the class as control group. The results
we obtained so far lead us to believe that serious game design has a direct and
indirect effect over student’s perceived acquired competency, which is mediated
by student's engagement. The module we wish to develop has a graphical interface
that allows the scenario designer to represent the scenario as a graph. The module
is expected to be able to mine the log of student groups’ actions and to represent
them under the shape of graphs, in order to benchmark the different groups’ ex-
perience.
In the next iteration we intend to have students groups playing different ver-
sions of the same game, whereas the student’s acquired competency will be tested
with a set of questions in the final exam of the course. These improvements should
increase the reliability of our results against endogeneity due to common method
variance (Antonakis et al., 2010).
References
1. Antonakis, J., Bendahan, S., Jacquart, P., & Lalive, R. (2010). On making causal claims: A
review and recommendations. The Leadership Quarterly, 21(6), 1086-1120.
2. Connolly, T., & Stanfield, M. (2006). Using Games-Based eLearning Technologies in Over-
coming Difficulties in Teaching Information Systems. Journal of Information Technology Edu-
cation, 5(2006), 459-476.
3. Dantas, A. R., Barros, M. O., & Werner, C. M. L. (2004). A simulation-based game for pro-
ject management experiential learning. 2004 International Conference on Software Engineering
and Knowledge Engineering.
4. De Freitas, S., & Oliver, M. (2006). How can exploratory learning with games and simula-
tions within the curriculum be most effectively evaluated? Computers & Education, 46(3), 249–
264.
5. Gresse von Wangenheim, C., Thiry, M., & Kochanski, D. (2009a). Empirical evaluation of
an educational game on software measurement. Empirical Software Engineering, 14(4), 418–
452.
6. Hainey, T., & Connolly, T. M. (2010). Evaluation of a game to teach requirements collection
and analysis in software engineering at tertiary education level. Computers & Education, 56(1),
21-35.
7. Hevner, A. R., March, S. T., Park, J., & Ram, S. (2004). Design science in information sys-
tems research. Management Information Systems Quarterly, 28(1), 75–106.
8. Hodgson, D., & Cicmil, S. (2007). The Politics of Standards in Modern Management: Mak-
ing “The Project”a Reality*. Journal of Management Studies, 44(3), 431–450.
9. Mawdesley, M., Long, G., Al-jibouri, S., & Scott, D. (2011). The enhancement of simulation
based learning exercises through formalised reflection, focus groups and group presentation.
Computers & Education, 56(1), 44–52.
10. Navarro, E. O., & Van Der Hoek, A. (2009). On the Role of Learning Theories in Furthering
Software Engineering Education. Ellis, H. J.C. ,Demurjian, S.A. and Naveda J. F., eds. Software
Engineering: Effective Teaching and Learning Approaches and Practices (pp. 38-59). IGI Glob-
al.
11. Okoli, C., & Schabram, K. (2010). A Guide to Conducting a Systematic Literature Review of
Information Systems Research. Retrieved from http://sprouts.aisnet.org/10-26/
12. Peffers, K., Tuunanen, T., Rothenberger, M. A., & Chatterjee, S. (2008). A Design Science
Research Methodology for Information Systems Research. Journal of Management Information
Systems, 24(3), 45-77.
13. Pfahl, D., Laitenberger, O., Ruhe, G., Dorsch, J., & Krivobokova, T. (2004). Evaluating the
learning effectiveness of using simulations in software project management education: results
from a twice replicated experiment. Information and Software Technology, 46(2), 127-147.
14. Sauer, C., & Reich, B. H. (2009). Rethinking IT project management: Evidence of a new
mindset and its implications. International Journal of Project Management, 27(2), 182–193.
15. Sitzmann, T. (2011). A Meta-analytic Examination of the Instructional Effectiveness of
Computer-based Simulation Games. Personnel Psychology, 64(2), 489–528.
16. Smith, K., & Kampf, C. (2004). Developing writing assignments and feedback strategies for
maximum effectiveness in large classroom environments. Professional Communication Confer-
ence, 2004. IPCC 2004, Minneapolis, Minnesota, USA.
17. Zapata, C. M. (2010). A Classroom Game for teaching Management of Software Companies.
Dyna, 77(163), 290–299.