Content uploaded by Fabian Steger
Author content
All content in this area was uploaded by Fabian Steger on Jun 12, 2017
Content may be subject to copyright.
AAEE 2016 CONFERENCE
Coffs Harbour, Australia
This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit
http://creativecommons.org/licenses/by/4.0/
1
Teaching Battery Basics in Laboratories:
Comparing Learning Outcomes of
Hands-on Experiments and Computer-based Simulations
Fabian Stegera,b, Alexander Nitscheb, Hans-Georg Schweigerb, and Iouri Belskia.
School of Engineering, RMIT, Melbourne, Australiaa
Faculty of Electrical Engineering and Computer Science, UAS Ingolstadt, Germanyb
Corresponding Author Email: fabian.steger@thi.de
BACKGROUND
Understanding the characteristics of rechargeable batteries is essential for a successful career in the
field of research and development of hybrid and electric cars. It has been shown that hands-on
laboratory work can significantly influence the outcomes of student learning. However, universities and
vocational training institutions need proper laboratory equipment to engage students in effective
learning of batteries' behaviour. Increased amount of supervision to conduct hand-on labs safely as
well as costs of specialised laboratory equipment make hands-on laboratories expensive. Therefore,
many universities conduct such laboratories as simulated experiments.
PURPOSE
The aim of this study was to compare the learning outcomes of laboratory work on lithium-ion battery
cells and components of battery systems conducted in two different modes: as a practical hands-on
exercise and by means of computer-based simulation. The research had a strong focus on the
learning mode of the laboratory experiment, the method was designed to avoid other effects on the
result.
DESIGN/METHOD
The students were split into two comparable groups based on their prior practical experience to
ensure a similar background level of the two groups. Each group was taught four content areas: two
as practical hands-on experiments and two as computer-based simulations. One group completed the
even laboratory sessions as hands-on experiments and the odd ones as computer-based simulations.
The other group completed the odd laboratory sessions as hands-on experiments and the even as
computer-based simulations. To evaluate the influence of the learning mode onto the student learning,
anonymous 10-minute tests on knowledge gained during the previous experiment were conducted at
the beginning of the next laboratory session. The average group results between hands-on and
simulated mode were compared, to answer the question, which mode was more successful to transfer
the knowledge. The method excludes learning synchronicity/distance learning/supervision effects, and
is focused on the mode.
RESULTS
Forty students took part in the study. Three of four content areas showed weak to moderate effect:
hands-on laboratory sessions led to a better knowledge acquisition compared to simulated
experiments. One content area did not show any effect of study mode. Overall learning results of
hands-on experiments were slightly better than that of simulated laboratories (weak effect, Cohen's d
= 0.22), but the difference in performance was not statistically significant.
CONCLUSIONS
This study showed that the described methodology is applicable to focus on the comparison of two
learning modes. The slightly better learning results in hands-on mode are not significant. To get
statistically significant results, more data collection is necessary.
KEYWORDS
Hands-on experiment, simulated experiment, student experiment, battery experiment, comparing
learning-modes
Proceedings, AAEE 2016 Conference
Coffs Harbour, Australia 2
Background
Understanding the characteristics of rechargeable batteries is essential for a successful
career in the field of development and maintenance of electric cars (Müller and Goericke,
2012). At the moment more than thirty electric mobility study programs deliver subjects on
electric automotive engineering in Germany (NQuE, 2016). Universities of Applied Sciences
(UAS) are German institutions of higher education that differ from the traditional university in
Germany through their more vocational/practical orientation and wider utilization of
laboratories (Unseld and Reucher, 2010). Magin, Churches, and Reizes (1986) found that
laboratory work could significantly influence the outcomes of student learning. To provide
hands-on learning of batteries’ behaviour, training institutions need adequate laboratory
equipment. But industrial battery test benches are very expensive. Moreover, in order to
study temperature dependent effects of batteries, test benches need to be used together with
bulky temperature cabinets. Also, in order to guarantee students’ safety while handling
dangerous objects like battery cells, an increased amount of supervision is necessary.
Therefore, many universities that are unable to use proper industrial equipment for student
experiments conduct such laboratories as computer-based simulations and/or as remote
experiments (Ma and Nickerson, 2006). In German UASs teaching is based on hands-on lab
experiments, so a hands-on lab course was also chosen here to enhance the employability
of the graduates of this course. Therefore, the university funded the production of small-size
battery test benches for hands-on laboratory practicals, which were developed with funding
of the German Federal Ministry of Education and Research within the scope of the project
"Academic Education Initiative for Electric Mobility Bavaria/Saxony".
The aim of the experiment discussed in this study was to evaluate the influence of this newly
developed hands-on training on student learning. More specifically, the authors planned to
assess whether it led to better understanding compared to an equivalent simulation-based
laboratory work.
Purpose
There are several reasons to compare the effectiveness of learning when laboratory work is
conducted in different modes. Using the more successful mode will lead to an improvement
in student learning outcomes. Better trained engineers will develop superior products, in this
case electrified vehicles. If the effort to create hands-on training laboratory facilities is not
justified by improved learning results, it is possible to save money using cheaper solutions
like computer-based training. In this case, the funding could be allocated to other activities
that improve student learning.
Several researchers were interested in learning effectiveness of laboratory exercises
conducted in different modes, for example Engum, Jeffries, & Fisher (2003), McAteer, Neil,
Barr, Brown, Draper, & Henderson (1996) and Edward (1996). Reflecting on the results of
such studies, Ma and Nickerson (2006) concluded that in many studies the number of
student-participants were too small and did not allow researchers to reach definite
conclusions. Additionally, they found that the relative effectiveness of different kinds of
laboratories was seldom explored. Corter, Nicherson, Esche, Chassapis, Im, & Ma (2007)
compared learning outcomes for traditional hands-on labs, remotely operated labs, and
simulations in a physics engineering course. Learning outcomes of 306 students in two
cantilever beam experiments were assessed and were equal or higher after doing remote or
simulated experiments versus hands-on laboratories. As an outcome of the present
research, the community of engineering educators will gain more information that could help
answering the question whether real hands-on training enhances learning more than
simulated experiments.
Proceedings, AAEE 2016 Conference
Coffs Harbour, Australia 3
Compared to past studies this research had a strong focus on the learning mode of the
laboratory experiment. Target of the piloted methodology was to compare the knowledge
results of the same laboratory-experiment (executed real or simulated). In literature
researchers e.g. replace and compare a well-tried hands-on experiment with a newly created
simulation. They improve both experiments (hands-on and simulated) independent to the
best solution they find in those modes. For example students learn in groups in the university
(hands-on), but alone at their working place (simulated). As the learning mode is mixed with
other influences (in this case supervision, cooperative learning effects, distance learning,
instructional papers) such research compares the two combinations of aspects. Another
example is the abovementioned study of Corter et al (2007) where in simulation mode the 3D
view was enriched with colour coded stress values, may causing the better results in
simulation mode. Keller et al. (2006) compared two levels of enrichment in a simulation
regarding current. They found no significant differences of conceptual understanding, but the
less enriched was significantly rated more enjoyable and more useful for the learning by the
students.
In the present research the laboratory experiments were developed in a way that every step
in the students experiment was identical, except for the usage of the hardware.
Design/Method
Creation of content areas A to D
The work was based on the identification of the main learning objectives that support the
existing theoretical subject on battery cell behaviors and battery systems design. These
objectives where grouped to four main content areas: A) contact resistance (including four-
conductor measurement); B) open-circuit voltage curve; C) internal resistance and power; D)
capacity and energy.
Based on these four content areas, laboratory experiments were developed in two modes: as
practical hands-on exercise that uses the abovementioned laboratory equipment as well as a
set of computer-based simulations.
• A1 “low resistance measurements”: In this laboratory exercise students conduct low
resistance measurements. They are expected to discover that a multimeter is not the
right tool for low ohmic measurements and why. As a result of this exercise students
learn how to use the right alternatives and different devices to conduct a four wire
measurement in AC and DC.
• A2 “contact resistance”: Here students discover exemplary values of contact
resistances of different electrical connections used in battery systems. They build up
knowledge in designing a cable lug connection and avoiding the main pitfalls.
• A3 “isolation resistance”: This laboratory exercise deals with the usage of the
appropriate measurement equipment. Students learn to estimate the influence of
moisture and measurement period on the isolation resistance.
• B “open circuit voltage curve”: In this experiment students investigate the dependency
of the open circuit voltage curve from the state of charge of two different lithium-ion
cell types. They are expected to learn that cells reach a stable state only over an
extended period of time.
• C1 “internal resistance”: This exercise is devoted to the importance of the internal
resistance on the efficiency of a battery system. Students learn to use AC- and DC-
methods to measure internal resistances. Being aware of the temperature
dependency, students learn to approximate temperature changes caused by the
power loss in a cell. They also learn to deal with industry standards, select the right
measurement procedure and the effects causing misleading and faulty results.
Proceedings, AAEE 2016 Conference
Coffs Harbour, Australia 4
• C2 “power”: In this laboratory exercise students learn to estimate the maximum
discharge rate of battery cells. They practice to read and understand a cell data sheet
and estimate various cell limits. Students also learn how to calculate the power
density and comprehend the dependency of maximum discharge power from state of
charge, pulse duration and temperature.
• D “energy and capacity”: In this experiment students determine the capacity of a
lithium-ion cell and learn about the factors influencing it. They familiarize themselves
with the Peukert’s law and the energy efficiency of a cell charge and discharge cycle.
They also learn how to calculate the energy density of a battery cell.
The time required to create these practical experiments was very similar in both modes.
Instructions affect the learning outcome of an experiment, e.g. Chamberlain et al. (2014)
explored using an interactive simulation that the guidance level can strongly influence
student exploration. In this research for each experiment a single set of instructions was
developed, which was used in both laboratory modes. These instructions contained
introductory questions for preparation, guides for the experiments, and suggestions for the
analysis of the collected data and measurement results. Since the study program was
delivered in German, all documents were prepared in German.
Arrangement of students into two groups
Forty students were enrolled in the laboratory subject in summer-semester 2016. As this
study-module was a mandatory subject, the full semester group in the study program
“Elektrotechnik und Elektromobilität (B. Eng.)” (“Electrical Engineering and Electric Mobility”)
at the UAS in Ingolstadt, Germany was asked to participate in the study.
To conduct the educational experiment as a cross-over study it was necessary to separate
the enrolled students into two comparable groups. It was assumed that students with more
practical experience may perform better in laboratories than their peers with a lesser
practical background. Therefore, in order to assess the level of students’ practical experience
a questionnaire was developed. The questionnaire consisted of 17 statements that focused
on prior hands-on-experience (e.g. "I ever changed the tires of a car"). A four point Likert-
scale from “full yes” to “full no” was deployed. After analysis of student responses, students
were assigned to two laboratory groups in a way to ensure a similar mix of ‘practical’
students in each group. Student names were recorded. Instead, each student created a
code-word that could be used to identify the same individual by the experimenters and at the
same time keep her/him anonymous. Later two lists with code-words were publicized, telling
the students the weekday for the practical laboratory sessions.
During the introductory meeting the research aims and methods were clearly explained to the
students.
Conducting laboratories in content areas A to D
In order to ensure very similar experiences of students from both groups, laboratory
experiments were conducted in accordance to the schedule shown in Figure 1. Each group
completed experiments in four main content areas, two as practical hands-on experiments
and two as computer-based simulations.
One group completed the even laboratory sessions as hands-on experiments and the odd as
computer-based simulations. The other group completed the odd laboratory sessions as
hands-on experiments and the even as computer-based simulations. Both groups attended
hands-on and simulated experiments equally. Topics A1 to A3 were taught in one session.
C1 and C2 were sharing two sessions. Such arrangement of laboratory work allowed to
further reduce the influence of laboratory mode and practical inclinations of participants on
assessment of learning outcomes of hands-on and simulated sessions.
Proceedings, AAEE 2016 Conference
Coffs Harbour, Australia 5
Figure 1: order of content areas over the semester
For the content area A in simulation-mode a newly created simulation-website was used. For
areas B to D a black box simulation of the hands-on equipment and the battery cell was
accessed through the same graphical user interface as the real hands-on devices. The
intention was to exclude any influences from the interface used by the students to control the
experiments. The simulation model emulates all observed effects of the real battery cell and
the hands-on devices which are used in hands-on mode. The cell simulation model was
parametrized to match the outcome of the hands-on experiments.
The laboratory sessions were conducted at the same time of a day. Each group (Wednesday
n=19, Friday n=21) was split into five smaller learning groups of three to five students. Webb
(1989) found that the same student may have different experiences in different groups, with
consequent effects on his or her learning. The learning groups remained unchanged for all
sessions to exclude any effects on the result caused by changing cooperative learning.
The students worked autonomously in a supervised environment. Each learning group used
a set of hands-on devices or one simulation PC. All groups were asked to prepare a written
laboratory report for each content area before the next session.
Physical actions and the environment may have influences on the learning outcome (Larson
et al, 2015). For the hands-on sessions, the students were standing at tables, whereas for
the simulation sessions a computer lab in sitting position was used.
Data collection/Testing the learning outcome
Anonymous written tests on knowledge gained during the laboratory exercises were
conducted at the beginning of the session that followed the appropriate laboratory session.
These “tests on the content area of the past laboratory session” lasted ten minutes, and
occupied around four percent of the overall class time. These tests contained a mix of
descriptive and multiple choice questions, free answers and drawings. The questions were
directly related to the learning objectives defined for the content area under test. A positive
point system (similar to tests for giving a mark) was used to evaluate the results.
No names were recorded. Students were coded through the same self-created code-word
that was used in the questionnaire for grouping to keep everything anonymous. This
prepared the analysis of test results of individual students in future. The test papers were not
returned to the students.
The target was to keep time lapses between experiment and the corresponding test equal for
both groups (A 7 days, B 7 days, C 14 days, D 14 days). For organizational reasons, this
was not possible at the first content area A. Nevertheless, although the extended time period
between laboratory session and test in hands-on mode (9 days) the results in this mode were
better.
For the tests, the computer lab was used to provide the same environment while writing the
tests (sitting on a desk, like in usual written exams). An exception was made for the test on
content area A-simulated. This test was written for organizational reasons in the chemistry
lab.
Proceedings, AAEE 2016 Conference
Coffs Harbour, Australia 6
Anonymity/Research Ethics
Any direct positive or negative effects for individual students regarding the study program
relevant marks had to be precluded. Like mentioned above, both groups attended hands-on
and simulated experiments equally. The result of the laboratory itself is a simple pass/fail,
depending on regular attendance and the abovementioned laboratory reports. The marks of
the accompanying theoretical module were generated according to the students'
performance in a written test created and conducted by an independent lecturer. However,
differences regarding the learning outcomes depending on the mode were expected. But as
both groups attended hands-on and simulated experiments equally, no inequitable results in
the theoretical test were anticipated.
From researcher’s side, it was not possible to identify individual students not taking part in
the research. All students had the free choice not to return any of the documents
(questionnaire for grouping, 10-minute tests on the sessions before).
Analysis
The tests on the individual laboratory sessions were evaluated and rated using a point
system. The average group results between hands-on and simulated mode were compared,
to answer the question, which mode was more successful to transfer the knowledge.
After more data collection, it is planned to answer in future, if students individually benefit
from one mode or the other. This study is going on every year till 2018.
Results
Table 1 shows group-wise results for all handled content areas, Table 2 compares both
learning modes.
Table 1: Results (average reached points) of groups
Group
Content
Area
Return
Rate
Sample
Size
Percentage of
points
Mean Value
Percentage of
points
Std. Deviation
Learning
Mode
Wednesday
A
100%
18
38%
17%
hands-on
Wednesday
B
100%
19
49%
16%
simulated
Wednesday
C
100%
15
52%
16%
hands-on
Wednesday
D
100%
17
45%
17%
simulated
Wednesday
all
45%
18%
all
Friday
A
100%
19
33%
16%
simulated
Friday
B
100%
20
54%
16%
hands-on
Friday
C
100%
20
47%
20%
simulated
Friday
D
100%
20
45%
14%
hands-on
Friday
all
46%
17%
all
Proceedings, AAEE 2016 Conference
Coffs Harbour, Australia 7
Grouping
Both groups performed similar in sum over both modes (Table 1), a group bias was not
necessary. The Wednesday group got overall 45 per cent, the Friday group got overall 46
per cent of the maximum points. Standard deviations were also very similar in all tests
(between 14% and 20%). Therefore, it was assumed that the grouping was successful for the
experiment. When enough data is collected, it is planned to investigate the correlation
between individual performance and score in the questionnaire, to check the
abovementioned assumption that practical experienced students perform better in
laboratories.
Group results
A full return rate from present students was reached (Table 1). The authors clearly state that
no data was omitted, except for one filled test that was rejected as the student told he was
not attending the session before. As one question in the test regarding content area B was
verbalized in wrong way (and not answered by the students) is was not taken into account
while evaluation.
Mode results
Range of individual reached points was from 12% to 85% for hands-on, and from 12% to
88% for simulated mode. The Shapiro-Wilk-Test was telling that the distribution of all results
was normal.
With three content areas (A to C) a Cohen’s d around 0.3 (Table 2) was reached. According
to literature (e.g. Rubin, 2013), this was to interpret as weak to moderate effect. It was giving
the hint that hands-on conducted experiments led to a better understanding and knowledge
retention compared to simulated experiments. Content area D was showing no effect.
According to all returned tests the data was showing a weak effect (Cohen’s d=0.22) towards
better results in hands-on mode.
Table 2: Effect hands-on vs. simulated
Content
Area
Percentage of
points
Mean Value
hands-on
Percentage of
points
Mean Value
simulated
Percentage of
points
Std. Deviation
both modes
Effect size (Cohen’s d)
“hands-on led to better
learning outcome”
A
38%
33%
16%
0.28
B
54%
49%
16%
0.32
C
52%
47%
18%
0.30
D
45%
45%
15%
0.04
all
47%
44%
17%
0.22
Nevertheless, the 95% confidence interval for the mean of percentage of points was widely
overlapping for both modes (43% to 51% for hands-on, 40% to 48% simulated).
Grouped by learning mode, Levene's test was showing equality of variances
(p=0.741>>0.05). Independent samples T-test showed that although the reached point
percentages after hands-on experiments were higher than after simulations experiments, the
difference in performance was not statistically significant (p=0.215>>0.05).
Proceedings, AAEE 2016 Conference
Coffs Harbour, Australia 8
Conclusions
This study showed that the described methodology is applicable to focus on the comparison
of two learning modes. With the instructions and learning objectives being identical and
avoiding to change cooperative learning effects, less influence was applied to the learning
outcome generated by the different modes. The modest increase also shows that some of
the excluded factors might have greater impact on student learning than estimated before.
Like this study was focusing on the learning mode with the exclusion of other influences, it is
recommended to investigate the quantitative impact of those other factors with a similar
strong focus on a single one.
The slightly better learning results in hands-on mode are not significant. To get statistically
significant results, more data collection is necessary. This study is going on every year till
2018 at UAS Ingolstadt and the methodology will be tested in different universities all over
the world.
The method used to distribute the students can also be applied in other situations, if
correlations between individual performance and score in the questionnaire will be found.
Than it might be e.g. feasible to provide students the learning mode that is most suiting their
predisposition so that the overall performance can be increased even further.
Proceedings, AAEE 2016 Conference
Coffs Harbour, Australia 9
References
Chamberlain, J. M., Lancaster, K., Parson, R. & Perkins K.K., (2014). How guidance affects student
engagement with an interactive simulation. Chem. Educ. Res. Pract., 15, 628
Corter, J., Nicherson, J., Esche, S., Chassapis, C., Im, S., Ma, J., (2007). Constructing Reality: A
Study of Remote, Hands-on, and Simulated Laboratories, ACM Transactions on Computer-Human
Interaction, Vol. 14, No 2, Article 7
Edward, N. S. (1996). Evaluation of computer based laboratory simulation. Computers & Education
26, 1-3, 123-130
Engum, S. A., Jeffries, P., & Fisher, L. (2003). Intravenous catheter training system: Computer-Based
education versus traditional learning methods. American J. Surgery 186, 1, 67-74
Keller, C. J., Finkelstein, N. D., Perkins, K. K., & Pollock, S. J., (2006). Assessing the Effectiveness of
a Computer Simulation in Introductory Undergraduate Environments. PERC Proceedings, AIP
Conf. Proc. 883, 121
Larson, M. J., Le Cheminant J. D., Hill K., Carbine K., Masterson T., & Christenson E. (2015).
Cognitive and Typing Outcomes Measured Simultaneously with Slow Treadmill Walking or Sitting:
Implications for Treadmill Desks. PLoS ONE 10(4), e0121309
Ma, J., & Nickerson J. V. (2006). Hands-On, Simulated, and Remote Laboratories: A Comparative
Literature Review, ACM Computing Surveys, Vol. 38, No 3, Article 7
Magin, D. J., Churches, A. E., & Reizes, J. A. (1986). Design and experimentation in undergraduate
mechanical engineering. Proceedings of a Conference on Teaching Engineering Designers.
Sydney, Australia. Institution of Engineers, 96-100
McAteer, E., Neil, D., Barr, N., Brown M., Draper, S., & Henderson, F. (1996). Simulation software in a
life sciences practical laboratory. Comput. and Education 26, 1-3, 102-112
Müller, K., & Goericke, D. (2012), Kompetenz-Roadmap, NPE Nationale Plattform Elektromobilität, AG
6 Ausbildung und Qualifizierung, Page 13
NQuE Netzwerk Qualifizierung Elektromobilität. (2016). Database Qualifizierungsangebote im Bereich
akademische Bildung. Retrieved June 18, 2016, from
http://www.nque.de/de/datenbank_akademisch.php
Rubin, A. (2013). Statistics for Evidence-Based Practice and Evaluation, Third Edition. Belmont:
Brooks/Cole. 144-145
Unseld, C., & Reucher, G. (2010). University types: Universities of applied science. Retrieved
September 15, 2015, from http://dw.com/p/Ovf8
Webb, N. M. (1989). Peer interaction and learning in small groups. International Journal of Educational
Research, Volume 13, Issue 1, 21-39
Acknowledgements
This research was approved by the Faculty of Electrical Engineering and Computer Science,
UAS Ingolstadt and the College Human Ethics Advisory Network of RMIT, Melbourne.
A lot of thanks to the lab engineer Sönke Barra giving assistance while planning, preparing
and conducting the experiments. This research was not possible without students,
generously ready to take part in the study to improve the learning outcome of future groups.
As part of the German Federal Government’s Showcase Regions for Electric Mobility, the
Federal Ministry of Education and Research provided funding for the project “Akad.
Bildungsinitiative zur Elektromobilität Bayern/Sachsen” (“Academic Education Initiative for
Electric Mobility Bavaria/Saxony”) which was used for the development of the prototypes of
the devices. The devices used in the laboratory were built from finance of the Faculty of
Electrical Engineering and Computer Science.