Conference PaperPDF Available

Characterizing Digital Educational Tools for School STEM Education using Educational Metadata

Panagiotis Zervas, Department of Digital Systems, University of Piraeus, Greece, Eleni Boulmanou, Department
of Digital Systems, University of Piraeus, Greece, Demetrios G. Sampson, Department of Digital Systems,
University of Piraeus, Greece & School of Education, Curtin University, Australia
Science, Technology, Engineering and Mathematics (STEM) education has been recognized as a top priority for
school education worldwide [1]. To support this, major initiatives have been developed to engage students in
interesting and motivating STEM experiences [2], [3]. As it has been highlighted by several policy reports, such
initiatives should follow the inquiry based teaching model [4]. Inquiry is the process in which students are
engaged in scientifically oriented questions, perform active experimentation, formulate explanations from
evidence, evaluate their explanations in light of alternative explanations, and communicate and justify their
proposed explanations [5]. Inquiry based learning can be supported and enhanced with several Digital
Educational Tools (DETs) that are available online, which can provide to the students ample opportunities for
engaging in the inquiry process [6], [7].
However, existing DETs are scattered around the web without a commonly accepted method for characterizing
them. This a hindering factor for STEM teachers to search and retrieve efficiently and effectively DETs for further
usage into their day-to-day inquiry-based teaching activities. Thus, the aim of this paper is twofold: (a) to collect a
sample of DETs which are currently available online and characterize them with a common educational metadata
schema and (b) to perform an analysis of the educational metadata of the collected DETs towards providing
insights about their characteristics.
The remainder of the paper is as follows. Section “Background” presents the background of this work, namely the
different types of DETs and their main characteristics, as well as the selected metadata schema that has been
used for describing the collected sample of DETs. Section “Study Methodology”, presents the methodology that
was followed for collecting the sample of DETs and the metrics that were used for the metadata analysis. Section
“Results: Analysis of Educational Metadata”, presents the analysis of specific metadata elements for the collected
DETs and it provides useful insights about their characteristics. Finally, section “Conclusions and Future Work”
concludes the paper and presents potential future work.
Types of Digital Educational Tools for STEM Education
Within the rich literature, there are many types of DETs for STEM education [6], [7], [8], [9], [10]. Based on a
literature review that we have performed, we can identify eight (8) main distinct types of DETs for STEM
education, as follows:
Remote Labs: these are physical laboratories that can be operated at a distance and they provide
students with the opportunity to conduct real experiments and collect real data from a physical laboratory
in a remote location [11].
Virtual Labs: constitute interactive environments for designing and conducting simulated experiments.
More specifically, they can range from simple simulations of physical processes that allow students to
manipulate few variables to accurate simulations of experimental processes complemented with
functionalities to measure experimental errors [12].
Data Set Analysis Tools: data sets are outcomes of investigations with physical or virtual equipment.
They often come with dedicated analysis and visualization tools that help to organize and interpret the
dataset [13].
Virtual Reality Tools: these tools contain the following features: (a) the illusion of being in a 3-D space,
(a) the ability to build and interact with the 3D objects, (c) digital representation of students in form of
avatars, and (d) ability to communicate with other students in the virtual reality tool [14].
Augmented Reality Tools: they are tools that provide students with technology-mediated immersive
experiences in which real and virtual worlds are blended, as well as students’ interactions and
engagement are augmented [15]
Modelling Tools: these are tools with calculation, exploration and visualization functionalities and they
can be used for learning abstract STEM concepts, as well as key elements of scientific modelling. It
could be concept modelling tools or hypothesis generation tools [16]
Assessment Tools: they are tools that can be used by the students, in order to assess their
competences and get feedback about their inquiry process. It could be quiz tools or drill and practice
tools [17].
Educational Games: these are learning environments that provide students with a sense of autonomy,
identity, and interactivity towards reaching specific educational goals. They are equipped with
achievement levels, and rewards systems. Students in these environments can strategize their moves,
test hypotheses, and solve problems [18].
Metadata Schema for Characterizing Digital Educational Tools for STEM Education
A major European Initiative, namely the Inspiring Science Education (ISE) Project (http://www.inspiring-science- has already developed a metadata schema for characterizing DETs for STEM education. This
metadata schema is presented in Table 1.
Table 1 ISE Project Metadata Schema for Characterizing Digital Educational Tools for STEM Education
Group Element Name Description Vocabulary
Title Refers to the complete title of the DET No
2Description and
Primary Aims
Provides a textual description of the DET, as well as its
primary aims No
3 Type Refers to the specific type of the DET Yes
4 Language Refers to the languages that the DET is available in Yes
5 Keyword Refers to a set of terms that characterize the content of the
Registration Denotes whether accessing DET requires registration) Yes
7 Rights Holder Refers to those entities that hold the DET’s copyrights No
8 Lifecycle Dates Refers to critical dates related to the DET’s lifecycle No
9 Contact Details Provides information about contact details of the person or
the organization responsible for DET No
10 Cost Refers to any payment required for using DET Yes
11 Licence Provides information about copyrights and restrictions
applied to the use of the DET
12 Provider Provides information about the provider of the DET No
13 Status Provides information about the availability status of the DET Yes
14 Pedagogical Subject Domain Refers to the DET’s subject domain Yes
15 Age Range Refers to the age range for which the DET can be used Yes
16 Educational
Objectives Refers to the educational objectives that the DET addresses No
17 Level of Difficulty Refers to the level of difficulty of the DET Yes
18 Level of
Interaction Refers to the level of interaction to the DET offers Yes
19 Technical
Location URL Provides a URL for accessing the DET No
20 Technical
Refers to the technical requirements that are needed for
using the DET.
21 Additional
Student’s Manual Refers to the URL of student’s manual for using the DET No
22 Teacher’s
Provides the URL for accessing the lesson plan that can be
used for exploiting the DET No
As the Table 1 depicts, the metadata schema includes 22 metadata elements divided in 5 metadata elements
groups, namely (a) General Metadata (5 elements), (b) Administrative Metadata (8 elements), (c) Pedagogical
Metadata (5 elements), (d) Technical Metadata (2 elements) and Additional Material Metadata (2 elements). This
metadata schema will be exploited in our study for characterizing the sample of DETs that has been collected, as
it will be described in the next section
Study Methodology
The scope of this section is to present the methodology that has been adopted in our study. More specifically, we
present the process for selecting our sample of DETs, namely the search terms used the inclusion criteria
applied, as well as the procedure that was followed for analyzing the educational metadata records of our sample.
In order to select a considerable amount of DETs we performed web searches, between February and May of
2015, using the Google search engine. The search terms included: "virtual laboratory", "remote laboratory", "data
set", "analysis tool", "virtual and remote laboratory", "virtual and remote lab", "online laboratory", "e-laboratory",
"virtual reality tool”, “VR tool”, “augmented reality tool”, “AR tool”, “modelling tool”, “assessment tool”, “e-
assessment”, “educational game”, “digital game”.
Due to the fact that the number of DETs that are currently available on the web is considerably high, we explicitly
determined a set of inclusion criteria for the sample (Table 2).
Table 2 Inclusion Criteria
DETs that are offered under an open access license
DETs that address STEM fields, namely Science, Technology, Engineering, Mathematics
DETs that address school education age ranges
DETs that do not require third party plugins (such as Java, flash, MS Silverlight)
DETs that are developed with HTML 5.0 technologies
The search process, and after deleting the duplicate records, yielded 195 DETs. These DETs were characterized
with educational metadata following the metadata schema that was presented in section 2.
The procedure that was followed for analyzing the educational metadata records of the DETs of the sample
comprises the following steps:
Step 1: Select Metadata Elements to be Analyzed: we focused on metadata elements that are "vocabulary
elements", namely, elements that can be filled with a limited set of pre-determined values established by the
metadata schema presented in section 2. The metadata elements that were selected to be analyzed were key
elements of the presented metadata schema, namely the “Type”, the “Subject Domain” and the “Age Range”.
Step 2 Measure the Entropy of the Metadata Values: The next step was to measure the entropy of values
(as proposed in [19])contained in the elements selected in Step 1, in order to determine the amount of information
they carry. Entropy as a measurement of information contained in a message was proposed by Shannon [20] and
it denotes the variety of information included in this message. Small values denote small variety of the information
included in the message. For example, if all the metadata records in our sample have the field “Subject Domain”
set as “Science”, a new instance with this field set to “Science” carries little information, meaning that it does not
help to distinguish this particular Online Lab from the rest. On the other hand, if the new educational metadata
record has the “Subject Domain” field set to “Mathematics”, it is highly possible (based on the DETs of our
sample) that this value helps to differentiate this new DET from the others. The entropy values for the
aforementioned vocabulary data elements has been calculated based on the formula below, which has been
proposed in [19]:
=1log (¿
log (n)
where: times(value) is the number of times that the vocabulary value is present in this metadata field in the
educational metadata records of our sample and n is the total number of educational metadata records, namely
195 for the sample of DETs that has been collected. When times(value) is 0 (the value is not present in the
repository), the Entropy is 1. On the other hand, if times(value) is equal to n (all the instances have the same
value), the Entropy is 0. For each of the aforementioned metadata elements, entropy values have been
calculated and they are presented in the next section. Further to the calculation of the entropy values for each
metadata element, we have calculated the occurrence frequency of each vocabulary value per metadata element
within our sample of online DETs.
Results: Analysis of Educational Metadata
In this section, we present the results from the analysis of the educational metadata records of the online labs in
our sample. More specifically, the Figures 1, 3, 5 and 7 present heat map views of the entropy values for the
different vocabulary values of the metadata elements: (a) “DET Type”, (b) “Subject Domain” and (c) “Age Range”.
Moreover, the Figures 2, 4, 6 and 8 present the occurrence frequency of the vocabulary values within the
educational metadata records of our sample.
DET Type Entropy
Virtual Lab 0,265844617
Remote Lab 0,336903498
Modelling Tool 0,368062813
Assessment Tool 0,368062813
Virtual Reality Tool 0,405368011
Educational Game 0,431873217
Augmented Reality Tool 0,583306512
Data Set Analysis Tool 0,660201096
Figure 1 Heat map Table of the Entropy Values for the
Vocabulary Values of the Metadata Element “DET Type”
Virtual Lab
Remote Lab
Modelling Tool
Assessment Tool
Virtual Reality Tool
Educaonal Game
Augmented Reality Tool
Data Set Analysis Tool
60 48
33 28 28 23 20 9
DET Type
Figure 2 Occurrence Frequency of the Vocabulary Values of the
Metadata Element “DET Type”
Subject Domain Entropy
Science 0,104019789
Mathematics 0,178638491
Technology & Engineering 0,572002040
Figure 3 Heat map Table of the Entropy Values for the
Vocabulary Values of the Metadata Element “Subject
Subject Domain
Figure 4 Occurrence Frequency of the Vocabulary Values of the
Metadata Element “Subject Domain”
Subject Domain: Science Entropy
Physics 0,193106057
Astronomy 0,342231739
Chemistry 0,392387939
Biology 0,412362144
Geography and Earth Science 0,412362144
Environmental Education 0,611644026
Figure 5 Heat map Table of the Entropy Values for the
Vocabulary Values of the Metadata Element “Subject
Domain” for the Value “Science”
60 57
27 21 19 19
Subject Domain: Science
Figure 6 Occurrence Frequency of the Vocabulary Values of the
Metadata Element “Subject Domain” for the Value “Science”
Age Range Entropy
9-12 0,167504434
12-15 0,246885262
15-18 0,259481541
6-9 0,295783156
Figure 7 Heat map Table of the Entropy Values for the
Vocabulary Values of the Metadata Element “Age
9-12 12-15 15-18 6-9
160 134
84 78
Age Range
Figure 8 Occurrence Frequency of the Vocabulary Values of the
Metadata Element “Age Range”
Based on the aforementioned metadata analysis results, we can notice the following main findings:
Virtual labs are dominant in our sample. This can be explained by the fact that virtual labs are not very costly to
implement and they do not require advanced technical skills. On the other hand, other types of DETs such as
augmented reality tools or data set analysis tools require advanced technical skills (rendering, visualizing and
augmenting real objects for the case of augmented reality tools) and specialized knowledge (structure and format
of the data set for the case of data analysis tools) to be developed. Finally, all other types of DETs are almost
equally distributed in our sample, which means that there are no main hindering factors for interested parties to
develop them and offer them online for use in STEM education.
Most of the DETs support science education rather than Mathematics and/or Technology and Engineering. This
can be explained by the fact that most of the DETs are mainly used to engage students in science experiments.
However, there is also a considerable amount of DETs that can be used for engaging students in understanding
and exploring mathematical, as well as technology & engineering concepts. Moreover, it is worth mentioning that
the dominant sub-subject of science education that is addressed by most DETs is Physics. This is explained by
the fact the Physics includes many concepts that can be taught with the support DETs following the inquiry
The age ranges in our sample are almost equally distributed. This means that DETs for supporting inquiry based
teaching are used to both primary (6-9 and 9-12 age ranges) and secondary education (12-15 and 15-18 age
ranges). This can be explained by the fact that most of school curricula worldwide aim to engage students in the
inquiry process during the entire spectrum of school education.
Conclusions and Future Work
In this paper, we presented an analysis of the educational metadata records of DETs that were collected and they
will be included in the DETs Repository of the Inspiring Science Education Project
( Based on this analysis, we concluded that the main trends
of DETs that are currently available online: (a) are mainly virtual labs and modeling tools, (b) support the science
subject domain and more specifically physics and (c) can support both primary and secondary education. Future
work in this agenda will be performed in three axes, as follows: (a) we will further analyze this sample towards
producing additional conclusions for other pedagogical characteristics of existing DETs (i.e. interactivity level and
difficulty level), as well as other characteristics of existing DETs (i.e. technical or administrative characteristics),
and (b) we will focus on the development of recommender mechanisms that can facilitate STEM teachers in the
process of selecting DETs for their day-to-day inquiry-based teaching activities.
The work presented in this chapter has been partly supported by the Inspiring Science Project that is funded by
the European Commission's CIP-ICT Policy Support Programme (Project Number: 325123). This document does
not represent the opinion of the European Commission, and the European Commission is not responsible for any
use that might be made of its content.
1. Johnson, L., Adams, S., Cummins, M., & Estrada, V. (2012). Technology Outlook for STEM+ Education 2012-
2017: An NMC Horizon Report Sector Analysis. Austin, Texas: The New Media Consortium.
2. De Jong, T., Linn, M. C., & Zacharia, Z. C. (2013). Physical and virtual laboratories in science and engineering
education. Science, 340(6130), pp. 305-308.
3. De Jong, T., Van Joolingen, W. R., Giemza, A., Girault, I., Hoppe, U., Kindermann, J., ... & Van Der Zanden, M.
(2010). Learning by creating and exchanging objects: The SCY experience. British Journal of Educational
Technology, 41(6), pp. 909-921.
4. Rocard, M., Csermely, P., Jorde, D., Lenzen, D., Walberg-Henriksson, H., & Hemmo, V. (2007). Science
education now. A renewed pedagogy for the future of Europe. Brussels: European Comission.
5. National Research Council. (2000). Inquiry and the national science education standards. A guide for teaching
and learning. Washington DC: National Academy Press
6. Bell, T., Urhahne, D., Schanze, S., & Ploetzner, R. (2010). Collaborative inquiry learning: Models, tools, and
challenges. International Journal of Science Education, 32(3), pp. 349-377.
7. Clark, D., Nelson, B., Sengupta, P., & D’Angelo, C. (2009). Rethinking science learning through digital games and
simulations: Genres, examples, and evidence. In Learning science: Computer games, simulations, and education
workshop sponsored by the National Academy of Sciences. Washington, DC.
8. Webb, M. E. (2005). Affordances of ICT in science learning: implications for an integrated pedagogy. International
journal of science education, 27(6), pp. 705-735.
9. McFarlane, D. A. (2013). Understanding the Challenges of Science Education in the 21st Century: New
Opportunities for Scientific Literacy. International Letters of Social and Humanistic Sciences, 4, pp. 35-44.
10. de Jong, T., Linn, M.C., & Zacharia, Z.C. (2013). Physical and virtual laboratories in science and engineering
education. Science, 340(6130), pp. 305-308.
11. Gomes L., & Bogosyan, S. (2009). Current trends in remote laboratories. IEEE Transactions on Industrial
Electronics, 56(12), pp. 4744-4756.
12. Balamuralithara B., & Woods P.C. (2009). Virtual laboratories in engineering education: The simulation lab and
remote lab. Computer Applications in Engineering Education, 17(1), pp. 108-118.
13. Renear, A. H., Sacchi, S., & Wickett, K. M. (2010). Definitions of dataset in the scientific and technical literature.
Proceedings of the American Society for Information Science and Technology, 47(1), pp. 1-4.
14. Bailenson, J. N., Yee, N., Blascovich, J., Beall, A. C., Lundblad, N., & Jin, M. (2008). The use of immersive virtual
reality in the learning sciences: Digital transformations of teachers, students, and social context. The Journal of
the Learning Sciences, 17(1), pp. 102-141.
15. Wu, H. K., Lee, S. W. Y., Chang, H. Y., & Liang, J. C. (2013). Current status, opportunities and challenges of
augmented reality in education. Computers & Education, p. 62
16. Teodoro, V. D., & Neves, R. G. (2011). Mathematical modelling in science and mathematics education. Computer
Physics Communications, 182(1), pp. 8-10.
17. Ramos, J., Trenas, M. A., Gutiérrez, E., & Romero, S. (2013). Eassessment of Matlab assignments in Moodle:
Application to an introductory programming course for engineers. Computer Applications in Engineering
Education, 21(4), pp. 728-736.
18. Conrad, S., Clarke-Midura, J., & Klopfer, E. (2014). A framework for structuring learning assessment in a
massively multiplayer online educational game: experiment centered design. International Journal of Game-Based
Learning (IJGBL), 4(1), pp. 37-59.
19. Ochoa, X., & Duval, E. (2009). Automatic evaluation of metadata quality in digital repositories. International
Journal on Digital Libraries, 10(2-3), pp. 67-91.
20. Shannon, C.E. (2001) A mathematical theory of communication. ACM SIGMOBILE Mobile Computing and
Communications Review 5(1), pp. 3–55
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
This essay examines the challenges of science education in the 21 st century with regard to social, cultural, economic, political and pedagogical issues impacting and influencing instructional methodology and understanding of the role of science education as it affects individual, social organizational and societal progress and functions. Drawing upon some strong practical, philosophical, and pedagogical-methodological and theoretical ideas and propositions from Hodson, as espoused in his book Looking to the Future: Building a Curriculum for Social Activism, the author essentially responds to this extremely rich scholarly volume in scientific literacy, philosophy, and history by supporting Hodson’s advocacy of an action-oriented and issues-based curriculum as the key to renewing and activating scientific literacy to increase students’ performance and national competitiveness in the global economy. The author extricates from literature, not only strong rationale for the renewal and transformational of science education in terms of perspective and approach, but also takes a critical approach in examining some of Hodson’s contentions regarding strategies in confronting socioscientifc issues as major pathways to the teaching and learning of science. The author examines problems, challenges, and the new opportunities that have emerged and are emerging in contemporary environmental, social, cultural and political contexts for science education to experience transformation in several ways: as a field of study, as an applied body of knowledge, as a way of living and as a competitive tool and strategy important to national goals and posterity.
Full-text available
Educational games offer an opportunity to engage and inspire students to take interest in science, technology, engineering, and mathematical STEM subjects. Unobtrusive learning assessment techniques coupled with machine learning algorithms can be utilized to record students' in-game actions and formulate a model of the students' knowledge without interrupting the students' play. This paper introduces "Experiment Centered Assessment Design" XCD, a framework for structuring a learning assessment feedback loop. XCD builds on the "Evidence Centered Assessment Design" ECD approach, which uses tasks to elicit evidence about students and their learning. XCD defines every task as an experiment in the scientific method, where an experiment maps a test of factors to observable outcomes. This XCD framework was applied to prototype quests in a massively multiplayer online MMO educational game. Future work would build upon the XCD framework and use machine learning techniques to provide feedback to students, teachers, and researchers.
Full-text available
The world needs young people who are skillful in and enthusiastic about science and who view science as their future career field. Ensuring that we will have such young people requires initiatives that engage students in interesting and motivating science experiences. Today, students can investigate scientific phenomena using the tools, data collection techniques, models, and theories of science in physical laboratories that support interactions with the material world or in virtual laboratories that take advantage of simulations. Here, we review a selection of the literature to contrast the value of physical and virtual investigations and to offer recommendations for combining the two to strengthen science learning.
This article illustrates the utility of using virtual environments to transform social interaction via behavior and context, with the goal of improving learning in digital environments. We first describe the technology and theories behind virtual environments and then report data from 4 empirical studies. In Experiment 1, we demonstrated that teachers with augmented social perception (i.e., receiving visual warnings alerting them to students not receiving enough teacher eye gaze) were able to spread their attention more equally among students than teachers without augmented perception. In Experiments 2 and 3, we demonstrated that by breaking the rules of spatial proximity that exist in physical space, students can learn more by being in the center of the teacher's field of view (compared to the periphery) and by being closer to the teacher (compared to farther away). In Experiment 4, we demonstrated that inserting virtual co-learners who were either model students or distracting students changed the learning abilities of experiment participants who conformed to the virtual co-learners. Results suggest that virtual environments will have a unique ability to alter the social dynamics of learning environments via transformed social interaction.
This paper presents an analysis of how affordances of ICT-rich environments identified from a recent review of the research literature can support students in learning science in schools within a proposed framework for pedagogical practice in science education. Furthermore other pedagogical and curriculum innovations in science education (promoting cognitive change, formative assessment and lifelong learning) are examined to see how they may be supported and enhanced by affordances of ICT-rich environments. The affordances that I have identified support learning through four main effects: promoting cognitive acceleration; enabling a wider range of experience so that students can relate science to their own and other real-world experiences; increasing students' self-management; and facilitating data collection and presentation. ICT-rich environments already provide a range of affordances that have been shown to enable learning of science but integrating these affordances with other pedagogical innovations provides even greater potential for enhancement of students' learning.
Thanks to recent developments on automatic generation of metadata and interoperability between repositories, the production, man- agement and consumption of metadata is vastly surpassing the human capacity to review or process this information. However, we need to as- sure that low quality metadata does not compromise the performance of the services that the repository provides to its users. We contend there is a need for automatic assessment of the quality of metadata in digital repositories, so tools or users can be alerted about low quality records. In this paper, we present several quality metrics for metadata based on quality evaluation frameworks used for human quality review. We applied these metrics to a sample of records from a real repository and compared the results with the quality assessment given to the same records by a group of human reviewers. Through correlation and regression analysis, we found that one of the metrics, the text information content, could be used as a predictor of the human evaluation. While these metrics are not proposed as a definitive measurement of the complete multi-dimensional quality of the metadata record, we present ways in which they can be used to enhance the functionality of digital repositories.
Although augmented reality (AR) has gained much research attention in recent years, the term AR was given different meanings by varying researchers. In this article, we first provide an overview of definitions, taxonomies, and technologies of AR. We argue that viewing AR as a concept rather than a type of technology would be more productive for educators, researchers, and designers. Then we identify certain features and affordances of AR systems and applications. Yet, these compelling features may not be unique to AR applications and can be found in other technological systems or learning environments(e.g., ubiquitous and mobile learning environments). The instructional approach adopted by an AR system and the alignment among technology design, instructional approach, and learning experiences may be more important. Thus, we classify three categories of instructional approaches that emphasize the “roles,” “tasks,” and “locations,” and discuss what and how different categories of AR approaches may help students learn. While AR offers new learning opportunities, it also creates new challenges for educators. We outline technological, pedagogical, learning issues related to the implementation of AR in education. For example, students in AR environments may be cognitively overloaded by the large amount of information they encounter, the multiple technological devices they are required to use, and the complex tasks they have to complete. This article provides possible solutions for some of the challenges and suggests topics and issues for future research.
Bell System Technical Journal, also pp. 623-656 (October)
Science Created by You (SCY) is a project on learning in science and technology domains. SCY uses a pedagogical approach that centres around products, called ‘emerging learning objects’ (ELOs) that are created by students. Students work individually and collaboratively in SCY-Lab (the general SCY learning environment) on ‘missions’ that are guided by socio-scientific questions (for example ‘How can we design a CO2-friendly house?’). Fulfilling SCY missions requires a combination of knowledge from different content areas (eg, physics, mathematics, biology, as well as social sciences). While on a SCY mission, students perform several types of learning activities that can be characterised as productive processes (experiment, game, share, explain, design, etc), they encounter multiple resources, collaborate with varying coalitions of peers and use changing constellations of tools and scaffolds. The configuration of SCY-Lab is adaptive to the actual learning situation and may provide advice to students on appropriate learning activities, resources, tools and scaffolds, or peer students who can support the learning process. The SCY project aims at students between 12 and 18 years old. In the course of the project, a total of four SCY missions will be developed, of which one is currently available.