ArticlePDF Available

University students' digital spaces in online learning communities: Implications for understanding, protecting and maintaining privacy

Wiley
British Journal of Educational Technology
Authors:

Abstract and Figures

In addition to learning effectively and effectively in online learning communities (OLCs), students must be in a secure environment and privacy must be respected. The study aimed to identify privacy violations that university students encounter in OLCs, and identify recommendations and some strategies to implement to ensure privacy at the OLC. A qualitative research method was applied in the study. First, focus group interviews were conducted with a total of 108 students who participated actively in the OLC. After analysing the interview data, a privacy violations report (RPV‐OLC) was prepared at the OLC. According to the RPV‐OLC, a problem description for the Delphi technique was designed and 17 experts were identified for the interview. Delphi panels with experts were completed in 5 months. The content analysis process was used for the data obtained. Students said they encountered information sharing and access, access to information, system/course settings, resistance to negotiations, communication problems and lack of knowledge. On the other hand, experts suggest that different suggestions and strategies could be implemented to ensure privacy in the OLC, such as partial privacy, mutual trust, contextual privacy agreements, role‐based or anonymous identity use, and privacy education. The study focused on defining the concept of privacy and student privacy violations in online learning environments and providing solutions for educators, students and system administrators. Practitioner notes What is already known about this topic Digital learning environments pose a threat to the privacy of students. The fact that privacy violations in the OLC can trigger concerns among students has a negative impact on learning processes. There is high interest at the moment in how to plan the most effective teaching in online learning environments—but privacy is being neglected. What this paper adds This article explains the privacy violations university students encounter at the OLC. This document contains guidelines on how to mitigate and prevent privacy violations in OLC. The results of the study also provide suggestions for privacy protection at the OLC. Implications for practice and/or policy It is important that the instructors who participate in the OLC are informed about the necessary measures, rules, principles, activities and follow‐up measures to protect their personal privacy. If we can overcome students' concerns about privacy violations at the OLC, we can create more comfortable learning spaces. The aim of learning should not only be to peak performance; the most important responsibility of educators or system administrators is to preserve each student's private space, while at the same time achieving the best performance.
This content is subject to copyright. Terms and conditions apply.
Article
Full-text available
In a digital ecosystem where large amounts of data related to user actions are generated every day, important concerns have emerged about the collection, management, and analysis of these data and, according, about user privacy. In recent years, users have been accustomed to organizing in and relying on digital communities to support and achieve their goals. In this context, the present study aims to identify the main privacy concerns in user communities on social media, and how these affect users’ online behavior. In order to better understand online communities in social networks, privacy concerns, and their connection to user behavior, we developed an innovative and original methodology that combines elements of machine learning as a technical contribution. First, a complex network visualization algorithm known as ForceAtlas2 was used through the open-source software Gephi to visually identify the nodes that form the main communities belonging to the sample of UGC collected from Twitter. Then, a sentiment analysis was applied with Textblob, an algorithm that works with machine learning on which experiments were developed with support vector classifier (SVC), multinomial naïve Bayes (MNB), logistic regression (LR), random forest, and classifier (RFC) under the theoretical frameworks of computer-aided text analysis (CATA) and natural language processing (NLP). As a result, a total of 11 user communities were identified: the positive protection software and cybersecurity and eCommerce, the negative privacy settings, personal information and social engineering, and the neutral privacy concerns, hacking, false information, impersonation and cookies data. The paper concludes with a discussion of the results and their relation to user behavior in digital environments and an outline valuable and practical insights into some techniques and challenges related to users’ personal data.
Article
Full-text available
With the rapid development of mobile positioning technologies, location-based services (LBSs) have become more widely used. The amount of user location information collected and applied has increased, and if these datasets are directly released, attackers may infer other unknown locations through partial background knowledge in their possession. To solve this problem, a privacy-preserving method for trajectory data publication based on local preferential anonymity (LPA) is proposed. First, the method considers suppression, splitting, and dummy trajectory adding as candidate techniques. Second, a local preferential (LP) function based on the analysis of location loss and anonymity gain is designed to effectively select an anonymity technique for each anonymous operation. Theoretical analysis and experimental results show that the proposed method can effectively protect the privacy of trajectory data and improve the utility of anonymous datasets.
Article
Full-text available
The datafication of learning has created vast amounts of digital data which may contribute to enhancing teaching and learning. While researchers have successfully used learning analytics, for instance, to improve student retention and learning design, the topic of privacy in learning analytics from students' perspectives requires further investigation. Specifically, there are mixed results in the literature as to whether students are concerned about privacy in learning analytics. Understanding students' privacy concern, or lack of privacy concern, can contribute to successful implementation of learning analytics applications in higher education institutions. This paper reports on a study carried out to understand whether students are concerned about the collection, use, and sharing of their data for learning analytics, and what contributes to their perspectives. Students in a laboratory session (n = 111) were shown vignettes describing data use in a university and an e-commerce company. The aim was to determine students' concern about their data being collected, used, and shared with third parties, and whether their concern differed between the two contexts. Students' general privacy concerns and behaviours were also examined and compared to their privacy concern specific to learning analytics. We found that students in the study were more comfortable with the collection, use, and sharing of their data in the university context than in the e-commerce context. Furthermore, these students were more concerned about their data being shared with third parties in the e-commerce context than in the university context. Thus, the study findings contribute to deepening our understanding about what raises students’ privacy concern in the collection, use and sharing of their data for learning analytics. We discuss the implications of these findings for research on and the practice of ethical learning analytics.
Article
Sürekli ve hızlı bir şekilde gelişmekte olan bilgi ve iletişim teknolojileri aynı hızla hayatımızın her alanında ve her anında yer almaya devam etmektedir. Bu gelişmeler beraberinde dijital kimlik, vatandaşlık ve yaşam sunmaktadır. Akıllı cihazlar ve internetin yaygınlaşmasıyla çoğalan sosyal medya platformları da dijital hayata geçişi kolaylaştırmaktadır. Ancak bu dijital yaşam birçok avantaj sunarken bazı belirsizlikler ve riskleri de barındırmaktadır. Dolayısıyla dijital dünyada yaşamanın zorluklarıyla başa çıkabilmek ve bilinçli ve sorumlu bir dijital vatandaş olabilmek için bir takım teknik, zihinsel ve sosyal becerilere sahip olmak gerekir. Dünya Ekonomi Forumu’na göre tüm 21.yüzyılda çocukların sahip olması gereken sekiz önemli dijital yaşam becerilerinden birisi mahremiyet yönetimidir. Bu çalışmanın amacı oldukça yeni bir kavram olan dijital mahremiyet hakkındaki kavramsal ve araştırma eğilimlerin değerlendirilmesidir. Çalışma, Türkçe olarak yayımlanmış önceki araştırmaların incelenmesini kapsayan alanyazın taraması şeklinde desenlenmiştir. Ulusal ve uluslararası dergiler, Yükseköğretim Kurulu (YÖK) lisansüstü tez veri tabanı ve Google arama motoru aracılığıyla ilgili çalışmalara ulaşılmıştır. Bulunan çalışmalar, 4N1K kum saati modeli ile özetlenmiştir.
Chapter
Modern democracies need an educated citizenry to survive and to thrive. “If a nation expects to be ignorant and free, in a state of civilization,” Thomas Jefferson wrote in 1816, “it expects what never was and what never will be.” Democratic societies are both enriched and challenged by a diverse citizenry. Democratic education encompasses the varied institutional structures and curricular contents that are suitable for educating free citizens of democratic societies. This entry considers key aspects of democratic education: the question of governance, challenges of multiculturalism, democratic higher education, and education in transitional democracies.
Article
Aula is a mandatory public school platform in Denmark with more than two million users. The idea behind Aula was to provide a shared space for communication and cooperation around children, both within the school/municipality setting and between teachers and parents, while adhering to the requirements of EU’s General Data Protection Regulation. In this article we examine the incorporation of Aula in the daily practices of teachers, especially as they relate to children’s privacy and data protection. Based on qualitative interviews with nine teachers and four experts, and drawing on practice theory, platform theory, and theories on children’s privacy, we find that Aula – despite the intentions behind it – fails to support the complex nature of teachers’ work practices and, therefore, to provide a solid data protection framework for the children. As such, the teachers mainly view the platform as being conducive to their non-sensitive communication with parents and deploy a range of other digital tools to support, for instance, cooperation with colleagues. Consequently, gaps in children’s privacy and data protection arise.
Article
The question of how best to integrate the views of underrepresented and marginalized groups in the evaluation process is of critical importance to many evaluation theorists and practitioners. In this article the Delphi technique, a method used to achieve consensus on a set of issues with the participation of all interested parties without incident or confrontation that could compromise the validity of collected data, is offered as a procedure for enhancing marginalized group participation in the evaluation process. Demonstrated by a case example, the Delphi technique is used to help ensure that all relevant stakeholders have a voice and that sometimes-silenced voices have equal influence. As a result, it is suggested that this technique lends itself to implementation with social justice evaluation models. The benefits of and lessons learned when using the Delphi technique to promote marginalized group participation and representation in evaluations are discussed.
Article
Industry 4.0 has become a reality by fusing the Industrial Internet of Things (IIoT) and Artificial Intelligence (AI), providing huge opportunities in the way manufacturing companies operate. However, the adoption of this paradigm shift, particularly in the field of smart factories and production, is still in its infancy, suffering from various issues, such as the lack of high-quality data, data with high-class imbalance, or poor diversity leading to inaccurate AI models. However, data is severely fragmented across different silos owned by several parties for a range of reasons, such as compliance and legal concerns, preventing discovery and insight-driven IIoT innovation. Notably, valuable and even vital information often remains unutilized as the rise and adoption of AI and IoT in parallel with the concerns and challenges associated with privacy and security. This adversely influences inter- and intra-organization collaborative use of IIoT data. To tackle these challenges, this article leverages emerging multi-party technologies, privacy-enhancing techniques (e.g., Federated Learning), and AI approaches to present a holistic, decentralized architecture to form a foundation and cradle for a cross-company collaboration platform and a federated data space to tackle the creeping fragmented data landscape. Moreover, to evaluate the efficiency of the proposed reference model, a collaborative predictive diagnostics and maintenance case study is mapped to an edge-enabled IIoT architecture. Experimental results show the potential advantages of using the proposed approach for multi-party applications accelerating sovereign data sharing through Findable, Accessible, Interoperable, and Reusable (FAIR) principles.
Article
The protection of personal data and privacy are important issues closely related to use of social media, information and communication technologies, and the Internet in the area of education. The treatment of academic information and use of tools and programs for instruction, communication, and learning have revealed the handling of a significant volume of personal data from different sources. It is essential to protect this information from possible privacy violations. This descriptive study, which is of transversal nonexperimental design, focuses on how 384 pre-service teachers’ enrolled in educational technology courses in their education programs view the protection of personal data. The goals are to describe and analyze how these teachers perceive the risks associated with protection of data on the Internet and what they know about protection of data in primary education. We administered a questionnaire within the framework of an educational activity that focused on digital competence in data protection in education. The results show a high perception of risk in topics such as accepting cookies when surfing the Internet or transferring banking information. The knowledge the students claim to have shown a lack of information on the protection of minors’ data in issues related to the development and schooling of primary school students, as well as their health, background, and family environment. Curricular treatment of these areas that includes content, practices on regulations, and adopts a situated, critical, and responsible approach in pre-service teacher education is recommended.