Anastasija Nikiforova

Anastasija Nikiforova
University of Tartu · Institute of Computer Science

PhD
data management, data quality, open data, OGD, data science, IoT, Smart city, Society 5.0, disruptive technologies

About

75
Publications
20,462
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
433
Citations
Citations since 2017
75 Research Items
433 Citations
2017201820192020202120222023050100150
2017201820192020202120222023050100150
2017201820192020202120222023050100150
2017201820192020202120222023050100150
Introduction
Anastasija Nikiforova is a researcher (PhD in Computer Science – Data Processing Systems and Data Networking), whose research interests include data management with a focus on data quality, open government data, IoT, sustainable development, Smart City, Society 5.0 and digitization. She is an assistant professor of Information Systems at University of Tartu (Institute of Computer Science), a part of European Open Science Cloud TF “FAIR Metrics and data quality" -> anastasijanikiforova.com
Additional affiliations
April 2022 - present
University of Tartu
Position
  • Assistant professor (Lecturer)
January 2022 - June 2022
Delft University of Technology
Position
  • visiting researcher
January 2021 - December 2021
Latvian Biomedical Research and Study Centre
Position
  • IT-expert
Description
  • H2020 INTEGROMED: Inspection of the current data ecosystem of Latvian Biomedical Research and Study Centre. This resulted in a set of guidelines towards efficient data management for heterogeneous data holders and exchangers developed and presented during the European Biobank Week 2021. ERDF DECIDE – Development of a dynamic informed consent system for biobank and citizen science data management, quality control and integration. Later refers to the software engineering tasks.
Education
September 2013 - June 2020
University of Latvia
Field of study
  • Computer Sciences

Publications

Publications (75)
Conference Paper
Full-text available
Open Government Data (OGD) are seen as one of the trends that has the potential to benefit the economy, improve the quality, efficiency, and transparency of public administration, and change the lives of citizens, and the society as a whole facilitating efficient sustainability-oriented data-driven services. However, the quick achievement of these...
Article
Full-text available
The development of new technologies brings with it "technical superiority", but new technologies can also be a stress test for existing political systems, which may fail as a result, leading to cascading effects that threaten fundamental precepts of democratic societies and their key institutions. If policymakers fail to recognise these challenges,...
Chapter
Full-text available
Open data are characterized by a number of economic, environmental, technological, innovative, and social benefits. They are seen as a significant contributor to the city’s transformation into smart city. This is all the more so when the society is on the border of Society 5.0, that is, shift from the information society to a super smart society or...
Preprint
Full-text available
The OGD is seen as a political and socio-economic phenomenon that promises to promote civic engagement and stimulate public sector innovations in various areas of public life. To bring the expected benefits, data must be reused and transformed into value-added products or services. This, in turn, sets another precondition for data that are expected...
Article
Full-text available
During the COVID-19 pandemic, open government data (OGD) was often used as a valuable crisis management resource. Unfortunately, there is limited research that explores how OGD can be used during times of crisis as a crisis management tool. To ensure that OGD can be used effectively in future crises, there is a need to understand how it may be used...
Chapter
Full-text available
Today, large amounts of data are being continuously produced, collected, and exchanged between systems. As the number of devices, systems and data produced grows up, the risk of security breaches increases. This is all the more relevant in times of Covid-19, which has affected not only the health and lives of human beings’ but also the lifestyle of...
Article
Full-text available
Users should trust the data that are managed by software applications constituting the Information Systems (IS). This means that organizations should ensure an appropriate level of quality of the data they manage in their IS. Therefore, the requirement for the adequate level of quality of data to be managed by IS must be an essential requirement fo...
Chapter
Full-text available
Nowadays, there are billions interconnected devices forming Cyber-Physical Systems (CPS), Internet of Things (IoT) and Industrial Internet of Things (IIoT) ecosystems. With an increasing number of devices and systems in use, amount and the value of data, the risks of security breaches increase. One of these risks is posed by open data sources, whic...
Preprint
Purpose The purpose of this paper is to highlight the drivers, barriers, benefits and risks affecting the integration of Internet of Things (IoT) into the e-government and to provide a future research agenda. Design/methodology/approach Existing literature examining the relationships between e-government and IoT is scanned and evaluated by concept...
Preprint
Full-text available
The European Open Science Cloud (EOSC) Association leverages thirteen Task Forces (TFs), grouped into five Advisory Boards, to help steer the implementation of EOSC. This document is released by the Data Quality subgroup of the “FAIR Metrics and Data Quality” TF. Data quality is critical in ensuring the credibility, legitimacy, and actionability of...
Chapter
Full-text available
A hackathon is known as a form of civic innovation in which participants representing citizens can point out existing problems or social needs and propose a solution. Given the high social, technical, and economic potential of open government data (OGD), the concept of open data hackathons is becoming popular around the world. This concept has beco...
Article
Full-text available
The rise of mobile applications has helped to provide information in a broader network of products remotely. They simplify the identification of products by using their barcode or even an image of the item. This paper, therefore, aims to create an e-commerce assistant Android application that incorporates machine learning, more precisely, image cla...
Book
Full-text available
This book constitutes selected and revised papers presented at the First International Conference on Electronic Governance with Emerging Technologies, EGETC 2022, held in Tampico, Mexico, in September 2022. The 15 full papers and 2 short papers presented were thoroughly reviewed and selected from the 54 submissions. This volume focuses on the rec...
Article
Full-text available
Although FAIR Research Data Principles are targeted at and implemented by different communities, research disciplines, and research stakeholders (data stewards, curators, etc.), there is no conclusive way to determine the level of FAIRness intended or required to make research artefacts (including, but not limited to, research data) Findable, Acces...
Preprint
Full-text available
Recently, there has been increasing awareness of the tremendous opportunities inherent in quantum computing (QC). Specifically, the speed and efficiency of QC will significantly impact the Internet of Things, cryptography, finance, and marketing. Accordingly, there has been increased QC research funding from national and regional governments and pr...
Article
Full-text available
Although FAIR Research Data Principles are targeted at and implemented by different communities, research disciplines, and research stakeholders (data stewards, curators, etc.), there is no conclusive way to determine the level of FAIRness intended or required to make research artefacts (including, but not limited to, research data) Findable, Acces...
Article
Full-text available
With the increase in the amount and variety of data that are constantly produced, collected, and exchanged between systems, the efficiency and accuracy of solutions/services that use data as input may suffer if an inappropriate or inaccurate technique, method, or tool is chosen to deal with them. This paper presents a global overview of urban data...
Conference Paper
Full-text available
Open Government Data (OGD) is a fundamental source for sustainability-oriented and data-driven innovation by citizens, companies, and other actors. However, many government agencies are reluctant to openly share their data with the public. While the resistance of public organizations to openly share government data has been investigated in previous...
Article
Full-text available
Consolidation of the research information improves the quality of data integration, reducing duplicates between systems and enabling the required flexibility and scalability when processing various data sources. We assume that the combination of a data lake as a data repository and a data wrangling process should allow low-quality or “bad” data to...
Conference Paper
Full-text available
Digitization in the research domain refers to the increasing integration and analysis of research information in the process of research data management. However, it is not clear whether it is used and, more importantly, whether the data are of sufficient quality, and value and knowledge could be extracted from them. FAIR principles (Findability, A...
Conference Paper
Full-text available
The research problems on Object detection have been attracted with major issues in the computer vision domain. Object detection based on images from unmanned aerial vehicles (UAV) - drones, has versatile applications in both defence security, agriculture and GIS. However, real-time object detection in UAV scenarios remains quite a tedious problem d...
Preprint
Purpose With the development of information technology (IT), governments around the globe are using state-of-the-art IT interfaces to implement the so-called 3E’s in public service delivery, that is, economy, efficiency and effectiveness. Two of these IT interfaces relate to Artificial Intelligence (AI) and Internet of Things (IoT). While AI focuse...
Conference Paper
A hackathon is a form of social innovation in which participants can point out existing problems or social needs and offer solutions. Generation Z is supposed to be the most appropriate audience representing "digital natives". Gen Z open data hackathons are organized annually in Latvia. However, the organizer assumes that the goal of the hackathons...
Preprint
Full-text available
Today, more and more people are reporting allergies, which can range from simple reactions close to discomfort to anaphylactic shocks. Other people may not be allergic but avoid certain foods for personal reasons. Daily food shopping of these people is hampered by the fact that unwanted ingredients can be hidden in any food, and it is difficult to...
Preprint
Full-text available
A hackathon is known as a form of civic innovation in which participants representing citizens can point out existing problems or social needs and propose a solution. Given the high social, technical, and economic potential of open government data (OGD), the concept of open data hackathons is becoming popular around the world. This concept has beco...
Conference Paper
Full-text available
Integrating artificial intelligence (AI) technologies into customer service is of great interest to firms that face a mass of feedback originating from multiple channels (Poser et al., 2022), including phone calls, emails, and social media. Particularly social media channels have increased their popularity in recent years as a notable channel of cu...
Chapter
Full-text available
Today, the popularity of semantic models and ontologies is increasing rapidly. This leads not only to the high number of general ontologies, but also to a variety of domain-specific ontologies where the medical or healthcare domain plays a major role. The increasing popularity of ontologies, particularly in different non-IT domains, has an impact o...
Article
Full-text available
This paper focuses on the issue of the transparency maturity of open data ecosystems seen as the key for the development and maintenance of sustainable, citizen-centered, and socially resilient smart cities. This study inspects smart cities’ data portals and assesses their compliance with transparency requirements for open (government) data. The ex...
Preprint
Full-text available
Today, large amounts of data are being continuously produced, collected, and exchanged between systems. As the number of devices, systems and data produced grows up, the risk of security breaches increases. This is all the more relevant in times of COVID-19, which has affected not only the health and lives of human beings' but also the lifestyle of...
Article
Full-text available
The data management process is characterised by a set of tasks where data quality management (DQM) is one of the core components. Data quality, however, is a multidimensional concept, where the nature of the data quality issues is very diverse. One of the most widely anticipated data quality challenges, which becomes particularly vital when data co...
Article
Full-text available
Purpose This paper aims to provide insights into the integration of blockchain technology in e-government services. Design/methodology/approach The article invokes an exploratory approach to emphasize the possibilities of integrating blockchain technology in e-government services. A cybernetic model is detailed in the paper for bridging the gulf b...
Preprint
Full-text available
Open data are characterized by a number of economic, technological, innovative and social benefits. They are seen as a significant contributor to the city's transformation into Smart City. This is all the more so when the society is on the border of Society 5.0, i.e., shift from the information society to a super smart society or society of imagina...
Article
Full-text available
Since the turn of the millennium, the volume of data has increased significantly in both industries and scientific institutions. The processing of these volumes and variety of data we are dealing with are unlikely to be accomplished with conventional software solutions. Thus, new technologies belonging to the big data processing area, able to distr...
Article
Full-text available
Purpose Open government data (OGD) are considered as a technology capable of promoting transparency openness, and accountability, which in turn has a positive impact on innovation activities and creates responsive government, collaboration, cooperation, co-creation and participation. The purpose of this paper is to explore the adoption of OGD and o...
Article
Full-text available
The rise of mobile applications has helped to provide information in a broader network of products remotely. They simplify the identification of products by using their barcode or even an image of the item. This paper, therefore, aims to create an e-commerce assistant Android application that incorporates machine learning, more precisely, image cla...
Preprint
Full-text available
Nowadays, there are billions interconnected devices forming Cyber-Physical Systems, Internet of Things (IoT) and Industrial Internet of Things (IIoT) ecosystems. With an increasing number of devices and systems in use, amount and the value of data, the risks of security breaches increase. One of these risks is posed by open data sources, by which a...
Article
Full-text available
The authors offer a method for detecting potentially incorrect execution of concurrent business processes. It is achieved by using symbolic execution of business process descriptions. The proposed method provides six steps: create a detailed business process description, define transactions, define the incorrect business process execution, create a...
Conference Paper
Full-text available
This study aims to analyze the state of the security of open data databases, i.e. being accessible from the outside of organization, representing both relational databases and NoSQL of three Baltic countries-Latvia, Lithuania, Estonia. This is done by using previously proposed tool for non-intrusive detection of vulnerable data sources called ShoBE...
Conference Paper
Full-text available
The paper presents a study aimed at identifying the most widely occurring data quality issues that affect users’ experience with data and their reuse, and presence of which may not only disrupt the willingness to work with data but also cause losses for businesses. The list of defects is intended to be identified as a result of the following activi...
Conference Paper
Full-text available
The paper proposes a tool for non-intrusive testing of open data sources for detecting their vulnerabilities, called ShoBeVODSDT (Shodan-and Binary Edge-based vulnerable open data sources detection tool). The use of Open Source Intelligence (OSINT) tools, more precisely the Internet of Things Search Engines (IoTSE), allows the tool to inspect a lis...
Article
Full-text available
This paper aims to provide a broad perspective on the development of benchmarking open data efforts through indices and rankings over the years, both at the level of countries and allowing for a cross-country comparison. The methodology follows a systematic search for the relevant resources, their classification and identification of six open data...
Conference Paper
Full-text available
Open Government Data (OGD) are seen as one of the trends that can potentially benefit the economy. However, this fact is closely linked to the “value” of the OGD, i.e. the extent to which the data provided by the OGD portals are interesting, useful and valuable for their reuse, creating value for society and the economy. Here, the concept of “high-...
Article
Open data are freely available and can be used by every stakeholder for its own purposes. However, the practice demonstrates that it is important to ensure that the source from which they are available is usable and facilitates the re-use of data to the widest possible range of stakeholders. This task is carried out by open government data (OGD) po...
Article
Full-text available
Nowadays, governments launch open government data (OGD) portals that provide data that can be accessed and used by everyone for their own needs. Although the potential economic value of open (government) data is assessed in millions and billions, not all open data are reused. Moreover, the open (government) data initiative as well as users’ intent...
Chapter
Full-text available
Today, the topic of child’s sporting has become a crucial, not only because, in the 21st century, computer games and social networks are the most common way of spending a child leisure time (shift from sports to eSports took place), but also because in 2020, Covid-19 and the associated restrictions give parents even more stress on visits to sports...
Article
Full-text available
Transparency in the public sector is one of the most important topics of the current debates on accountable, participatory, and responsive governance. An open government addresses these major topics and aims to encourage the relationships and flows of information between involved stakeholders. This article explores the role of open data portals in...
Chapter
Full-text available
In order to develop reliable software, its operating must be verified for all possible cases of use. This can be achieved, at least partly, by means of a model-based testing (MBT), by establishing tests that check all conditions covered by the model. This paper presents a Data Quality Model-based Testing (DQMBT) using the data quality model (DQ-mod...
Article
Full-text available
This paper aims to explore the patterns of online interaction of users of the Pretty Good Privacy (PGP) algorithm to identify the most important and influential users in the social network. While PGP is widely used in protecting email privacy, there are some encryption defects that can raise users’ concerns about data privacy and security. It is th...
Article
Full-text available
Open government data (OGD) initiatives can deliver many cultural and institutional benefits. This is why many governments are trying to establish an OGD ecosystem. However, although many countries have made good progress in doing so, some face significant challenges. In such cases, country-specific studies can prove valuable in understanding not on...
Conference Paper
Full-text available
This paper presents first steps towards a solution aimed to provide concurrent business processes analysis methodology for predicting the probability of incorrect business process execution. The aim of the paper is to (a) look at approaches to describing and dealing with the execution of concurrent processes, mainly focusing on the transaction mech...
Conference Paper
Full-text available
The paper proposes a data quality model-based testing methodology aimed at improving testing methodology of information systems (IS) using previously proposed data quality model. The solution supposes creation of a description of the data to be processed by IS and the data quality requirements used for the development of the tests, followed by perf...
Article
Full-text available
Open government data, as a phenomena, may be considered an important and influential innovation that has the potential to drive the creation of public value via enabling the prevention of corruption, increase in accountability and transparency, and driving the co-creation of new and innovative services. However, in order for open government data to...
Conference Paper
Full-text available
The paper addresses the “timeliness” of data in open government data (OGD) portals. It is one of the primary principles of open data, which is considered to be a success factor, while at the same time it is one of the biggest barriers that can disrupt users trust in data and even the desire to use the entire open data portal. However, assessing thi...
Conference Paper
Full-text available
This paper proposes a model-based testing approach by offering to use the data quality model (DQ-model) instead of the program's control flow graph as a testing model. The DQ-model contains definitions and conditions for data objects to consider the data object as correct. The study proposes to automatically generate a complete test set (CTS) using...
Article
Full-text available
Data quality issue has emerged since the end of the 60's, however, more than 50 years later, it remains unresolved and is still current, mainly due the popularity of data and open data. The paper proposes a data object-driven approach to data quality evaluation. This user-oriented solution is based on 3 main components: data object, data quality sp...
Conference Paper
Full-text available
This paper focuses on the analysis of usability of the national open data portals. Open [government] data are considered as one of the most influenceable tool for preventing and reducing corruption and reaching innovative solutions that create added value for society. Thus, it is important to ensure that they are provided in a form that are useful...
Conference Paper
Full-text available
Nowadays, more and more countries are launching their own open data portals, seeking to provide their citizens with open data in a form that is useful and suitable for the original purpose of the open data, and Latvia is not an exception. Despite the fact that the Latvian open data portal was launched only in 2017, it is considered to be a fast-tra...
Conference Paper
Full-text available
The main idea of the solution is to improve testing methodology of information systems (IS) by using data quality models. The idea of the approach is as follows: (a) first, a description of the data to be processed by IS and the data quality requirements used for the development of the test are created, (b) then, an automated test of the system on...
Article
Full-text available
The paper proposes a new data object-driven approach to data quality evaluation. It consists of three main components: (1) a data object, (2) data quality requirements, and (3) data quality evaluation process. As data quality is of relative nature, the data object and quality requirements are (a) use-case dependent and (b) defined by the user in ac...
Conference Paper
Full-text available
This research focuses on the analysis of the quality of open health data that are freely available and can be used by everyone for their own purposes. The quality of open data is crucial as it can lead to unreliable decision-making and financial losses, however, the quality of open health data has even more critical role. Despite its importance, th...
Conference Paper
Full-text available
Data quality issues have been topical for many decades. However, a unified data quality theory has not been proposed yet, since many concepts associated with the term “data quality” are not straightforward enough. The paper proposes a user-oriented data quality theory based on clearly defined concepts. The concepts are defined by using three groups...
Conference Paper
Full-text available
This paper discusses data quality checking during business process execution by using runtime verification. While runtime verification verifies the correctness of business process execution, data quality checks assure that particular process did not negatively impact the stored data. Both, runtime verification and data quality checks run in paralle...
Conference Paper
Full-text available
This research is an extension of a data object-driven approach to data quality evaluation allowing to analyse data object quality in scope of multiple data objects. Previously presented approach was used to analyse one particular data object, mainly focusing on syntactic analysis. It means that the primary data object quality can be analysed agains...
Experiment Findings
Full-text available
This time, the practical solution proposed previously has evolved into a more theoretical solution - an informal theory of data quality. Data quality issues have been topical for many decades. However, a unified data quality theory has not been proposed yet, since many concepts associated with the term “data quality” are not straightforward enough....
Article
Full-text available
Nowadays open data is entering the mainstream - it is free available for every stakeholder and is often used in business decision-making. It is important to be sure data is trustable and error-free as its quality problems can lead to huge losses. The research discusses how (open) data quality could be assessed. It also covers main points which shou...
Preprint
Nowadays open data is entering the mainstream - it is free available for every stakeholder and is often used in business decision-making. It is important to be sure data is trustable and error-free as its quality problems can lead to huge losses. The research discusses how (open) data quality could be assessed. It also covers main points which shou...
Conference Paper
Full-text available
This research proposes a new approach to data quality evaluation comprising 3 aspects: (1) data object definition, which quality will be analyzed, (2) quality requirements specification for the data object using Domain Specific Language (DSL), (3) implementation of an executable data quality model that would enable scanning of data object and detec...
Conference Paper
Full-text available
This paper is devoted to the analysis of open data quality of the company registers in four different countries. The data quality evaluation was obtained using a methodology that involves the creation of three-part data quality model: (1) the definition of a data object to analyse its quality, (2) data object quality specification using DSL, (3) th...
Preprint
Full-text available
The research deals with application of the LEAN principles to business processes of a typical IT company. The paper discusses LEAN principles amplifying advantages and shortcomings of their application. The authors suggest use of the LEAN principles as a tool to identify improvement potential for IT company's business processes and work-flow effici...
Article
Full-text available
The research deals with application of the LEAN principles to business processes of a typical IT company. The paper discusses LEAN principles amplifying advantages and shortcomings of their application. The authors suggest use of the LEAN principles as a tool to identify improvement potential for IT company’s business processes and work-flow effici...
Preprint
Full-text available
The research discusses how (open) data quality could be described, what should be considered developing a data quality management solution and how it could be applied to open data to check its quality. The proposed approach focuses on development of data quality specification which can be executed to get data quality evaluation results, find errors...
Conference Paper
Full-text available
The research discusses how (open) data quality could be described, what should be considered developing a data quality management solution and how it could be applied to open data to check its quality. The proposed approach focuses on development of data quality specification which can be executed to get data quality evaluation results, find errors...
Experiment Findings
Full-text available
As the volume of open data increases, solutions are needed that would be suitable for users without more in-depth knowledge of data quality and IT, as open data becomes a daily phenomenon and quality analysis is becoming an integral part of everyday activity. The quality of open data is crucial as it can lead to unreliable decision-making and fina...

Network

Cited By