Fig 2 - uploaded by Marek Suchánek
Content may be subject to copyright.
Data Stewardship Wizard-architecture

Data Stewardship Wizard-architecture

Source publication
Conference Paper
Full-text available
Every year, the amount of data (in science) grows significantly as information technologies are used more intensively in various domains of human activities. Biologists, chemists, linguists, and others are not data experts but often just regular users who need to capture and process some huge amount of data. This is where serious problems emerge-ba...

Context in source publication

Context 1
... schema to describe these files is public together with a guide on how to publicly share KMs from the Wizard. Such setup together with the hierarchy of KMs is depicted in Figure 2. MongoDB is used as the storage for all the data in the Wizard. ...

Similar publications

Conference Paper
Full-text available
Graduate outputs or the Undergraduate Students' performance is a key quality parameter of any higher education institute. Therefore, management of the ABC Institute annually spends a considerable amount of money as capital and operational cost to provide better teaching and learning environment. However, the major issue at present is that the stude...

Citations

... Inadequate data management can result in loss of critical data, unverifiable results, and wasted money. Over the next few years, thousands of qualified data hosts will be required to address these issues [23]. The main challenge is not only the amount of data, but also the variety of formats and types. ...
Chapter
Full-text available
In recent years, an increasing understanding of software development as a set of interconnected processes has affected configuration management (CM). The purpose of CM is to understand network behavior by taking an inventory of users and network devices, bandwidth usage and analyzing this data to provide information on current usage patterns. Usage quotas can be set for individual users or groups, whose optimal access points can be reached after several iterations and some fixes, which data stewardship (DS) comes into the picture to maintain high-quality data in a consistent and accessible way. When achieved, the continuous measurement produces information related to the billing process and assessment of the fair and optimal use of resources within the network system should be conducted. Hence, this study wants to explore the issue that has been challenged primarily by the high school in maintaining the configuration process of the network in the laboratory for teaching and learning by assessing the implementation with set of CM and DS criterion.
... All of the reported work is framed within a wider scope of having the DCSO adopted as the official semantic-based serialisation of the DCS application profile. "Utility and discussion" section presents a use case, through the description of the adoption of DCSO by the Data Stewardship Wizard (DSW) DMP creation tool [7]. Finally, "Conclusions" section provides a summarised review on the contents of this paper, as well as a description of the future goals for DCSO. ...
Article
Full-text available
The concept of Data Management Plan (DMP) has emerged as a fundamental tool to help researchers through the systematical management of data. The Research Data Alliance DMP Common Standard (DCS) working group developed a set of universal concepts characterising a DMP so it can be represented as a machine-actionable artefact, i.e., machine-actionable Data Management Plan (maDMP). The technology-agnostic approach of the current maDMP specification: (i) does not explicitly link to related data models or ontologies, (ii) has no standardised way to describe controlled vocabularies, and (iii) is extensible but has no clear mechanism to distinguish between the core specification and its extensions.This paper reports on a community effort to create the DMP Common Standard Ontology (DCSO) as a serialisation of the DCS core concepts, with a particular focus on a detailed description of the components of the ontology. Our initial result shows that the proposed DCSO can become a suitable candidate for a reference serialisation of the DMP Common Standard.
... All of the reported work is framed within a wider scope of having the DCSO adopted as the official semantic-based serialisation of the DCS application profile. "Utility and discussion" section presents a use case, through the description of the adoption of DCSO by the Data Stewardship Wizard (DSW) DMP creation tool [7]. Finally, "Conclusions" section provides a summarised review on the contents of this paper, as well as a description of the future goals for DCSO. ...
Preprint
Full-text available
The concept of Data Management Plan (DMP) has emerged as a fundamental tool to help researchers through the systematical management of data. The Research Data Alliance DMP Common Standard (DCS) working group developed a core set of universal concepts characterising a DMP in the pursuit of producing a DMP as a machine-actionable information artefact, i.e.,machine-actionable Data Management Plan (maDMP). The technology-agnostic approach of the current maDMP specification: (i) does not explicitly link to related data models or ontologies, (ii) has no standardised way to describe controlled vocabularies, and (iii) is extensible, but has no clear mechanism to distinguish between the core specification and its extensions. Currently, the maDMP specification provides a JSON serialisation and schema to operationalise the approach. Such approach, however, does not address the concerns above. This paper reports on the community effort to create the DMP Common Standard Ontology (DCSO) as a serialisation of the DCS core concepts, with a particular focus on a detailed description of the components of the ontology. Our initial result shows that the proposed DCSO can become a suitable candidate for a reference serialisation of the DMP Common Standard.
... To close the metadata gap, it is expected to understand why metadata is lost due to user changing the behavioral through incentives or funders, which require the storage of spatial-temporal metadata if it is appropriate to reuse the data in order to be properly rewarded for employment, promotion and tenure decisions [21,22]. Interestingly, the data stewardship outlines useful steps for applying curation and storage rules to mission datasets, from initial conceptualization or acceptance to an iterative activities cycle that begins with recognition and awareness [23][24][25]. The basic concept of quality managers is an environment related to utilizing feedback iteratively and incrementally in every activity phase based on the individual efforts to evaluate and improve the results as mentioned in figure 3. From the customer perspective, the requirement for positive feedback mechanisms to provide and facilitate problem-solving. ...
Conference Paper
Full-text available
The role of data stewardship satisfies various requirement for the users, employee or staff to provide necessary awareness in ensuring information security strategy take place accordingly. It relates to have properly understand the definition and standard across the organization in term of quality, accuracy, workflows, usage, compliance, format and attribute of content and metadata. Unfortunately, many organization have neglected the importance of data stewardship as the assistance for leveraging the valuable asset of data at the optimization capacity. One of advantages that can be generated automatically related to improve the awareness of relevant party within the environment in protecting the information assets based on policy implemented. This study identifies the relationship by accessing data stewardship with the awareness campaign strategy through security awareness domain and resources (SADAR) framework. The study compare between the utilization in the banking industry by using enterprise resource planning (ERP) and in the education with open platform of database management system (DBMS).
Chapter
Independent and preferably atomic services sending messages to each other are a significant approach of Separations of Concerns principle application. There are already standardised formats and protocols that enable easy implementation. In this paper, we go deeper and introduce evolvable and machine-actionable reports that can be sent between services. It is not just a way of encoding reports and composing them together; it allows linking semantics using technologies from semantic web and ontology engineering, mainly JSON-LD and Schema.org. We demonstrate our design on the Data Stewardship Wizard project where reports from evaluations are crucial functionality, but thanks to its versatility and extensibility, it can be used in any message-oriented software system or subsystem.