Conference PaperPDF Available

Enterprise Master Data Architecture: Design Decisions and Options

Authors:

Abstract

The enterprise-wide management of master data is a key prerequisite for companies to respond to strategic business drivers such as compliance to regulatory requirements, integrated customer management, and global business process integration. Among others, this demands systematic design of the enterprise master data architecture. The current state-of-the-art, however, falls short of guiding practitioners with regard to the design decisions they have to make and to the design options of which they can choose. This paper aims at contributing to this gap. It reports on the findings of three case studies and uses morphological analysis to structure design decisions and options for the management of an enterprise master data architecture.
A preview of the PDF is not available
... Мастер-данные доступны и используются несколькими приложениями в масштабе 15 предприятия. Основными характерными особенностями мастер-данных являются [1,2]: ...
Patent
Full-text available
The application proposes a master data of assets management system containing: 1) A data model that combines different views of master data about the same asset park, 2) Modules that implement methods for checking the completeness and correctness of links between different representations in this model. Each independent view of the asset master data is organized in a separate hierarchy. Each such hierarchy reflects the ownership of assets (their inclusion into the assets of a higher level). Each view of the master data of assets model is associated with at least one unique classifier of the master data of assets included in the model, which is a hierarchical system of classes based on the principles of inheritance and encapsulation, and the class of each master data of assets object is completely defines all the attributes of that object. For each object of any of the views, the master data of assets model includes a set of links with objects of other views to navigate between different representations of the same asset. In each view, the master data of assets model can include a set of network relationships between assets belonging to the same view. Such network links can be of different types, can be characterized by sets of attributes and have their own classifier of links. In addition, the model of the master data of assets in each view can include a set of structural models describing the internal relationships and the component composition of assets’ specific types, as well as a set of functional models using the attribute values of individual instances of the master data of assets and attributes related to them objects of the master data of assets. Methods for checking the correctness and completeness of links between different representations of the same assets include the following main nested actions: • Loop traversal of all hierarchical views presented in the asset structure; • Traversal of all nodes of each hierarchy from top to bottom and sequentially along the branches of each subtree; • Traversal of all rings of the existing lattice of links between different representations of the same assets starting from a given node, • Formation of report elements on broken, missing and not anticipated links. When traversing all the rings of each lattice, in order not to go through each ring many times, the markup of connections is used.
... For example, in [20], case study has been used to investigate data quality problems in multinational manufacturer in China. However, case study limited the generalization of the findings and further implementation of the proposed solution required more attention [20], [42], [77]. ...
Article
Full-text available
Data quality drawn a major concern when dealing with data especially in the event that insightful outputs is needed. Research in data quality emerged in various topics and diversification in known knowledge and used approach is inevitable. In this paper, we apply systematic review study to explain the landscape of data quality and to identify available research gap by using categorization and mapping. Our search scope is limited to research articles from journals, conference proceedings and magazine published between 2010 until 2016. We defined three types of main categorization to map the selected research articles and to answer our research questions. These categorization focus on research topics, research type and contribution type. On average, fifty-four research articles related to data quality were published every year. This number shows the importance of data quality research in various research topics such as online users, database, web information, sensors and big data. This study also indicates that almost half of the selected articles proposed a novel solution or an essential extension of an existing data quality technique. Moreover, most of the selected research articles belongs to the model type in the contribution category. Our mapping also suggests that obvious contribution disparity happen between contribution in metric type and model type category.
... Good governance will hasten the process of adopting innovation by local government such as the adoption of cloud computing (Ali et al., 2016), e-services (Wang & Feeney, 2016), web accessibility standards (Velleman et al., 2015), and electronic health records (McCullough et al., 2015). MDM, which comprises numerous design decisions and multiple parties in the implementation, requires the identification of roles and responsibilities in managing the shared master data, also known as data governance (Otto & Schmidt, 2010). Data governance is a subset of information governance, which involves processes and controls of the information at the data level (Smallwood, 2014). ...
Article
Master Data Management (MDM) is an approach for effective management of shared master data across organizations. In the public sector, MDM initiatives have been developed; however, the adoption among local government remains slow and there has been little interest in MDM adoption in extant research. Building on a Technology-Organization-Environment (TOE) framework, a conceptual model which highlights a set of potential determinants affecting the adoption of MDM by local government was developed. To validate the model, data were collected via survey from 224 responses from Malaysian local government department units. Using SEM-PLS, the study confirmed that data quality and data governance are two determinants of MDM adoption specific to the context of Malaysian local government, and four other determinants-complexity, top management support, technological competence, and citizen demand-are found to have significant effects on MDM adoption by local government. Surprisingly, three determinants-relative advantage, data security, and government policy-are found to have non-significant relationships to the adoption of MDM by local government. In addition , top management support is revealed to be a cornerstone of MDM technological competence in local government. The study contributes to the theoretical, contextual, and practical knowledge of MDM and IT adoption in the context of local government.
... It is necessary to know which should be the leading system in the application system landscape to construct the ISAA. The leading system is the one that triggers actions within other systems and holds the master data (Otto and Schmidt, 2010). One option would be to enrich IT service production systems with enterprise management functionality as is done by cases C1 and C24 and make the ITSP the leading system. ...
Thesis
Full-text available
Information technology (IT) service providers struggle with efficient and integrated production processes when compared to modern manufacturers. Manufacturers produce products build-to-order in a mass-customization approach or engineer-to-order in a highly customized but streamlined production process while using computer-integrated manufacturing approaches. Software's inherent complexity and its heterogeneous implementations make such consistent management of application service production difficult. This thesis examines if the IT service provider type of application system landscape providers can implement a production process similarly efficient and integrated as that of manufacturers. The thesis makes the argument that software for operations automation, such as infrastructure as a service and configuration management software, can wrap application software's complexity and its' heterogeneous implementations. Operations automation approaches facilitate an automated, modularized, and standardized build- and -engineer-to-order production process. This thesis creates its artifact based on an analysis of current technology, a case study of various IT service providers, including application system landscape providers, as well as literature. It follows the design science paradigm of information systems research. The main contribution is an information system architecture for application system landscape providers (ISAA). The ISAA explores the limits of standardization, automation, and modularization for application system landscape production. A domain model explicates the relationships between relevant entities of application system landscape production on the layers of business, process, integration, software, and infrastructure. Application system landscapes providers can describe the application system landscapes that underlie the application services they provide to customers using three different models: software, infrastructure, and orchestration configuration models. The ISAA's leading application system, the enterprise management system, treats these models as materials, which facilitates integration with secondary activities such as controlling. The thesis proposes a production execution system that orchestrates the production of application services between the ASLP's enterprise management (i.e., an ERP system), IT service management, and IT service production systems (e.g., infrastructure as a service and configuration management systems), similarly to manufacturing execution systems. The evaluation follows a recognized methodology. It presents a prototypical implementation of the ISAA after a discussion of the research's relevance as well as the ISAA's consistency, applicability, and adaptability. Comprehensively conducted tests with the prototype show the ISAA's feasibility. Expert interviews validate the overall utility of the ISAA. Targeted companies can improve their application service production in terms of quality and efficiency by leveraging this novel automation- and model-based as well as integrated approach.
... If the organization has chosen to carry out this task using Service-orientedarchitecture (SOA), Web services can be developed to publish the data, and all future updates and changes can be synchronized with the help of these Web services. In the case of mastering, the consuming applications must be modified to look up the new master data (Murthy et al. 2010, Otto andSchmidt 2010). ...
Article
Full-text available
This survey report aims to provide a data definition of one master data for cross application consistency. The concepts related to Master data management in broader spectrum has been discussed. The current challenges companies are facing while implementing the MDM solutions are outlined. We have taken a case study to highlight why Master Data Management is imperative for the enterprises in optimizing their business Also we have identified some of the long term benefits for the enterprises on implementing MDM. We take a close look at the importance, challenges, and business value of excellence in master data maintenance. It explores the causes of problems and presents a solution for simplified and improved automated master data maintenance. The example of a manufacturing company illustrates the roadmap to sustainable value through simplified master data maintenance: It shows how both transparency and efficiency can be ensured through objective assessments, clear responsibilities, and traceable ownership of data fields and profiles. In today’s business world, data is a valuable corporate asset which, when managed properly, can support a company’s ability to achieve strategic goals and financial results. A recent Hackett Group study revealed that over 70% of companies are planning on implementing a Business Intelligence or analytics application. In addition, these same companies are planning to establish data stewardship rules, standardize master data, and cleanse existing data, all components of a Master Data Management (MDM) strategy. Executives can improve their ability to quickly access accurate data by adopting MDM best practices. MDM typically involves a series of consistent processes and policies with proper governance and oversight. Master Data Management is focused around several key and actionable business segments including, but not limited to, the material, customer, supplier, and employee master. Strong Master Data Management governance can drive greater consistency and accuracy of data, which can be an asset in driving world class operations, providing the ability to use data as a competitive advantage, and reducing unnecessary waste. In contrast, without proper governance, there is limited accountability and ownership of data, which creates compliance risk as well higher expenses and lost revenue.
... Framework for the argumentation of advantages of Product Information Management (PIM) (Osl and Otto 2007) Reference architecture for data synchronization across organizations between the retail and consumer goods industries (Schemm 2008) Functional architecture for company-wide management of master data (Otto and Hüner 2009) Reference model for data governance (Weber 2009) Method for the specification of business-oriented data quality key performance indicators (Hüner 2010) Method for master data integration (Schmidt 2010) Management systems for controlling corporate data quality (Hüner 2011) Semantic MediaWiki for the management of metadata (Hüner et al. 2011b;Hüner et al. 2011c) Reference model for company-wide management of data quality (Otto 2011b; Typology of data governance models (Otto 2011c) Reference model for the management of business rules Catalog of requirements for the management of the master data of the future (Otto and Ofner 2011) Approach integrating a data quality perspective into business process management (Ofner et al. 2012;Ofner 2013) Descriptive model for semantic information system standards Concept for the management of the life cycle of master data (Ofner et al. 2013b) Method for designing a company's data architecture (Ebner 2014) Method for the design and implementation of master data management as a corporate support function (Reichert 2014) Methods for the strategy development of company-wide data quality management in global companies (Falge 2015) Approaches to economic valuation of data quality (Otto 2015) 4 Factors for Success and Immediate ...
Preprint
Social media platforms have empowered the democratization of the pulse of people in the modern era. Due to its immense popularity and high usage, data published on social media sites (e.g., Twitter, Facebook and Tumblr) is a rich ocean of information. Therefore data-driven analytics of social imprints has become a vital asset for organisations and governments to further improve their products and services. However, due to the dynamic and noisy nature of social media data, performing accurate analysis on raw data is a challenging task. A key requirement is to curate the raw data before fed into analytics pipelines. This curation process transforms the raw data into contextualized data and knowledge. We propose a data curation pipeline, namely CrowdCorrect, to enable analysts cleansing and curating social data and preparing it for reliable analytics. Our pipeline provides an automatic feature extraction from a corpus of social media data using existing in-house tools. Further, we offer a dual-correction mechanism using both automated and crowd-sourced approaches. The implementation of this pipeline also includes a set of tools for automatically creating micro-tasks to facilitate the contribution of crowd users in curating the raw data. For the purposes of this research, we use Twitter as our motivational social media data platform due to its popularity.
Conference Paper
Full-text available
Master Data Management (MDM) refers to the central management of shared master data across disparate business units in the organization. Despite the outward benefits of the MDM, the adoption of data sharing by data provider organizations to the MDM central repository remains slow. This is due to critical challenges of technological, organizational, individual and environmental that the organizations may expose. Hence, the primary aim of this study is to identify factors that influence data sharing adoption for MDM implementation using systematic literature review method. We strategized our review methods through relevant keywords searching from eight databases including journals, proceedings, books, and book chapters. The study categorized MDM adoption factors into four main dimensions which are Technological (relative advantage, cost, security and privacy, and complexity), Organizational (sufficient resources, and data governance), Individual (personnel competence, and top management support), and Environmental (policy and regulation, customer influence, and data quality). It is expected that the findings of this study will contribute to a deeper understanding of the factors that will lead to a succession of MDM adoption.
Article
Full-text available
В статье рассматриваются возможности современных MDM-систем (систем Master Data Management) и перспективные направления разработки мультидоменных и мультивекторных MDM-систем. Показаны причины, по которым однодоменные системы управления мастер-данными об активах не нашли широкого применения в отличие от существующих систем управления мастер-данными о клиентах, поставщиках, продуктах, сотрудниках и других типах бизнес-объектов. Рассматриваются сложности совмещения различных представлений одних и тех же активов в системе управления мастер-данными об активах. Делается вывод, что пока не будут разработаны и начнут успешно внедряться однодоменные системы управления мастер-данными об активах, переносить эту предметную область в мультидоменные системы преждевременно. Для решения описанных проблем предложена модель мастер-данных об активах, позволяющая совместить их различные представления. Эта модель включает множество независимых иерархий для различных представлений одного и того же парка активов, неиерархические связи, специфические для каждой предметной области, решётки связей, позволяющие переходить между разными представлениями одного и того же актива, набор классификаторов активов, классы в которых определяют наборы описывающих активы атрибутов, классификаторы связей активов, а также структурные и функциональные модели отдельных типов активов. Для того чтобы реализовать предложенную модель мастер-данных об активах, разработана архитектура MDM-системы, а также предложен алгоритм проверки корректности межракурсных связей всей модели в целом. Выдвинуты основные требования к инструментарию для разработки прототипа системы управления мастер-данными об активах – он должен одновременно обеспечивать функциональность графовой СУБД и графового энджина для выполнения сложных алгоритмов над графом в целом. Из двух существующих инструментов, отвечающих этим требованиям, для разработки выбран SAP HANA Graph.
Article
Full-text available
The paper describes capabilities of current MDM (Master Data Management) solutions and prospects of multidomain and multivector MDM solutions. The paper presents the reasons for which MDM of asset data solutions for single data domain are not used successfully in contrast to the existing MDM solutions for other data domains such as customers, suppliers, products, employees, etc. There are challenges of combining several different representations of the same assets in MDM of asset data solution. The conclusion shows that as long as relevant single-domain MDM of asset data solutions are not developed and not implemented successfully, it is too early to move this subject area to multidomain systems. To solve the problems described above, the authors propose a model of assets master data, which enable to combine different representations. This model includes multiple independent hierarchies for different representations of the same assets, non-hierarchical links specific for each subject area, grids of links allowing to go between different representations of the same asset, a set of asset classifiers whose classes define sets of attributes for describing assets, classifiers of links between assets, as well as structural and functional models for individual asset types. In order to implement the proposed model of master data on assets, the authors have developed a special architecture of MDM of asset data solution, as well as an algorithm for checking the integrity of links between different representations across the whole data model. Key requirements are defined to the tools for developing a prototype of MDM of asset data solution. It must provide the functionality of a graph DBMS and at the same time a graph engine to perform complex algorithm on the graph as a whole. Keywords: MDM solution, master data management of asset data solution, data model, system architecture, links integrity checking algorithm, SAP HANA Graph.
Book
Full-text available
An enterprise architecture tries to describe and control an organisation?'s structure, processes, applications, systems and techniques in an integrated way. The unambiguous specification and description of components and their relationships in such an architecture requires a coherent architecture modelling language. Lankhorst and his co-authors present such an enterprise modelling language that captures the complexity of architectural domains and their relations and allows the construction of integrated enterprise architecture models. They provide architects with concrete instruments that improve their architectural practice. As this is not enough, they additionally present techniques and heuristics for communicating with all relevant stakeholders about these architectures. Since an architecture model is useful not only for providing insight into the current or future situation but can also be used to evaluate the transition from ?as-is? to ?to-be?, the authors also describe analysis methods for assessing both the qualitative impact of changes to an architecture and the quantitative aspects of architectures, such as performance and cost issues. The modelling language and the other techniques presented have been proven in practice in many real-life case studies. So this book is an ideal companion for enterprise IT or business architects in industry as well as for computer or management science students studying the field of enterprise architecture. © Springer-Verlag Berlin Heidelberg 2005. All rights are reserved.
Article
Full-text available
With increasing size and complexity of the implementations of information systems, it is necessary to use some logical construct (or architecture) for defining and controlling the interfaces and the integration of all of the components of the system. This paper defines information systems architecture by creating a descriptive framework from disciplines quite independent of information systems, then by analogy specifies information systems architecture based upon the neutral, objective framework. Also, some preliminary conclusions about the implications of the resultant descriptive framework are drawn. The discussion is limited to architecture and does not include a strategic planning methodology.
Article
Full-text available
Most financial service companies view the requirements imposed by the insurance mediation directive (IMD) as a tiresome administrative burden, causing high costs but providing little benefit. In contrast, this article argues that the documentation requirements demanded by the IMD lead to a higher data quality and also to higher economic benefits. A significant improvement is proclaimed regarding the data quality dimensions correctness and completeness. To substantiate this hypothesis, we develop two metrics based on existing approaches; one for correctness and one for completeness. The development is guided by six general requirements on data quality metrics. Moreover, a case study analyses the influence of the IMD on data quality by applying these metrics in a case study: Based on data from a major German insurance company, we illustrate how economic benefits arise from documenting particular categories of customer data (e.g. the customer's professional background, his financial circumstances and his goals), while the documentation of other customer data categories does not provide similar benefits.
Article
- This paper describes the process of inducting theory using case studies from specifying the research questions to reaching closure. Some features of the process, such as problem definition and construct validation, are similar to hypothesis-testing research. Others, such as within-case analysis and replication logic, are unique to the inductive, case-oriented process. Overall, the process described here is highly iterative and tightly linked to data. This research approach is especially appropriate in new topic areas. The resultant theory is often novel, testable, and empirically valid. Finally, framebreaking insights, the tests of good theory (e.g., parsimony, logical coherence), and convincing grounding in the evidence are the key criteria for evaluating this type of research.
Book
The key to a successful MDM initiative isn't technology or methods, it's people: the stakeholders in the organization and their complex ownership of the data that the initiative will affect. Master Data Management equips you with a deeply practical, business-focused way of thinking about MDM-an understanding that will greatly enhance your ability to communicate with stakeholders and win their support. Moreover, it will help you deserve their support: you'll master all the details involved in planning and executing an MDM project that leads to measurable improvements in business productivity and effectiveness. * Presents a comprehensive roadmap that you can adapt to any MDM project. * Emphasizes the critical goal of maintaining and improving data quality. * Provides guidelines for determining which data to "master." * Examines special issues relating to master data metadata. * Considers a range of MDM architectural styles. * Covers the synchronization of master data across the application infrastructure.
Article
The Enterprise Architecture Planning (EAPTM) methodology and model are a seminal part of the common body of EA knowledge that remain relevant in their own right and which have influenced a number of other current frameworks, methodologies, and best practices in the public and private sector. However foundational, the EAP ™ approach has become, according to some EA practitioners, somewhat dated in its content, presentation, technological examples utilized, and in its relationship to some aspects of how EA is being practiced today. The intent of this article is to refresh one Wrt of the EAP approach, the famous EAP model (also known as the Wedding Cake M) and to provide explanations of each part of the updated model.