ArticlePDF Available

Organizing Data Governance: Findings from the Telecommunications Industry and Consequences for Large Service Providers

Authors:

Abstract and Figures

Many companies see Data Governance as a promising approach to ensuring data quality and maintaining its value as a company asset. While the practitioners’ community has been vigorously discussing the topic for quite some time, Data Governance as a field of scientific study is still in its infancy. This article reports on the findings of a case study on the organization of Data Governance in two large telecommunications companies, namely BT and Deutsche Telekom. The article proposes that large, service-providing companies in general have a number of options when designing Data Governance and that the individual organizational design is context-contingent. Despite their many similarities, BT and Deutsche Telekom differ with regard to their Data Governance organization. BT has followed a more project-driven, bottom-up philosophy; Deutsche Telekom, on the other hand, favors a rather constitutive, top-down approach. The article also proposes a research agenda for further studies in the field of Data Governance organization.
Content may be subject to copyright.
Communications of the Association for Information Systems
.+4,% 13)#+%

Organizing Data Governance: Findings from the
Telecommunications Industry and Consequences
for Large Service Providers
Boris O"o
Institute of Information Management, University of St. Gallen, Switzerland".1)2.:.34$.13,4-$$%
.++.63()2!-$!$$)3).-!+6.1*2!3 (:/!)2%+!)2-%3.1'#!)2
9)2,!3%1)!+)2"1.4'(33.7.4"73(%.41-!+2!3+%#31.-)#)"1!17%3(!2"%%-!##%/3%$&.1)-#+42).-)-.,,4-)#!3).-2.&3(%
22.#)!3).-&.1-&.1,!3).-723%,2"7!-!43(.1)8%$!$,)-)231!3.1.&+%#31.-)#)"1!17%.1,.1%)-&.1,!3).-/+%!2%#.-3!#3
%+)"1!17!)2-%3.1'
%#.,,%-$%$)3!3).-
:..1)21'!-)8)-'!3!.5%1-!-#%)-$)-'2&1.,3(%%+%#.,,4-)#!3).-2-$42317!-$.-2%04%-#%2&.1!1'%
%15)#%1.5)$%12 Communications of the Association for Information Systems .+13)#+%
5!)+!"+%!3 (:/!)2%+!)2-%3.1'#!)25.+)22
Volume 29
Article 3
Organizing Data Governance: Findings from the Telecommunications Industry
and Consequences for Large Service Providers
Boris Otto
Institute of Information Management, University of St. Gallen, Switzerland
Boris.Otto@unisg.ch
Many companies see Data Governance as a promising approach to ensuring data quality and maintaining its value
as a company asset. While the practitioners’ community has been vigorously discussing the topic for quite some
time, Data Governance as a field of scientific study is still in its infancy. This article reports on the findings of a case
study on the organization of Data Governance in two large telecommunications companies, namely BT and
Deutsche Telekom. The article proposes that large, service-providing companies in general have a number of
options when designing Data Governance and that the individual organizational design is context-contingent.
Despite their many similarities, BT and Deutsche Telekom differ with regard to their Data Governance organization.
BT has followed a more project-driven, bottom-up philosophy; Deutsche Telekom, on the other hand, favors a rather
constitutive, top-down approach. The article also proposes a research agenda for further studies in the field of Data
Governance organization.
Keywords: Data Governance, data quality, case study, organizational design, contingency factors, research agenda
Volume 29, Article 3, pp. 45-66, August 2011
The manuscript was received 9/13/2010 and was with the authors 3 months for 2 revisions.
Organizing Data Governance: Findings from the Telecommunications Industry
and Consequences for Large Service Providers
Organizing Data Governance: Findings from the Telecommunications Industry
and Consequences for Large Service Providers
46
I. INTRODUCTION
Motivation
Data Governance is a topic that is attracting growing attention, both within the practitioners’ community and among
Information Systems (IS) researchers. Consulting companies, software vendors, and analysts have emerged, giving
recommendations on the establishment of Data Governance in organizations [Bitterer and Newman, 2007; IBM,
2007; Newman and Logan, 2006]. Researchers have proposed initial frameworks for Data Governance [Khatri and
Brown, 2010; Otto et al., 2007] and have analyzed influencing factors [Weber et al., 2009] as well as the current
status of implementation [Pierce et al., 2008].
The fact that Data Governance is currently being discussed so vividly is due to a growing number of business
requirements which a company’s data is expected to meet [Kagermann et al., 2010]. In the telecommunications
industry, working with high-quality data is widely seen as a competitive advantage. A company aiming at the
provision of Internet Protocol Television (IPTV) on a broad scale, for example, needs to have available both
customer data (address data, for example) and network infrastructure data (regarding the bandwidth available in
specific regions, for example) in a complete and accurate form. False-negative responses during availability checks
(i.e., when customers are told that the product is not available for them, although the bandwidth would be sufficient)
due to incorrect or inaccurate data result in lost revenues and reduced market share. What data quality means to
telecommunications companies can be seen from a statement from consulting company Deloitte: ―Data ascends
from the basement to the boardroom‖ [Deloitte, 2009].
The issue of data quality and data quality management has been under scientific investigation since the 1990s. The
respective studies deal with the definition of data quality [Lee et al., 2006; Wang and Strong, 1996], the identification
and description of tasks to be addressed by data quality management [English, 2001; Wang, 1998], or the
description and analysis of data quality management in concrete, real-world cases. Interestingly, only very few
studies deal with the organization of Data Governance on a companywide level. This omission is all the more
surprising as companies clearly express their need for effective support in their efforts to establish Data
Governance. Despite the fact that the first attempts have been made to formulate recommendations regarding the
organization of data quality management (Wang [1998] proposing to view information as a product and to appoint
information product managers, or Weber et al. [2009] applying the principles of contingency theory, for example),
there has so far been no research into what companies should actually do when trying to organize Data
Governance.
Research Problem and Approach
The article at hand picks up on this gap and sets out to investigate the organization of Data Governance. The aim is
to shed light on the organizational dimensions of Data Governance and the design options companies have when
organizing Data Governance. For this purpose, the article uses case study research which is a qualitative research
method particularly suited to researching contemporary phenomena that cannot be separated from the environment
they are embedded in (unlike laboratory experiments, for example) and that have not been scientifically studied to a
large extent so far [Benbasat et al., 1987; Stake, 1995; Yin, 2002].
The case study comprises two organizations that have similar key company data and similar goals (to reduce
confounding effects in the study). Thus the article follows Blanton et al. [1992] and their analysis of IT organization in
the banking industry. The two cases presented in this article are taken from the telecommunications industry,
namely BT Groupthe successor to the former public organization British Telecommunicationsand Deutsche
Telekom. The telecommunications industry was chosen because of the relatively dominant rolecompared to other
industries such as manufacturingthat information technology (IT) in general [Potter et al., 2010] and data in
particular play for business success [Deloitte, 2009].
This article contributes both to the advancement of the scientific knowledge base and to the practitioners’
community. The scientific contribution results mainly from the article’s ambition to increase the theoretical
understanding of Data Governance organization. It follows the suggestion of previous research to analyze ―more
complex organizational‖ [Weber et al., 2009, p. 23] models for Data Governance. Epistemologically, the result of the
article represents theory for explaining,hence a typical outcome of case study research [Gregor, 2006, p. 624].
Volume 29
Article 3
47
Practitioners will benefit from the study because they can transfer the results to their own business context, thus
fostering the organization of Data Governance.
Section II of the article analyzes the theoretical background. The result of the analysis lays the foundation for
Section III which formulates the research question and introduces the research process. The two cases of BT and
Deutsche Telekom are presented in Section IV before they are analyzed and interpreted in Sections V and VI. The
article concludes with Section VII, giving a summary of the article and an outlook for future research.
II. THEORETICAL BACKGROUND
Data Governance and Related Concepts
Data is sometimes referred to as the ―raw material‖ of information, while information is defined as data in context
[Boisot and Canals, 2004] or processed data [van den Hoven, 1999]. In the practitioners’ community, however, both
terms are often used synonymously. Thus, for the purposes of this article, information and data are used
interchangeably, following the example of Wang [1998].
A common definition of the term Data Governance has yet to be established. Some studies point out that several
definitions exist, but they do not come up with any clarification in this regard [Pierce et al., 2008]. Other studies take
up Weill’s [2004] definition of IT Governance as a ―framework for decision rights and accountabilities to encourage
desirable behavior in the use of IT.Consequently, Weber et al. [2009], for example, see Data Governance as a
framework for decision rights and accountabilities to encourage desirable behavior in the use of data. Khatri and
Brown [2010] provide a similar description: ―Data governance refers to who holds the decision rights and is held
accountable for an organization’s decision-making about its data assets.‖
In the practitioners’ community, the issue of Data Governance has been taken up in recent years by analysts,
software vendors, and consulting companies. Gartner sees Data Governance as establishing ―disciplined proces ses
for managing information assets‖ [Bitterer and Newman, 2007]. Forrester shares the process view and defines Data
Governance as the ―process by which an organization formalizes the fiduciary duty for the management of data
assets critical to its success‖ [Karel, 2007]. On the contrary, IBM sees Data Governance not as a process but as a
―quality control discipline for adding new rigor and discipline to the process of managing, using, improving and
protecting organizational information‖ [IBM, 2007]. And Hewlett-Packard considers Data Governance to be the
enterprise function focused on ensuring the quality of an organization’s data [Hewlett-Packard, 2007].
The scientific state of the art and the state of the art in the practitioners’ community converge with regard to the
following aspects of Data Governance:
Data Governance is based on the notion of data as a company asset, the value of which organizations
need to maintain and/or increase.
Data Governance specifies who in a company is allowed to make what decisions regarding the
handling of data (rights), and what the tasks related to this decision-making are (duties).
Data Governance demands binding guidelines and rules for data quality management.
The article defines Data Governance as a companywide framework for assigning decision-related rights and duties
in order to be able to adequately handle data as a company asset.
Both researchers and practitioners frequently use the term Data Governance in the context of data quality and data
quality management. Pierce et al. [2008] have shown that in many organizations activities targeted at Data
Governance as well as those targeted at data quality management are initiated by the same individuals or groups of
individuals. And almost all publications state thatamong other thingsData Governance is about making
decisions with regard to data quality guidelines and data quality management [Cheong and Chang, 2007; Khatri and
Brown, 2010; Otto et al., 2007].
Data quality is typically determined by the data’s ―fitness for use,‖ i.e., the capability of data to meet certain
requirements defined by the user in order to accomplish a certain goal in a given context [Olson, 2003; Redman,
2001]. Thus, data quality seems to be in the eye of the beholder. However, despite the absence of a definition, a
number of variables (so-called data quality dimensions) are commonly used to determine data quality. Examples of
such dimensions are completeness, consistency, accurateness, relevance, and timeliness [Wang and Strong, 1996].
Data quality management comprises activities for the improvement of data quality [Batini and Scannapieco, 2006].
Going beyond mere reactive action (e.g. identification and correction of data defects), data quality management
48
works as a proactive and preventive concept, characterized by a continuous cycle consisting of activities to define,
measure, analyze, and improve data quality including the design of appropriate framework conditions [English, 2001;
Eppler and Helfert, 2004; Wang et al., 1998].
The close connection between Data Governance and data quality management results from the perspective on data
as a company asset. This notion of data goes back to the Hawley Report, in which practitioners in the 1990s defined
a number of basic principles for handling ―information assets‖ [Horne, 1995]. Data assets are considered to have a
certain value for an organization, which is why they need to be managed in the same way that physical goods are
managed. Data only has a value if being used. The fitness of data for use is what Wang [1998] considers data
quality. Poor data quality, i.e., data ―unfit‖ for use and of low utility, reduces the value of data assets in an enterprise
[Even and Shankaranarayanan, 2007, p. 80]. Thus, the value of data depends on its quality. Data Governance
provides guidelines and rules regarding data quality and data quality management. Figure 1 illustrates the
relationships of these basic concepts.
Data Governance
Data Assets
ValueData Quality
Data Quality
Management
refers to
have a value
depends on
is ensured by
is governed by
Figure 1. Data Governance and Related Concepts
Organizing Data Governance
Practitioners and researchers agree upon the fact that the establishment of Data Governance is an organizational
design task [Lee et al., 2006; Pierce et al., 2008; Thomas, 2006]. However, a holistic conceptualization of Data
Governance organization is missing. Existing work discusses either particular organizational aspects such as data
stewardship [Friedman, 2007; Laurent, 2005] or remains too generic and does not consider the specific
characteristics of different industries, for example [Khatri and Brown, 2010; Weber et al., 2009].
Therefore, this article looks at existing research on organizational design in general and on the organization of the
IS/IT Governance in particular in order to lay the theoretical ground for the case study. The transfer of findings from
IS/IT Governance to Data Governance follows the lines of argumentation adopted by Khatri and Brown [2010] and
Weber et al. [2009].
In general, organizational design comprises three dimensions. The first organizational dimension relates to an
organization’s goals which can be further divided into formal goals and functional goals [Grochla, 1982; Gross, 1969;
Mohr, 1973]. W hereas the former measure an organization’s performance, the latter refer to the tasks an
organization has to fulfill. Transferred to the research area of data governance, the formal goals relate to maintaining
or raising the value of a company’s data assets [Fisher, 2009, p. 63ff.], whereas the functional goals are represented
by the decision rights defined. Typical decision rights organized by Data Governance refer to the definition of data
quality metrics, the specification of metadata, or the design of a data architecture and a data lifecycle [Weber et al.,
2009].
The second organizational dimension is the organizational form. This materializes in its organizational structure and
its process organization [Galbraith, 2002]. Whereas the former defines the specification and assignment of
responsibilities, the latter focuses on the activities required to meet the organization’s business objectives. Very few
studies address these dimensions from a Data Governance perspective: Wang et al. [1998], suggesting that
information be viewed as a product, propose the appointment of an information product manager and the
specification and control of the information production process, but give no recommendations as to how this should
be done. Weber et al. [2009] propose a reference model for Data Governance, enabling the identification of basic
decision areas and roles, but do not explain how companies might design the individual model elements according
Volume 29
Article 3
49
to their needs. Furthermore, one can distinguish between primary and secondary organization [Galbraith, 1977,
Vahs, 1999]. Primary organization refers to the hierarchical structure of a company. The main characteristic of a
hierarchical relation is the unilateral assignment of decision-making power to a higher hierarchy level. In contrast,
secondary organization is the result of an effort to mitigate dysfunctional aspects produced by the primary
organization, typically by introducing hierarchy-overarching and hierarchy-adding measures [Galbraith, 1977;
Galbraith, 2002; Lawrence and Lorsch, 1967]. Examples are coordinating departments and committees. It is widely
agreed that Data Governance should comprise the specification of roles and decision rights as well as the
assignment of decision rights to roles. Typical roles are data steward, data owner, or Data Governance council
[Bitterer and Newman, 2007; Weber et al., 2009].
The third organizational dimension is the organizational transformation which consists of a transformation process
on the one hand and organizational change measures on the other. Recommendations on what to do when
introducing a continuous improvement process for data quality can be found in the work of Loshin [2001] and that of
Batini and Scannapieco [2006]. However, neither is specific when it comes to the question of how such a process
could be permanently established in companies. Some authors from the practitioners’ community propose maturity
models for Data Governance which are supposed to guide the transformation [Dember, 2006; IBM, 2007]. These
models, however, often remain at the level of ―check lists‖ with no application support in concrete business settings.
Figure 2 shows the conceptual framework for the case study outlined in this article based on an analysis of the
theoretical background. The framework shows Data Governance as an organizational design task which comprises
the design of organizational goals, the design of the organizational form, and organizational transformation. Its use
in the research process is described in further detail in Section III.
Data Governance as
Organizational Design
Task
Organizational FormOrganizational Goals Organizational
Transformation
Formal goals
Functional goals Structure and processes
Secondary organization Transformation process
Organizational change
Figure 2. Conceptual Framework of Data Governance Organization
Research on the instantiation of the framework in practice, i.e., the actual organization of Data Governance in a
company, is still in its infancyin contrast to IS/IT Governance which is a highly researched area in this regard. One
example is the question of where to locate IS/IT-related decision-making power in the organization. Boynton et al.
[1992] propose an approach for supporting companies that want to find a solution for splitting IS/IT-related decision
rights between functional departments and IT department. In the context of Data Governance there have been some
basic reflections about this issue both in the scientific [Khatri and Brown, 2010; Weber et al., 2009] and in the
practitioners’ community [Laurent, 2005]. A second example relates to the organizational form of the IS/IT function.
Numerous studies have examined alternatives in the continuum between centralized and decentralized organization
[Brown, 1997; Ein-Dor and Segev, 1978]. On this basis, various researchers have determined contingency factors
by transferring the principles of contingency theory to the organization of the IS/IT function [Brown and Magill, 1998;
Sambamurthy and Zmud, 1999]. The ―IT Governance archetypes‖ (business monarchy, IT monarchy, federal
governance, IT duopoly, and feudal governance) described by Weill and Ross [2005] are also built on these studies.
Comparable studies relating to Data Governance have not been conducted so far. Instead, Weber et al. [2009]
showed the general applicability of contingency theory to Data Governance. They identified the performance
strategy of a company, the organizational structure and also the degree of market regulation and business process
50
harmonization as contingency factors, but did not elaborate further on resulting ―organizational design patterns,for
example. In addition, a number of organizational ―prototypes‖ exist in certain individual cases. Cheong and Chang
[2007], for example, describe a Data Governance structure which consists of five roles (Data Governance council,
data custodian, user group, IT council, and IT technical staff). And Friedman [2007] proposes a sample organization
chart with data stewards per business department, a data quality sponsor, and a data quality operations team.
Data Governance in the Telecommunications Industry
Few available case studies deal with the management of data quality in the telecommunications industry. Reid and
Catterall [2005] examine the significance of data quality in the process of introducing Customer Relationship
Management (CRM) in a telecommunications company. They show that a data management strategy and a ―data
governor‖ are required prior to implementing a CRM system in order to avoid poor customer data quality. Umar et al.
[1999] propose a methodology for integrating data quality and data architecture management, based on findings
from the telecommunications industry. A similar study was conducted at AT&T, resulting in a three-step methodology
for ensuring data quality [Redman, 1995]. Another case study collects and analyzes experiences gained in the
process of data integration and developing of a companywide conceptual data model in the course of a Data
Warehousing project at Telecom Italia [Trisolini et al., 1999]. Heinrich et al. [2009] illustrate their procedure for
developing data currency metrics at a German mobile phone provider. They provide a means for estimating the
economic impact of measures to improve data currency. However, each of these case studies deals with a single
aspect of data quality management (specification of data quality metrics, for example). None of them addresses
Data Governance as a companywide organizational framework.
III. RESEARCH METHODOLOGY
Research Design
This article aims at investigating the organization of Data Governance. For this purpose, case study research was
chosen as the underlying research method because it allows the examination of contemporary phenomena in the
early state of research in their real-world context [Benbasat et al., 1987; Yin, 2002].
The design of case study research comprises five components [Yin, 2002, pp. 2027], namely
the research questions
propositions for the research
the unit of analysis
the logic which links the data to the propositions
the criteria for interpreting the findings
The central research question of this article is motivated by the lack of research on Data Governance organization
and asks what options companies have in practice with regard to organizing Data Governance and whether an
individual Data Governance organization is contingent on its context.
The case study is of an exploratory nature, i.e., it aims at laying the foundation for pertinent hypotheses or
propositions for further inquiry. In such exploratory cases, Yin [2002, p. 21] concedes that no elaborated propositions
can be specified beforehand (in contrast to descriptive and explanatory case studies). Nevertheless, he stipulates
that case studies be purpose-oriented, i.e., that there has to be a preliminary conceptual framework guiding the
exploration. This article uses the conceptual framework of Data Governance organization (see Figure 2) as
preliminary proposition, assuming that Data Governance comprises three different dimensions and that companies
have options when designing the dimensions. Moreover, previous research on Data Governance suggests the
assumption that individual instantiation of the design options is contingent on context [Khatri and Brown, 2010;
Weber et al., 2009].
The unit of analysis of the study at hand is the organization of Data Governance in the telecommunications industry.
From a Data Governance standpoint, the telecommunications industry is characterized by high IT spending [Potter
et al., 2010], big data volumes, many legal and regulatory requirements (e.g., related to security and privacy of
customer data) and intensive consumer interaction, as well as complex application system architectures (all
compared to other industries such as discrete manufacturing, for example).
The above mentioned conceptual framework for Data Governance organization is also used as the logic which links
the data to the propositions in the case analysis (see Section V). Moreover, the frameworktogether with the
proposition that its individual instantiation is context-contingentserves as guidance for the interpretation of findings
Volume 29
Article 3
51
in Section VI. In doing so, the article results in a theory for explaining because ―explanations are given for how and
why things happened in some particular real-world situation‖ [Gregor, 2006, p. 624].
Case Selection
Along with a clear understanding of the unit of analysis, case selection is crucial for building theory from case
studies because it is case selection that determines the external validity of the case study [Yin, 2002, pp. 3536],
hence the limits for generalizing the findings [Eisenhardt, 1989, pp. 536537]. The companies to be investigated in
the case study were selected on the basis of key company data [Benbasat et al., 1987, p. 373]. Key company data
comprises aspects such as company size, geographic coverage, organizational structure, or customer base
structure. To avoid confounding effects, the study was limited to the examination of companies with comparable
environments.
The two companies that were chosen are BT and Deutsche Telekom. Both companies rank among the world’s
leading providers of communications solutions. BT’s activities include networked IT services, local, national, and
international telecommunications services, and higher-value broadband products and services. BT Group plc is the
holding company for an integrated group encompassing the four lines of business. Deutsche Telekom offers
telecommunications and IT services. Deutsche Telekom’s business strategy is oriented toward socioeconomic
developments, such as digitalization of central areas of life, personalization of products and services, and the
increasing mobility of individuals. The two companies show similarities regarding their key company data (see Table
1). Previous case studies followed similar principles for the selection of companies, as did Blanton et al. [1992] with
their analysis of IT organization in the banking industry.
The main contacts for the case study were two employees in charge of data quality management introduction at BT
and the head of the organizational unit ―Data Governance‖ at Deutsche Telekom. Case A was analyzed first, in the
second quarter of 2008. In a second step, case B was selected for replication purposes [Yin, 2002, pp. 4546].
Thus, the Deutsche Telekom case was chosen because it allowed for investigation of the same phenomenon.
Table 1: Case Overview
Case A
Case B
Company
BT
Deutsche Telekom
Headquarter
London, United Kingdom
Bonn, Germany
Revenue in 2009
GBP 21.4bn
EUR 64.6bn
Staff 2009
96,000
260,000
Lines of business
BT Retail
BT Wholesale
Openreach
BT Global Services
Broadband/Fixed Network
Mobile Communication
Business Customers
Markets served
Operations in 70 countries
Operations in more than 50 countries
bn = billion.
Data Sources, Collection, and Exposition
In order to prepare the companies for the case study research project, they were both provided with information
material outlining the objectives of the project and a structured interview guide. The interview guide reflected the fact
that Data Governance is considered a design activity [Khatri and Brown, 2010] which follows a process of reshaping
an initial organizational state toward a desired state [Galbraith, 2002]. Therefore, the questions in the interview guide
were structured into three groups: first, the initial situation (including ―pain points‖ and the ―need for action‖); second,
the process of Data Governance organization covering the three organizational dimensions of the conceptual
framework; and third, the new situation.
In accordance with accepted recommendations on case study research [Stake, 1995; Yin, 2002], multiple sources
were used for data collection. However, interviews provided the main data source at both BT and Deutsche
Telekom. Table 2 summarizes the data sources used.
All interviews were documented in writing by two researchers working in parallel. The documents were then
analyzed and transferred into an integrated case document (one for each company). The first versions of this
document were then sent to the interview participants for feedback and clarification of open points. Once all the
additional information feedback had been incorporated, the final version was reviewed and discussed several times
on the phone with the main contacts at BT and Deutsche Telekom. The final versions, which are available as
working reports [Otto and Weber, 2009; Schmidt et al., 2010], were also reviewed by the companies’ communication
departments.
52
The presentation of the cases follows a chronological structure and intentionally deviates from the analytical
structure provided by the conceptual framework. This enables the case presentation to take into account the fact
that the case study covers an event over time [Yin, 2002, p. 139]. Subsequently, the conceptual framework of Data
Governance organization is used as the underlying structure to analyze and compare the cases in Section V, before
the results are interpreted in Section VI. This approach follows the recommendations given by Benbasat et al. as a
path which readers can readily follow [1987, p. 374].
Table 2: Data Sources
Case A
Case B
Company
BT
Deutsche Telekom
Interviews
2008-04-14, 13:3017:00, London (UK)
Head of Information Management
Practice, BT Design
IM Manager, Business Revenue
Management, BT Wholesale
IT Manager, BT Wholesale
Lead Data Management Consultant, BT
Design
2009-11-06, 09.0010:30, Darmstadt (DE):
Head of Data Governance, Central IT
2008-04-14, 19:3021:00, London (UK):
Head of Unit Customer Management ICT
Transformation, BT Design
IT Manager, BT Retail
Lead Data Management Consultant, BT
Design
2009-11-06, 11:0012:30, Darmstadt (DE):
External consultant to Central IT
2008-05-15, 12:3013:45, Cardiff (UK):
2 DM Consultants, BT Design
Lead Data Management Consultant, BT
Design
2009-11-06, 13:0014:30, Darmstadt (DE):
Data Architect, Central IT
2008-04-15, 14:4515:30, telephone:
Enterprise Data Architect, BT Design
Lead Data Management Consultant, BT
Design
Documents
External presentations:
Presentation by [Hill, 2003] on the IM
Program
Presentation by Turner and Evans [2007]
at ICIQ 2007
Official publications:
Presentation by [Grewe, 2009] at the Data
Management Congress 2009 in Cologne
TM Forum, Information Framework (SID),
as per 04.02.2010 [TM-Forum, 2010]
Archival
records
Internal documents:
Information Management Forum Terms of
Reference, BT Group plc, London, UK
2002
BT Group Information Policy, BT Group
plc, London, UK, 2001
Company communications:
DTAG, Connected life and work, annual
report 2008
DTAG, Deutsche Telekom AG, company
presentation, Bonn 2009
DM: Data Management
IM: Information Management
IT: Information Technology
TM Forum: Tele Management Forum
SID: Shared Information and Data
IV. CASE PRESENTATION
BT
Initial Situation
In the mid-1990s, BT introduced a strategy to increase the speed of service deliveries to customers, as a response
to the effects of market deregulation and increasing competition. At that time, BT’s business processes were very
complex, and in many cases processes were supported by more than one application system. Business processes
were mainly designed to fit the functional orientation of BT’s organization. Hence, end-to-end processes were split
across many different organizational units. Also, as BT was undergoing continuous organizational change, business
processes were permanently being designed ―behind reality.The gap between business process design and the
demands of reality was typically closed by increased manual activities and workarounds.
Volume 29
Article 3
53
The central problem at BT with regard to data was that no standards existed for the creation, use, and maintenance
of data. These activities tended to be spread across the entire organization. For example, each line of business had
its own customer data, a diverse set of products, and different contractual agreements. Another problem was
meeting the need for high-quality customer address data as an essential precondition for business processes such
as billing, delivery, repair, and marketing.
BT’s Operational Support Systems supported all business processes. However, the systems landscape was very
complex and heterogeneous. Over time, the proliferation of systems had led to complicated interfaces and data
flows. Moreover, acceptance of IT services was limited because business often failed to yield the expected business
benefits from systems. In this context, BT’s IT strategy aimed at improving existing systems and increasing their use,
rather than developing a new application architecture design.
Other factors causing BT to deal with the issue of data quality resulted from its functional organization, which did not
support integrated business processes. Business units did not know where the data they were using came from, and
which other units in subsequent process steps were using the same data for what purpose. As a consequence,
employees did not focus on entering seemingly unnecessary data correctly, which resulted in poor data quality in
subsequent process steps. For example, repair personnel were sent to the wrong locations because of incorrect
customer address data, which was costly for BT and resulted in a poor experience for the customer.
The Information Management Program
BT’s data quality efforts started in 1997 with an initial project and over time evolved into the Information
Management (IM) Program. The IM Program was sponsored by the Group Chief Information Officer (CIO), which
was a business function at that time. The initial project had a budget of GBP 20,000 and aimed at identifying
opportunities to better leverage investments in information systems in BT Wholesales. The study covered various
topics and their impact on business. It emerged that data quality was rated as number one priority.
A team of two people was formed in response to this finding. The first area which was selected dealt with the data
quality of customer names and addresses used by BT Retail. The project team analyzed and evaluated appropriate
software tools to improve data quality in this domain. The one-year software license fees incurred by the purchase of
the software were recovered within three months through postage savings after the cleansing of address data. With
this success as the starting point, more projects followed. The largest project aimed at replacing two marketing
systems by one new system called SWIFT. It was based on the introduction of standards for customer names and
addresses. The increasing number of data quality projects led to the establishment of the IM Steering Group in
1998, whose main objective was to oversee projects and make sure they were in time and budget.
In 1999, BT Wholesale joined the initiative, driven by a lack of transparency with regard to its assets. The Steering
Group then gave rise to the IM Forum. Among the major tasks of the IM Forum were portfolio management for data
quality projects, planning and budgeting processes for data quality activities, coordinated identification of
opportunities for data quality projects, and alignment with BT’s overall business goals. Also in 1999, the IM Forum
issued the first version of the BT Group Information Policy, which aimed at maximizing exploitation of ―information
assets‖ across the organization.
In 2000, the IM Forum initiated a one-day information event called ―Shaping the Future BT Conference.The event
was chaired by BT’s CEO, who gave a keynote speech emphasizing the importance of information in BT’s strategic
goals. Both BT Retail and BT Wholesale presented selected data quality projects demonstrating that increased data
quality results in improved business performance. As a result of the conference, the request for data quality projects
increased.
Also in 2000, the team developed a methodology for data quality projects (see below). The approach was
characterized by a focus on cost-benefit analyses. Project proposals that could not prove a reasonable cost-benefit
ratio were not accepted. While the Data Quality Methodology was an answer to the need for a structured approach
to efficiently managing the growing number of projects, it was also a means for fostering adoption of the Information
Policy. The team learned that a simple directive regarding the use of information and data was not enough to
achieve sufficient support from the business. Instead, the team had to speak a ―language the business understood ,
i.e., be able to quantify the monetary benefits of improved data quality in a business process or function. By 2003,
the number of data quality projects managed under the IM Program had grown to more than fifty.
The IM Forum
Members of the IM Forum were the Group CIO, the CIOs of the different lines of business, and a representative from
central Information and Knowledge Management (IKM) practice. The projects coordinated by the IM Forum were
54
Information Management
Forum
Lines of Business CIO
BCIO Information Managers
Group CIO
Internal Audit
Line of Business CIOs
Line of Business
Information Managers
IM Team Experts /
Consultants
Associate Policy
Owners
1-to-1
meetings
1-to-1 meetings
Issues
Audits
Projects
as required
Review &
Improvement
Review &
Improvement
Figure 3. BT’s Information Governance
funded by a central budget provided by the Group CIO. However, the benefiting lines of business were to contribute
from the savings they made as a result of the projects so that the IM Forum would be self-funding.
In the course of the IM Program, specific roles in both business and IS/IT departments were established. As the data
held by each line of business needed to reflect the different market and organizational needs, one senior manager
from every line of business was appointed ―Information Manager.Information Managers were responsible for data
quality communications, organizational change and improvement activities, and data cleansing. Business owned the
definitions of data quality and the metrics that define success across data quality projects. An Internal Audit group
carried out audits to ensure compliance of the lines of business with the Information Policy. Figure 3 shows an
overview of BT’s information governance roles and responsibilities.
As a corporate-wide instrument, the Information Policy set out directives for the different lines of business, such as
conducting data quality assessments, developing measures and targets for data quality, and developing
communication plans. Implementation of the Information Policy was one of the IM Forum’s main areas of
responsibility.
However, BT realized that the Information Policy in itself was just a first step and not sufficient to create sustained
awareness of the importance of data quality in the business. It was not that business managers were per se
reluctant to implement the Information Policy, but certain business requirements (such as ―do more with less,
organizational changes, etc.) conflicted with data quality objectives. The IM Forum reacted by changing its approach
and introducing the Data Quality Methodology, which allowed managers to focus on the business impact of data
quality. Speaking the ―language business understood, i.e., quantifying the monetary benefit of each data quality
project, facilitated the deployment of sustainable, reusable solutions and hence sustainable data quality
improvements.
Data Quality Methodology
BT’s Data Quality Methodology consists of five phases (see Figure 4). The first phase, Problem & Opportunity
Identification, aims at identifying pain holders and ―champions‖ and at relating complaints and suggestions regarding
data quality to business objectives. Within the second phase, Diagnosis, the project team carries out a quick data
analysis which is comprised of data discovery and data profiling techniques in order to identify the current data
quality level as a baseline (also called ―data quality health check‖). The health check is followed by a workshop with
stakeholders to identify problem sources and potential business benefits. Proposals with no indication of business
benefit are rejected. For approved proposals, stakeholders and the project team create a commercial proposition
document within the Proposal phase. The document has to be agreed by the business sponsor and then forms the
project agreement. The commercial proposition ensures that business owns the project. In the fourth phase, Re-
engineering, the project team designs and implements the solution. Business leads and IT supports the solution,
Volume 29
Article 3
55
which includes changing culture, systems, processes, and technology. With the fifth phase, Consolidation, the
project team aims at ensuring a sustainable solution.
In addition to the Data Quality Methodology, the Architectural Forumwhich was formed as a subgroup of the IM
Forumdeveloped data quality principles for system development as part of the design procedures called Joined-
Up Design (JUD). New system development projects had to pass the JUD test, which ensured that the prerequisites
for data quality were taken into consideration in the design of new systems. For example, only existing data sources
were to be used. The Architectural Forum was empowered to stop system development projects until they passed
the test.
Prototype/Prove
the Concept
Problem &
Opportunity
Identification
Consolidation
(Hold the Gains)
(Data Reconciliation Tools)
Re-Engineering
(Data Re-Engineering/
Extract, Transform, Load/
Data Profiling Tools)
Diagnosis
(Data Discovery/
Data Profiling/
Data Modeling Tools)
Proposal
Integrate
Solution
Build on
Gains
Deliver
Solution
Define
Solution
Investigate
Review/Reuse/Reject on Commercial Grounds
123
45
Figure 4. BT’s Data Quality Methodology, adapted from [Turner and Evans, 2007]
New Situation
BT achieved a cumulative GBP 700 million in business benefits during the seven-year IM Program. The overall
benefit is a result of the benefit contributions of the numerous individual data quality projects which were coordinated
under the IM Program (in 2003, for example, the program included more than fifty projects). Sources of benefits
were mainly process improvements, i.e., reduced costs of failure, less scrap and rework, enhanced productivity, and
better morale. Reduced lifecycle costs and faster deployment of enhancements led to an increased return on
investment in Information Technology, which was one of the main goals of the IM Program.
Within BT Wholesale, the business signed off over GBP 600 million in benefits, which amounts to 85 percent of all of
BT’s data quality benefits. Benefits resulted from decreased inventory costs, avoidance of capital expenditure,
revenue recovery and creation, and improved asset utilization. Furthermore, lost assets could be found, asset status
was corrected, correct bills for products and features were issued, electronic business was enabled, and customer
satisfaction and process efficiency was improved.
Every year, the team was given targets for savings in the asset investment budget, and the team succeeded in
meeting these targets. When it came to defining business benefits, there were no generic measuresthese were
specific for every project.
An example is the ―lost asset‖ project which improved accuracy and availability of network asset data. This was
achieved through harmonizing asset data records, improving business processes between BT and external network
suppliers, and supporting ―asset recovery.‖ In 2003, approximately 15 percent of all network assets were not
recorded in the inventory systems (i.e., they were considered ―lost‖) [Hill, 2003]. And in the same year, BT reported
56
on GBP 1.8 billion in new network capital expenditure [BT Group plc, 2003]. Avoidance of just 10 percent (as a
conservative estimate) of the monetary value of the average 15 percent losses equals savings of GBP 27 million.
The IM Program is seen as a successful initiative within BT. However, with growing maturity and stability, the
program became self-contained and developed into ―business as usual,which is why it now no longer exists as a
strategic goal in its own right. At BT Wholesale, there is a relatively small team that has continued to run internal
data quality projects using the Data Quality Methodology. Part of the core IM Team now works at BT Client Services,
providing data quality services to BT’s customers using the Data Quality Methodology. The Architectural Forum
turned into a group dealing with Enterprise Master Data Management.
Deutsche Telekom
Initial Situation
To validate whether the company’s strategic business goals are met on a short-term basis, Deutsche Telekom
defined six priorities (the BIG 6) containing measurable objectives for each business year. In 2008, for example, the
BIG 6 referred to the company’s expansion of its leading position in the broadband sector, entrance into the
entertainment market, and the fulfillment of customer expectations with regard to products and services.
Before 2006, Deutsche Telekom pursued neither systematic data quality management nor any form of Data
Governance. Activities related to data quality management were limited to casual consistency and completeness
checks of data inventories for single applications or ad hoc data cleansing initiatives when new software
components or products were implemented. All these measures were demand-driven and did not take into account
business requirements to be met by data. Moreover, the question of how to improve data quality on a companywide
level and in a preventive manner was not addressed.
When the two business divisionsT-Com and T-Onlinemerged in June 2006, Deutsche Telekom faced a growing
need to ensure data quality on a companywide level. For example, it became necessary to consolidate data
inventories from both divisions in order to ensure consistency of customer master data, as customers were
increasingly being offered a combination of products from both divisions (i.e., telephone and Internet products and
services). The following problems occurred with the merger of T-Com and T-Online:
Alignment of data quality management with the overall company strategy and the BIG 6 was missing.
Companywide guidelines and processes for maintenance and modification of business objects and data
objects were not defined and/or implemented.
Clearly defined responsibilities for maintenance and modification of business objects and data objects
were not defined.
The origin and the distribution of data objects across the entire IT landscape were not transparent.
Data defects, although identified (inconsistency of customer data, for example), could not be quantified.
There was no companywide transparency as to which data had to be available in which quality, or as to
what methods and techniques could be used for effective data quality management.
There was no common understanding of business objects and data objects (i.e., no common
understanding of the elements to be addressed by data quality management).
Data Governance Organization
In the course of integrating T-Com and T-Online to form the new business unit T-Home, Deutsche Telekom decided
to set up organizational units solely addressing the issue of data quality management. These organizational units
were supposed to act as a line function in order to bundle and control existing data quality management activities,
which until then had been conducted independently of one another and in a decentralized fashion.
In April 2007 a project was started in the board area IT, aiming at preparing the regular operations of a business
function for data quality management. The project resulted in the establishment of two departments for data quality
management, one within a business function and the other within the central IT department. The business
department is called MQM27 and deals with the consolidation of business requirements to be met by data
(particularly customer data). It is the central unit for data quality management on the business side. Within the
central IT department, known as ZIT (Zentrum für InformationstechnikCenter for Information Technology), there
was established a department called MDM (Master Data Management, ZIT72), which is responsible for putting
forward concepts for data quality management and translating business requirements to the IT level. ZIT also
encompasses another unit, Data Management (ZIT73/74), which deals exclusively with data cleansing activities. For
Volume 29
Article 3
57
the purpose of planning and executing data quality measures, the departments ZIT72 and MQM27 cooperate closely
with a unit named IT2 (Enterprise IT Architecture). ZIT72, on the other hand, consists of two teams. One of these
(ZIT722) deals with short- and mid-term tasks such as data quality measurements. The other, Data Governance
(ZIT721), is responsible for long-term measures, such as defining standards for Data Governance, developing
guidelines and rules to ensure high data quality, specifying data object ownership, or designing and implementing
companywide data models and data architectures. Figure 5 shows how data quality management is embedded in
the organizational structure of Deutsche Telekom. Grayed out organizational units are the ones dealing with the
functional goals of Data Governance.
Deutsche
Telekom AG
T-Home T-Mobile T-Systems
Line of
Business CIO
MQM
Marketing and
Quality Mngmt.
MQM2
Quality
Management
MQM27
Data Quality
Management
IT1
IT Strategy and
Quality
IT2
Enterprise IT
Architecture
ZIT7
Informatio n
Processing
ZIT72
MDM
IT73/74
Data
Management
ZIT721
Data
Governance
ZIT722
DQ Measurement
and Assurance
Figure 5. Data Governance Organization at Deutsche Telekom
In parallel with the design of the organizational structure, the NDQ2010 project was initiated (NDQ stands for
Nachhaltige DatenqualitätSustainable Data Quality). One of its goals was the identification and assignment of
data-related responsibilities:
Data owners are responsible for the data class or attribute assigned to them. They make the final
decisions with regard to maintenance or modification of their data classes and attributes. The role of
data owner is typically assumed by the head of the functional department/division.
Besides the data owners, two data architects, one from the business side and one from the central IT
department, are assigned to each data class. Data architects are responsible for the management of
metadata and data structures of the respective data class (i.e., the data modeling processes).
Data managers (again one from the business side and one from the central IT department) are
responsible for data maintenance and for the related processes, tools, and resources on behalf of the
data owner.
Business and IT data quality managers are responsible for controlling data quality processes.
New Situation
At Deutsche Telekom, guidelines and rules exist for data quality management. While guidelines (such as ―clear and
unambiguous definition of and ownership for metadata‖) tend to constitute a framework aimed at directing
employees’ actions and ―way of thinking,‖ rules (such as ―the central model of reference data for all T-Home
applications is the current version of the business object model‖) are more concrete and binding.
For the purpose of aligning data quality management with the overall company strategy, the BIG 6 are used as an
orientation pattern when performance indicators for data quality management and target values for data quality are
defined. This ensures that data quality management activities always support strategic company goals. Moreover,
58
clear data-related responsibilities are assigned per business object. And a standard data model was introduced on a
companywide basis.
The Data Governance team (as part of the MDM department) has evaluated the benefits of a consistent data
modeling approach and a standardized, companywide data model:
The standardized data model provides a common and consistent terminology facilitating communication
between functional departments and the central IT department as well as across domain boundaries.
The definition of business objects across projects and domains fosters business object reuse so that
repeated redefinition can be avoided and costs and additional effort can be reduced.
A standardized data model is a prerequisite for the consolidation of distributed system landscapes and
the migration of systems.
The comprehensive definition of business objects fosters a common understanding of data elements
and reduces errors due to inadequate usage.
A standardized data model is a prerequisite for efficient integration of internal and external partners, as
all data elements have the same semantics.
Apart from that, Deutsche Telekom quantified the overall savings through those benefits by estimating the cost
reduction potential for new system development activities, maintenance activities, and system operations activities.
For the budget for new system developments, for example, Deutsche Telekom estimated a portion of 20 percent
being ―generally influenceable,‖ of which between 12.5 and 22.5 percent might be positively affected by data quality
measures. Deutsche Telekom expected to reduce the overall IT budget by 0.2 percent in the first year after
introducing a standardized data model, by 0.67 percent in the second year, and (leveraging learning curve effects)
by 1.46 percent in the third year, resulting in an average annual saving potential of approx. 0.8 percent of the IT
budget [see Schmidt et al., 2010]. With an estimated average proportion of the IT budget amounting to 5.3 percent
of a telecommunications company’s turnover [Smith and Potter, 2008], Deutsche Telekom’s cost-saving potential in
2009 equaled EUR 27 million.
V. CASE ANALYSIS
Organizational Goals
With regard to the organizational goals related to Data Governance, the cases are similar insofar as the motivation
to organize Data Governance was derived from the companies’ overall strategic goals (more implicitly in the case of
BT, very explicitly in the case of Deutsche Telekom with its BIG 6).
Also, the economic benefit was quantified in both cases. However, BT did so only for individual projects, whereas at
Deutsche Telekom the benefit (of a standardized data model, for example) was not calculated until data quality
management had been organized as a line function. At BT, formal goals were set as targets for cost savings (in
particular, for the asset investment budget), whereas they were not specified at Deutsche Telekom.
A comparison of the quantified benefits in the two cases shows that BT realized cost savings of approximately GBP
100 million per year while Deutsche Telekom’s annual saving potential equals EUR 27 million. However, the
conclusion that BT’s approach for Data Governance was more effective is not valid because benefit quantification
approaches applied in the two cases are different. For example, BT assigned the business benefits of all data quality
projects to the IM Program whereas Deutsche Telekom only quantified the effects of the introduction of a
standardized data model. And BT was able to consider the full amount of realized benefits whereas Deutsche
Telekom calculated with an attenuation factor (12.5 to 22.5 percent in the example given above).
Similar functional goals exist with regard to support for data quality projects. However, whereas all of BT’s actions
were project-driven, at Deutsche Telekom companywide design tasks also played an important role (for the data
architecture, for example).
Organizational Form
The organizational structure of Data Governance differs in the two cases. During the time of the IM Program there
was no line function at BT, whereas at Deutsche Telekom the two departments MQM27 and ZIT72 were set up right
from the start. At the same time, differences also exist with regard to the process organization. At BT, the Data
Quality Methodology was developed and introduced as a new process for guiding data quality projects in a
business-oriented manner. In contrast, Deutsche Telekom uses existing processes such as the change request
management for IS/IT services.
Volume 29
Article 3
59
And while at BT the IM Forum was a central element in the secondary organization of Data Governance, at
Deutsche Telekom interaction of the different departments and projects took place via existing project management
structures. The translation of activities into a line function was not performed at BT until Data Governancein the
form of the IM Programhad proven to be successful.
Organizational Transformation
When it came to the organizational transformation, BT decided in favor of a ―start small, grow big‖ approach (i.e., the
initial budget for the project was quite small). Deutsche Telekom, on the other hand, decided to implement new line
functions first, and set up projects to elaborate on detailed aspects of data quality management afterwards (e.g., the
NDQ project).
As for organizational change management, Deutsche Telekom only executed measures in the course of the merger
process for the two divisions. Apart from this, no explicit change management took place. BT, however, was always
anxious to maintain the commitment of both top management and functional departments on a continuous basis.
Table 3 summarizes the result of the case analysis as a case-ordered meta-matrix [Miles and Huberman, 1994].
VI. INTERPRETATION OF FINDINGS
Propositions
The case analysis shows that companies have options when designing the three organizational dimensions
described in the framework of Data Governance organization. The results of the case analysis in Table 3 show a
number of similarities in the way the two studied companies organize Data Governance. Similarities can be found
mainly with regard to the formal goals (e.g., derivation from the companies’ strategic business goals, limited degree
of formalization) and to functional goals (e.g., setting companywide policies and standards, companywide Data
Governance roles). Also, the case analysis reveals many differences which lie mainly in the organizational structure
(project organization at BT vs. line functions at Deutsche Telekom) and the transformation process (―think big, start
small‖ at BT vs. ―top down‖ from the outset at Deutsche Telekom).
Proposition 1: There is no “off-the-shelf” approach for organizing Data Governance. The
organizational design of Data Governance rather depends on how companies “configure” the
variety of organizational dimensions related to Data Governance.
The fact that even companies with many similarities (with regard to key company data, for example) have numerous
design options in organizing Data Governance confirms the assumption of existing research regarding Data
Governance as being context-contingent [Khatri and Brown, 2010; Weber et al., 2009].
Proposition 2: The effective design of Data Governance organization is contingent on external and
internal factors because these factors determine the configuration of organizational dimensions
related to Data Governance.
Proposition 2 leads to the question of exactly which factors are contingencies. As an appropriate tactic for further
case investigation, Eisenhardt recommends the juxtaposition of seemingly similar cases [1989, p. 541]. As
mentioned before, the two companies show many similarities: They are almost the same in terms of key company
data (revenue, number of employees, markets served, product portfolio, etc.), are former state-owned companies,
have undergone significant organizational changes, and operate in the same industry (one which is characterized by
intensive customer interaction, high infrastructure investments, and high IT spending). However, they display
significant differences in two other aspects.
First, the ―mandate for action‖ was given by different units in the two companies. At BT, both the initial project in
1997 and the IM Program were sponsored by a business function (in fact by the CIO who was then organizationally
positioned on the business side). In contrast, at Deutsche Telekom the project which prepared the setting up of the
organizational structure for Data Governance was IT-driven. This might explain why Data Governance was for a long
time organized as a program with no formal allocation to a line function. Business units were interested in solving
data quality issues on a project-wise basis. The ―constitutive‖ approach which Deutsche Telekom chose might have
been ―harder to sell‖ to the business units at BT.
Second, there was a difference in the awareness of Data Governance as an important topic. When BT started the IM
Program, Data Governance was a relatively new concept with an extremely limited knowledge base. The initial
project at BT was not even considered a data quality project, and data quality did not turn out to be an issue until the
results had been analyzed. As a consequence, there were no references BT could learn from and all Data
Governance-related design options had to be learned ―from scratch.At Deutsche Telekom, the situation was very
60
Table 3: Case Analysis
Case A
Case B
Variables (cf. Figure 2)
BT
Deutsche Telekom
Congruence
Organizational
Goals
Formal Goals
Driven by strategic
business requirements
Implicitly related to
strategic business goals
Annual targets for IM
Program
Driven by strategic
business requirements
Explicitly related to the
BIG 6, but not formalized
High
Functional
Goals
Initially defined in the IM
Policy
Goal definition on a case-
wise basis (e.g., data
quality targets)
Central portfolio
management (including
budgeting)
Job descriptions for units
MQM27 and ZIT72
―Constitutive‖ goals (e.g.
standard data model,
guidelines and rules)
Medium
Organizational
Structure
Structure and
Process
Organization
Organizational unit not
introduced until IM
Program turned out to be
successful
Establishment of IM
Managers in business
units
Organizational units
MQM27 and ZIT72
Integration into existing
processes (e.g., change
request management)
Low
Secondary
Organization
IM Forum as a means of
secondary organization
Data Quality Methodology
to facilitate project
organization
Identification and
assignment of Data
Governance roles across
organization
Projects such as NDQ
(dedicated DQ projects)
No formal board structure
Medium
Organizational
Transformation
Transformation
Process
Evolution over time,
starting with a small
project and growing to the
IM Forum (no ―master
plan‖ existed upfront)
Continuous management
of a project portfolio
Small preparatory project
in 2006
NDQ project to elaborate
tasks and define roles
Low
Organizational
Change
―Shaping the Future BT
Conference‖ including
CEO speech
Task of Information
Managers
Focus on benefits to
obtain support from
business
Part of larger initiative
(integration of two lines of
business)
Benefits exemplarily
analyzed to show impact
on the business
Low
different. At the start of their Data Governance activities, the topic was still in its infancy within the scientific
community, but was intensively discussed within the practitioners’ community. For example, IBM founded a Data
Governance Council which consisted of a number of large US-based firms as far back as 2004 [IBM, 2006]. Thus,
Data Governance was considered an accepted approach which, for decision-makers, in return reduces the
perceived risk related to organizing Data Governance as a line function.
Proposition 3: The positioning of the mandate for action and the awareness of Data Governance
within the company are two contingency factors for Data Governance organization. They determine
the configuration of organizational dimensions related to Data Governance and, thus, the
effectiveness of Data Governance.
At BT, the mandate for action was allocated to a business function from the early stages of the Data Governance
activities. This approach to mandate allocation can be interpreted as a reason why Data Governance was always
Volume 29
Article 3
61
required to demonstrate its business benefit (what also led to a mandatory cost-benefit-analysis in the data quality
methodology). Moreover, it might also be an explanation for why BT (in contrast to Deutsche Telekom) did not use a
rather ―conservative‖ approach to quantifying the business benefits. BT’s Data Quality Methodology ensured that
business functions were continuously ―engaged‖ in the justification of Data Governance.
Proposition 4: In companies with the mandate for action allocated to a business function, the
business benefits related to or caused by Data Governance are eventually attributed to Data
Governance to a larger extent (compared to companies with a mandate for action in IT, for
example).
The propositions also raise the question about the validity of findings, in particular external validity. External validity
concerns the domain to which the findings can be generalized [Yin, 2002, 35ff]. The comparison of the companies
studied suggests that the findings hold true for other large, multinational service providers with intensive use of
complex IT and a large customer base. Similar companies can be found, for example, in the retail banking and utility
industries. Cheong and Chang [2007] present the case study of a large utility service provider in which Data
Governance activities were driven by data quality issues in asset management (similar to BT) and in which a Data
Governance council was established (similar to the IM Forum at BT).
In contrast, the findings are not expected to be transferrable to companies with fundamentally different
characteristics to those of BT and Deutsche Telekom. Such firms might be found in manufacturing industries, among
others. Wijnhoven et al. [2007] discuss Total Data Quality Management (TDQM) at gas valve producer Honeywell
Emmen in The Netherlands. This case differs in many ways from the cases presented in this article. The focus of
activities was on product data (not on customer or asset data), the company (typically for manufacturing firms)
operates a central product database (in contrast to the complex and heterogeneous system landscape in
telecommunication firms), and it comprises only one location. Organizational activities associated with TDQM at
Honeywell Emmen mainly relate to the setting up of a project team. In contrast to BT and Deutsche Telekom, neither
a primary organization (in the form of a line function as at Deutsche Telekom), nor a secondary organization (like the
IM Forum and IM Program at BT) were set up.
Research Agenda Outline
The propositions mentioned above form a first step in the development of a theory on the organization of Data
Governance. In order to prepare the ground for further investigation and allow for operationalizing such theory, the
article suggests empirical indicators for the organizational dimensions of Data Governance and a set of contingency
factors. Following the line of argumentation used by Wheeler [2002] in his work on net-enablement, these
suggestions are not exhaustive, but rather form a foundation which can be examined for evidence in future research.
Table 4 shows sample empirical indicators.
An initial set of contingency factors for Data Governance is proposed by Weber et al. [2009]. Combined with the
factors which emerge from the evidence of the case study, the article proposes a number of external and internal
contingencies. External contingencies comprise the company size, the industry, the volatility of the markets served,
and the extent of business-to-consumer activities (in contrast to business-to-business activities). Among the internal
factors are the allocation of the mandate for Data Governance, the awareness in the organization of the topic, the
organizational structure in general, the degree of business process harmonization, and the heterogeneity of the
application system landscape.
Future research on the organization of Data Governance should include both positivist and interpretivist approaches.
Positivist research approaches would transform the propositions into testable hypothesis which could then be used
in quantitative research designs. They would also allow for substantiating the applicability of contingency theory to
the field of Data Governance. Interpretivist research approaches would include further case studies which allow for
gaining a deeper understanding of feasible Data Governance ―configurations‖ and prevailing ―organizational
patterns.
VII. CONCLUSION AND OUTLOOK
Companies organize Data Governance in order to be able to ensure data quality and maintain the value of data as a
company asset. This article presents a case study on the organization of Data Governance based on two of the
largest companies from the telecommunications industry, namely BT and Deutsche Telekom.
The findings from the case study lead to four propositions regarding the organization of Data Governance in
telecommunication firms in particular and in large service-providing companies in general. The case study findings
propose, for example, that the ―configuration‖ of Data Governance is contingent to external and internal factors.
62
Table 4: Empirical Indicators Related to Organizational Dimensions
Organizational
Goals
Formal Goals
Data Governance goals derived from business goals
Existence of metrics for data quality
Integration of data quality metrics in performance management system
(e.g., key performance indicators of business processes, balanced
scorecard)
Functional Goals
Existence and maintenance of a data strategy
Existence of preventive data quality management
Existence of and compliance to data standards
Data architecture and data lifecycle management defined
Organizational
Structure
Structure and
Process
Organization
Mandate for action allocated in the organization
Data Governance roles reflect organizational structure of company
(central and decentral departments, business and IT departments,
different business units)
Existence of standard operating procedures
Integration of Data Governance processes in existing processes and
practices (e.g., IT service management)
Secondary
Organization
Existence of a companywide Data Governance body
Clear responsibilities for portfolio management, budgeting and resource
allocation related to Data Governance
Organizational
Transformation
Transformation
Process
Existence of a project and time plan for Data Governance organization
Existence of a ―roll-out‖ plan
Organizational
Change
Involvement and engagement of all hierarchical levels
Existence of communication measures
Existence of training and support activities
These factors determine the individual design of organizational dimensions of Data Governance and, thus, its
effectiveness. Moreover, both companies studied in the case are able to quantify the business benefits related to
Data Governance. The organizational design of Data Governance at BT, however, seems to be favorable in terms of
attributed the benefits eventually to Data Governance activities.
The article also risks the attempt of proposing a research agenda for further investigation of Data Governance.
Further studies should combine both positivist and interpretivst approaches. On the one hand, they should aim at
validating the organizational design options through an investigation of companies on a larger scale. On the other
hand, a focus of future work should be placed on the further investigation of contingency factors and the
confirmation of ―organizational patterns‖ for Data Governance. From the perspective of the practitioners’ community,
the results of the work presented can be considered valuable as different options to be taken into consideration
when establishing Data Governance are specified. Reflecting on these options will help to avoid approaching the
topic prematurely and in too simplified a manner.
Moreover, the article indicates some more general implications to the IS community. It shows that two of the
worldwide leading telecommunications companies have undertaken a significant endeavor to improve the way data
are managed and to sustainably assure quality of their data. While the discussion within the IS community over the
last years emphasized topics such as IS strategies, architectures, and software design (which are undoubtedly of
high relevance for both practitioners and researchers), data as a research topic was often underrepresented. With
the increasing prominence of Data Governance, though, the entire study of data management in general might
experience a ―renaissance‖ in the community. In the 1990s W ang [1998] proposed the idea of managing data in the
same way as physical goods. He compared data with raw material and software systems with manufacturing
systems. Keeping his analogy in mind, one could get the impression from the past discussions in the IS community
that companies should focus on factory planning, manufacturing machinery design, shop-floor production systems,
etc. only, and might lose sight of managing raw materials, semi-finished, finished goods, etc. As this is certainly not
an appropriate approach, the future discussion around data in the IS community should be more balanced than in
the past.
Volume 29
Article 3
63
ACKNOWLEDGMENTS
The research presented in this article is a result of the Competence Center Corporate Data Quality (CC CDQ) at the
Institute of Information Management at the University of St. Gallen. The CC CDQ is consortium research project and
part of the research program Business Engineering at the University of St. Gallen (BE HSG).
REFERENCES
Editor’s Note: The following reference list contains hyperlinks to World Wide Web pages. Readers who have the
ability to access the Web directly from their word processor or are reading the article on the Web, can gain direct
access to these linked references. Readers are warned, however, that:
1. These links existed as of the date of publication but are not guaranteed to be working thereafter.
2. The contents of Web pages may change over time. Where version information is provided in the
References, different versions may not contain the information or the conclusions referenced.
3. The author(s) of the Web pages, not AIS, is (are) responsible for the accuracy of their content.
4. The author(s) of this article, not AIS, is (are) responsible for the accuracy of the URL and version
information.
Batini, C. and M. Scannapieco (2006) Data Quality: Concepts, Methodologies and Techniques, Berlin, Germany:
Springer.
Benbasat, I., D.K. Goldstein, and M. Mead (1987) The Case Research Strategy in Studies of Information Systems‖,
MIS Quarterly (11)3, pp. 369386.
Bitterer, A. and D. Newman. (2007) Organizing for Data Quality, Stamford, CT: Gartner, Inc., G00148815.
Blanton, J.E., H.J. Watson, and J. Moody (1992) Toward a Better Understanding of Information Technology
Organization: A Comparative Case Study‖, MIS Quarterly (16)4, pp. 531555.
Boisot, M. and A. Canals (2004) Data, Information and Knowledge: Have We Got It Right? Journal of Evolutionary
Economics (14)1, pp. 4367.
Boynton, A.C., G.C. Jacobs, and R.W. Zmud (1992) Whose Responsibility Is IT Management? Sloan Management
Review (33)4, pp. 3238.
Brown, C.V. (1997) Examining the Emergence of Hybrid IS Governance Solutions: Evidence from a Single Case
Site‖, Information Systems Research (8)1, pp. 6994.
Brown, C.V. and S.L. Magill (1998) Reconceptualizing the Context-Design Issue for the Information Systems
Function‖, Organization Science (9)2, pp. 176194.
BT Group plc. (2003) Annual Report and Form 20-F 2003.
Cheong, L.K. and V. Chang (2007) The Need for Data Governance: A Case Study in Toleman, M., A. Cater-Steel,
and D. Roberts (eds.) Proceedings of the 18th Australasian Conference on Information Systems (ACIS 2007),
Toowoomba: The University of Southern Queensland, pp. 9991008.
Deloitte (2009) Telecommunications Predictions: TMT Trends 2009, Deloitte Touche Tohmatsu.
Dember, M. (2006) 7 Stages for Effective Data Governance‖, Architecture & Governance Magazine (2)4,
http://www.architectureandgovernance.com/content/7-stages-effective-data-governance (current July 29,
2011).
Ein-Dor, P. and E. Segev (1978) Organizational Context and the Success of Management Information Systems‖,
Management Science (24)10, pp. 10641077.
Eisenhardt, K.M. (1989) Building Theories from Case Study Research‖, Academy of Management Review (14)4,
pp. 532550.
English, L.P. (2001) Total Quality Data Management (TQdM) in Piattini, M.G., C. Calero, and M. Genero (eds.)
Information and Database Quality, Boston, MA: Kluwer, pp. 85110.
Eppler, M.J. and M. Helfert (2004) A Framework for the Classification of Data Quality Costs and an Analysis of
Their Progression in Chengalur-Smith, S. et al. (eds.) Proceedings of the 9th International Conference on
Information Quality (ICIQ 2004), Cambridge, MA: Massachusetts Institute of Technology, pp. 311325.
Even, A. and G. Shankaranarayanan (2007) Utility-Driven Assessment of Data Quality‖, ACM SIGMIS Database
(38)2, pp. 7593.
64
Fisher, T. (2009) The Data Asset: How Smart Companies Govern Their Data for Business Success, Hoboken, NJ:
Wiley.
Friedman, T. (2007) Best Practices for Data Stewardship, Stamford, CT: Gartner, Inc., G00153470.
Galbraith, J.R. (1977) Organization Design, Reading, MA: Addison-Wesley.
Galbraith, J.R. (2002) Designing Organizations: An Executive Guide to Strategy, Structure, and Process, 2nd
edition, San Francisco, CA: Jossey-Bass.
Gregor, S. (2006) The Nature of Theory in Information Systems‖, MIS Quarterly (30)3, pp. 611642.
Grewe, A. (2009) DatenarchitekturenWelche Datenmodelle braucht das Unternehmen? in IIR Data Management
Kongress 2009, Cologne, Germany: IIR Technology.
Grochla, E. (1982) Grundlagen der organisatorischen Gestaltung, Stuttgart, Germany: Poeschel.
Gross, E. (1969) The Definition of Organizational Goals‖, The British Journal of Sociology (20)3, pp. 277294.
Heinrich, B., M. Klier, and M. Kaiser (2009) A Procedure to Develop Metrics for Currency and its Application in
CRM‖, ACM Journal of Data and Information Quality (1)1, Article 5.
Hewlett-Packard (2007) Managing Data as a Corporate Asset: Three Action Steps Toward Successful Data
Governance, Hewlett-Packard Development Company.
Hill, J. (2003) Information Management in British Telecom‖, in DIMACS Workshop on Data Quality, Data Cleaning
and Treatment of Noisy Data, Piscataway, NJ.
Horne, N.W. (1995) Information as an AssetThe Board Agenda‖, Computer Audit Update (9), pp. 511.
IBM (2006) IBM Delivers New Data Governance Service to Help Companies Protect Sensitive Information‖, IBM
Corporation, http://www-03.ibm.com/press/us/en/pressrelease/20769.wss (current May 20, 2011).
IBM (2007) The IBM Data Governance Council Maturity Model: Building a Roadmap for Effective Data Governance,
IBM Corporation, http://www.935.ibm.com/services/uk/cio/pdf/leverage_wp_data_gov_council_maturity_
model.pdf (current July 11, 2011).
Kagermann, H., H. Osterle, and J.M. Jordan (2010) IT-Driven Business Models: Global Case Studies in
Transformation, Hoboken, NJ: John Wiley.
Karel, R. (2007) Data Governance: What Works And What Doesn’t, Cambridge, MA: Forrester Research.
Khatri, V. and C.V. Brown (2010) Designing Data Governance‖, Communications of the ACM (53)1, pp. 148152.
Laurent, W. (2005) The Case for Data Stewardship‖, DM Review (15)2, pp. 2628.
Lawrence, P.R. and J.W. Lorsch (1967) Organization and Environment, Boston, MA: Harvard University Press.
Lee, Y.W. et al. (2006) Journey to Data Quality, Boston, MA: MIT Press.
Loshin, D. (2001) Enterprise Knowledge Management: The Data Quality Approach, San Diego, CA: Morgan
Kaufmann.
Miles, M.B. and A.M. Huberman (1994) Qualitative Data Analysis: An Expanded Sourcebook, 2nd edition, Thousand
Oaks, CA: SAGE.
Mohr, L.B. (1973) The Concept of Organizational Goal‖, The American Political Science Review (67)2, pp. 470
481.
Newman, D. and D. Logan (2006) Governance Is an Essential Building Block for Enterprise Information
Management, Stamford, CT: Gartner, Inc., G00139707.
Olson, J. (2003) Data QualityThe Accuracy Dimension, San Francisco, CA: Morgan Kaufmann.
Otto, B. and K. Weber (2009) From Health Checks to the Seven Sisters: The Data Quality Journey at BT, University
of St. Gallen, Institute of Information Management BE HSG/CC CDQ/8.
Otto, B. et al. (2007) Towards a Framework for Corporate Data Quality Management in Toleman, M., A. Cater-
Steel, and D. Roberts (eds.) Proceedings of the 18th Australasian Conference on Information Systems (ACIS
2007), Toowoomba: The University of Southern Queensland, pp. 916926.
Pierce, E., W.S. Dismute, and C.L. Yonke (2008) The State of Information and Data GovernanceUnderstanding
How Organizations Govern Their Information and Data Assets, IAIDQ and UALR-IQ.
Potter, K. et al. (2010) IT Metrics: IT Spending and Staffing Report, 2010, Stamford CT: Gartner, Inc., G00173877.
Volume 29
Article 3
65
Redman, T.C. (1995) Improve Data Quality for Competitive Advantage‖, Sloan Management Review (36)2, pp. 99
107.
Redman, T.C. (2001) Data Quality: The Field Guide, Boston, MA: Digital Press.
Reid, A. and M. Catterall (2005) Invisible Data Quality Issues in a CRM Implementation‖, Journal of Database
Marketing & Customer Strategy Management (12)4, pp. 305314.
Sambamurthy, V. and R.W. Zmud (1999) Arrangements for Information Technology Governance: A Theory of
Multiple Contingencies‖, MIS Quarterly (23)2, pp. 261290.
Schmidt, A., K. Hüner, and A. Grewe (2010) Fallstudie Deutsche Telekom AGEinheitliche Datenarchitektur als
Grundlage für unternehmensweites Datenqualitätsmanagement, University of St. Gallen, Institute of
Information Management, BE HSG/CC CDQ/23.
Smith, M. and K. Potter (2008) Preliminary Findings: 2009 IT Spending and Staffing Report, Stamford, CT: Gartner,
Inc., G00163078.
Stake, R.E. (1995) The Art of Case Study Research, Thousand Oaks, CA: SAGE.
Thomas, G. (2006) The DGI Data Governance Framework, The Data Governance Institute.
TM-Forum (2010) Information Framework (SID)‖, TeleManagement Forum, http://www.tmforum.org/Information
Framework/1684/home.html (current Feb. 4, 2010).
Trisolini, S.M., M. Lenzerini, and D. Nardi (1999) Data Integration and Warehousing in Telecom Italia‖, ACM
SIGMOD Record (28)2, pp. 538539.
Turner, N. and D. Evans (2007) Data Quality? Dont Waste Your Time in Proceedings of the 12th International
Conference on Information Quality, Cambridge, MA: Massachusetts Institute of Technology.
Umar, A. et al. (1999) Enterprise Data Quality: A Pragmatic Approach‖, Information Systems Frontiers (1)3, pp.
279301.
Vahs, D. (1999) Organisation: Einführung in die Organisationstheorie und -praxis, 2nd edition, Stuttgart, Germany:
Schäffer-Poeschel.
van den Hoven, J. (1999) Information Resource Management: Stewards of Data‖, Information Systems
Management (16)1, pp. 8891.
Wang, R. (1998) A Product Perspective on Total Data Quality Management‖, Communications of the ACM (41)2,
pp. 5865.
Wang, R.Y. et al. (1998) Manage Your Information as a Product‖, Sloan Management Review (39)4, pp. 95105.
Wang, R.Y. and D.M. Strong (1996) Beyond Accuracy: What Data Quality Means to Data Consumers‖, Journal of
Management Information Systems (12)4, pp. 534.
Weber, K., B. Otto, and H. Österle (2009) One Size Does Not Fit AllA Contingency Approach to Data
Governance‖, ACM Journal of Data and Information Quality (1)1, Article 4.
Weill, P. (2004) Dont Just Lead, Govern: How Top-Performing Firms Govern IT‖, MIS Quarterly Executive (3)1, pp.
117.
Weill, P. and J. Ross (2005) A Matrixed Approach to Designing IT Governance‖, MIT Sloan Management Review
(46) 2, pp. 2534.
Wheeler, B.C. (2002) NEBIC: A Dynamic Capabilities Theory for Assessing Net-Enablement‖, Information Systems
Research (13)2, pp. 125146.
Wijnhoven, F. et al. (2007) Total Data Quality Management: A Study of Bridging Rigor and Relevance in Österle,
H., J. Schelp, and R. Winter (eds.) Proceedings of the 15th European Conference on Information Systems
(ECIS 2007), St. Gallen, pp. 925937.
Yin, R.K. (2002) Case Study Research: Design and Methods, 2nd edition, Thousand Oaks, CA: SAGE.
66
ABOUT THE AUTHOR
Dr. Boris Otto is Assistant Professor at the University of St. Gallen, Switzerland, and Research Fellow at the Center
for Digital Strategies at the Tuck School of Business at Dartmouth College in Hanover, NH, USA. His main areas of
research include Data Governance, data quality management, and master data management. His work is published
in numerous international journals and proceedings of international conferences. Prior to his current positions he
worked for the Fraunhofer Institute for Industrial Engineering in Stuttgart, Germany, PricewaterhouseCoopers in
Hamburg, Germany, and SAP in Walldorf, Germany.
Copyright © 2011 by the Association for Information Systems. Permission to make digital or hard copies of all or part
of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for
profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for
components of this work owned by others than the Association for Information Systems must be honored.
Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists
requires prior specific permission and/or fee. Request permission to publish from: AIS Administrative Office, P.O.
Box 2712 Atlanta, GA, 30301-2712, Attn: Reprints; or via e-mail from ais@aisnet.org.
Volume 29
Article 3
.
ISSN: 1529-3181
EDITOR-IN-CHIEF
Ilze Zigurs
University of Nebraska at Omaha
AIS PUBLICATIONS COMMITTEE
Kalle Lyytinen
Vice President Publications
Case Western Reserve University
Ilze Zigurs
Editor, CAIS
University of Nebraska at Omaha
Shirley Gregor
Editor, JAIS
The Australian National University
Robert Zmud
AIS Region 1 Representative
University of Oklahoma
Phillip Ein-Dor
AIS Region 2 Representative
Tel-Aviv University
Bernard Tan
AIS Region 3 Representative
National University of Singapore
CAIS ADVISORY BOARD
Gordon Davis
University of Minnesota
Ken Kraemer
University of California at Irvine
M. Lynne Markus
Bentley University
Richard Mason
Southern Methodist University
Jay Nunamaker
University of Arizona
Henk Sol
University of Groningen
Ralph Sprague
University of Hawaii
Hugh J. Watson
University of Georgia
CAIS SENIOR EDITORS
Steve Alter
University of San Francisco
Jane Fedorowicz
Bentley University
Jerry Luftman
Stevens Institute of Technology
CAIS EDITORIAL BOARD
Monica Adya
Marquette University
Michel Avital
University of Amsterdam
Dinesh Batra
Florida International
University
Indranil Bose
University of Hong Kong
Thomas Case
Georgia Southern
University
Evan Duggan
University of the West
Indies
Mary Granger
George Washington
University
Åke Gronlund
University of Umea
Douglas Havelka
Miami University
K.D. Joshi
Washington State
University
Michel Kalika
University of Paris
Dauphine
Karlheinz Kautz
Copenhagen Business
School
Julie Kendall
Rutgers University
Nancy Lankton
Marshall University
Claudia Loebbecke
University of Cologne
Paul Benjamin Lowry
City University of Hong
Kong
Sal March
Vanderbilt University
Don McCubbrey
University of Denver
Fred Niederman
St. Louis University
Shan Ling Pan
National University of
Singapore
Katia Passerini
New Jersey Institute of
Technology
Jan Recker
Queensland University of
Technology
Jackie Rees
Purdue University
Raj Sharman
State University of New
York at Buffalo
Mikko Siponen
University of Oulu
Thompson Teo
National University of
Singapore
Chelley Vician
University of St. Thomas
Padmal Vitharana
Syracuse University
Rolf Wigand
University of Arkansas,
Little Rock
Fons Wijnhoven
University of Twente
Vance Wilson
Worcester Polytechnic
Institute
Yajiong Xue
East Carolina University
DEPARTMENTS
Information Systems and Healthcare
Editor: Vance Wilson
Information Technology and Systems
Editors: Sal March and Dinesh Batra
Papers in French
Editor: Michel Kalika
ADMINISTRATIVE PERSONNEL
James P. Tinsley
AIS Executive Director
Vipin Arora
CAIS Managing Editor
University of Nebraska at Omaha
Sheri Hronek
CAIS Publications Editor
Hronek Associates, Inc.
Copyediting by
S4Carlisle Publishing
Services
... No one, single, "one size fits all" approach to data governance exist (Brous, Janssen, & Vilminko-Heikkinen, 2016). Rather, data governance formats in an organization are often peculiar and specific to such organization (Brous, et al., 2016), with outlined organization roles, decision areas, responsibilities, and a unique outlay of specialized personnel that are hired, trained, nurtured, and integrated into the organization (Khatri & Brown, 2010;Otto, 2011). Based on the proposed initial frameworks for data governance by some researchers, and the attendant influencing structural factors, four data governance key concepts or principles have emerged: (a) Organization, which included Decision rights (Thompson, Ravindran, & Nicosia, 2015), Balanced roles (Al-Khouri, 2012), Stewardship (Thompson, et al., 2015), Ownership (Al-Khouri, 2012), Separation of duties and concern (Malik, 2013), and Improved coordination of decision making (Otto, 2011). ...
... Rather, data governance formats in an organization are often peculiar and specific to such organization (Brous, et al., 2016), with outlined organization roles, decision areas, responsibilities, and a unique outlay of specialized personnel that are hired, trained, nurtured, and integrated into the organization (Khatri & Brown, 2010;Otto, 2011). Based on the proposed initial frameworks for data governance by some researchers, and the attendant influencing structural factors, four data governance key concepts or principles have emerged: (a) Organization, which included Decision rights (Thompson, Ravindran, & Nicosia, 2015), Balanced roles (Al-Khouri, 2012), Stewardship (Thompson, et al., 2015), Ownership (Al-Khouri, 2012), Separation of duties and concern (Malik, 2013), and Improved coordination of decision making (Otto, 2011). (b) Alignment, that involved Meeting business needs (Dawes, 2010), Aligning business and IT (Tallon, 2013), Developing data strategy (Malik, 2013), Defining data quality requirements (Otto, 2011), Reducing error of use (Panian, 2010), and Effective policies and procedures (Hripcsak, et al., 2014). ...
... Based on the proposed initial frameworks for data governance by some researchers, and the attendant influencing structural factors, four data governance key concepts or principles have emerged: (a) Organization, which included Decision rights (Thompson, Ravindran, & Nicosia, 2015), Balanced roles (Al-Khouri, 2012), Stewardship (Thompson, et al., 2015), Ownership (Al-Khouri, 2012), Separation of duties and concern (Malik, 2013), and Improved coordination of decision making (Otto, 2011). (b) Alignment, that involved Meeting business needs (Dawes, 2010), Aligning business and IT (Tallon, 2013), Developing data strategy (Malik, 2013), Defining data quality requirements (Otto, 2011), Reducing error of use (Panian, 2010), and Effective policies and procedures (Hripcsak, et al., 2014). ...
Article
Full-text available
Recent developments in big data have heightened the need for Sustainable Data Governance (SDG). SDG is significant in realizing sustainable economic development in Nigeria. Information and Communication Technologies (ICTs) have made landmark innovational trends in empowering data governance globally. Despite these global impacts of ICT on data governance, numerous investigations have shown that poor sustainability of ICT in Nigeria poses barriers that impede progress related to data governance. SDG which is the pivot for economic growth has remained relatively nonexistence or unattended to due to corrupt policies and practices, ignorance, and illiteracy that plagued sustainable ICT innovations in Nigeria. For this study, the Unified Theory of Acceptance and Use of Technology (UTAUT) was adopted as the conceptual framework. UTAUT model claims that the benefits of using technology and the factors that drive users' decision to use it, is what determines users' acceptance behavior. In this study, the authors explored a narrative review, analysis, and synthesis of prior research that focused on the theoretical underpinnings of vast works of literature that revealed significant information on the impact of sustainable ICT on Sustainable Data Governance in Nigeria. The authors also extracted peer-reviewed articles within the last five years from electronic databases, using some keywords such as "ICT and SDG", "ICT and national economic development", "Trends for ICT", etc. The result of this study revealed that strict adherence to policies, laws, and guidelines on the adoption of ICT coupled with good formulation and communication of same, are the major impact of sustainable ICT that can leverage SDG in Nigeria. The result from this study may increase understanding, minimize corrupt practices and encourage trust in ICT innovations, ICT adoption, its acceptance and sustainability that can positively impact SDG and national economic development in Nigerian.
... DG formats in any nation or organization are often peculiar and specific to such nation or organization [10]. However, DG generally goes with welldefined organizational roles, policies, responsibilities, and a unique behavior that sustains the organization's culture [27] and [36]. There is a consensus among some researchers on initial frameworks for data governance, and the attendant influencing structural factors. ...
... There is a consensus among some researchers on initial frameworks for data governance, and the attendant influencing structural factors. These researchers postulated four data governance key concepts or principles: (a) Organization, which included sound decision making [43], good judgment [5], management oversight [43], right of possession [5], Separation of duties and concern [33], and enhanced and synchronized decision making [36]. (b) The agreement that involved meeting organization needs [14], synchronizing ICT protocol with DG [42], advancing excellent data strategy [33], defining data quality necessities [26] and [36], minimizing the error of use [38], and efficient policies and procedures [25]. ...
... These researchers postulated four data governance key concepts or principles: (a) Organization, which included sound decision making [43], good judgment [5], management oversight [43], right of possession [5], Separation of duties and concern [33], and enhanced and synchronized decision making [36]. (b) The agreement that involved meeting organization needs [14], synchronizing ICT protocol with DG [42], advancing excellent data strategy [33], defining data quality necessities [26] and [36], minimizing the error of use [38], and efficient policies and procedures [25]. (c) compliance and enforcing data accountability [28] and [43], procedure enforcement [42], expected diligence [25], Privacy [5], ingenuousness [28], Security [5], data quality assessment [25] and [26], and (d) shared understanding, based on shared data utilization [5], and [36], standards and metadata management usability [27], [36], and [43], standardized data models and operations [36] and [43], and sustainable enhanced communication [33]. ...
Article
Full-text available
Big data technologies have intensified the need for Sustainable Data Governance (SDG). Significant empirical evidence from literature revealed that of the 2.7 zettabytes of data now in the digital universe, only 67% of organizations deployed data governance or data intelligence solutions, 76% of company executives consider information "mission-critical" or most important asset, while 46% including Nigeria had no formal governance strategy in place. Globally, significant relationships exist between SDG and Sustainable Information and Communication Technology (SICT). Data governance (DG) that is driven by SICT is agile, holistic, security-embedded, accurate, high-quality, sustainable, and on a real-time enterprise data pipeline, required in revamping any nation’s economy. Despite these global impacts of SDG in revamping the national economy, numerous investigations have shown that DG in Nigeria has remained relatively non-existent or unattended to because its approach is driven by IT that adopted rigid and fragmented processes that were carried out on a system by system basis, lacked a single version of the truth or one single reference for critical master data across geographies, business structure, and wider support of the organization. Poor sustainability of ICT in Nigeria also posed barriers that impede progress related to DG due to corrupt policies and practices, ignorance, and illiteracy that plagued SICT innovations in Nigeria. The authors adopted the Unified Theory of Acceptance and Use of Technology (UTAUT) as the conceptual framework for this study. UTAUT model claimed that users’ acceptance behavior towards technology is determined by users’ perceived benefits of using technology and the factors that drive users’ decision to use it. A narrative review methodology was adopted in this study to review significant information based on the study conceptual framework, and existing systems that enhance SDG in revamping Nigeria’s economy. Articles reviewed include peer-reviewed articles and other documentaries within the last 5 years, extracted from electronic databases, using keywords such as “ICT and SDG”, “SDG and national economic development”, “Trends for SDG”, etc. Results from this study revealed that better decision-making, analytics, and regulatory compliance to policies, laws, and guidelines on the adoption of ICT, coupled with good formulation and communication of same, are the major drivers of sustainable DG for revamping the national economy. The result of this study may increase understanding, minimize corrupt practices and encourage trust and regulatory compliance of ICT innovations, adoption, and sustainability that can positively impact SDG for revamping Nigeria’s economy.
... These matters include people, processes, and technology. According to [33] and [34], to support the implementation of good data governance in implementing MDM, a business team and IT staff are needed. The roles and responsibilities in data governance are shown in Table IV [33], [34]. ...
... According to [33] and [34], to support the implementation of good data governance in implementing MDM, a business team and IT staff are needed. The roles and responsibilities in data governance are shown in Table IV [33], [34]. The roles and responsibilities of organizational actors to manage master data must be Unclear master data definition [25], [26] 2. Lack of responsibilities for data maintenance [25], [27] 3. ...
... Organizations need to define roles, functions, and responsibilities in data governance clearly so that data quality is maintained to support MDM implementation following organizational needs [9], [18], [33], [34]. ...
Article
Full-text available
Master data management (MDM) is a method of maintaining, integrating, and harmonizing master data to ensure consistent system information. The primary function of MDM is to control master data to keep it consistent, accurate, current, relevant, and contextual to meet different business needs across applications and divisions. MDM also affects data governance, which is related to establishing organizational actors’ roles, functions, and responsibilities in maintaining data quality. Poor management of master data can lead to inaccurate and incomplete data, leading to lousy stakeholder decision-making. This article is a literature review that aims to determine how MDM improves the data quality and data governance and assess the success of MDM implementation. The review results show that MDM can overcome data quality problems through the MDM process caused by data originating from various scattered sources. MDM encourages organizations to improve data management by adjusting the roles and responsibilities of business actors and information technology (IT) staff documented through data governance. Assessment of the success of MDM implementation can be carried out by organizations to improve data quality and data governance by following the existing framework.
... Data governance assists organisations to ensure data quality and to maintain the value of data as an organisational asset. Many companies see DG as a promising approach to ensuring data quality (Otto 2011a(Otto , 2011b. Successful DG may answer certain data challenges of many organisations. ...
Article
Full-text available
Background: This study aimed to investigate data governance (DG) related to challenges associated with healthcare information systems (HIS), by reviewing guidelines emerging from academic sources as part of a consolidated systematic literature review (SLR). The research contributed theoretically towards the body of knowledge, by reviewing challenges and guidelines related to DG within the healthcare environment. It contributed practically to the body of knowledge through understanding the healthcare information’s systems status. The study also contributed methodologically and significantly to SLR strategies. Objectives: The objective of this study was to understand the features of HIS; acquire information about DG success and understand the influence noted on DG. Method: The study conducted an SLR over the period 2010–2020. Literature collection was not only restricted to South African publications but was extended to international sources. This study adapted a mono method. Results: The study revealed that many organisations have realised that the only method to fix the data problem is the implementation of effective DG. With the increased adoption and rise of cloud computing, DG is gaining interest amongst specialists. Conclusion: The shift from paper-based systems led organisations to seek organisational change through digital transformation. The proper collection and utilisation of electronic healthcare record is the foundation of the digital healthcare. Many organisations value DG as a promising method of maintaining data as a valuable asset.
... improved through relational mechanisms needs to be measured by the performance of 150 the overall interaction level of the port logistics system (Otto, 2011). Therefore, how to 151 comprehensively assess the performance of port logistics data governance from the 152 perspective of multi-subjects, multi-dimensions and dynamic interactions and optimize 153 the data governance structure from the perspective of relationship governance 154 mechanism is the issue to be addressed in this paper. ...
Preprint
Full-text available
Port logistics data governance is characterized by multi-subject, multi-dimensional and dynamic interaction, which brings significant challenges for enterprises to perform the strategic value of data information. In order to maximize the potential of data information, this paper systematically and quantitatively assesses and optimizes port logistics data governance capabilities through an enhanced Meta-Network Analysis-Simulated Annealing Algorithm (MNA-SAA)-based approach. This approach first applies the MNA method to conceptualize port logistics data governance as “Data Information-Skilled People-Technology-Process” (I-A-T-P) meta-networks, which portray the dynamic interaction behavior of data information between different subjects. Then, multi-level meta-network metrics (i.e., people/process data information waste congruence, data information actual/potential load, organization people data information needs congruence, and perform as accuracy) are used to the port logistics data governance capabilities. The SAA is applied to optimize the data governance structure from the skilled people-data information interaction (AI) network. This proposed approach is validated by a generic port logistics data governance case study. Based on the optimization results of the generic port logistics data governance meta-networks, its data governance capability improvement strategies and the advantages of the enhanced MNA-SAA approach are discussed. Overall, this enhanced MNA-SAA approach promotes an understanding of port data governance by systematically conceptualizing complex data governance structures and quantifying data governance capabilities. This study provides decision-makers with implementable support to advance stakeholder collaboration and knowledge sharing to improve future port logistics data governance capabilities.
... A governança de dados especiVica quem na instituição tem permissão para tomar quais decisões sobre o tratamento e dados (direitos) e quais são as tarefas relacionadas a essa tomada de decisão (deveres). Em suma, a governança de dados regulamenta as diretrizes e regras para o gerenciamento da qualidade de dados (Otto, 2011), aleḿ de atribuir papeís e responsabilidades (Reĝo, 2013). ...
Article
Full-text available
Objetivo: Instituições de Ensino Superior (IES) possuem uma variedade de dados com potencial de gerar informações que podem contribuir com a tomada de decisões e uma melhor administração dos recursos. Nesse contexto, este artigo visa discutir o tema Academic Analytics em relação aos seus conceitos, estágios e desafios. Desenho: foi empregado o gênero de ensaio teórico para o desenvolvimento de reflexões a partir da abordagem sociotécnica, considerando a experiencia deste autor, bem como estudos teóricos e empíricos da literatura. Resultados: a partir da articulação de diferentes conceitos, uma definição integrativa sobre Academic Analytics é apresentada; bem como os estágios e desafios gerenciais no desenvolvimento da inteligência analıtica em IES, incluindo dimensões como liderança, governança de dados, cultura e pessoas, política, ética e tecnologia. Contribuições acadêmicas e práticas: a idiossincrasia do contexto, a lente da abordagem sociotécnica, e a perspectiva focada em gestão permite um olhar sistêmico e prático sobre o tema. Espera-se ainda que gestores e lideranças institucionais de IES, bem como pesquisadores, possam tomar as reflexões como um ponto de partida para projetos no âmbito organizacional e acadêmico que envolvam o desenvolvimento de suas competências analíticas.
Article
The study at hand measures the value of improving data governance and access in the Supporting Soil Health Interventions (SSHI) project in Ethiopia. We applied two separate but interlinked models, one qualitative and one quantitative, to create a new framework enhancing the traditional cost–benefit analysis. The qualitative analysis provided novel insights into the specific types of value and the mechanisms through which they are generated. These results underpinned the development of an innovative framework to measure this perceived value quantitatively. By combining the quantitative and qualitative framework, the study demonstrated that it is possible to generate plausible and credible quantitative estimates of both costs and benefits of data governance and access. While acknowledging that the estimates are only illustrative, the case study results suggested on a direct cost measure, at a particular point in time, the SSHI data governance activities yielded a negative return. However, indirect social and public benefits are rarely quantified, but this paper shows that relatively few “indirect” benefits (current but unmeasured, or measurable but in the future) are necessary to reverse that view, at least from the point of the economy more generally.
Article
Full-text available
The amount of data and the speed at which it increases grows rapidly. Companies and public institutions try to manage this increasing flood of data effectively and in a manner that adds value. Besides, the companies and public institutions also join corporate networks or platforms to increase their value by sharing their data. The evolution of traditional business intelligence into business analytics, including real-time analysis, increases the high demand for qualitative data. Data governance tries to create a framework to manage these issues. This interdisciplinary research field has now been in existence for nearly two decades. With this contribution, we attempt to provide the research field with a blueprint. This paper aims to explore the past to understand the present and shape the future of data governance. We give an overview of how the research field changed from 2005 to 2020, commenting on its development and pointing out future research paths based on our findings. We, therefore, conducted a bibliometric analysis to describe the research field’s bibliometric and intellectual structure. The findings show that for years the research field concentrated on a few topics, which currently undergoes change and has led to an opening up of the research field. Finally, the results are discussed and future research strands are highlighted
Article
We discuss the main methodological and technological issues arosen in the last years in the development of the enterprise integrated database of Telecom Italia and, subsequently in the management of the primary data store for Telecom Italia data warehouse applications. The two efforts, although driven by different needs and requirements can be regarded as a continous development of an integrated view of the enterprise data. We review the experience accumulated in the integration of over 50 internal databases, highlighting the benefits and drawbacks of this scenario for data warehousing and discuss the development of a large dedicated data store to support the analysis of data about customers and phone traffic.
Article
- This paper describes the process of inducting theory using case studies from specifying the research questions to reaching closure. Some features of the process, such as problem definition and construct validation, are similar to hypothesis-testing research. Others, such as within-case analysis and replication logic, are unique to the inductive, case-oriented process. Overall, the process described here is highly iterative and tightly linked to data. This research approach is especially appropriate in new topic areas. The resultant theory is often novel, testable, and empirically valid. Finally, framebreaking insights, the tests of good theory (e.g., parsimony, logical coherence), and convincing grounding in the evidence are the key criteria for evaluating this type of research.
Article
The organizational goal concept is important for significant types of organizational research but its utility has been downgraded in recent scholarship. This paper reviews critically key contributions to conceptualizing the organizational goal and synthesizes many of their elements into a more concrete and comprehensive conceptualization. The efforts of Etzioni, Seashore and Yuchtman, Simon, and Thompson to bypass the need for a goal concept in evaluative and other behavioral research are unconvincing in important respects. However, they are persuasive in underscoring the importance of viewing organizational goals as multiple and as empirically determined. Perrow, Gross, and others convincingly suggest a dual conceptualization, so that goals are dichotomized into those with external referents (transitive goals) and those with internal referents (reflexive goals). Deniston et al. contribute the desirability of subsetting the goals of organizations into “program goals” and of differentiating goals from both subgoals and activities. The existence and relative importance of organizational goals and an allied concept, “operative goals,” may be operationally determined by current social science methods. The goal concept as presented here has implications for the evaluation of organizational effectiveness, for research on organizational behavior, for organization theory, and for views of the role of organizations in society.
Article
The authors develop theory for predicting the distribution of decision making between the corporate and business-unit levels of management for a subset of information systems (IS) resources referred to as systems development. Drawing on literature from the fields of MIS, strategic management, and organization theory, they first determine how potentially influential context factors are likely to affect the locus of the lead decision making role from a multiple-contingencies perspective. Then they theorize how conflicting corporate and business-unit contingencies are likely to be resolved. They present a set of six propositions that predict a centralized, decentralized, or compromise design solution for a given business unit on the basis of (1) business-level strategy, (2) whether or not information technology (IT) plays a strategic role for the business unit, (3) the degree of line managers' IT knowledge at the business-unit level, and (4) the level at which opportunities for IT-related synergies across business units are being pursued at the corporate level.
Article
This article defines and discusses one of these qualitative methods--the case research strat- egy. Suggestions are provided for researchers who wish to undertake research employing this approach. Criteria for the evaluation of case research are established and several characteristics useful for categorizing the studies are identified. A sample of papers drawn from information systems journals is reviewed. The paper concludes with examples of research areas that are particularly well- suited to investigation using the case research approach.