To read the full-text of this research, you can request a copy directly from the author.
Abstract
Despite the large number of academic contributions, there is no uniform definition of data marketplaces.However, different data marketplaces may vary from eachother in terms of their underlying business model, type ofdata offered, functionality, market mechanisms, etc.
To read the full-text of this research, you can request a copy directly from the author.
... Most data sharing occurs in a bilateral Business-to-Business (B2B) context, as businesses possess the necessary technological means and a unique position in society to collect, retain, share, and profit from data generated by a large number and range of actors (Spiekermann, 2019). However, despite the abundance and value of datasets, widespread data sharing and trading remain limited (Koutroumpis et al., 2020). ...
... The rationale behind the policy interventions of the Data Governance Act and Data Act is rooted in the recognition that without enough legal clarity about who can use the shared data for what objectives, businesses will fear that their data will be used for unagreed purposes by third parties against their business interests. This deficit of trust is present in both B2B and Business-to-Government (B2G) data sharing relations (Klievink et al., 2018;Spiekermann, 2019). ...
... Section 2 presents the problem statement by illustrating how three types of common risks-competition, privacy, and reputational-generate distrust when businesses engage in three emerging approaches to sharing data: data marketplaces, data collaboratives, and data philanthropy, respectively. The selection of these exemplary cases has been made taking into consideration that, despite being perceived as avenues to unlock the value of private sector data, they have failed to scale to a satisfactory extent (Lev-Aretz, 2019;Spiekermann, 2019;Ruijer, 2021). ...
Enabling private sector trust stands as a critical policy challenge for the success of the EU Data Governance Act and Data Act in promoting data sharing to address societal challenges. This paper attributes the widespread trust deficit to the unmanageable uncertainty that arises from businesses’ limited usage control to protect their interests in the face of unacceptable perceived risks. For example, a firm may hesitate to share its data with others in case it is leaked and falls into the hands of business competitors. To illustrate this impasse, competition, privacy, and reputational risks are introduced, respectively, in the context of three suboptimal approaches to data sharing: data marketplaces, data collaboratives, and data philanthropy. The paper proceeds by analyzing seven trust-enabling mechanisms comprised of technological, legal, and organizational elements to balance trust, risk, and control and assessing their capacity to operate in a fair, equitable, and transparent manner. Finally, the paper examines the regulatory context in the EU and the advantages and limitations of voluntary and mandatory data sharing, concluding that an approach that effectively balances the two should be pursued.
... We are living in the data economy, where almost all aspects of everyday life are increasingly digitized, and a plethora of data is stored for analysis and subsequent value generation. The emergence of new technologies leads to an exponential increase in available data (Spiekermann 2019). At the same time, advances in data analysis, data storage, data sharing, and computing power accelerate the rise of the data economy (Zuboff 2019). ...
... The data economy depicts an economic perspective that understands data as an economic good with two primary purposes: the use and the trade of data (Bründl et al. 2015;Hüllmann et al. 2021). We focus on the use of data, where data is monetized by building a value chain around it (Spiekermann 2019). Consequently, data-driven business models are Business Models (BMs) that create value from data and data processing. ...
... The value generated by collecting and processing data can be captured through novel products and services. Compared to traditional value chains, data scales up and never depletes (Shapiro and Varian 1999;Spiekermann 2019). Just how lucrative data as an economic good is, is being showcased by the financial success of major players (e.g., Google and Facebook) and start-ups that purely operate on data and deliver data-related products and services (Klein and Hüllmann 2018). ...
We are now living in the data economy with data as the central fuel for operating data-driven business models. Especially incumbent companies are constantly challenged by rapid technological change and emerging business models that utilize data for value creation. Consequently, every company must rethink and, possibly, renew its business model over time to remain successful. Various tools have been proposed by practice and academia in order to enable and facilitate business model innovation. Although IT tools for supporting business model innovation proved to be meaningful, IT tools for data-driven business model innovation are relatively scarce. Hence, we aim for the design of an IT tool to enable and facilitate data-driven business model innovation. To reach the research objective, we employ a design science research approach accompanied by an experimental evaluation design. In this research, we propose four design features for IT tools supporting data-driven business model innovation.
... The availability of data and advanced technologies offer substantial potential for creating value. Therefore, within the business landscape, data has transitioned from being merely an enabler of products to becoming valuable products in and of themselves, serving as strategic resources for companies [37]. For example, it is predicted that the European Union's data economy will attain a value of €829 billion, accompanied by a notable expansion in the workforce of data professionals, expected to reach 10.9 million by 2025 [32]. ...
... Ideally, the marketplace should possess metadata for all data records to enable future trades [30]. In addition to the expensive maintenance of this type of platform, the distribution of exchange volumes and profits will heavily favour the larger entities that provide high volumes of data at a fixed cost of metadata [37]. A centralized platform can enforce specific entrance policies and fees, which creates a clear boundary. ...
... This lack of visibility and control raises concerns among data providers, as they are unable to monitor how their data is being used and whether it aligns with the agreed-upon terms. The fear of competitors benefiting from their data in unforeseen ways further amplifies the apprehension [37]. This situation not only creates uncertainties surrounding the intended usage of the data but also introduces potential privacy risks [19], as the extent of data utilization and the resulting implications on individual privacy become uncertain. ...
Traditional data monetization approaches face challenges related todata protection and logistics. In response, digital data marketplaceshave emerged as intermediaries simplifying data transactions. De-spite the growing establishment and acceptance of digital datamarketplaces, significant challenges hinder efficient data trading.As a result, few companies can derive tangible value from their data,leading to missed opportunities in understanding customers, pric-ing decisions, and fraud prevention. In this paper, we explore bothtechnical and organizational challenges affecting data monetization.Moreover, we identify areas in need of further research, aimingto expand the boundaries of current knowledge by emphasizingwhere research is currently limited or lacking.
... Otto and Jarke 2019, Reiberg et al. 2022 Data Marketplaces A data marketplace is a third-party platform acting as neutral intermediary and allowing others to sell standardized data products for commercial purposes. Abbas et al. 2021, Spiekermann 2019, Sterk et al. 2022 Table 1. ...
... Furthermore, data may be enriched with other related data, such as combining product data with sales or customer data (de Corbière, 2009;Otto & Jarke, 2019). Subsequently, data can already be processed with various algorithms, machine learning models, or other tools for analysis (Spiekermann 2019;Susha et al. 2017). While shared data sets can be a combination of various data, the data sets can e.g., contain modified and processed data, resulting in the dimension being non-exclusive. ...
... Organizational sovereignty involves an organization's control over its data, such as data from its production line sensors (Azkan et al. 2020;Lis and Otto 2021;Schäffer and Stelzer 2018). When data is captured or acquired from individuals or other organizations, the organization must obtain permission before sharing this data, e.g., through data purchase, license agreements, or data funding (Gelhaar and Otto 2020;Opriel et al. 2021;Spiekermann 2019). Shared sovereignty occurs when data is freely available (Gelhaar and Otto 2020; Lis and Otto 2021). ...
With the increasing abundance of data, organizations may not only leverage internal data sources to create value but also share data across organizations. However, successful real-world applications of data sharing are still scarce, and support for the systematic development of data sharing practices remains weak. Realizing the true potential of data sharing requires understanding and making informed choices among a wide range of design options, such as bilateral sharing or data ecosystems. In this study, we draw on a systematic literature review as well as 72 real-world data sharing practices to develop a comprehensive taxonomy consisting of 15 dimensions across three meta-dimensions. From a theoretical perspective, our work contributes to structuring and systematizing existing knowledge on data sharing to form the basis for future theorizing processes. In practical terms, it should enable organizations to systematically embrace and evaluate strategic data sharing opportunities.
... According to [1,2], projections indicate a surge in the global data sphere to 181 ZB by 2025, as shown in Figure 1a, alongside an anticipated revenue boost for the global big data market to 655.53 billion dollars by 2029, as shown in Figure 1b. The convergence of mobile cloud computing and communications, the Internet of things (IoT), artificial intelligence (AI), big data analytics, and blockchain technologies has created unprecedented economic prospects for individuals and organizations to capitalize on their data [3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18]. However, the path to effective monetization of data is riddled with challenges, mainly stemming from the limitations of traditional online data marketplaces [4][5][6][7][8][9][10][11][12]. ...
... The convergence of mobile cloud computing and communications, the Internet of things (IoT), artificial intelligence (AI), big data analytics, and blockchain technologies has created unprecedented economic prospects for individuals and organizations to capitalize on their data [3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18]. However, the path to effective monetization of data is riddled with challenges, mainly stemming from the limitations of traditional online data marketplaces [4][5][6][7][8][9][10][11][12]. To address these challenges, the establishment of peer-to-peer (P2P) marketplaces is imperative, facilitating direct transactions between data providers (sellers) and consumers (buyers) over the Internet [7][8][9][10][11][12][13][14][15][16][17][18]. ...
... However, traditional data marketplaces are inadequate. Operating as centralized platforms, they lack the necessary levels of trust, transparency, fairness, accountability, and security [3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21][22]. Despite their extensive adoption, these centralized data marketplaces are often face vulnerabilities such as single points of failure (SPF) and fail to offer adequate ownership and consent control over data use [16][17][18][19][20], thus contravening data protection regulations such as the General Data Protection Regulation (GDPR) [23]. ...
In contemporary data-driven economies, data has become a valuable digital asset that is eligible for trading and monetization. Peer-to-peer (P2P) marketplaces play a crucial role in establishing direct connections between data providers and consumers. However, traditional data marketplaces exhibit inadequacies. Functioning as centralized platforms, they suffer from issues such as insufficient trust, transparency, fairness, accountability, and security. Moreover, users lack consent and ownership control over their data. To address these issues, we propose DataMesh+, an innovative blockchain-powered, decentralized P2P data exchange model for self-sovereign data marketplaces. This user-centric decentralized approach leverages blockchain-based smart contracts to enable fair, transparent, reliable, and secure data trading marketplaces, empowering users to retain full sovereignty and control over their data. In this article, we describe the design and implementation of our approach, which was developed to demonstrate its feasibility. We evaluated the model’s acceptability and reliability through experimental testing and validation. Furthermore, we assessed the security and performance in terms of smart contract deployment and transaction execution costs, as well as the blockchain and storage network performance.
... These include Schomm et al. [47] who provide an initial set of dimensions and Stahl et al. [53] that extend these. Meisel and Spiekermann [38] derive five classification characteristics and Spiekermann [49] provides economic and technological characteristics of data marketplaces. Täuscher and Laudien [54] list key business model attributes of marketplaces, which are however not exclusive to data marketplaces, and Azcoitia and Laoutaris [6] classify data exchange entities including data marketplaces through business model attributes. ...
... To classify the EDMP, we studied data marketplace characteristics. As outlined in the previous section, these are provided through various research articles such as [6,20,32,35,38,47,49,[52][53][54]. The characteristics range from aspects like marketplace ownership over the value proposition, data access methods, monetization aspects to the underlying architecture. ...
... Meisel and Spiekermann [38] provide a classification framework by combining characteristics identified through various research articles including [32,35,47,52]. Spiekermann [49] also provides a data marketplace classification framework based on a taxonomy developed explicitly for classifying data marketplaces based on their business models. By combining both of these frameworks an overview covering various dimensions of data marketplace characteristics can be obtained. ...
In this big data era, multitudes of data are generated and collected which contain the potential to gain new insights, e.g., for enhancing business models. To leverage this potential through, e.g., data science and analytics projects, the data must be made available. In this context, data marketplaces are used as platforms to facilitate the exchange and thus, the provisioning of data and data-related services. Data marketplaces are mainly studied for the exchange of data between organizations, i.e., as external data marketplaces. Yet, the data collected within a company also has the potential to provide valuable insights for this same company, for instance to optimize business processes. Studies indicate, however, that a significant amount of data within companies remains unused. In this sense, it is proposed to employ an Enterprise Data Marketplace, a platform to democratize data within a company among its employees. Specifics of the Enterprise Data Marketplace, how it can be implemented or how it makes data available throughout a variety of systems like data lakes has not been investigated in literature so far. Therefore, we present the characteristics and requirements of this kind of marketplace. We also distinguish it from other tools like data catalogs, provide a platform architecture and highlight how it integrates with the company’s system landscape. The presented concepts are demonstrated through an Enterprise Data Marketplace prototype and an experiment reveals that this marketplace significantly improves the data consumer workflows in terms of efficiency and complexity. This paper is based on several interdisciplinary works combining comprehensive research with practical experience from an industrial perspective. We therefore present the Enterprise Data Marketplace as a distinct marketplace type and provide the basis for establishing it within a company.
... Stahl et al. (2016) describe DMs generally as electronic platforms facilitating data exchange. They represent a neutral intermediary allowing authorized actors to upload and trade their data (Spiekermann, 2019). Since both industry and research face a rising need to obtain appropriate data to promote innovation and new business potential, the popularity of DMs has grown in recent years (Spiekermann, 2019;Fruhwirth et al., 2020). ...
... They represent a neutral intermediary allowing authorized actors to upload and trade their data (Spiekermann, 2019). Since both industry and research face a rising need to obtain appropriate data to promote innovation and new business potential, the popularity of DMs has grown in recent years (Spiekermann, 2019;Fruhwirth et al., 2020). Likewise, the number of DMs joining the global data market increases constantly. ...
... Likewise, the number of DMs joining the global data market increases constantly. Hence, data consumers have more possibilities to acquire external data to improve their business, explore new revenue opportunities, and foster innovation and development (Spiekermann, 2019). ...
Since the emerging information economy relies heavily on data for advancement and growth, data markets have gained increasing attention. However, while global data economies are evolving and data are increasingly shared among organizations in various data ecosystems, marketplaces for personal data (PDMs) exhibited considerable start-up difficulties, which doomed their majority to either fail quickly or to operate in legal grey zones. Apparently, in recent times, novel PDMs have arisen which seem economically and technically viable. The study investigates this “new generation” from both an economic and a technological perspective. Adhering to a rigorous methodology for taxonomy building and evaluation, 18 dimensions and 59 characteristics are presented alongside which these new PDMs can be designed. Additionally, archetypes are derived. The findings reveal that PDMs tend to follow certain design commonalities holding for data markets generally but comprise specific design elements distinguishing them both from conventional data markets and among each other.
... These challenges manifest in a variety of issues within different data governance decision domains, including how to protect data, ensure data quality, and define and model data consistently. Regarding data security, companies fear a loss of control over their data, which could lead to a competitive disadvantage (Roman & Stefano, 2016;Spiekermann, 2019;van den Broek & van Veenstra, 2015). Concerning data quality, data consumers require insights into the quality of data products before they can use them for certain purposes (Janssen et al., 2012). ...
... Data goods encompass manually and automatically created personal and commercial data, such as age, gender, purchase history, and IoT sensor data. Data-related services comprise capabilities such as data aggregation, analysis, and visualization (Roman & Stefano, 2016;Spiekermann, 2019). In our study, we focus on data marketplaces that act as independent intermediaries connecting two or more market participants (Stahl et al., 2016). ...
... The main actors involved in data trades are data providers offering data, and data consumers buying data. Marketplace providers offer an infrastructure that allows these actors to upload, discover, buy, and sell data (Spiekermann, 2019;Stahl et al., 2016). ...
Commercializing data and data-related services has gained in importance in recent years. Driven by digitalization and the Internet-of-Things (IoT), companies and individuals continuously generate vast amounts of data. Data marketplaces have emerged to support these data providers in selling their data to different data consumers. However, data marketplaces face challenges in different data governance decision domains that inhibit their adoption. To get a better understanding of how data marketplaces counteract these challenges, this paper develops a taxonomy of data governance decision domains in data marketplaces. We used a taxonomy development method to inspect 13 data marketplaces from eight countries. The resulting taxonomy shows an overview of mechanisms concerning data quality, data security, data architecture, metadata, data lifecycle, data storage, and data pricing. We discuss common instantiation patterns, highlight gaps, and propose possible solutions. The taxonomy sets a foundation for further research and theory-building on data marketplaces. Practitioners can use the taxonomy to develop customized data governance strategies for data marketplaces.
... The companies can create an immense business all over the globe as it can see in your mind's eye in any language using AI (Samsung, 2020). These real-time business models not only benefit the company but also make our life simpler in many terms (Afuah & Tucci, 2001;Kiriyama, 2012;Spiekermann, 2019). In fact, the main objectives of such models have been to be customer oriented first and then plan how to make money (Meroño-Cer, 2014). ...
... It is definitely going to be the survival of the fittest, and better the model-the better business it is expected to give (Stefan et al., 2014). Teams are working day and night to make sure their models are at par with the current trend (Spiekermann, 2019). They are trying their level best to make the consumers happy and satisfied, because the consumers will stand no chance if they get a better offer elsewhere (Beierle et al., 2015;Rong et al., 2015;Meroño-Cerdán et al., 2014). ...
... They are trying their level best to make the consumers happy and satisfied, because the consumers will stand no chance if they get a better offer elsewhere (Beierle et al., 2015;Rong et al., 2015;Meroño-Cerdán et al., 2014). On the other hand, a team of analysts keeps working to find flaws within a model, how to overcome those flaws, and how to come up with a better model (Spiekermann, 2019). ...
This paper discusses the comparative analysis of different attributes of Google and Facebook business model and their novel features for handling innovative business framework. We have compared Google and Facebook business model on different key attributes and also discussed the statistical analysis of business models using Google business analytics platform. We have argued performance analysis of these models. One important point which we discuss and analyze in this paper is that a business model is not about just building revenue generating machine, but it is indeed more than that. It explores the strategy and business approaches of both the models of revenue generating line of attacks. Our research contributes a considerate understanding of Google and Facebook architectural model and its influence on business framework. Statistical enactment and results are analyzed, precisely when big data and media are applied. This paper also provides better understanding of the digital marketplace for both of the platforms and its earning methodology.
... Buyers can express their demand for digital goods and benefit from the range of suppliers and the simplicity of locating goods and services on marketplaces (Tkachuk et al. 2022). New IoT, AI, and blockchain technologies provide a base for data economy, and many growing marketplaces such as Dawex, IOTA, Databroker DAO, and Streamr have emerged (Spiekermann 2019;Palviainen and Suksi 2023). Digital marketplaces offer incentives to offer Data and Processing Capabilities (DPCs) as sellable data products that assist in the development of innovative DPC-based solutions such as water data solutions to aid cities in achieving the Sustainable Development Goals (SDGs) (Palviainen and Suksi 2023). ...
... Data marketplaces were seen to reduce the need for administrative effort, assist in settlement of partnership contracts, and make existing or potential water data visible in the ecosystem. The similar kind of observation is done in (Spiekermann 2019) that considers a marketplace as a digital platform that handles market transactions and enables a more efficient and straightforward interaction between market participants. The interviewees from the public sector organizations identified a need for varied pricing models for sellable data. ...
This paper studies the factors that affect the emergence of water data ecosystems using a case study as a research method. The study is based on interviews conducted with partners in a comprehensive business ecosystem focused on the development of smart water network management. Eleven representatives from six private companies, the waterworks of a city, and three organizations that provide water supply management services for municipalities were interviewed. The paper presents analysis of the interview results focusing on the interviewees’ thoughts on the state of water data systems in Finland and on the factors that affect the emergence of water data ecosystems in Finland.
The interview results indicate a clear need for water data ecosystems but also obstacles preventing their emergence. Inadequate understanding on the part of customer, a lack of water data, regulations, and underdeveloped agreements were seen to hinder the development of water data solutions. In addition to ecosystem development, the emergence of water data ecosystems requires investment and the development of water data solutions, solution concepts, and demonstrations to show the value of the ecosystem. The results show that ecosystems need a clear rationale and vision, effective management of water data sharing, and mechanisms to ensure the scalability of water data ecosystems.
... The network mode is characterized by lateral relationships between data ecosystem members, emphasizing social agreements and collaborative approaches (Otto & Jarke, 2019). The market mode has recently gained traction, where data is shared as a commercial product through formal contracts via data marketplaces (Spiekermann, 2019). Such marketplaces are a subtype of digital platforms that create value by connecting data providers with consumers, facilitating smooth data sharing, and maintaining a modular infrastructure for third-party providers to add additional offerings and services (Abbas et al., 2021;Fruhwirth et al., 2020;Spiekermann, 2019). ...
... The market mode has recently gained traction, where data is shared as a commercial product through formal contracts via data marketplaces (Spiekermann, 2019). Such marketplaces are a subtype of digital platforms that create value by connecting data providers with consumers, facilitating smooth data sharing, and maintaining a modular infrastructure for third-party providers to add additional offerings and services (Abbas et al., 2021;Fruhwirth et al., 2020;Spiekermann, 2019). We focus on the market mode, which is the most complicated setting for sovereignty issues. ...
In the data economy, data sovereignty is often conceptualized as data providers’ ability to control their shared data. While control is essential, the current literature overlooks how this facet interrelates with other sovereignty facets and contextual conditions. Drawing from social contract theory and insights from 31 expert interviews, we propose a data sovereignty conceptual framework encompassing protection, participation, and provision facets. The protection facets establish data sharing foundations by emphasizing baseline rights, such as data ownership . Building on this foundation, the participation facet, through responsibility divisions , steers the provision facets. Provision comprises facets such as control , security , and compliance mechanisms , thus ensuring that foundational rights are preserved during and after data sharing. Contextual conditions (data type, organizational size, and business data sharing setting) determine the level of difficulty in realizing sovereignty facets. For instance, if personal data is shared, privacy becomes a relevant protection facet, leading to challenges of ownership between data providers and data subjects, compliance demands, and control enforcement. Our novel conceptualization paves the way for coherent and comprehensive theory development concerning data sovereignty as a complex, multi-faceted construct.
... i.e., organizations which own data and offer them to others for a fee (Spiekermann 2019). • A public sector, P S, whose activity concerns the growing and training of scientists, and to the purchasing of data. ...
... Table 1 provides some further information. The list of features is based on the simple taxonomy by Spiekermann (2019). For what concerns the market positioning, both sides of the market are operated by data providers, hence, the same actors involved in the trading. ...
This paper contributes to the understanding of the relationship between the nature of data and the artificial intelligence (AI) technological trajectories, on the one hand, and on the dynamic processes triggered by demand during the evolution of an industry, on the other hand. We develop an agent-based model in which firms are data producers that compete on the markets for data and AI. The model is enriched by a public sector that fuels the purchase of data and trains the scientists that will populate firms as workforce. Through several simulation experiments, we analyze the determinants of each market structure, the corresponding relationships with innovation attainments, the pattern followed by labor and data productivity, the quality of data traded in the economy, and in which forms demand does affect innovation and the dynamics of industries. We question the established view in the literature of industrial organization according to which technological imperatives are enough to experience divergent industrial dynamics on both the markets for data and AI blueprints.Although technical change behooves if any industry pattern is to emerge, the actual unfolding is not the outcome of a specific technological trajectory, but the result of the interplay between technology-related factors and the availability of data-complementary inputs such as labor and AI capital, the market size, preferences, and public policies.
... The national and EU-level data-related regulation is evolving, and there is a vast amount of data-type-specific, e.g., personal data under GDPR (EU 2016) and sector-specific regulations and the upcoming Data Governance Act (EU 2022), Data Act (European Commission 2022), and AI Act (European Commission 2021) that must be considered in data exchange and trade (Palviainen and Suksi 2023). Data productization requires legal and contractual frameworks to decrease legal uncertainty regarding trading data (Duch-brown et al. 2017;Spiekermann 2019;Bornholdt 2021). ...
The smart city infrastructures, such as digital platforms, edge computing, and fast 5G/6G networks, bring new possibilities to use near-real-time sensor data in digital twins, AR applications, and Machine-to-Machine applications. In addition, AI offers new capabilities for data analytics, data adaptation, event/anomaly detection, and prediction. However, novel data supply and use strategies are needed when going toward higher-granularity data trade, in which a high volume of short-term data products is traded automatically in dynamic environments. This paper presents offering-driven data supply (ODS), demand-driven data supply (DDS), event and offering-driven data supply (EODS), and event and demand-driven data supply (EDDS) strategies for high-granularity data trade. Computer simulation was used as a method to evaluate the use of these strategies in supply of air quality data for four user groups with different requirements for the data quality, freshness, and price. The simulation results were stored as CSV files and analyzed and visualized in Excel. The simulation results and SWOT-analysis of the suggested strategies show that the choice between the strategies is case-specific. DDS increased efficiency in data supply in the simulated scenarios. There was higher profit and revenues and lower costs in DDS than in ODS. However, there are use cases that require the use of ODS, as DDS does not offer ready prepared data for instant use of data. EDDS increased efficiency in data supply in the simulated scenarios. The costs were lower in EODS, but EDDS produced clearly higher revenues and profits.
... Agahari et al. [2] and [3] offer a business perspective on MPC for data sharing, building on the business model for data marketplaces from [45]. They conduct semi-structured interviews in the privacy and security domain to study the perceived value propositions, architecture and financial models [2], as well as control, trust, and perceived risks [3]. ...
This paper explores the integration of advanced cryptographic techniques for secure computation in data spaces to enable secure and trusted data sharing, which is essential for the evolving data economy. In addition, the paper examines the role of data intermediaries, as outlined in the EU Data Governance Act, in data spaces and specifically introduces the idea of trustless intermediaries that do not have access to their users' data. Therefore, we exploit the introduced secure computation methods, i.e. Secure Multi-Party Computation (MPC) and Fully Homomorphic Encryption (FHE), and discuss the security benefits. Overall, we identify and address key challenges for integration, focusing on areas such as identity management, policy enforcement, node selection, and access control, and present solutions through real-world use cases, including air traffic management, manufacturing, and secondary data use. Furthermore, through the analysis of practical applications, this work proposes a comprehensive framework for the implementation and standardization of secure computing technologies in dynamic, trustless data environments, paving the way for future research and development of a secure and interoperable data ecosystem.
... Table 2 about here This stream of discussions represents an emerging but important direction for research, spanning across a few streams of discussion, viz, data liquidity, data commodity, data business models, data marketplaces, and data valuation, which signals a diverse coverage of the topic, often cross-disciplinary. Under this research theme, data trading emerges as the main scene; the value creation process, mostly inter-organizational, involves a two-sided market coordination, namely, buyers and sellers (Parvinen et al., 2020;Spiekermann, 2019). Investigations also include datadriven business models or service offerings (Alfaro et al., 2019;Lange, Drews and Höft, 2021;Najjar and Kettinger, 2013;Schüritz, Seebacher and Dorner, 2017;Ye et al., 2021) as well as data pricing strategies or policies (Chen and Huang, 2016;Mehta et al., 2021). ...
Despite the substantial body of evidence detailing the multifaceted use of data within organizations, the conceptualizations of data and their value propositions remain disjoint and require updating. Information Systems scholars contend that the traditional ways of conceiving data now appear inadequate in framing this ever-evolving data-driven phenomenon. In this context, we argue for a reassessment of the fundamental assumptions about data in the field. This paper offers a comprehensive literature review, through which we conceptualize the role of data into four distinguishable types: data as a tool, as a commodity, as a practice, and as algorithmic intelligence. Each type possesses a set of identifiable characteristics, usage, and unique pathways of value creation. Together these elements form a typology, which is aimed to provide an explanation for the intricate and complex nature of data use in organizations and the diverse sources of their value.
Key words: data use, data artifacts, data value, analytics and AI value, IT artifacts.
... Although not explicitly mentioned in the DGA, DMs and DSPs fall under broad regulatory parameters. DMs may differ in governance and structural arrangements according to various factors: accessibility, domain specificity, technical architecture, and business models related to pricing and revenue (Spiekermann, 2019). ...
Data are a strategic asset for organizations in both the private and public sectors that spans multiple domains and sectoral boundaries. For innovation ecosystems, the ability to frictionlessly exchange data across borders between stakeholders for better decision-making, predictive capability, and automation represents a competitive advantage in the market. Data are also inputs for providing and receiving services online. Recent regulations such as the Data Governance Act (DGA) have placed the role of data intermediaries for cross-border data sharing at the forefront. However, the impact of the regulation on small- and medium-sized enterprises (SMEs) and the role of data intermediaries are still uncertain. This exploratory study investigated these dynamics by focusing on the perspective of SMEs in the Nordic-Baltic region through a sense-making policy and regulatory impact analysis. SMEs face significant legal uncertainties under the DGA, which impact cross-border uptake. The silver economy is a prime cross-sectoral market for cross-border data sharing, and established data intermediary solutions in the region could be leveraged to achieve innovation in this area.
... They conclude that PETs are not frequently used in this setting, despite relevant use cases; and that there is no consensus on a general architecture, in particular regarding the usage of blockchain. (Agahari et al., 2021) and (Agahari et al., 2022) offer a business perspective on MPC for data sharing, building on the business model for data marketplaces from (Spiekermann, 2019). They conduct semi-structured interviews in the privacy and security domain to study the perceived value propositions, ar-chitecture and financial models (Agahari et al., 2021), as well as control, trust, and perceived risks (Agahari et al., 2022). ...
... Green development, characterized by high efficiency, low pollution, and low energy consumption, has been included in the five development concepts in China (Zeng and Gu 2023), which requires the joint efforts of labor, capital, land, knowledge, technology, management, and other production factors, especially data factor (Spiekermann 2019;Koutroumpis et al. 2020). In the era of data technology, China's digital economy has gradually become an important component and growth engine of the national economy. ...
Improving carbon productivity is of great significance to China’s “30 · 60” carbon target, while the development of the digital economy is a driving force for green transformation. However, few studies discuss the relationship between the digital economy and carbon productivity. We investigate the influence of digital economic development on carbon productivity using panel data from 30 Chinese provinces from 2011 to 2020. Spatial econometric and moderating effects are considered. The results show that (i) digital economy has a positive direct and negative spatial spillover effect on carbon productivity, and this conclusion is still valid after the robustness test and endogeneity test; (ii) digital infrastructure has a greater impact on carbon productivity than digital industrialization and industrial digitalization; (iii) the mechanism analysis shows that environmental regulation negatively moderates the relationship between the digital economy and carbon productivity; (iv) heterogeneity analysis shows that the effect of the digital economy on carbon productivity is more obvious in the central region compared to the western region, while it is not significant in the eastern region. Overall, this paper not only provides a new analytical perspective for understanding the improvement of carbon productivity in the digital economy but also provides policy inspiration for promoting carbon peak and carbon neutrality goals.
... In order for data sharing to scale, alignment is required between the business models of the various participants of a data sharing business ecosystem: data providers, data consumers, intermediary platforms, software and services providers. Various research has shown this alignment is lacking in many data ecosystems because of different perceptions between participants on the value of data [9][10][11] and the revenue sharing models [12,13]. ...
Potential economic benefits of data sharing have been estimated and described in many reports and research. However, data sharing initiatives are limited by a lack of alignment on the business models for data sharing among stakeholders. In this research we aim specifically to investigate the perceptions of the stakeholders of such data sharing platforms on business models for data sharing to identify ways to foster alignment on business models. In our qualitative exploratory analysis (interviews and focus groups with stakeholders involved in a EU-funded project, MobiDataLab) we identify 6 criteria used by the stakehold- ers to compare business models and rank 5 archetypal business models according to them. We conclude with insights useful for business model design for mobility data sharing platforms and possible future research.
... For instance, prior research discussed GDPR and HIPAA's impact on wearable health devices [18]. Furthermore, research by Spiekermann [25] outlines challenges related to trust, security, and the lack of established regulatory frameworks for data trading. ...
This study outlines essential elements needed to develop a Health Data Marketplace (HDM) by building upon an existing data platform in Norway. A comprehensive framework is proposed that accounts for technical, legal, financial, and additional considerations. The results highlight the pivotal roles of key HDM actors - Marketplace Operators, Marketplace Users, and Legal Authorities - and emphasize critical enablers such as Data Standardization, Interoperability, Integration, Security, Trust, and Legal Frameworks. Such a marketplace has the potential to catalyze the effective, secure, and ethical use of health data, contributing to enhanced healthcare outcomes, research, and innovation.
... Data marketplaces are platforms that enable the matching of the supply and demand of data or data products/services. These platforms act as 'neutral intermediaries' in data flows as (i) they do not actively intervene in data value chains but solely facilitate the matching of supply and demand (Spiekermann, 2019), and (ii) the data intermediation service is open to any third party that respects the terms and conditions of the intermediary and the legal framework (see Box 3 above). Hence, data marketplaces fall under the scope of Chapter III of the DGA and are considered a prime example of data intermediaries in that legal text. ...
The report provides a landscape analysis of key emerging types of data intermediaries. It reviews and syntheses current academic and policy literature, with the goal of identifying shared elements and definitions. An overall objective is to contribute to establishing a common vocabulary among EU policy makers, experts, and practitioners. Six types are presented in detail: personal information management systems (PIMS), data cooperatives, data trusts, data unions, data marketplaces, and data sharing pools. For each one, the report provides information about how it works, its main features, key examples, and business model considerations. The report is grounded in multiple perspectives from sociological, legal, and economic disciplines. The analysis is informed by the notion of inclusive data governance, contextualised in the recent EU Data Governance Act, and problematised according to the economic literature on business models.
The findings highlight the fragmentation and heterogeneity of the field. Data intermediaries range from individualistic and business-oriented types to more collective and inclusive models that support greater engagement in data governance, while certain types do aim at facilitating economic transactions between data holders and users, others mainly seek to produce collective benefits or public value. In the conclusions, it derives a series of take-aways regarding main obstacles faced by data intermediaries and identifies lines of empirical work in this field.
... Data marketplace. Data trading involves a two-sided market, buyers and sellers (Spiekermann, 2019;Parvinen et al., 2020). While an information gap may exist between them (Agarwal, Dahleh and Sarkar, 2019), in some cases, a demonstration of data offerings would help seal a deal (Ray, Menon and Mookerjee, 2020). ...
Information Systems researchers have sought to demonstrate the strategic value of data in organizations through robust evidence. Over time, the ways data benefit organizations have evolved and become more diverse, yet definitions of data and their value propositions have not kept up and remain disconnected. The field still lacks a clear understanding of the various roles data play in organizations and how to define them. This paper presents a comprehensive review of related literature in the Information Systems and Management fields from the past two decades. We first conduct a systematic literature search and organize them into key research themes by the purpose of data use. We then propose a reconceptualization of data that takes into account their distinct features. Our aim is to provide an explanation for the unique nature of data and the diverse sources of their value in organizations.
... Data marketplaces provide an infrastructure for trading data and data-related services, i.e. data-driven services often use or operate on data marketplaces. Data marketplaces are expected to have high impact on data-related business activities that transform the data economy (Spiekermann, 2019). A digital ecosystem "is an interdependent group of enterprises, people and/or things that share standardized digital platforms for a mutually beneficial purpose" (Gartner Group, 2017). ...
In many industrial sectors, the current digitalization trend resulted in new products and services that exploit the potential of built-in sensors, actuators, and control systems. The business models related to these products and services usually are data-driven and integrated into digital ecosystems. Quantified products (QP) are a new product category that exploits data of individual product instances and fleets of instances. A quantified product is a product whose instances collect data about themselves that can be measured or, by design, leave traces of data. The QP design has to consider what dependencies exist between the actual product, services related to the product, and the digital ecosystem of the services. By investigating three industrial case studies, the paper contributes to a better understanding of typical features of QP and the implications of these features for the design of products and services. For this purpose, we combine the analysis of features of QP potentially affecting design with an analysis of dependencies between features. The main contributions of the work are (1) three case studies describing QP design and development, (2) a set of recurring features of QPs derived from the cases, and (3) a feature model capturing design dependencies of these features.
... An intuitive approach toward this problem of valuable and scarce information toward data sharing and reusing is data marketplaces [9]. However, a simple selling platform for datasets and models has limitations as the problem of eroding property claims. ...
... The growing demand to unleash the full potential of the Data Economy has led to the emergence of data marketplaces: multi-sided platforms that facilitate business data exchange among enterprises (Spiekermann, 2019). This phenomenon is particularly evident in the European context, where efforts to strengthen the European Data Economy have accelerated the proliferation of these marketplaces (European Commission, 2020). ...
The landscape of platform ecosystems is becoming increasingly complex, with new types of platforms emerging that glue together otherwise fragmented ecosystems. One recent case is meta-platforms that can contribute to the European Data Economy by interconnecting data marketplaces; however, meta-platforms may intensify data sovereignty concerns: the inability of data providers to own and control the exchanged data. While smart contracts and certification can generally enhance data sovereignty, it is unknown whether data providers perceive these control mechanisms as valuable in the complex meta-platform setting. This study aims to evaluate the perceived efficacy of the control mechanisms to ensure data sovereignty in meta-platforms. The findings from a survey study (n=93) indicate that
respondents perceive high data sovereignty. One potential explanation is that smart contracts can potentially enable providers to maintain ownership and control over their exchanged data; meanwhile, certification may signal meta-platforms’ responsibility to deliver secure data exchange infrastructure and assist providers in adhering to relevant regulations. This study contributes to advancing design knowledge for meta-platforms, showcasing that meta-platforms can be designed in a way to resolve fragmentation without neglecting data sovereignty principles.
... Unfortunately, still organizations still face obstacles regarding data sharing (Fassnacht et al. 2023;Heinz et al. 2022;Kraemer et al. 2021) and refrain from participating in data spaces. This may be due to the fact that many platforms have faced failures in recent years (Özcan et al. 2022;Spiekermann 2019). Despite this reluctance in adopting data spaces, they have been identified as a solution for data sovereignty (Haße et al. 2020;Hutterer and Krumay 2022;Winter et al. 2022) and were identified as a promising approach for achieving an increase in global data availability (Schleimer et al. 2023). ...
In the data economy, data has become an essential strategic resource for gaining a competitive advantage. Data spaces represent a relatively new phenomenon aimed at encouraging businesses to fully leverage the potential of data. Despite various approaches for definitions there remains a lack of clarity surrounding the conceptualization of data space, its perceived value, and the factors that drive its adoption. The conceptual ambiguity and synonymous usage of the term in academic and business literature present significant obstacles to targeted conceptualization and use. This paper addresses these issues by proposing primary properties of data space and contributes to the field by applying a semantic decomposition. Through this approach, we identified data space as having the following conceptual aspects Nature, Element, Function, Utility and Governance. These primitives highlight the growing need for security and privacy when sharing interorganizational data. In addition, we offer an initial definition of data spaces.
Стаття є дослідженням, у якій обґрунтовано взаємозв’язок між економічним потенціалом України та можливостями монетизації цифрових послуг в Європейському Союзі. Проаналізовано вплив технологічних інновацій на розвиток ІТ-сектора в Україні, а також досліджено різні моделі монетизації, що використовуються в ЄС, з метою адаптації їх до українського контексту. Основна увага приділяється законодавчим ініціативам, які регулюють цифровий ринок, та їхньому впливу на розвиток цифрової економіки в Україні. Результати дослідження підкреслюють значний потенціал України для інтеграції в європейський цифровий простір, що може сприяти економічному зростанню та підвищенню конкурентоспроможності країни. Мета статті полягає в аналізі економічного потенціалу України в контексті монетизації цифрових послуг, з акцентом на інтеграцію в європейський цифровий ринок. Досліджено вплив технологічних інновацій на розвиток ІТ-сектора, що є ключовим драйвером економічного зростання. Наукова новизна роботи полягає в систематизації моделей монетизації цифрових послуг, що використовуються в ЄС, та їх адаптації до українського контексту. Стаття також висвітлює роль законодавчих ініціатив, таких як GDPR, DSA та DMA, у формуванні умов для розвитку цифрового бізнесу. Практичне значення отриманих результатів полягає в розробці рекомендацій для українських ІТ-компаній щодо ефективної інтеграції в європейський ринок, що включає експорт ІТ-послуг, аутсорсинг, розвиток FinTech, а також залучення венчурного капіталу. Практичне значення дослідження полягає також у формулюванні рекомендацій для українських ІТ-компаній щодо оптимізації стратегій монетизації та адаптації до європейських умов. Стаття пропонує шляхи використання фінансових і інституційних ресурсів ЄС, таких як венчурний капітал, для підтримки і розвитку інноваційних проєктів. Основні висновки дослідження свідчать про значний потенціал України для інтеграції в європейський цифровий простір через впровадження інноваційних технологій та адаптацію до сучасних глобальних трендів, що може сприяти підвищенню конкурентоспроможності та економічного зростання країни. Подальші дослідження повинні зосередитися на глибшому аналізі механізмів монетизації цифрових послуг в ЄС та їх адаптації до українського ринку.
Data markets offer a mechanism for augmenting the data supply, particularly in data-scarce domains such as personalized medicine and services. Potential data sellers are incentivized and are expected to enter the market. A significant challenge for data buyers in these markets is the selection of the most valuable data points from a data seller. It is imperative to control data quality to ensure that the service offerings meet the user’s requirements. In DQ management, inspection is the process of measuring, examining, and testing to evaluate one or more characteristics of data goods and to compare these against specified requirements to ascertain conformity. Inspection can also be applied to products, processes, and other outcomes within data goods to ensure that the provided data goods are accurate and in compliance with specifications.
Despite the substantial body of evidence detailing the multifaceted use of data within organizations, the conceptualizations of data and their value propositions remain disjointed and require updating. Information Systems scholars contend that the traditional ways of conceiving data now appear inadequate in framing this ever-evolving data-driven phenomenon. In this context, we argue for a reassessment of the fundamental assumptions about data in the field. This paper offers a comprehensive literature review, through which we conceptualize the role of data into four distinguishable types: data as a tool, as a commodity, as a practice, and as algorithmic intelligence. Each type possesses a set of identifiable characteristics, usage, and unique pathways of value creation. Together these elements form a typology, which provides an explanation for the intricate and complex nature of data use in organizations and the diverse sources of their value.
Data are undoubtedly a contested commodity. On the one hand, data commodification is largely under way including through the operation of law. This is notably visible with the new EU data policy (Data Strategy notably followed by the Data Act, the Data Governance Act) that aims to establish data markets in keeping with European values. On the other hand, this phenomenon is heavily contested based on a wide range of different arguments, which have not been subjected to a systematic clustering. Data commodification is often understood simplistically as a binary and monolithic phenomenon whereby data would be either ‘commodified’ or not. This leads to misunderstandings of this phenomenon and of the ways in which it manifests. For example, many conceptual misunderstandings surround the relationship between ‘data access’ or ‘data sharing’ and data commodification and markets. This paper clarifies the phenomenon of data commodification, by approaching it as a spectrum with degrees following M. J. Radin (Contested Commodities, 1996). Following Radin, the paper clusters the different data governance arguments – or even paradigms – found in the literature along a data commodification spectrum. A specific attention is paid to the importance of market discourses in commodification dynamics, especially on the law. The paper offers a novel systematic synthesis of the data governance normative arguments, including data commons, data trusts, data sharing, data intermediation, against the background of the data commodification phenomenon. This synthesis brings conceptual clarity and allows to bring together different strands of the data literature (especially welfare economics, law and economics, commons, critical data studies, infrastructure studies) comprehensively, while they have until now remained siloed. Data governance normative arguments would greatly benefit from taking into account (conceptual and/or normative) arguments found in other strands of the literature. The paper can also be used to evaluate how data governance arguments or legislations relate to data commodification and take into account the specificities of data, thus enabling for more systematic analysis. The paper finds that data is actually a very contested commodity: The very conceptualization of data as a commodity is ontologically contestable. The framework in which situations are conceptualized – whether as data market ones or not – can indeed play a powerful but often invisible discursive role on commodification dynamics. Finally, the very identification and regulation of ‘data’ alone necessarily brings about certain commodification affordances.
With the proliferation of data and advanced analytics, organizations are increasingly recognizing the potential value of sharing data across organizational boundaries. However, there is a lack of empirical evidence and systematic frameworks to guide the design of effective data sharing practices. Realizing the full potential of data sharing requires the effective design and implementation of data sharing practices by considering the interplay of data, organizational structures, and network dynamics. This study presents an empirically and theoretically grounded taxonomy of data sharing practices drawing on existing literature and real-world data sharing cases. The subsequent cluster analysis identifies four generic archetypes of data sharing practices, differing in their primary orientation toward compliance, efficiency, revenue, or society. From a theoretical perspective, our work conceptualizes data sharing practices as a foundation for a more systematic and detailed exploration in future research. At the practitioner level, we enable organizations to strategically develop and scale data sharing practices to effectively leverage data as a strategic asset.
Data platforms enable actors to exchange personal and business data. While data is relevant for any digital platform, data platforms exclusively revolve around data artifacts. This paper argues that the specific characteristics of data artifacts challenge the authors’ understanding of platform openness. Specifically, it is argued that data artifacts are editable, interactive and distributable, which means that the consequences of opening up a data platform extend far beyond the focal platform and its context. From this, the study infers that the scope of platform openness extends beyond the data platform on which data artifacts originate. At the same time, the very nature of data artifacts afford new mechanisms to realize and reduce the risks of openness. New avenues are suggested to study platform openness in the realm of data platforms. These avenues include (1) exploring and incorporating novel consequences of platform openness in a data platform setting, (2) examining new arenas for defining openness beyond a focal platform’s confines, and (3) theorizing the implications of new mechanisms for realizing openness while maintaining apparent control over data artifacts.
Partiendo de las ideas de onerosidad, precio y contraprestación en la teoría general de las obligaciones y de los contratos, se analiza en este texto la cuestión del pago con datos personales en los contratos celebrados con consumidores frente a los ordenamientos jurídicos español y portugués. A pesar del temor del legislador a utilizar la palabra “contraprestación” en este contexto, se argumenta que estamos ante un pago con datos personales siempre que se cumplan tres supuestos: (i) el tratamiento de datos se basa en el consentimiento del interesado (consumidor); (ii) el profesional responsable del tratamiento está vinculado por el consentimiento del consumidor (interesado) de tal forma que se produce el sinalagma do (res) ut des (data); (iii) la responsabilidad del consumidor es consecuencia de la libertad de consentimiento. De este último aspecto se desprende que la mera restitución recíproca de prestaciones no puede considerarse un perjuicio para la revocación del consentimiento.
Data integration, which aims to solve problems and create new services by combining datasets, has attracted considerable attention. The discovery of similar datasets that can be combined is critical. In the literature on similar dataset discovery, it is important to select an appropriate discovery method for each information need, such as the domain. However, conventional studies have evaluated discovery methods in different ways, such as domains, test datasets, and evaluation metrics. This factor prevents the appropriate method selection for each situation. Furthermore, the specific effects of the combination of different methods are not well known despite conventional studies arguing the importance of the combination. This study attempts to understand (1) the similarity indicators that should be employed for each domain and (2) the effects of a combination of different indicators on performance. We evaluated 16 inter-dataset clustering models based on different metadata-based similarity indicators, using unified evaluation metrics and datasets for 15 domains. Our results (1) suggest that similarity indicators should be used for each domain and (2) demonstrate that most of the combinations of different methods can improve clustering performance.
The onerous exchange of data is a reality that needs to be regulated by the European Union. Despite the reluctance expressed by EDPS and EDPB during the debate on the current EU Directive 2019/770, the scientific literature is beginning to accept and demand that the rules take into account the reality that data are another economic asset. An example of data functioning as an economic asset is the existence of data marketplaces. In these data marketplaces, different parties exchange data packages in the same way as they exchange goods and services in other markets. Data marketplaces face two major regulatory challenges. On the one hand, the aforementioned need for regulations to accept that data functions as an economic asset; on the other hand, their adaptation, like other digital markets, to the rules established by the Digital Market Act.Keywordsdatapersonal data protectiondata marketplacesDigital Markets ActGDPR
In this paper, we offer an original framework to study Artificial Intelligence (AI). The perspective we propose is based on the idea that AI is a system technology, and that a useful description of AI cannot abstain from mapping the components of the system, their interdependence, and how the synergies they create shape at the roots the directions of AI development. We adopt the concept of Large Technical System (LTS) to give substance and structure to our idea. Using LTS, we are able to scaffold AI and the forces at work steering its production, deployment, and evolution. We find that AI as a system shares essential features with infrastructural technologies such as the Internet. The LTS framework proves very useful to capture important nuances of the technology, and it allows us to trace the connections and cross-influences among its constituting domains - algorithms (software), compute (hardware), and data. We compare our proposed framework with other concepts usually associated with radical innovations, and suggest in which respects AI differs from these ideal-types. We consider ours a timely exercise, as we witness the formation of an AI industry. While in the making, this industry is rapidly ossifying, together with its specific problems, power imbalances, and development scenarios; the focus on the system-ness of AI allows uncovering the deeper structure of this technological breakthrough.
Data-driven markets depend on access to data as a resource for products and services. Since the quality of information that can be drawn from data increases with the available amount and quality of the data, businesses involved in the data economy have a great interest in accessing data from other market players. However, companies still appear to be reluctant to share their data. Therefore, the key question is how data sharing can be incentivized. This article focuses on data sharing platforms, which are emerging as new intermediaries and can play a vital role in the data economy, as they may increase willingness to share data. By comparing data sharing to the exchange of patents based on the FRAND principles, this article suggests a possible way for self-regulation to provide more transparency and fairness in the growing markets for data sharing.
Digital transformation implies the development of data-driven business models and thus the management of data goods. While marketplaces for data are being established and platforms for the exchange of data are being created, companies have to adapt their data management to the increasing requirements. One central question can be deduced: How can data goods be described in a standardized way? This paper describes the development of the metadata model for data goods M4DG. The M4DG, based on an analysis of existing data marketplaces and metadata models of related topics, makes it possible to describe data sources with defined properties. This creates a unified understanding of the properties of data goods to facilitate selection and trading. We are convinced that the M4DG will contribute to the practical design of data management.
Digitalization of the economy requires enterprises from all industries to revisit their current business models and prepare their organizations for the digital age. One task is the (re-)design of hybrid and digital products and services. The foundation builds the improved interchangeability of data and the availability of external data sources through data markets and platforms. This leads to the requirement of a structured decision-making while mapping data sources to digital products. In order to successfully transform their business and develop valuable new products, companies require methodological help. This paper proposes a high-level conceptual model for the assessment of data sources value. It consists of an approach for comparing data sources based on a common description of data and individual metrics definition enable a benchmark process. The development of the model and its practicability has been validated in a case study with an industrial partner.
The survey presented in this work investigates emerging markets for data and is the third of its kind, providing a deeper understanding of this emerging type of market. The findings indicate that data providers focus on limited business models and that data remains individualized and differentiated. Nevertheless, a trend towards commoditization for certain types of data can be foreseen, which allows an outlook to further developments in this area.
Mobile computing and the Internet of Things promises massive amounts of data for big data analytic and machine learning. A data sharing economy is needed to make that data available for companies that wish to develop smart systems and services. While digital markets for trading data are emerging, there is no consolidated understanding of how to price data products and thus offer data vendors incentives for sharing data. This paper uses a combined keyword search and snowballing approach to systematically review the literature on the pricing of data products that are to be offered on marketplaces. The results give insights into the maturity and character of data pricing. They enable practitioners to select a pricing approach suitable for their situation and researchers to extend and mature data pricing as a topic.
Zusammenfassung
Der Handel mit Daten etabliert sich als immer wichtigerer Wirtschaftsbereich, in dem Datenmarktplätzen als Handelsplattformen eine Schlüsselrolle zukommt. Dementsprechend nimmt auch die Forschung zu Datenmarktplätzen zu und es werden neue Forschungsgebiete und -richtungen identifiziert, welche von verschiedenen Forschungsteams in unterschiedlichen Disziplinen bearbeitet werden. Dieser Artikel gibt zum ersten Mal einen Überblick über die aktuelle Forschung im Bereich Datenmarktplätze in unterschiedlichen Disziplinen. Es wird analysiert, welche Themenfelder erforscht werden und welche Forschungsgebiete weitgehend unberührt sind; außerdem werden Forschungsarbeiten aus ähnlichen Bereichen gegenübergestellt und der Gesamtzusammenhang aufgezeigt.
This article introduces a dynamic cloud-based marketplace of near-realtime human sensing data (MARSA) for different stakeholders to sell and buy near-realtime data. MARSA is designed for environments where information technology (IT) infrastructures are not well developed but the need to gather and sell near-realtime data is great. To this end, we present techniques for selecting data types and managing data contracts based on different cost models, quality of data, and data rights. We design our MARSA platform by leveraging different data transferring solutions to enable an open and scalable communication mechanism between sellers (data providers) and buyers (data consumers). To evaluate MARSA, we carry out several experiments with the near-realtime transportation data provided by people in Ho Chi Minh City, Vietnam, and simulated scenarios in multicloud environments.
Trading data as a commodity has become increasingly popular in recent years, and data marketplaces have emerged as a new business model where data from a variety of sources can be collected, aggregated, processed, enriched, bought, and sold. They are effectively changing the way data are distributed and managed on the Internet. To get a better understanding of the emergence of data marketplaces, we have conducted several surveys in recent years to systematically gather and evaluate their characteristics. This paper takes a broader perspective and relates data marketplaces as currently discussed in computer science to the neoclassical notions of market and marketplace from economics. Specifically, we provide a typology of electronic marketplaces and discuss their approaches to the distribution of data. Finally, we provide a distinct definition of data marketplaces, leading to a classification framework that can provide structure for the emerging field of data marketplace research.
General morphological analysis (GMA) is a method for structuring and investigating the total set of relationships contained in multidimensional, usually non-quantifiable, problem complexes. Pioneered by Fritz Zwicky at the California Institute of Technology in the 1930s and 1940s, it relies on a constructed parameter space, linked by way of logical relationships, rather than on causal relationships and a hierarchal structure. During the past 10 years, GMA has been computerized and extended for structuring and analysing complex policy spaces, developing futures scenarios and modelling strategy alternatives. This article gives a historical and theoretical background to GMA as a problem structuring method, compares it with a number of other ‘soft-OR’ methods, and presents a recent application in structuring a complex policy issue. The issue involves the development of an extended producer responsibility (EPR) system in Sweden
In recent years it has become clear that application-independent
techniques and tools must be supplemented with an application-specific
approach. We begin to identify the foundations needed for
application-specific software research by examining the role and nature
of current application taxonomies
Eine zunehmende Zahl von Anbietern nutzt das Cloud-Computing-Paradigma für einen Handel mit Daten und analytischen Dienstleistungen. In dieser qualitativen Studie präsentieren wir die Ergebnisse aus Interviews mit zwölf etablierten Anbietern. Unsere Ergebnisse zeigen insbesondere eine große Unsicherheit bezüglich der Preissetzung und Preismodellwahl. Ferner erlauben sie eine Abstraktion der betrachteten Marktplätze auf ein einheitliches Schema mit sieben Akteuren sowie sechs atomaren und zwei hybriden Preisstrategien abstrahieren. Darüber hinaus bietet diese Papier erstmals eine strukturierte Entscheidungshilfe für die Wahl eines geeigneten Preismodells für Datenmarktplätze und legt somit den Grundstein für eine algorithmische Unterstützung bei Preismodellwahl und Preisfindung.
Currently, multiple data vendors utilize the cloud-computing paradigm for trading raw data, associated analytical services, and analytic results as a commodity good. We observe that these vendors often move the functionality of data warehouses to cloud-based platforms. On such platforms, vendors provide services for integrating and analyzing data from public and commercial data sources. We present insights from interviews with seven established vendors about their key challenges with regard to pricing strategies in different market situations and derive associated research problems for the business intelligence community.
Recent rapid advances in Information and Communication Technologies (ICTs)
have highlighted the rising importance of the Business Model (BM) concept in
the field of Information Systems (IS). Despite agreement on its importance to
an organization’s success, the concept is still fuzzy and vague, and there is
little consensus regarding its compositional facets. Identifying the fundamental
concepts, modeling principles, practical functions, and reach of the BM relevant
to IS and other business concepts is by no means complete. This paper,
following a comprehensive review of the literature, principally employs the
content analysis method and utilizes a deductive reasoning approach to
provide a hierarchical taxonomy of the BM concepts from which to develop a
more comprehensive framework. This framework comprises four fundamental
aspects. First, it identifies four primary BM dimensions along with their
constituent elements forming a complete ontological structure of the concept.
Second, it cohesively organizes the BM modeling principles, that is, guidelines
and features. Third, it explains the reach of the concept showing its interactions
and intersections with strategy, business processes, and IS so as to place the BM
within the world of digital business. Finally, the framework explores three major
functions of BMs within digital organizations to shed light on the practical
significance of the concept. Hence, this paper links the BM facets in a novel
manner offering an intact definition. In doing so, this paper provides a unified
conceptual framework for the BM concept that we argue is comprehensive and
appropriate to the complex nature of businesses today. This leads to fruitful
implications for theory and practice and also enables us to suggest a research
agenda using our conceptual framework.
Eigenschaften und Erfolgsfaktoren digitaler Plattformen, iit-Institut für Innovation und Technik in der VDI/VDE Innovation+ Technik GmbH
Jan 2017
S Engelhardt
L Wangler
S Wischmann
S von Engelhardt
Vossen: Marketplaces for data
F Stahl
F Schomm
Nurturing the market for Data Markets
P Miller
P. M i l l e r : Nurturing the market for Data Markets, 2012, available at
https://www.cloudave.com/16572/nurturing-the-market-for-datamarkets/.
Jan 2013
O Gassmann
Dust-dar: MARSA: A marketplace for realtime human sensing data
Jan 2016
16
T D Cao
T V Pham
Q H Vu
H L Truong
D H Le
TD Cao
Marketplaces for data
Jan 2014
F Stahl
F Schomm
The (Unfulfilled) Potential of Data Marketplaces, Working Paper No. 53, The Research Institute of the Finnish Economy
P Koutroumpis
A Leiponen
L Thomas
The current discussion sees an increasing need for contractual arrangements in the data goods trade, see
H Richter
P R Slowinski
Jurdak: A Decentralized IoT Data Marketplace
P Gupta
S S Kanhere
Defining the Business Model in the New World of Digital Business
Jan 2008
MM Al-Debei
R El-Haddadeh
D Avison
Measuring the Value of Information: An Asset Valuation Approach
Jan 1999
496-512
D Moody
P Walsh
J Pries-Heje
C Ciborra
K Kautz
J Valor
E Christiansen
D Avison
Uncovering the Role of IS in Business Model Innovation - A Taxonomy-Driven Approach to Structure the Field In
See
A Hanelt
B Hildebrandt
J Polier
A conceptual model of benchmarking data and its implications for data mapping in the data economy
Jan 2018
314-325
B Otto
P Drews
Vo s s e n , L. Vo m f e l l : A classifi cation framework for data marketplaces
Jan 2016
137-144
F S C H O M M
F. S t a h l, F. S c h o m m, G. Vo s s e n, L. Vo m f e l l : A classifi cation
framework for data marketplaces, in: Vietnam Journal of Computer
Science, Vol. 3, No. 3, 2016, pp. 137-144.
O s t a d : Wirtschaftliche Verwertungsmöglichkeiten für Mobilitäts-und Infrastrukturdaten
Jan 2018
B B E N D E R S
C B U R K A R D
J K Ü F E N
B. B e n d e r s, C. B u r k a r d, J. K ü f e n, Y. O s t a d : Wirtschaftliche
Verwertungsmöglichkeiten für Mobilitäts-und Infrastrukturdaten, IKT
für Elektromobilität, Berlin 2018.
Marketplaces for data
P K O U T R O U M P I S
P. K o u t r o u m p i s et al., op. cit.; F. Stahl et al.: Marketplaces for data..., op. cit.
Vossen: The data marketplace survey revisited, ERCIS Working Papers 18