Article

Strength in Numbers: How Does Data-Driven Decisionmaking Affect Firm Performance?

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We examine whether firms that emphasize decision making based on data and business analytics (“data driven decision making” or DDD) show higher performance. Using detailed survey data on the business practices and information technology investments of 179 large publicly traded firms, we find that firms that adopt DDD have output and productivity that is 5-6% higher than what would be expected given their other investments and information technology usage. Furthermore, the relationship between DDD and performance also appears in other performance measures such as asset utilization, return on equity and market value. Using instrumental variables methods, we find evidence that the effect of DDD on the productivity do not appear to be due to reverse causality. Our results provide some of the first large scale data on the direct connection between data-driven decision making and firm performance.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The best management is a true science, resting upon clearly defined laws, rules, and principles, which are applicable to all kinds of human activities, from our simplest individual act to the work of our great corporations Frederick W. Taylor (1911;p. 7) In today's fast-paced business environment, data analysis is essential for informed decision-making, enabling strategic adaptation and a sustained competitive advantage (Brynjolfsson & McElheran, 2016;Hedgebeth, 2007;Brynjolfsson et al., 2011;Teece et al., 1997). For instance, using advanced data analytics techniques and state-of-the-art technologies, Amazon converts raw data into valuable insights that drive its operations, marketing strategies, and customer-focused approach. ...
... Nevertheless, the swift advancement and pervasive integration of information technology tools, like artificial intelligence and machine learning, represent a viable solution to deal with it (Balasubramanian et al., 2022); they allow exploiting data for competitive advantage through data-driven decision making. This refers to basing decisions on data analysis rather than intuition (Brynjolfsson et al., 2011), opening a prescriptive approach to Decision Theory (how to improve decision making). ...
... Test. Finally, as per the scientific method, growth hackers must engage in 'testing the hypothesis' (hack) with the highest priority using well-designed and implemented experiments capable of yielding statistically valid results that allow gathering sufficient information to enable data-driven evaluation and decision making (Brynjolfsson et al., 2011;Ellis & Brown, 2017). This aligns also with scientific management principles (Taylor, 1911) which suggest observing and measuring every aspect of a process, conducting controlled experiments, and refining the process based on results of the experiments (Brecht et al., 2021). ...
Article
Full-text available
Today’s businesses necessitate data-driven decisions to continuously adapt (and even shape) their environment to stay competitive. Growth hacking, with its emphasis on experimentation and data analysis, offers a promising approach to meet this need. Even though interest in growth hacking is increasing, the literature on the topic is still developing, and notclear guidance in how to implement it has yet been provided. Combining the scientific method and Taylor’s scientific management principles with growth hacking insights from academic research and practice, we present growth hacking as a scientific approach for data-driven decision making in organisations. Through its iterative cycle of analysis, ideation, prioritisation, testing, and evaluation of prerequisites and facilitators, growth hacking empowers companies to make data-driven decisions, enabling them to navigate uncertainty, identify and seize opportunities, and transform their operations to adapt to or shape their environment. We also provide point out tools for the real-world business applications of growth hacking.
... Given the rapid global increase in information and data within and outside the corporate world, this "data-driven decision-making" will play an increasingly important role in existing and future business activities [34]. However, with the increasing availability of data, it is also becoming increasingly difficult to maintain an overview of the data and to make decisions on this basis [35,36]. ...
... Further, empirical research has shown that companies that use data-driven decision-making outperform their competitors on a financial and operational level. Therefore, it has long been argued that data-driven companies tend to make better decisions [34,55,56]. However, the use of data by humans also has its limits. ...
Article
Full-text available
Artificial Intelligence (AI) is revolutionising the economy and society by automating processes, driving innovation and enabling new business models, leading to a significant increase in productivity and competitiveness. Until now, the aspect of "Innovation Development" has always been assigned to Humans as Entrepreneurs (or Intrapreneurs), but in the course of AI-Development, the question must increasingly be asked whether AI will not only take on a passive support role in this field but also an active development and decision-making role. Against this background, there is a growing need for research in the field of “Human versus Machine” for driving innovation and enabling new business models. (Human) Entrepreneurs are characterised by re­cog­nising, evaluating and exploiting entrepreneurial opportunities. According to Schumpeter's understanding, the "Human Entre­preneur" appears in particular as an innovator by developing innovative ideas through their creative power and establishing them on the market. To do this, they must make entrepreneurial decisions based on the available information and data. However, this decision-making process based on information or data is increasingly being taken over by Artificial Intelligence, which is much more powerful in handling this information or data. However, what happens when Artificial Intelligence not only supports the decision-making process of a "Human Entrepreneur" in a formative way but also takes it over as an "Artificial Entrepreneur" based on its own transformative creativity? The aim of the following article is to conceptually describe the prerequisites for the takeover of creative destruction by a machine in the sense of Schumpeter. The result is the development of a Framework which forms the basis for a new field of research: "Artificial Entrepreneurship".
... Part of this trend is due to the widespread diffusion of enterprise information technology such as Enterprise Resource Planning (ERP), Supply Chain Management (SCM), and Customer Relationship Management (CRM) systems (Aral et al., 2006) [2] , which capture and process vast quantities of data as part of their regular operations. Increasingly these systems are imbued with analytical capabilities, and these capabilities are further extended by Business Intelligence (BI) systems that enable a broader array of data analytic tools to be applied to operational data (Brynjolfsson et al., 2011) [3] . These opportunities for data analytics outside of the firm-owned operational systems have increased substantially driven by the digitalization of marketplace and emergence of ecommerce in not just B2C but also B2B settings. ...
... Part of this trend is due to the widespread diffusion of enterprise information technology such as Enterprise Resource Planning (ERP), Supply Chain Management (SCM), and Customer Relationship Management (CRM) systems (Aral et al., 2006) [2] , which capture and process vast quantities of data as part of their regular operations. Increasingly these systems are imbued with analytical capabilities, and these capabilities are further extended by Business Intelligence (BI) systems that enable a broader array of data analytic tools to be applied to operational data (Brynjolfsson et al., 2011) [3] . These opportunities for data analytics outside of the firm-owned operational systems have increased substantially driven by the digitalization of marketplace and emergence of ecommerce in not just B2C but also B2B settings. ...
Article
In today's data-rich business environment, organizations and their Sales and Marketing department grapple with ever-increasing pressure to leverage analytics for strategic decision-making and sustainable growth. This paper explores the space of data and analytics and proposes the idea of establishing an effective Business Insights (BI) team that enables sales and marketing departments to move beyond standing up a traditional data and analytics function. Drawing from the transformation of a Business Insights team supporting six commercial sales areas in the U.S. based organization, the paper outlines seven critical steps: defining objectives aligned with business goals, developing comprehensive analytics frameworks, implementing advanced data tools, building skilled collaborative teams, fostering cross-functional engagement, establishing continuous monitoring mechanisms, and driving a culture of innovation. The paper provides actionable guidance for each step, emphasizing the importance of stakeholder alignment, advanced data visualization, and tailored reporting processes. Using an illustrative example of a USD 1 billion revenue organization, the paper demonstrates how implementing these strategies can generate significant value through improved sales growth, enhanced upsell opportunities, operational improvements, and reduced customer churn. The paper concludes that organizations that invest in standing up a Business Insights function are better positioned to navigate market complexity and maintain sustained competitive advantage in today's dynamic business landscape while fostering a culture of collaboration and continuous improvement.
... Furthermore, research on retail marketing indicates that nearly 67% chains struggle with insufficient customer analytics for guiding cross-channel budgeting. Lack of data connectivity restricts dynamic decision-making on forward-looking media optimization and audience prioritization (Brynjolfsson, 2011). By harnessing analytics, Coolshik can enrich understating of channel efficacies for targeted consumer clusters. ...
... The datasets provide rich raw data capturing digital marketing campaign performance across several crucial dimensions like campaign types, channels, audience demographics, and time periods. This multifaceted data offers fertile ground for business intelligence and data analytics to derive penetrating insights that can critically inform marketing decision-making (Brynjolfsson, 2011 In totality, the systematic data processing, mathematically rigorous metrics, intelligent visualizations, and interactive analytics interface demonstrate sound grasp of business intelligence systems and analytical proficiency. The data-to-decisions workflow steers a model practice of data analytics for augmenting marketing efficiencies. ...
Article
Full-text available
Coolshik, a leading supermarket chain in 5 major US cities, implemented over 350 creative digital marketing campaigns targeted at diverse consumer segments in 2022. However, with fragmented teams, de-centralized analytics, and no unified data standards, dynamically optimizing spends for higher ROI remained a persistent gap. Accordingly, this report approaches Coolshik’s opportunity with a concentrated analytical lens across 3 facets: ➢ Audience Evaluation: Segmenting target groups based on affinity drivers and mapping engagement levels, sentiment, and channel preferences through statistical cluster analysis for sharper selections, ➢ Channel Diagnostics: Estimating channel efficacy for each audience cluster by tracking multi-touch attribution propensities and predicting creative resonance, ➢ Campaign Blueprint: Constructing an integrated promotional calendar encompassing regions, consumer categories and media mix complemented by machine learning guided creatives based on the analytics-backed decisions. Through measurable frameworks grounded in marketing science and data analytics, the structured analysis seeks to activate superior customer prioritization, channel productivity, financial accountability, and adaptive preparedness across Coolshik’s operating terrain (Chaffey & Ellis-Chadwick, 2022). In summary, the structured analysis aims to equip Coolshik with actionable, data-backed directives on acquiring and engaging customers across priority demographics and high-ROI channels while judiciously optimizing promotional budgets seasonally.
... Moreover, this relationship was observed to vary between the United States and South Korea. Similarly, ref. [23] emphasized that the association between data-driven decision-making and performance extends beyond conventional metrics, encompassing aspects such as asset utilization, return on equity and market value [9]. Utilizing instrumental variables methods, they discerned evidence indicating that the impact of data-driven decision-making on productivity is unlikely to stem from reverse causality. ...
... For instance, ref. [16]'s described research indicated that customer engagement did not directly influence Sustainable Financial Performance but rather exhibited a favorable indirect influence mediated by brand equity. In the realm of data-driven decision-making, ref. [23] emphasized that the relationship between data-driven decision-making and performance extends to various performance metrics, including asset utilization, return on equity and market value. The findings about customer engagement, customer satisfaction, and datadriven decision-making underscore the presence of economies of scale facilitated by the integration of marketing AI. ...
Article
Full-text available
This study aims to examine the impact of marketing artificial intelligence (AI) and the specific channels it uses to influence the performance of Jordanian SMEs. In contrast to prior research that focused solely on the direct effects of marketing AI on organizational performance, our study introduces innovative pathways involving customer engagement, customer satisfaction, and data-driven decision-making. These channels serve as indirect mechanisms through which the impact of marketing AI extends to influence the Sustainable Financial Performance of SMEs. Using a framework for structural equation modeling, we looked at the answers of 250 small and medium-sized enterprises (SMEs) chosen through cluster sampling from industrially active areas in Jordan. Our findings reveal that the adoption of AI technologies leads to a notable 42.5% enhancement in Sustainable Financial Performance among SMEs, driven by a 50% increase in customer engagement and a 76% improvement through data-driven decision-making processes. While our study did not establish a direct correlation between marketing AI and long-term financial performance, it demonstrated the connection between technology adoption, customer-focused strategies, and data-driven practices. The findings offer actionable insights and recommendations for Jordanian SMEs leveraging marketing AI to achieve competitive advantage and sustainable growth. The study offers significant contributions to how marketing AI improves the Sustainable Financial Performance of Jordanian SMEs by identifying crucial indirect pathways. Unlike previous studies, it focuses on the roles of customer engagement, satisfaction, and data-driven decision-making as mediators rather than emphasizing direct impacts. The findings highlight the importance of integrating technology with customer-centric and data-oriented strategies to drive sustainable growth and competitiveness in SMEs.
... Data is the primary foundation for managers' decision-making, and the success of an organisation depends on the timely and appropriate decisions they make (Brynjolfsson et al., 2011;Popovi� c et al., 2018). The data-driven decision-making (DDDM) process involves using evidence and data analysis to guide and verify action plans before implementation (Cipp� a et al., 2021). ...
Article
Full-text available
Purpose The exponential growth of organisational data has thrust big data into the spotlight, making data analysis, information extraction and data-driven decision-making (DDDM) critical for organisational success. This study aims to systematically review the literature to identify key research trends, methodologies and opportunities within the DDDM domain. Design/methodology/approach This research employs bibliometric analysis and systematic review methodologies to synthesise findings from existing studies. The analysis categorises research methods into eight primary groups, highlighting their applications and contributions to DDDM. Findings The review identifies machine learning, statistical models and qualitative methods as the most widely used approaches, while multi-criteria decision-making and simulation emerge as promising avenues for future research. Research has predominantly focused on production and operations and business management and organisation. However, underexplored domains with significant potential for future breakthroughs are marketing and sales, development and education and social and financial. Originality/value This study underscores critical gaps in the application of DDDM across less-explored fields, including engineering, biomedical sciences and safety and security. By identifying emerging trends and under-represented areas, the research provides a roadmap for advancing DDDM scholarship and practice. Keywords: Data-driven decision-making, Bibliometric analysis, Content analysis, Systematic literature review
... By leveraging real-time data insights, organizations can make informed decisions, reducing uncertainty and optimizing outcomes. (Brynjolfsson et al., 2011) The ability to tailor strategies to unique contexts through detailed analytics allows for more targeted and effective negotiation processes. Furthermore, the integration of artificial intelligence and machine learning capabilities enables predictive modeling, aiding in the anticipation of market trends and proactive strategy adjustments. ...
Thesis
Full-text available
Abstract In the context of global procurement and sourcing strategies, the integration of Fact-Based Negotiation (FBN) with artificial intelligence (AI) and machine learning (ML) technologies marks a crucial transformative step. This research proposal, entitled "Enhancing Sourcing Efficacy through Fact-Based Negotiation: The Role of Supportive Intelligence," delves into the significant role AI and ML play in refining FBN—a data-driven approach that embraces objective, transparent and informed decision-making in contract negotiations. The study demonstrates how the adept incorporation of advanced analytics and automation can substantially streamline negotiation processes within organizations, leading to improved outcomes, higher efficiency, and reduced costs. This research heralds the importance of systematic and data-centric approaches in today's digital era, where data equates to value. It advocates for novel sourcing negotiation strategies through a meticulous literature review and empirical analysis, specifically exploring AI and ML within the parameters of lean thinking. The literature reviewed highlights the urgent requirement to modernize conventional negotiation practices to excel in the evolving data-centric business environment. Moreover, the research illuminates the pivot essential to adapt sourcing methodologies, proposing that the synergistic application of FBN, AI, and ML can induce a paradigm shift in procurement processes. This alignment is anticipated to set a new standard for management and operations, applicable across various industry sectors.
... о тенденциях рынка, таким образом, они могут создать устойчивое конкурентное преимущество [4]. Например, в [5] установлено, что компании, принимающие решения на основе данных, могут повысить производительность на 5-6%. ...
Article
Full-text available
В статье рассматривается интеграция технологий больших данных и процессов принятия решений в контексте разработки программного обеспечения для вычислительных систем. Основное внимание уделяется проблеме низкой эффективности проектов больших данных, связанной с игнорированием аспектов принятия решений. Автор предлагает концептуальную модель BD-Da, которая объединяет три уровня: данные, анализ данных и принятие решений. На уровне данных акцент делается на управлении характеристиками больших данных (объем, скорость, разнообразие, достоверность) и их источниками. Уровень анализа включает применение аналитических методов и инструментов для обработки данных, их визуализации и представления в структурированном виде. Уровень принятия решений базируется на расширенной версии стандарта DMN, что позволяет формализовать бизнес-правила, логику и требования к автоматизации процессов. Модель BD-Da направлена на устранение разрозненности данных и обеспечение связи между извлеченной информацией и её практическим использованием. Особое внимание уделяется моделированию решений с помощью диаграмм требований (DRD) и логики, что способствует верификации и автоматизации процессов в вычислительных системах. Исследование подчеркивает важность интеграции больших данных с системами поддержки принятия решений для повышения скорости и качества управленческих решений. Результаты работы имеют практическое значение для разработки программных комплексов, ориентированных на обработку больших данных, и могут быть использованы для оптимизации архитектуры вычислительных систем, включая реализацию аналитических моделей в реальном времени.
... The global economy heavily relies on data as a multifaceted asset. Creating, sharing, and using data are expected to lead to innovative new business models (BMs) [1]. According to the European Commission [2], data-driven innovations are anticipated to be crucial in enhancing productivity and resource efficiency. ...
Article
Full-text available
Creating, sharing, and using data are expected to lead to the development of new innovative business models. This study investigates the interplay between business model change, ethical data practices, and participation in data ecosystems in fostering data-driven innovation. Using survey data from 1200 European companies, analyzed through partial least squares structural equation modeling (PLS-SEM), the findings reveal that while firms recognize the potential benefits of data, business model change alone is insufficient to drive innovation. Instead, active engagement in data ecosystems and adherence to ethical data practices together have a significant positive impact on data-driven innovation. This research contributes to the business model innovation literature by highlighting the role of ecosystems and ethical governance in shaping sustainable data-driven innovations. This study also provides practical insights for firms seeking to transition toward more collaborative and ethically grounded data-driven business models.
... In today's rapidly evolving business environment, companies face increasing challenges in workforce management and performance optimization. Many organizations, particularly those lacking advanced technologies, rely on intuition or fragmented data for critical HR decisions [1]. This reliance can result in inefficient HR practices, high employee turnover, and misalignment between HR functions and broader organizational goals, ultimately hindering organizational performance. ...
Article
Full-text available
Strategic human resource management plays a crucial role in fostering long-term organizational performance through data-driven decision-making. Human Resource Analytics (HRA), using advanced business intelligence and integrated reporting tools, provides insights that optimize decision-making and strategy alignment. Despite its potential, the impact of HRA on organizational performance remains insufficiently explored. This study addresses this gap by examining the effects of HRA on organizational performance in Ethiopian organizations. A quantitative research design was employed, utilizing a survey method to collect data from 269 valid responses across 55 organizations in Addis Ababa, Ethiopia. Structural Equation Modeling (SEM) via SmartPLS 3.0 software was used for data analysis. The findings reveal that HRA significantly enhances organizational performance, with this relationship mediated by strategic alignment between HR and organizational goals. Additionally, firm size was found to moderate the impact of HRA on performance, with larger firms deriving greater benefits. The results suggest that HRA serves as a powerful driver of enhanced organizational performance, with larger firms potentially reaping even greater benefits from its implementation. These results also underscore the importance of strategic alignment in leveraging HRA for improved performance, particularly in the context of Ethiopian organizations, where HRA adoption is still evolving. This study offers practical implications for organizations seeking to enhance workforce management and performance through data-driven HR strategies.
... de Medeiros, Hoppen and Maçada (2020) look more broadly, at businesses, and find that the increased agility with which insights may be obtained and the management of organisational performance are leading benefits of using data science. Studies have suggested that data science and the use of big data might improve productivity by as much as 7% (Brynjolfsson, Hitt and Kim, 2011;Müller, Fay and vom Brocke, 2018). These productivity benefits are likely to come from automating processes, improving the quality of decision-making through better or more timely insights, and, via large language models, through providing aids to people-whether for writing, coding, or even tutoring for subjects like mathematics. ...
Preprint
Full-text available
Economies are fundamentally complex and becoming more so, but the new discipline of data science-which combines programming, statistics, and domain knowledge-can help cut through that complexity, potentially with productivity benefits to boot. This chapter looks at examples of where innovations from data science are cutting through the complexities faced by policymakers in measurement, allocating resources, monitoring the natural world, making predictions, and more. These examples show the promise and potential of data science to aid policymakers, and point to where actions may be taken that would support further progress in this space.
... This motivated the authors to collect real time data from a flexoprinting and cutting unit to understand the correlation between various parameters and to develop a suitable model for predicting the output of a point bottom sealing and cutting machine. Proper management of data and its utilization for data driven decision making can give an edge to an industry over their competitors [1], [2]. The field of artificial intelligence (AI) has been successful in dealing with large amounts of data to draw meaningful conclusions [3]. ...
Article
Full-text available
p>The packaging sector utilizes polypropylene based flexible materials for diverse product packaging with customization options in size and design achieved through advanced flexographic printing and point bottom sealing and cutting machines. Accurately estimating production time and quantity is vital for efficient planning and cost estimation, with factors like material dimensions, thickness, and cutting machine speed influencing production output. Understanding the intricate relationship between these parameters is essential for comprehending their impact on production time and quantity. Predicting production quantity before production begins helps in determining machine runtime and associated costs. In large-scale production systems, machine learning (ML) has proven to be a useful tool for resource allocation and predictive scheduling. An attempt has been made in this paper to develop an intelligent model for predicting the yield of a cutting machine using artificial neural network (ANN), support vector regression (SVR), regression tree ensemble (RTE) and gaussian process regression (GPR). The most crucial features for prediction were identified and the hyperparameters of the ML models were optimized to create efficient models for prediction. A comparative analysis of the four models revealed that the GPR model was simple and effective with least training time and prediction error.</p
... This research stream advances the idea that quantitatively data-driven organizations are apt to be better equipped to test their assumptions about the market (Thomke, 2003(Thomke, , 2020Camuffo et al., 2020;Koning, Hasan, and Chatterji, 2022) and to process large volumes of external information to align their products to customer preferences (Tambe, 2014;Hitt, Jin, and Wu, 2015;Mü ller, Fay, and vom Brocke, 2018;Wu, Hitt, and Lou, 2020;Brynjolfsson, Jin, and McElheran, 2021). Such organizations may also be less prone both to undisciplined organizational politics and to confirmation bias that supports management's pet projects (Brynjolfsson, Hitt, and Kim, 2011;Brynjolfsson and McElheran, 2019;Thomke, 2020;Kim et al., 2024). All these advantages would, theoretically, augment organizations' ability to produce the most-promising innovations. ...
Article
Full-text available
Prior research on data-driven innovation, which assumes quantitative analysis as the default, suggests a tradeoff: Organizations that rely heavily on data-driven analysis tend to produce familiar, incremental innovations with moderate commercial potential, at the expense of risky, novel breakthroughs or hit products. We argue that this tradeoff does not hold when quantitative and qualitative analysis are used together. Organizations that substantially rely on both types of analysis in the new-product innovation process will benefit by triangulating quantifiably verifiable demand (which prompts more moderate successes but fewer hits) with qualitatively discernible potential (which prompts more novelty but more flops). Although relying primarily on either type of analysis has little impact on overall new-product sales due to the countervailing strengths and weaknesses inherent in each, together they have a complementary positive effect on new-product sales as each compensates for the weaknesses of the other. Drawing on a unique dataset of 3,768 new-product innovations from NielsenIQ linked to employee résumé job descriptions from 55 consumer-product firms, we find support for our hypothesis. The highest sales and number of hits were observed in organizations that demonstrated methodological pluralism: substantial reliance on both types of analyses. Further mixed-method research examining related outcomes—hits, flops, and novelty—corroborates our theory and confirms its underlying mechanisms.
... • Data-driven decision-making enables companies to identify market trends, opportunities, and risks to make faster and more informed decisions. Thus, enabling companies to capitalize on opportunities, mitigate potential losses, and improve profitability (Brynjolfsson and McElheran, 2019;Brynjolfsson et al., 2011). • The combination of advanced devices and technologies allows companies to monitor equipment conditions, analyze data, and predict potential failures before they occur. ...
Article
Full-text available
In response to urgent global challenges posed by climate change and environmental degradation, integrating sustainable production practices into the entire product life cycle (PLC) has become essential. This paper proposes a comprehensive framework addressing the gap between sustainability models and product life cycle assessment (PLCA), addressing the need for a holistic approach encompassing economic, social, and environmental dimensions. The framework outlines optimal sustainable practices from material extraction to end-of-life disposal. It emphasizes reduced ecological footprints, resource conservation, pollution mitigation, and enhanced sustainability. Furthermore, it underscores the role of governmental and non-governmental organizations (GOs and NGOs) in promoting this integrated approach. Additionally, this research addresses key questions about integrating sustainable practices, implementation challenges, and economic feasibility. It aims to guide businesses toward holistic approaches that balance economic growth, environmental stewardship, and social equity across the entire PLC.
... Extant research on data-driven firms encompasses a variety of fields, from decisionmaking (Brynjolfsson et al., 2011), information systems (Maass et al., 2018) and business intelligence (Rostek et al., 2012) to business models (Hartmann et al., 2016), international management (Akter et al., 2021) and knowledge management (Wang & Wang, 2020). ...
Article
Full-text available
Digital transformation and the possibility to collect large amount of data, the so-called big data, from different sources can lead to the redefinition of business processes, models and infrastructures. In this respect, data-driven decision making (DDDM) emphasizes the need to rethink management practices to view data as a driving force that can improve the effectiveness of decisions and nurture innovation. Over time, data-driven principles have been reframed as the foundations for the development of a new mind-set that considers data as a key asset that redesigns entrepreneurship and encompasses orientation, culture and human resources management to help entrepreneurs catch, evaluate and launch entrepreneurial opportunities that can improve technology and knowledge by stimulating innovation. For this reason, the study aims at reconceptualizing entrepreneurship according to a multi-levelled perspective based on the integration of cultural, human, knowledge-based and technological dimension to assess the impact of data-driven management on entrepreneurial opportunities creation and on the development of different kinds of innovation. To assess these goals, empirical research based on constructivist grounded theory and on the administration of semi-structured interviews is performed through the investigation of an Italian public–private Consortium specialized in big data. The findings allow the elaboration of a conceptual framework which classifies the activities of data-driven entrepreneurial processes, the phases of opportunity creation and the different kinds of innovation enabled in the Consortium by guiding entrepreneurs and managers in the elaboration of effective data analysis strategies.
... The use of data analytics in organizations can help the organizations to know their customers, improve their operations, and even predict change in the market. For instance, Brynjolfsson, Hitt, and Kim (2011) realized that firms that incorporated data analysis techniques had 5% higher productivity gains relative to competitors and 6% superior operating profits [6]. This correlation implies that companies that ensure they embrace data analytics are more likely to positively transform and achieve organizational strategic objectives [7]. ...
Article
Full-text available
This paper explores the role of data analytics and big data in decision-making for management. The paper also shows how the increased adoption of data analytics has shifted conventional decision-making methods, which were mainly informed by heuristics. Big data technologies facilitate data-driven decision-making to offer real-time and predictive information to managers. The quantitative data was obtained from 200 managers from different sectors such as health care, finance, retail, manufacturing, and technology sectors while the qualitative data was obtained from focus groups and interviews. This is evident because approximately 78% of managers indicated that they often apply data analytics tools and the most often used tools are business intelligence systems, predictive analytics, and data visualization tools. The study also looks at the difficulties that organizations experience when adopting a data-driven culture, including data quality, information overload, and organizational culture. In summary, this research establishes that data analytics enhances the speed of decision-making, increases accuracy, and promotes cooperation to enable organizations to prepare for volatile business environments.
... Empresas que utilizam de Data Science para a tomada de decisões têm maior produtividade e valor de mercado. Além disso, há evidências de que decisões tomadas com o auxílio de Data Science estão associadas a algumas medidas de lucratividade, como utilização de ativos e retorno sobre patrimônio (Brynjolfsson, 2011). ...
Article
Full-text available
Com o aumento na quantidade de dados acessíveis, as empresas estão utilizando essas informações para obter vantagens competitivas na tomada de decisão, incluindo a aplicação de técnicas de ciência de dados. Este trabalho procura investigar os impactos da Data Science, partindo da hipótese que gerentes de empresas que adotam essas técnicas percebem vantagens de seu uso para a tomada de decisão. Deste modo, foi desenvolvido um estudo de caso, com entrevista semiestruturada e observação do autor. Os resultados desta análise indicam que a ciência de dados é importante para a empresa, porém ainda com pouco uso em algumas áreas da organização. A partir da percepção de gestor da empresa estudada, sugere-se práticas para a implementação da ciência de dados, validadas por especialistas.
... Decisions are made about people and are made by people using machine learning-based tools for support. Many emerging application domains are now shifting to data-driven decision making due to a greater capture of information digitally and the desire to be more scientific rather than relying on (fallible) gut instinct [49]. These applications present many safety-related challenges. ...
Preprint
Machine learning algorithms increasingly influence our decisions and interact with us in all parts of our daily lives. Therefore, just as we consider the safety of power plants, highways, and a variety of other engineered socio-technical systems, we must also take into account the safety of systems involving machine learning. Heretofore, the definition of safety has not been formalized in a machine learning context. In this paper, we do so by defining machine learning safety in terms of risk, epistemic uncertainty, and the harm incurred by unwanted outcomes. We then use this definition to examine safety in all sorts of applications in cyber-physical systems, decision sciences, and data products. We find that the foundational principle of modern statistical machine learning, empirical risk minimization, is not always a sufficient objective. Finally, we discuss how four different categories of strategies for achieving safety in engineering, including inherently safe design, safety reserves, safe fail, and procedural safeguards can be mapped to a machine learning context. We then discuss example techniques that can be adopted in each category, such as considering interpretability and causality of predictive models, objective functions beyond expected prediction accuracy, human involvement for labeling difficult or rare examples, and user experience design of software and open data.
... The use of data analytics in organizations can help the organizations to know their customers, improve their operations, and even predict change in the market. For instance, Brynjolfsson, Hitt, and Kim (2011) realized that firms that incorporated data analysis techniques had 5% higher productivity gains relative to competitors and 6% superior operating profits [6]. This correlation implies that companies that ensure they embrace data analytics are more likely to positively transform and achieve organizational strategic objectives [7]. ...
Article
This paper explores the role of data analytics and big data in decision-making for management. The paper also shows how the increased adoption of data analytics has shifted conventional decision-making methods, which were mainly informed by heuristics. Big data technologies facilitate data-driven decision-making to offer real-time and predictive information to managers. The quantitative data was obtained from 200 managers from different sectors such as health care, finance, retail, manufacturing, and technology sectors while the qualitative data was obtained from focus groups and interviews. This is evident because approximately 78% of managers indicated that they often apply data analytics tools and the most often used tools are business intelligence systems, predictive analytics, and data visualization tools. The study also looks at the difficulties that organizations experience when adopting a data-driven culture, including data quality, information overload, and organizational culture. In summary, this research establishes that data analytics enhances the speed of decision-making, increases accuracy, and promotes cooperation to enable organizations to prepare for volatile business environments.
... Initially, most firms employed only a few modelers (e.g., data scientists) who were responsible for a small number of models. This changed as companies turned to analytics on an enterprise-wide basis and hired more analytics professionals (e.g., data engineers), created centralized repositories of data (e.g., data warehouses, lakes), acquired advanced analytics tools and software, and used these resources to build models at a large scale in order to compete on analytics (Brynjolfsson, Hitt, & Kim, 2011;Davenport, 2007). ...
Article
Full-text available
Companies are moving from a cottage industry to a factory approach to analytics, especially in regard to machine learning (ML) models. This change is motivating companies to adopt ML operations (MLOps) as a methodology for the timely development, deployment, and maintenance of ML models in order to positively impact business outcomes. The adoption of MLOps requires changes in processes, technology, and people, and these changes are informed by previous work on decision support systems (DSS), development operations (DevOps), and data operations (DataOps). The processes, technologies, and people needed for MLOps are discussed and illustrated using a customer purchase recommendation example. Current and future directions for MLOps practice driven by artificial intelligence (AI) are explored. Suggestions for further academic research are provided.
... Will Big Data abolish models and theories (Anderson, 2008)? In fact, what Big Data proposes is to move from a hypothesis-driven approach to an evidencebased data-driven approach (Brynjolfsson et al., 2011). Obviously, the Big Data approach is not the end of hypotheses and theories but simply allows the numbers to speak for themselves (Anderson, 2008). ...
Book
Full-text available
The work focuses on the morphological standardisation of ceramic production in continental Italy between the 2nd and the first half of the 1st millennium BC, employing a quantitative approach linked to the fields of machine learning and Big Data. This research is made possible through the creation of a substantial and structured dataset comprising over 20,000 complete profiles and archaeological information from more than 600 different contexts. The study of ceramics has enabled the utilisation of recent analysis methods, such as non-linear dimension reduction techniques or autoencoders, which are versatile artificial intelligence tools for the study of archaeological pottery. This includes feature extraction methods as well as a method for reconstructing ceramic shapes from fragments. The archaeological results support the link between standardisation and specialisation, and their relationship with the development of social complexity. The detailed methods have been compiled into a Python library for easy sharing and utilisation. The entire study adheres to the principles of open data, as the dataset, methods, and analyses will be made publicly available.
Chapter
This chapter explores the emergence and transformative impact of generative artificial intelligence (Gen AI) on organizational intellectual capital (IC) management and the shift from a data-driven to an information-driven management paradigm. It contributes to future IC research and practice by outlining several implications that Gen AI will have on an organization’s IC management. First, regarding human capital, it challenges organizations to develop and retain a specific breed of information engineering experts, sets demand for personnel reskilling including prompt engineering, and emphasizes the central role of collaboration between technical Gen AI experts and domain experts. Second, the chapter argues that organizations must pay attention to certain facets of their structural capital, including the controlled generation of new information through established business processes and the design and maintenance of adaptable information systems. Further, there is a need for an information-driven culture, which drives the utilization of Gen AI outputs in practice. Third, it emphasizes an increasing need to augment an organization’s proprietary information with extra-organizational information, as higher volumes of complementary information improve Gen AI performance. This development will lead to a market of commercialized information products, where organizations may operate as clients and providers or participate through different networks and ecosystems.
Article
Full-text available
Advancements in technology and digitalisation have paved the way for multidimensional possibilities in the availability, extraction, and implantation of data. Consequently, the application of big data in understanding the spectra of different fields of knowledge has been evolving. Entrepreneurship research is one such developing area. Within this field, research has called for the advancement and application of methodological approaches to understand entrepreneurial phenomena. This study attempts to respond to this call by addressing how big data can be applied to entrepreneurship research. Therefore, the use of big data analysis and analytics in entrepreneurship research was explored and several theoretical perspectives and methodological possibilities for applying big data to entrepreneurship research were investigated. Furthermore, benefits, challenges, and ethical considerations were considered with a focus on how to navigate these challenges and implement them. The study concludes with future directions for emerging technologies that can be adopted in entrepreneurship research. Finally, recommendations are offered to researchers on how to apply big data not only to collect information but also to analyse and present it in a meaningful way.
Cover Page
Full-text available
Technological readiness has become one of the pillars of the growth of the modern economy based on digital technology, the Internet and the rapid and growing digital communication of the economy, as electronic readiness is an attractive and stimulating factor for international economic transactions and investments based on technology, abilities in areas such as financial system development, private sector finance, research and development, and, talents and media, which has become the leader in the development of the economy in any country, especially developed countries and some developing countries seeking to develop their economy and create development in their economies in order to catch up with the global economic and technological development. Hence, this research came as an attempt to measure the total index of border technological readiness on GDP growth using the Vector Auto Regression Estimates (VAR) model in Egypt for the period (2010-2022) and the research found that there is a positive and long term positive correlation between the total e-readiness index and economic growth represented by GDP in Egypt. Introduction In recent times, there has been a remarkable leap forward in communication systems and the widespread adoption of information technology. This has brought about a sea change in how we work and what we measure. As a result, we now analyses, process, and exchange economic activities in whole new ways. This has given rise to a new type of economic system called "electronic readiness," which is crucial for meeting the various social and economic challenges that nations face today, particularly in areas like sustainable development, which aims to alleviate poverty and unemployment. To put it simply, e-readiness is a measure of how well an economy is prepared to conduct its day-today operations in the digital sphere,
Article
A transformação digital tem sido uma tendência nas instituições de ensino superior, aumentando significativamente o volume de dados gerados por essas organizações. Em resposta às mudanças oportunizadas pela digitalização, as IES começaram a desenvolver estratégias para utilizar dados como suporte a seus processos e missão institucional. Reconhecendo a orientação a dados como fundamental para melhorar a eficiência organizacional, este estudo visa a expandir um modelo de maturidade em orientação a dados, especificamente adaptado para as IES, com base em boas práticas. A metodologia incluiu uma revisão da literatura para mapear publicações relevantes e identificar práticas-chave para serem incorporadas em um modelo de referência. O estudo oferece uma contribuição em dois aspectos principais: explora modelos de maturidade específicos para as IES, examinando a literatura e o contexto atual, e aborda as oportunidades e desafios da orientação a dados nas IES, destacando fatores críticos e boas práticas para aprimorar o uso de dados e a tomada de decisões. A revisão da literatura identificou 45 boas práticas de orientação a dados em seis dimensões organizacionais, que podem ser usadas pelas universidades para avaliar sua situação atual, identificar lacunas e guiar suas estratégias de transição nos diferentes níveis de maturidade.
Article
Full-text available
The objective of this study is to identify the activities required to achieve effective outcomes in data utilization within large Japanese corporations. Building upon research on Big Data Analytics Capabilities, as represented by Gupta and George (2016), we categorized corporate structure into three primary organizational segments: “Data Utilization Organization,” “Executive/Corporate Level,” and “Business Units.” A hypothesis model was developed to examine how the activities of each segment contribute to data utilization outcomes. The results indicate that while data resources, analytical systems, and skilled personnel within the Data Utilization Organization are essential for achieving data utilization outcomes, they are not sufficient as direct effects alone. Two key mediating factors were identified: (1) executive awareness and understanding of data utilization, along with a corporate culture that supports organizational transformation; and (2) business unit awareness and understanding of data utilization, as well as collaboration with the Data Utilization Organization. Importantly, (1) was shown to have a greater impact than (2).
Article
With the emerging trend of artificial intelligence (AI) and its application in various fields, AI ethics and its related incidents have aroused concern and caused wide discussion in both society and academia around the world. In this paper, we discuss AI ethics and governance with respect to public perspectives. Based on the existing literature, policies, and guidelines on AI ethics, we sorted AI ethics concerns into eight dimensions: safety, transparency, fairness, personal data protection, liability, truthfulness, human autonomy, and human dignity. Combining online survey data with social media data, we quantified people's concerns on each dimension, and their attitudes toward AI governance policies and goals. The results shed light on how the public understands and views AI ethics and related governance. Finally, we propose several future directions in the development of AI ethics.
Article
Purpose This paper aims to explore the underlying mechanisms and boundary conditions through which equipment manufacturing enterprises can capture market value from digital transformation, with a specific focus on the roles of knowledge search and knowledge recombination. Design/methodology/approach This study uses a double fixed-effects model to test the hypotheses, using a unique data set of “firm-year” observations from 739 publicly listed equipment manufacturing companies in China, spanning the period from 2018 to 2022. Findings Digital transformation drives market value creation in equipment manufacturing enterprises through both breakthrough knowledge recombination (BKR) and progressive knowledge recombination (PKR). In addition, the analysis of marginal conditions reveals that diversified knowledge search serves as a substitute for digital transformation in promoting BKR, while also positively moderating the relationship between digital transformation and PKR. Originality/value Grounded in the knowledge-based view theoretical framework, this study introduces the novel concepts of BKR and PKR and systematically examines how digital transformation impacts market value in equipment manufacturing enterprises.
Article
Research Summary We study how humans learn from artificial intelligence (AI), leveraging an introduction of an AI‐powered Go program (APG) that unexpectedly outperformed the best professional player. We compare the move quality of professional players to APG's superior solutions around its public release. Our analysis of 749,190 moves demonstrates significant improvements in players' move quality, especially in the early stages of the game where uncertainty is highest. This improvement was accompanied by a higher alignment with AI's suggestions and a decreased number and magnitude of errors. Young players show greater improvement, suggesting potential inequality in learning from AI. Further, while players of all skill levels benefit, less skilled players gain higher marginal benefits. These findings have implications for managers seeking to adopt and utilize AI in their organizations. Managerial Abstract We examine how professionals can learn from artificial intelligence (AI) by studying an AI‐powered Go program (APG) that outperformed the best professional player. By analyzing 749,190 moves, we find that players' move quality improved significantly, closely aligning with the AI's recommendations. The number and magnitude of errors also decreased. This learning effect was particularly strong early in the game where decisions are more uncertain. Young players showed greater effect, suggesting that learning from AI may vary by age. While players of all skill levels benefited, those with less skill saw the greatest improvement. These findings highlight the instructional role of AI and offer guidance on how to effectively integrate AI into organizations to enhance worker performance across different age groups and skill levels.
Article
Full-text available
In the modern era, a major avenue of dissemination for cultural and artistic events is through the World-Wide Web, where every such event has a multifaceted distinct digital footprint. This digital footprint is an indicator of how strong the influence of each event is in the public’s perception and to what extent it becomes part of the audiovisual art landscape. This study aims to present how the impact of an audiovisual event may be estimated using quantitative data collected through its online presence. This data-driven approach is made possible through web data extraction techniques and the use of generative artificial intelligence, which allows for structured information extraction from an endless variety of websites. Based on an event’s innate characteristics, web outreach, estimated scope, and thematical popularity, an encompassing impact factor is calculated, which may be used to rank events on the basis of perceived influence. For the purposes of this study, a dataset consisting of thousands of events in Greece was collected over an extended period. These data were used for a computational statistical analysis. Through this process of data collection, impact calculation, and analysis, data-driven insights were derived concerning the landscape of audiovisual art events.
Article
Full-text available
Big Data Analytics (BDA) has revolutionized financial forecasting by enabling the processing and analysis of vast and complex datasets in real time. This study explores the impact of BDA techniques on the accuracy and efficiency of financial forecasting within the [specific industry/sector]. The research employs a comparative approach, evaluating the performance of traditional statistical methods against advanced BDA techniques, such as machine learning, predictive analytics, and sentiment analysis. Key metrics, including forecasting precision, computational efficiency, and adaptability to dynamic market changes, are assessed. The findings reveal that BDA techniques significantly outperform traditional methods in accuracy and timeliness, enabling businesses to make informed financial decisions. Furthermore, the study highlights the role of domain-specific data sources and feature engineering in enhancing forecasting performance. The implications of these findings underscore the transformative potential of BDA in financial strategy and risk management.
Article
Full-text available
The rapid expansion of financial markets and the proliferation of data have made effective financial forecasting increasingly dependent on robust data management frameworks. This research focuses on developing an optimal data management framework tailored to the unique requirements of big data analytics in financial forecasting. The proposed framework integrates advanced data processing techniques, scalable storage solutions, and high-performance computing to handle the velocity, variety, and volume of financial data. Leveraging cutting-edge machine learning algorithms and real-time data pipelines, the framework ensures accurate trend prediction, risk assessment, and decision-making support. Key innovations include the integration of data quality assessment protocols, automated feature engineering, and adaptive models that evolve with changing market dynamics. Case studies demonstrate the framework's ability to enhance predictive accuracy and operational efficiency across diverse financial domains, such as stock market analysis, portfolio optimization, and credit risk modeling. This work highlights the transformative potential of optimized data management in harnessing the full capabilities of big data analytics for superior financial forecasting, paving the way for more resilient and informed financial systems.
Chapter
This chapter highlights the critical role of artificial intelligence (AI) and big data analytics in advancing sustainability in the construction industry. Early-stage design decisions have a significant impact on the long-term environmental, economic and social outcomes of construction projects, as fundamental choices determine material use, energy efficiency and waste generation. The chapter highlights the transformative potential of AI-driven methods for analysing large datasets to improve decision-making processes and promote sustainable practices from the outset of construction projects. By using AI, particularly machine learning techniques, the construction sector can achieve more accurate predictions of building energy performance and assess the economic feasibility of retrofits. The integration of Energy Performance Certificates (EPCs) and neural networks makes it possible to predict payback periods for investments, enabling stakeholders to make informed decisions during the feasibility and design phases. The chapter also addresses the challenges of data quality, advocating for robust data governance to ensure reliable AI applications. Ultimately, the chapter presents AI as a central tool for achieving sustainable construction practices. By embedding sustainability considerations early in the project lifecycle, the industry can contribute to decarbonisation targets, reduce environmental impacts and promote long-term economic viability.
Article
The objective of the study was to develop a model of Big Data Analytics and Artificial Intelligence (BDA-AI) technology acceptance in the hospitality and tourism industry in Malaysia. The model developed in this study is Comprehensive Theory of Use and Adoption of Technology (CTUAT). This is an empirical and quantitative study based on a unified model developed through a massive literature review. The study adopted a cross-sectional online survey among 343 owners/managers of tourism and hospitality firms. Applying structural equation modeling and using AMOS software, data was purified and analyzed. The study identified that strategic orientation, performance effectiveness; top management support, organizational resources, employee readiness, and technology expectancy are the predictors of the behavioral intention of BDA-AI technology acceptance except for innovation on behavioral intention.
Article
This special issue looks at how big data affects business decisions, processes, and organizational change within organizations. The issue starts with a review of the latest research in the field, including key developments and ongoing debates. The literature review shows how Big Data is affecting how organizations work, including ethical issues, internal rules, and using new technology. Next, the issue presents three key papers on how Big Data affects modern organizations. The first paper looks at how Big Data how Big Data is helping to make cities smarter. The third paper looks at how new technologies like artificial intelligence, blockchain, and quantum computing affect financial organizations. Together, these contributions show the need to balance innovation with risk management. They advocate for ethical considerations and policy frameworks as organizations navigate the complexities of the Big Data era. This essay of synthesis, from literature review to focused studies on decision‐making, operations, and organizational change, provides a holistic understanding of the role of Big Data in shaping the future of business.
Chapter
Data analytics and reporting technologies play an important role in leadership decision-making. This chapter examines how data analytics and reporting technology help decision-making in leadership. Previous research shows that modern data analytics tools have been incorporated to enhance decision-making processes, improve reporting accuracy, and drive business performance. Organizations using advanced analytics tools tend to see huge benefits in speed, accuracy, improved reporting, and strategic decision-making. Modern technical analytical and reporting tools help leadership understand data by interacting with dashboards, graphs, and real-time data, which helps make a strategy based on the data. There are great benefits of using data analytics and reporting technology in leadership decision-making, it improves efficiency, productivity, and overall organizational growth.
Chapter
Big data has changed how businesses operate by creating digital spaces; however, marketers are having a hard time using big data to create helpful information about customers that leads to good results for their business and the market. This paper aims to show how businesses can make the most of big data for their marketing strategies. This research works adopts the qualitative methodology. The study discovers that using big data can help improve marketing results. Big data helps with marketing strategies and is really important for understanding customers, predicting what they will do, and targeting ads more accurately. Moreover, big data is difficult when trying to analyze and process it. When it comes to handling data, problems with privacy and security are posing risks to people around the world. Thus, the study recommends that businesses should hire marketing experts to handle the complexity of big data, and also enhance the analytic skills of the current management.
Article
Full-text available
This research aimed to develop a Dynamic Financial Growth Model (DFGM) to enhance corporate growth by promoting strategic agility through data-driven decision-making. The main objective was to optimize corporate value by integrating real-time data, dynamic decision-making, risk management, and scenario analysis. The research employed a mathematical modelling framework that combined predictive analytics, real options theory, and scenario-based optimization to represent dynamic corporate financial decisions. The numerical example demonstrated how the model adjusts strategic decisions in response to changes in market data and evaluates corporate value under optimistic, pessimistic, and baseline scenarios. The main results indicated that the DFGM is effective in optimizing corporate value by allowing for continuous adjustments and strategic flexibility, distinguishing itself from traditional static financial models that lack real-time adaptability. The findings highlighted the value of incorporating risk constraints and scenario analysis, resulting in a balanced approach that manages both growth and uncertainty. However, the study identified limitations, including the need for empirical validation, more complex predictive analytics, and accounting for behavioral factors affecting decision-making. The conclusion emphasizes that the DFGM provides an adaptable and data-driven framework that enhances corporate strategic agility, making it a valuable tool for managing growth in rapidly changing environments, while also suggesting future research to refine the model's practical application
Article
Full-text available
This paper argues that the slowdown in labor productivity growth that has occurred since 1968 and particularly since 1973 has probably been caused by a decline in the services of capital and labor relative to the measured quantities of these inputs. There is enough suggestive evidence of a decrease in effective labor input relative to measured labor hours to attribute about one-seventh of the productivity growth decline to this source. These effects have been concentrated outside the main manufacturing and industrial sectors. The most important cause of the growth slowdown in recent years seems to be a decline in the services of capital, caused by obsolescence and by the diversion of some part of capital spending to saving energy or product conversion. According to the model in this paper, conventional estimates based on the measured capital stock overstated the rate of total factor productivity growth through the mid-1960s, and the steady-state productivity growth rate of the US economy is lower than has been thought. Thus some part of the recent productivity slow-down is simply a return to the long-run steady-state path. An implication of this paper is that investment may do more to improve productivity growth than a coventional analysismore » predicts. There is an important qualification to this conclusion. We will gain little by adding substantially to the growth rate of gross output if we add little to output net of economic depreciation. The payoff to investment will be exceptionally large provided that new capital can avoid the problem of obsolescence that slowed productivity during the past decade. 64 references, 6 tables.« less
Article
Full-text available
Considers structural inertia in organizational populations as an outcome of an ecological-evolutionary process. Structural inertia is considered to be a consequence of selection as opposed to a precondition. The focus of this analysis is on the timing of organizational change. Structural inertia is defined to be a correspondence between a class of organizations and their environments. Reliably producing collective action and accounting rationally for their activities are identified as important organizational competencies. This reliability and accountability are achieved when the organization has the capacity to reproduce structure with high fidelity. Organizations are composed of various hierarchical layers that vary in their ability to respond and change. Organizational goals, forms of authority, core technology, and marketing strategy are the four organizational properties used to classify organizations in the proposed theory. Older organizations are found to have more inertia than younger ones. The effect of size on inertia is more difficult to determine. The variance in inertia with respect to the complexity of organizational arrangements is also explored. (SRD)
Article
Full-text available
This paper focuses on patterns of technological change and on the impact of technological breakthroughs on environmental conditions. Using data from the minicomputer, cement, and airline industries from their births through 1980, we demonstrate that technology evolves through periods of incremental change punctuated by technological break-throughs that either enhance or destroy the competence of firms in an industry. These breakthroughs, or technological discontinuities, significantly increase both environmental uncertainty and munificence. The study shows that while competence-destroying discontinuities are initiated by new firms and are associated with increased environmental turbulence, competence-enhancing discontinuities are initiated by existing firms and are associated with decreased environmental turbulence. These effects decrease over successive discontinuities. Those firms that initiate major technological changes grow more rapidly than other firms.
Article
Full-text available
This paper uses newly collected panel data that allow for significant improvements in the measurement and modeling of IT productivity to address some long-standing empirical limitations in the IT business value literature. First, we show that using GMM-based estimators to account for the endogeneity of IT spending produces coefficient estimates that are only about 10% lower than unadjusted estimates, suggesting that the effects of endogeneity on IT productivity estimates may be relatively small. Second, analysis of the expanded panel suggests that a) IT returns are substantially lower in small and mid-size firms than in Fortune 500 firms, b) that they materialize more slowly in large firms -- unlike in larger firms, the short-run contribution of IT to output in small and mid-size firms is similar to the long-run output contribution, and c) that the measured marginal product of IT spending is higher from 2000-2006 than in any previous period, suggesting that firms, and especially large firms, have been continuing to develop new, valuable IT-enabled business process innovations. Furthermore, we show that the productivity of IT investments is higher in manufacturing sectors, and that our productivity results are robust to controls for IT labor quality and outsourcing levels.
Article
Full-text available
This paper presents a technique for estimating a firm's brand equity that is based on the financial market value of the firm. Brand equity is defined as the incremental cash flows which accrue to branded products over unbranded products. The estimation technique extracts the value of brand equity from the value of the firm's other assets. This technique is useful for two purposes. First, the macro approach assigns an objective value to a company's brands and relates this value to the determinants of brand equity. Second, the micro approach isolates changes in brand equity at the individual brand level by measuring the response of brand equity to major marketing decisions. Empirically, we estimate brand equity using the macro approach for a sample of industries and companies. Then we use the micro approach to trace the brand equity of Coca-Cola and Pepsi over three major events in the soft drink industry from 1982 to 1986.
Conference Paper
Full-text available
We find three-way complementarities among IT, performance pay, and monitoring practices. We model these practices as a tightly-knit incentive system that produces the largest productivity premium when implemented in concert. We assess our model by combining fine-grained data on Human Capital Management (HCM) software adoption with detailed survey data on performance pay and monitoring practices at 90 firms from 1995-2006. HCM adoption is associated with a disproportionately large productivity premium when it is implemented within a system of organizational incentives that includes both monitoring and performance pay, but has little benefit when adopted alone. We find no evidence of reverse causality: the complementarities appear when the software goes live, not when the purchase decision is made. Furthermore, we can distinguish two components of performance pay: motivation (inducing employees to increase effort), and talent selection (attracting higher quality employees). We find that the complementarities are entirely explained by talent selection.
Conference Paper
Full-text available
We combine detailed survey data on firms’ organizational practices with information technology (IT) investment measures to test the hypothesis that in addition to decentralization, external focus is another important determinant of returns to IT investment. We argue that IT-intensive firms characterized by decentralization are able to more effectively process and respond to information from their competitive environments, which drives productivity through superior innovation and product development. Our estimates from a regression model including organizational practices indicate that IT investments only increase productivity for those firms that are decentralized and externally focused. IT investments in firms that have only one or neither of these organizational assets in place do not appear to significantly increase productivity.
Conference Paper
Full-text available
The media hype surrounding the growth of electronic commerce has led to considerable firm interest in making the significant investments required to participate in this growing market. However, the evidence on benefits to firms from e-commerce is far from unequivocally positive, as popular accounts would lead us to believe. In this paper, we explore the following questions: What are the economic returns to firms from engaging in e- commerce? How do the returns to non-net, brick and mortar firms from e-commerce initiatives compare with returns to the new breed of net firms? How do returns from business-to-business e-commerce compare with returns from business-to-consumer e-commerce? We examine these issues using event study methodology and assess the cumulative abnormal returns (CARs) for 305 e-commerce announcements between October and December 1998. The results suggest that e-commerce initiatives announced in this period do indeed lead to positive CARs for firms. However, the hypothesis drawing on the resource-based view of the firm: that the CAR to non-net firms is significantly more than the CAR to net firms is not supported. Further, the CARs associated with business-to-consumer e-commerce announcements are higher than the CARs for business-to-business e- commerce, a result contrary to the hypothesized direction. The results are robust to the removal of outliers and time windows of varying length between firm announcements and capital market adjustments of prices. Most importantly, the magnitudes of CARs (between 3% and 11%) observed in response to e-commerce announce- ments are considerably larger than those observed for a variety of firm actions in the prior literature. This paper presents the first empirical test of the dot com effect, validating the popular notion that capital markets recognize the transformational potential of e-commerce and expect significant future benefits to firms entering into e- commerce arrangements.
Article
Full-text available
The business value of information technology (IT) has been debated for a number of years. While some authors have attributed large productivity improvements and substantial consumer benefits to IT, others report that IT has not had any bottom line impact on business profitability. This paper focuses on the fact that while productivity, consumer value, and business profitability are related they are ultimately separate questions. Accordingly, the empirical results on IT value depend heavily on which question is being addressed and what data are being used. Applying methods based on economic theory, we are able to define and examine the relevant hypotheses for each of these three questions, using recent firm-level data on IT spending by 370 large firms. Our findings indicate that IT has increased productivity and created substantial value for consumers. However, we do not find evidence that these benefits have resulted in supranormal business profitability. We conclude that while modeling techniques need to be improved, these results are collectively consistent with economic theory. Thus, there is no inherent contradiction between increased productivity, increased consumer value, and unchanged business profitability.
Article
Full-text available
This paper demonstrates that the traditional categorization of innovation as either incremental or radical is incomplete and potentially misleading and does not account for the sometimes disastrous effects on industry incumbents of seemingly minor improvements in technological products. We examine such innovations more closely and, distinguishing between the components of a product and the ways they are integrated into the system that is the product "architecture," define them as innovations that change the architecture of a product without changing its components. We show that architectural innovations destroy the usefulness of the architectural knowledge of established firms, and that since architectural knowledge tends to become embedded in the structure and information-processing procedures of established organizations, this destruction is difficult for firms to recognize and hard to correct. Architectural innovation therefore presents established organizations with subtle challenges that may have significant competitive implications. We illustrate the concept's explanatory force through an empirical study of the semiconductor photolithographic alignment equipment industry, which has experienced a number of architectural innovations.
Article
Harrah's Entertainment may not offer the most dazzling casinos in the business, but it is the most profitable gaming company in the United States. Since 1998, Harrah's has recorded 16 straight quarters of same-store revenue growth. It boasts the most devoted clientele in the casino industry, a business notorious for fickle customers. Yet its casinos eschew the must-see amenities characteristic of its competitors in Las Vegas - the volcanoes, knights on horse-back, gondolas, and mini-Manhattans. In this article, Harrah's Entertainment CEO and former Harvard Business School professor Gary Loveman explains how his company has trumped its competitors by mining customer data, running experiments using customer information, and using the findings to develop and implement marketing strategies that keep customers coming back for more. Harrah's identified its best customers - who were not typical high rollers - and taught them to respond to the casino's marketing efforts in a way that added to their individual value. The company took customer preference data collected through its Total Rewards incentive program and used decision-science-based analytical tools and data-base marketing to widen the gap between Harrah's and other casino operators that base their customer incentives more on intuition than empirical data. This deep data mining has succeeded because Harrah's has simultaneously maintained its focus on satisfying its customers. Loveman outlines the specific strategies and employee-performance measures that Harrah's uses to nurture customer loyalty across its 26 casinos. By refining the same-store tactics used by retailers and by delving into customer data, the first nation-wide casino business has managed to fare well even in a bad economy.
Chapter
Introduction In the standard economic treatment of the principal–agent problem, compensation systems serve the dual function of allocating risks and rewarding productive work. A tension between these two functions arises when the agent is risk averse, for providing the agent with effective work incentives often forces him to bear unwanted risk. Existing formal models that have analyzed this tension, however, have produced only limited results. It remains a puzzle for this theory that employment contracts so often specify fixed wages and more generally that incentives within firms appear to be so muted, especially compared to those of the market. Also, the models have remained too intractable to effectively address broader organizational issues such as asset ownership, job design, and allocation of authority. In this article, we will analyze a principal–agent model that (i) can account for paying fixed wages even when good, objective output measures are available and agents are highly responsive to incentive pay; (ii) can make recommendations and predictions about ownership patterns even when contracts can take full account of all observable variables and court enforcement is perfect; (iii) can explain why employment is sometimes superior to independent contracting even when there are no productive advantages to specific physical or human capital and no financial market imperfections to limit the agent's borrowings; (iv) can explain bureaucratic constraints; and (v) can shed light on how tasks get allocated to different jobs.
Article
Managers regularly implement new ideas without evidence to back them up. They act on hunches and often learn very little along the way. That doesn't have to be the case. With the help of broadly available software and some basic investments in building capabilities, managers don't need a PhD in statistics to base consequential decisions on scientifically sound experiments. Some companies with rich consumer-transaction data-Toronto-Dominion, CKE Restaurants, eBay, and others-are routinely testing innovations well outside the realm of product R&D. As randomized testing becomes standard procedure in certain settings (website analysis, for instance), firms learn to apply it in other areas as well. Entire organizations that adopt a "test and learn" culture stand to realize the greatest benefits. That said, firms need to determine when formal testing makes sense. Generally, it's much more applicable to tactical decisions (such as choosing a new store format) than to strategic ones (such as figuring out whether to acquire a business). Tests are useful only if managers define and measure desired outcomes and formulate logical hypotheses about how proposed interventions will play out. To begin incorporating more scientific management into your business, acquaint managers at all levels with your organization's testing process. A shared understanding of what constitutes a valid test-and how it jibes with other processes-helps executives to set expectations and innovators to deliver on them. The process always begins with creating a testable hypothesis. Then the details of the test are designed, which means identifying sites or units to be tested, selecting control groups, and defining test and control situations. After the test is carried out for a specified period, managers analyze the data to determine results and appropriate actions. Results ideally go into a "learning library," so others can benefit from them.
Article
The growth of U.S. labor productivity rebounded in the second half of the 1990s, after nearly a quarter century of sluggish gains. We assess the contribution of information technology to this rebound, using the same neoclassical framework as in our earlier work. We find that a surge in the use of information technology capital and faster efficiency gains in the production of computers account for about two-thirds of the speed-up in productivity growth between the first and second halves of the 1990s. Thus, to answer the question posed in the title of the paper, information technology largely is the story.
Article
In this paper, Tobin's q theory is applied to evaluate the informativeness of the traditional accounting measures of business performance and the measures derived from the cash recovery rate (CRR). The informativeness is defined in terms of the correlation of the performance measure and the internal rate of return implied in a ratio known as Tobin's q. The paper shows that the traditional ROI does weakly reflect the underlying profitability. The internal rate of return derived from the cash recovery rate approach, however, seems to be more informative than the ROI.
Article
The emergence of the Internet has pushed many established companies to explore this radically new distribution channel. Like all market discontinuities, the Internet creates opportunities as well as threats-it can be performance-enhancing as readily as it can be performance-destroying. Making use of event-study methodology, the authors assess the net impact of adding an Internet channel on a firm's stock market return, a measure of the change in expected future cash flows. The authors find that, on average, Internet channel investments are positive net-present-value investments. The authors then identify firm, introduction strategy, and marketplace characteristics that influence the direction and magnitude of the stock market reaction. The results indicate that powerful firms with a few direct channels are expected to achieve greater gains in financial performance than are less powerful firms with a broader direct channel offering. In terms of order of entry, early followers have a competitive advantage over both innovators and later followers, even when time of entry is controlled for. The authors also find that Internet channel additions that are supported by more publicity are perceived as having a higher performance potential.
Article
Is increased knowledge or enhanced skills the primary result of learning from experience? This study addresses this question by examining the effects of experience of administrators and the average experience of the administrators' units on four aspects of information-processing performance: need for breadth of information, need for depth of information, receiving more information than needed, and receiving less information than needed. That is, the administrator is viewed as an individual learner operating within an ecology of other learning administrators. Researchers have assumed that skills (information processing abilities gained from learning by doing) are more important than knowledge (the relatively formal and established facts, rules, policies, and procedures within the organization) in predicting how the individual and context effects of experience affect administrators' information-processing performance. However, using a survey of administrators in a multi-unit organization (N = 415), it is demonstrated that a model that assumes that knowledge is the primary intervening variable between experience and enhanced information processing correctly predicts both the individual and context effects of experience on information processing, such as the negative relationship between individual experience and the need for breadth and depth of information and getting less information than needed, and the negative relationships between average experience of an administrators' unit and receiving more information than needed. A model based on skills-acquisition as the primary intervening variable between experience and information-processing performance predicts contrary, and hence incorrect, results, leading us to conclude that knowledge is the primary result of experience for administrators. The experience/knowledge relationship is argued to have implications for understanding worker satisfaction and the liability of newness.
Article
The purpose of this paper is to explain why task uncertainty is related to organizational form. In so doing the cognitive limits theory of Herbert Simon was the guiding influence. As the consequences of cognitive limits were traced through the framework various organization design strategies were articulated. The framework provides a basis for integrating organizational interventions, such as information systems and group problem solving, which have been treated separately before.
Article
ABSTRACT Recently, the relative demand for skilled labor has increased dramatically. We investigate one of the causes, skill-biased technical change. Advances in information technology (IT) are amongthe most powerful forces bearing on the economy. Employers who use IT often make complementary,innovationsin their organizations and in the services they offer. Our hypothesis is that these co-inventions by IT users change the mix of skills that employers demand. Specifically, we test the hypothesis that it is a cluster of
Article
This paper presents the results of a natural experiment conducted at a U.S. high-tech manufacturer. The experiment had as its treatment the adoption, at a single point in time, of a comprehensive enterprise information system throughout the functional groups charged with customer order fulfillment. This information technology (it) adoption was not accompanied by substantial contemporaneous business process changes. Immediately after adoption, lead time and on-time delivery performance suffered, causing a “performance dip” similar to those observed after the introduction of capital equipment onto shop floors. Lead times and on-time delivery percentages then improved along a learning curve. After several months, performance in these areas improved significantly relative to preadoption levels. These observed performance patterns could not be well explained by rival causal factors such as order, production, and inventory volumes; head count; and new product introductions. Thus, this longitudinal research presents initial evidence of a causal link between IT adoption and subsequent improvement in operational performance measures, as well as evidence of the timescale over which these benefits appear.
Book
How can you know when someone is bluffing? Paying attention? Genuinely interested? The answer, writes Alex Pentland in Honest Signals, is that subtle patterns in how we interact with other people reveal our attitudes toward them. These unconscious social signals are not just a back channel or a complement to our conscious language; they form a separate communication network. Biologically based "honest signaling," evolved from ancient primate signaling mechanisms, offers an unmatched window into our intentions, goals, and values. If we understand this ancient channel of communication, Pentland claims, we can accurately predict the outcomes of situations ranging from job interviews to first dates. Pentland, an MIT professor, has used a specially designed digital sensor worn like an ID badge--a "sociometer"--to monitor and analyze the back-and-forth patterns of signaling among groups of people. He and his researchers found that this second channel of communication, revolving not around words but around social relations, profoundly influences major decisions in our lives--even though we are largely unaware of it. Pentland presents the scientific background necessary for understanding this form of communication, applies it to examples of group behavior in real organizations, and shows how by "reading" our social networks we can become more successful at pitching an idea, getting a job, or closing a deal. Using this "network intelligence" theory of social signaling, Pentland describes how we can harness the intelligence of our social network to become better managers, workers, and communicators.
Article
This paper studies a key driver of the demand for the products and services of the global IT industry---returns from IT investments. We estimate an intercountry production function relating IT and non-IT inputs to GDP output, on panel data from 36 countries over the 1985--1993 period. We find significant differences between developed and developing countries with respect to their structure of returns from capital investments. For the developed countries in the sample, returns from IT capital investments are estimated to be positive and significant, while returns from non-IT capital investments are not commensurate with relative factor shares. The situation is reversed for the developing countries subsample, where returns from non-IT capital are quite substantial, but those from IT capital investments are not statistically significant. We estimate output growth contributions of IT and non-IT capital and discuss the contrasting policy implications for capital investment by developed and developing economies.
Article
Despite increasing anecdotal evidence that information technology (IT) assets contribute to firm performance and future growth potential of firms, the empirical results relating IT investments to firm performance measures have been equivocal. However, the bulk of the studies have relied exclusively on accounting-based measures of firm performance, which largely tend to ignore IT's contribution to performance dimensions such as strategic flexibility and intangible value. In this paper, we use Tobin's q, a financial market-based measure of firm performance and examine the association between IT investments and firm q values, after controlling for a variety of industry factors and firm-specific variables. The results based on data from 1988--1993 indicate that, in all of the five years, the inclusion of the IT expenditure variable in the model increased the variance explained in q significantly. The results also showed that, for all five years, IT investments had a significantly positive association with Tobin's q value. Our results are consistent with the notion that IT contributes to a firm's future performance potential, which a forward-looking measure such as the q is better able to capture.
Article
Empirical research has revealed differences in the economic impact of information technology (IT) across industries. However, the source of these differences is unclear. In this study we analyze the role of the competitive environment in moderating the productive impact of information technology and regular capital. We focus on two important features of an industry's competitive environment: industry concentration and industry dynamism. Industry concentration is the degree to which the output of an entire industry is produced by a few firms and is considered an inverse proxy for industry competitiveness. Industry dynamism denotes change that is difficult to predict, measured as the deviation of industry sales from a trend line. We analyze the moderating impact of concentration and dynamism on the output elasticity of information technology and regular capital by estimating a production function using 5211 firm–year observations spanning the years 1987 to 1994. We find that the marginal product of IT is lower in more concentrated industries, while the opposite is true for regular capital. There is limited evidence that the marginal product of IT is higher in more dynamic industries, and strong evidence that the marginal product of regular capital is lower in more dynamic industries. Taken together, our results suggest that IT provides enhanced productivity impacts to firms in more competitive industries without any productivity loss in dynamic industries, in contrast to regular capital. The findings underscore the salience of inclusion of the competitive environment in studies of the productive impacts of information technology.
Article
A significant relationship is found between the market value of the firm and its ‘intangible’ capital, proxied by past R&D expenditures and the number of patents, based on a time-series cross-section analysis of data for large U.S. firms.
Article
This paper looks directly at the impact of firms' age and (process) innovations on productivity growth. A model that specifies productivity growth as an unknown function of these variables is devised and estimated using semiparametric methods. Results show that firms enter the market experiencing high productivity growth and that above-average growth rates tend to last for many years, but also that productivity growth of surviving firms converges. Process innovations at some point then lead to extra productivity growth, which also tends to persist somewhat attenuated for a number of years.
Conference Paper
While it is now well established that IT intensive firms are more productive, a critical question remains: Does IT cause productivity or are productive firms simply willing to spend more on IT? We address this question by examining the productivity and performance effects of enterprise systems investments in a uniquely detailed and comprehensive data set of 623 large, public U.S firms. The data represent all U.S. customers of a large vendor during 1998-2005 and include the vendor's three main enterprise system suites: Enterprise Resource Planning (ERP), Supply Chain Management (SCM), and Customer Relationship Management (CRM). A particular benefit of our data is that they distinguish the purchase of enterprise systems from their installation and use. Since enterprise systems often take years to implement, firm performance at the time of purchase often differs markedly from performance after the systems go live. Specifically, in our ERP data, we find that purchase events are uncorrelated with performance while go-live events are positively correlated. This indicates that the use of ERP systems actually causes performance gains rather than strong performance driving the purchase of ERP. In contrast, for SCM and CRM, we find that performance is correlated with both purchase and go-live events. Because SCM and CRM are installed after ERP, these results imply that firms that experience performance gains from ERP go on to purchase SCM and CRM. Our results are robust against several alternative explanations and specifications and suggest that a causal relationship between ERP and performance triggers additional IT adoption in firms that derive value from their initial investment. These results provide an explanation of simultaneity in IT value research that fits with rational economic decision-making: Firms that successfully implement IT, react by investing in more IT. Our work suggests replacing either-or views of causality with a positive feedback loop conceptualization in which successful IT investments initiate a virtuous cycle of investment and gain. Our work also reveals other important estimation issues that can help researchers identify relationships between IT and business value.
Conference Paper
As part of an effort to examine the value of intangible assets in the firm, our study is the first to create IT-related intangible asset stocks from firm-level survey data. We also use data on IT-related business practices in order to understand the distribution of IT-related intangibles, and we create asset stocks to value research and development (R&D) and brand. Using a panel of 130 firms over the period 2003-2006, we find that intangible assets are correlated with significantly higher market values beyond their cost-based measures. Moreover, we estimate that there is a 30-55% premium in market value for the firms with the highest organizational IT capabilities (based on a measure of HR practices, management practices, internal IT use, external IT use, and Internet use) as compared to those with the lowest organizational IT capabilities.
Conference Paper
Capital One has exploited an innovative approach to targeted marketing, based on customer profitability analysis, to achieve impressive performance as a leading credit card issuer. It is sustaining its advantage through investment in infrastructure and personnel, and through constantly improving its expertise through a practice known as test-and-learn. Moreover, it is attempting to generalize this information-based strategy to other industries.
Article
Enterprise Resource Planning (ERP)software systems integrate key business and management processes within and beyond a firm's boundary.Although the business value of ERP implementations has been extensively debated in trade periodicals in the form of qualitative discussion or detailed case studies, there is little large-sample statistical evidence on whether the benefits of ERP implementation exceed the costs and risks. With multiyear multi-firm ERP implementation and financial data, we find that firms that invest in ERP tend to show higher performance across a wide variety of financial metrics. Even though there is a slowdown in business performance and productivity shortly after the implementation, financial markets consistently reward the adopters with higher market valuation (as measured by Tobin's q). Due to the lack of mid- and long-term post-implementation data, future research on the long-run impact of ERP is proposed.
Article
This paper examines the relationship between information technology (IT) and the organizational architecture of firms. Firms that are extensive users of information technology tend to adopt a complementary set of organizational practices that include: decentralization of decision authority, emphasis on subjective incentives, and a greater reliance on skills and human capital. We explore these relationships using detailed data on work systems and information technology spending for 273 large firms. Overall, we find that increased investment in IT is linked to a system of decentralized authority and related practices. Our findings may help resolve some of the questions about the relationships of information technology to internal organization and provide insight into the optimal organization of knowledge work.
Article
This paper develops empirical proxy measures of information technology (IT) risk and incorporates them into the usual empirical models for analyzing IT returns: production function and market value specifications. The results suggest that IT capital investments make a substantially larger contribution to overall firm risk than non-IT capital investments. Further, firms with higher IT risk have a higher marginal product of IT relative to firms with low IT risk. In the market value specification, the impact of IT risk is positive and significant, and inclusion of the IT risk term substantially reduces the coefficient on IT capital. We estimate that about 30% of the gross return on IT investment corresponds to the risk premium associated with IT risk. Taken together, our results show that IT risk provides part of the explanation for the unusually high valuations of IT capital investment in recent research.
Article
The relationship between investment in information technology (IT) and its effect on organizational performance continues to interest academics and practitioners. In many cases, due to the nature of the research design employed, this stream of research has been unable to identify the impact of individual technologies on organizational performance. This study posits that the driver of IT impact is not the investment in the technology, but the actual usage of the technology. This proposition is tested in a longitudinal setting of a healthcare system comprising eight hospitals. Monthly data for a three–year period on various financial and nonfinancial measures of hospital performance and technology usage were analyzed. The data analysis provides evidence for the technology usage–performance link after controlling for various external factors. Technology usage was positively and significantly associated with measures of hospital revenue and quality, and this effect occurred after time lags. The analysis was triangulated using three measures of technology usage. The general support for the principal proposition of this paper that “actual usage” may be a key variable in explaining the impact of technology on performance suggests that omission of this variable may be a missing link in IT payoff analyses.
Article
In this concluding article to the Management Science special issue on ÜManaging Knowledge in Organizations: Creating, Retaining, and Transferring Knowledge,Ý we provide an integrative framework for organizing the literature on knowledge management. The framework has two dimensions. The knowledge management outcomes of knowledge creation, retention, and transfer are represented along one dimension. Properties of the context within which knowledge management occurs are represented on the other dimension. These properties, which affect knowledge management outcomes, can be organized according to whether they are properties of a unit (e.g., individual, group, organization) involved in knowledge management, properties of relationships between units or properties of the knowledge itself. The framework is used to identify where research findings about knowledge management converge and where gaps in our understanding exist. The article discusses mechanisms of knowledge management and how those mechanisms affect a unit's ability to create, retain and transfer knowledge. Emerging themes in the literature on knowledge management are identified. Directions for future research are suggested.
Article
An identification and evaluation of the possible explanations for the high market valuation of Y2K spending is presented. A market-valuation model is used to determine the valuation multiple on Y2K spending of the sample fortune 1000 firms on March 31, 1999. The multiple on Y2K spending of 62.10 for March 31, 1999 indicates that about 60offirmvaluewasassociatedwitheach60 of firm value was associated with each 1 of Y2K spending. Due to the high costs of remediating and maintaining legacy systems, many companies were also found to use the Y2K opportunity to replace an existing cluster of legacy systems with an enterprise resource planning (ERP) system integrating various internal applications through a common data base.
Article
Despite the importance to researchers, managers, and policy makers of how information technology (IT) contributes to organizational performance, there is uncertainty and debate about what we know and don’t know. A review of the literature reveals that studies examining the association between information technology and organizational performance are divergent in how they conceptualize key constructs and their interrelationships. We develop a model of IT business value based in the resource-based view of the firm that integrates the various strands of research into a single framework. We apply the integrative model to synthesize what is known about IT business value and guide future research by developing propositions and suggesting a research agenda. A principal finding is that IT is valuable, but the extent and dimensions are dependent upon internal and external factors, including complementary organizational resources of the firm and its trading partners, as well as the competitive and macro environment. Our analysis provides a blueprint to guide future research and facilitate knowledge accumulation and creation concerning the organizational performance impacts of information technology.
Article
This paper investigates the impact of IT investments and worker composition on the productivity of life insurance companies. The majority of previous IT productivity studies follow a technological imperative, hypothesizing a direct relationship between higher IT investments and increased productivity. This paper shifts the focus toward the organizational imperative, which views returns on IT investments as a result of the alignment between technology and other critical management choices. Specifically, the study focuses on the alignment between IT investments and worker composition, measured in terms of relative numbers of clerical, managerial, and professional positions to the total number of employees. Hypotheses are tested using a data set compiled over a 10-year period for 52 life insurance companies. With respect to prior research, the study is novel in its adoption of a model of productivity that accounts for both separate and combined effects of IT investments and worker composition. Premium income per employee and total operating expense to premium income are used as indicators of productivity. Study findings show that increases in IT expenses are associated with productivity benefits when accompanied by changes in worker composition. Life insurance companies that have decreased their proportion of clericals and professionals while at the same time investing in IT have experienced productivity improvements. On the other hand, companies decreasing their proportion of managers while investing in IT are found to have reduced productivity.
Article
Payoffs from information technology (IT) continue to generate interest and debate both among academicians and practitioners. The extant literature cites inadequate sample size, lack of process orientation, and analysis methods among the reasons some studies have shown mixed results in establishing a relationship between IT investment and firm performance. In this paper we examine the structural variables that affect IT payoff through a meta analysis of 66 firm-level empirical studies between 1990 and 2000. Employing logistic regression and discriminant analyses, we present statistical evidence of the characteristics that discriminate between IT payoff studies that observed a positive effect and those that did not. In addition, we conduct ordinary least squares (OLS) regression on a continuous measure of IT payoff to examine the influence of structural variables on the result of IT payoff studies. The results indicate that the sample size, data source (firm-level or secondary), and industry in which the study is conducted influence the likelihood of the study finding greater improvements on firm performance. The choice of the dependent variable(s) also appears to influence the outcome (although we did not find support for process-oriented measurement), the type of statistical analysis conducted, and whether the study adopted a cross-sectional or longitudinal design. Finally, we present implications of the findings and recommendations for future research.
Article
Firms are undertaking growing numbers of e-commerce initiatives and increasingly making significant investments required to participate in the growing online market. However, empirical support for the benefits to firms from e-commerce is weaker than glowing accounts in the popular press based on anecdotal evidence would lead us to believe. In this paper, we explore the following questions: What are the returns to shareholders in firms engaging in e-commerce? How do the returns to conventional, brick and mortar firms from e-commerce initiatives compare with returns to the new breed of net firms? How do returns from business-to-business e-commerce compare with returns from business-to-consumer e-commerce? How do the returns to e-commerce initiatives involving digital goods compare to initiatives involving tangible goods? We examine these issues using event study methodology and assess the cumulative abnormal returns to shareholders (CARs) for 251 e-commerce initiatives announced by firms between October and December 1998. The results suggest that e-commerce initiatives do indeed lead to significant positive CARs for firms' shareholders. While the CARs for conventional firms are not significantly different from those for net firms, the CARs for business-to-consumer (B2C) announcements are higher than that for business-to-business (B2B) announcements. Also, the CARs with respect to e-commerce initiatives involving tangible goods are higher than for those involving digital goods. Our data were collected in the last quarter of 1998 during a unique bull market period and the magnitudes of CARs (between 4.9 and 23.4 percent for different sub-samples) in response to e-commerce announcements are larger than those reported for a variety of other firm actions in prior event studies. This paper presents the first empirical test of the dot com effect, validating popular anticipations of significant future benefits to firms entering into e-commerce arrangements.
Article
An important management question today is whether the anticipated economic benefits of Information Technology (IT) are being realized. In this paper, we consider this problem to be measurement related, and propose and test a new process-oriented methodology for ex post measurement to audit IT impacts on a strategic business unit (SBU) or profit center's performance. The IT impacts on a given SBU are measured relative to a group of SBUs in the industry. The methodology involves a two-stage analysis of intermediate and higher level output variables that also accounts for industry and economy wide exogenous variables for tracing and measuring IT contributions. The data for testing the proposed model were obtained from SBUs in the manufacturing sector. Our results show significant positive impacts of IT at the intermediate level. The theoretical contribution of the study is a methodology that attempts to circumvent some of the measurement problems in this domain. It also provides a practical management tool to address the question of why (or why not) certain IT impacts occur. Additionally, through its process orientation, the suggested approach highlights key variables that may require managerial ttention and subsequent action. Copyright © 1995, Institute for Operations Research and the Management Sciences.
Book
This study develops an evolutionary theory of the capabilities and behavior of business firms operating in a market environment. It includes both general discussion and the manipulation of specific simulation models consistent with that theory. The analysis outlines the differences between an evolutionary theory of organizational and industrial change and a neoclassical microeconomic theory. The antecedents to the former are studies by economists like Schumpeter (1934) and Alchian (1950). It is contrasted with the orthodox theory in the following aspects: while the evolutionary theory views firms as motivated by profit, their actions are not assumed to be profit maximizing, as in orthodox theory; the evolutionary theory stresses the tendency of most profitable firms to drive other firms out of business, but, in contrast to orthodox theory, does not concentrate on the state of industry equilibrium; and evolutionary theory is related to behavioral theory: it views firms, at any given time, as having certain capabilities and decision rules, as well as engaging in various ‘search' operations, which determines their behavior; while orthodox theory views firm behavior as relying on the use of the usual calculus maximization techniques. The theory is then made operational by the use of simulation methods. These models use Markov processes and analyze selection equilibrium, responses to changing factor prices, economic growth with endogenous technical change, Schumpeterian competition, and Schumpeterian tradeoff between static Pareto-efficiency and innovation. The study's discussion of search behavior complicates the evolutionary theory. With search, the decision making process in a firm relies as much on past experience as on innovative alternatives to past behavior. This view combines Darwinian and Lamarkian views on evolution; firms are seen as both passive with regard to their environment, and actively seeking alternatives that affect their environment. The simulation techniques used to model Schumpeterian competition reveal that there are usually winners and losers in industries, and that the high productivity and profitability of winners confer advantages that make further success more likely, while decline breeds further decline. This process creates a tendency for concentration to develop even in an industry initially composed of many equal-sized firms. However, the experiments conducted reveal that the growth of concentration is not inevitable; for example, it tends to be smaller when firms focus their searches on imitating rather than innovating. At the same time, industries with rapid technological change tend to grow more concentrated than those with slower progress. The abstract model of Schumpeterian competition presented in the study also allows to see more clearly the public policy issues concerning the relationship between technical progress and market structure. The analysis addresses the pervasive question of whether industry concentration, with its associated monopoly profits and reduced social welfare, is a necessary cost if societies are to obtain the benefits of technological innovation. (AT)
Article
Investments in certain technologies do confer a competitive edge - one that has to be constantly renewed, as rivals don't merely match your moves but use technology to develop more potent ones and leapfrog over you. That's the conclusion of a comprehensive analysis that Harvard Business School professor McAfee and MIT professor Brynjolfsson conducted of all publicly traded U.S. companies in all industries over the past few decades. They found a clear correlation between levels of IT spending and a new competitive dynamic: Since the mid-1990s, when the rate of spending on IT began to rise sharply, the spread between the leaders and laggards in an industry has widened. There are more winner-take-all markets. But the increased concentration has ramped up, rather than dampened, churn among the remaining players. And these dynamics are greatest in those industries that are more IT intensive. This pattern is already familiar to the makers of digital products, but it has now spread to traditional industries, the authors contend, not because more products are becoming digital but because more processes are. Enterprise software like ERP and CRM systems, coupled with cheap networks, is allowing companies to replicate their unique business processes quickly, widely, and faithfully, in the same way that a digital photo can be endlessly reproduced. In this new environment, top managers must pay careful attention to which processes to make consistent and which to vary locally. And while standardizing some ways of working, they must also encourage employees to come up with creative process improvements to outdo competitors' innovations. Competing at such high speeds isn't easy, and not everyone will be able to keep up - but the companies that do may realize vastly improved business processes as well as higher market share and increased market value.
Article
This paper provides a survey on studies that analyze the macroeconomic effects of intellectual property rights (IPR). The first part of this paper introduces different patent policy instruments and reviews their effects on R&D and economic growth. This part also discusses the distortionary effects and distributional consequences of IPR protection as well as empirical evidence on the effects of patent rights. Then, the second part considers the international aspects of IPR protection. In summary, this paper draws the following conclusions from the literature. Firstly, different patent policy instruments have different effects on R&D and growth. Secondly, there is empirical evidence supporting a positive relationship between IPR protection and innovation, but the evidence is stronger for developed countries than for developing countries. Thirdly, the optimal level of IPR protection should tradeoff the social benefits of enhanced innovation against the social costs of multiple distortions and income inequality. Finally, in an open economy, achieving the globally optimal level of protection requires an international coordination (rather than the harmonization) of IPR protection.