Article

Data Quality for Data Science, Predictive Analytics, and Big Data in Supply Chain Management: An Introduction to the Problem and Suggestions for Research and Applications

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Today’s supply chain professionals are inundated with data, motivating new ways of thinking about how data are produced, organized, and analyzed. This has provided an impetus for organizations to adopt and perfect data analytic functions (e.g. data science, predictive analytics, and big data) in order to enhance supply chain processes and, ultimately, performance. However, management decisions informed by the use of these data analytic methods are only as good as the data on which they are based. In this paper, we introduce the data quality problem in the context of supply chain management (SCM) and propose methods for monitoring and controlling data quality. In addition to advocating for the importance of addressing data quality in supply chain research and practice, we also highlight interdisciplinary research topics based on complementary theory.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Numerous studies show how ABA is applied in various resource allocation domains. In supply chain management, for instance, researchers have demonstrated how predictive analytics can forecast demand and optimize inventory levels, lowering costs and improving service quality (Hazen et al., 2014;Ghadge et al., 2019). ABA has also been shown to enhance workforce planning and talent acquisition in human resource management by analyzing employee data to predict turnover and identify skill gaps (Huang et al., 2020;Jha et al., 2022). ...
... Every domain uses distinct analytical methods to tackle certain problems. In supply chains, for example, predictive analytics reduces costs by optimizing inventory levels in addition to forecasting demand (Hazen et al., 2014;Ghadge et al., 2019). This cross-domain applicability emphasizes how companies must take a customized approach to analytics, matching tools and methods to particular operational objectives. ...
... In a variety of fields, such as supply chain management, human resources, project management, and finance management, advanced business analytics (ABA) exhibits adaptability. For instance, predictive analytics efficiently predicts demand and maximizes inventory, demonstrating the necessity of customized analytical methods that complement certain operational objectives (Hazen et al., 2014;Ghadge et al., 2019). ...
Article
Full-text available
Allocating resources optimally is crucial for businesses aiming for both strategic and operational success in the complicated and cutthroat business environment of today. By converting data into usable insights, advanced business analytics (ABA) provides useful techniques and tools that improve decision-making. The purpose of this study is to examine how well ABA optimizes resource allocation, highlighting its contribution to increased operational performance and tackling the difficulties encountered in dynamic contexts. To learn more about the use and effects of ABA, the study uses a qualitative research methodology based on secondary sources, examining case studies, industry reports, and existing literature. The underutilization of ABA in businesses as a result of obstacles including a shortage of qualified staff, poor data quality, and change aversion, is the main topic addressed. Important conclusions show that firms that successfully use ABA benefit from better decision-making, more agility, and more efficient use of resources. However, the broad use of analytics is constrained by ongoing issues with data governance and cultural resistance. One of the study's shortcomings is its dependence on secondary data, which might not fully represent the range of organizational experiences with ABA. However, the results have important theoretical and practical ramifications, indicating that to properly utilize advanced business analytics, firms need to make investments in training, data quality enhancements, and cultural change. This study adds to the expanding corpus of research on ABA and offers practitioners practical advice for improving resource management techniques.
... For example, cloud-based systems enable smooth stakeholder communication, while IoT devices offer insightful data on logistics performance and inventory levels [6]. According to Hazen et al. [7], the integration of technology not only lowers operating expenses but also improves responsiveness to client demands, which in turn leads to higher customer satisfaction. For businesses looking to maximise return on investment (ROI) and optimise resource allocation, marketing efficiency is essential. ...
... This deeper comprehension enables the creation of more focused marketing plans, which in turn raises consumer satisfaction and engagement [10]. Moreover, IT makes inventory management easier, guaranteeing that goods are available when and where customers need them-a critical component of keeping a competitive edge in the marketplace [7]. Research has indicated that companies who successfully integrate IT into their supply chains see enhanced marketing outcomes, such as increased revenue and market share [12]. ...
... The examined literature as shown in Table 2, reveals significant insights into the integration of IT in supply chains and its impact on marketing outcomes. Companies that effectively integrate IT solutions tend to experience higher customer satisfaction and operational efficiency [7]. Utilizing big data and advanced analytics allows businesses to better predict demand and manage inventories, which reduces costs and enhances service levels [2]. ...
Article
Full-text available
p align="justify">This study examines how supply chain management (SCM) uses information technology (IT) and how that integration affects marketing effectiveness. IT plays a critical role in improving supply chain management (SCM) procedures as organisations aim for operational excellence in a world going digital. The current body of knowledge regarding the connection between IT integration in supply chain management (SCM) and marketing outcomes is compiled in this study through the use of a systematic literature review (SLR). Important conclusions show that using IT solutions, including cloud computing and advanced analytics, greatly enhances the communication, accessibility, and reactivity of data in marketing initiatives. In order to maximise the allocation of marketing resources and drive client interaction, the conversation highlights the revolutionary potential of IT. The study ends with proposals for additional research to examine the changing landscape of IT in SCM as well as tips for practitioners.</p
... Big data technologies daily face the rapid evolution in volume as well as variety and velocity of processed data [1]. Such big data characteristics routinely force analytics pipelines to underperform, requiring continuous maintenance and optimization. ...
... Such big data characteristics routinely force analytics pipelines to underperform, requiring continuous maintenance and optimization. One major reason for this is bad quality of data 1 . Poor data quality leads to low data utilisation efficiency and even brings forth serious decision-making errors [2]. ...
... In a similar way, a lot of scientists make the case that improving data quality by focusing on the content is a crucial factor in achieving better results [3]. 1 https://tinyurl.com/de62sf48 A plethora of available data sources and datasets in the organisations' data centres and lakes pose a significant challenge: Deciding which datasets should be selected and passed into analytic workflows in order to produce more accurate results/predictions. ...
Preprint
The massive increase in the data volume and dataset availability for analysts compels researchers to focus on data content and select high-quality datasets to enhance the performance of analytics operators. While selecting the highest quality data for analysis highly increases task accuracy and efficiency, it is still a hard task, especially when the number of available inputs is very large. To address this issue, we propose a novel methodology that infers the outcome of analytics operators by creating a model from datasets similar to the queried one. Dataset similarity is performed via projecting each dataset to a vector embedding representation. The vectorization process is performed using our proposed deep learning model NumTabData2Vec, which takes a whole dataset and projects it into a lower vector embedding representation space. Through experimental evaluation, we compare the prediction performance and the execution time of our framework to another state-of-the-art modelling operator framework, illustrating that our approach predicts analytics outcomes accurately. Furthermore, our vectorization model can project different real-world scenarios to a lower vector embedding representation and distinguish between them.
... License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. magnitude, variety, and speed of data created daily within supply chain networks [7,8]. Big data models in process control, for instance, can be used to manage natural resources sustainably and reduce pollution, creating the best possible plans to increase the sustainability of green supply chain management [9,10,11,5]. ...
... First, our study's findings indicate a strong positive correlation between the GSC and BDAC. These findings corroborate the claims made by several recent conceptual and empirical research that the implementation of BDAC improves cooperation among supply chain participants [30,7]. One reason could be that the focal firm's big data technical competence is mostly comprised of its own systems and data analytics capabilities, which could facilitate information sharing and communication across various firm functions as well as with suppliers and customers. ...
Article
Full-text available
In recent years, scholars and practitioners have been more interested in big data analytics capabilities. Though typically understudied, research in this new sector is growing. An enhanced green supply chain can be achieved, according to this article, provided businesses adopt and restructure some of the big data resources and capabilities inside their supply function. 332 managers of supply chain, production, and information systems from shareholder manufacturing companies listed on the Amman Stock Exchange participated in a survey employing a questionnaire. To examine the data, structural equation modeling (SEM) was used. The researched hypotheses were evaluated using Amos V.22. The empirical findings demonstrate how supply chain visibility and agility, as well as green supply chain, are influenced by big data analytics capability. Green supply chains have an impact on supply chain agility and visibility. Additionally, the influence of big data analytics capabilities on green supply chains is mediated by supply chain visibility and agility. Additionally, the results provide managers with concrete evidence that enhancing supply visibility and agility via BDAC development can increase the degree of GSC.
... A global retailer successfully deployed an advanced IAM solution, incorporating role-based access controls and multi-factor authentication across all cloud platforms. The implementation resulted in a substantial 55% reduction in unauthorized access incidents within the first year of deployment (Hazen et al., 2014). Continuous monitoring, including monthly access reviews and real-time alerts, improved detection rates of suspicious activities by 60% (Davenport, Barth, & Bean, 2012). ...
... The implementation streamlined incident workflows, significantly reducing average incident response times from 8 hours to under 2 hours (Waller & Fawcett, 2013). Automation of repetitive tasks led to a 45% increase in productivity among security analysts, enabling them to focus more effectively on complex threat investigations (Hazen et al., 2014). Real-time alerts improved the detection rate of security incidents by approximately 70%, ensuring quicker containment and minimal damage (Davenport, Barth, & Bean, 2012). ...
Article
Full-text available
Retail organizations increasingly leverage cloud computing to enhance operational flexibility, scalability, and efficiency. However, migrating retail data to the cloud introduces critical security risks, particularly regarding sensitive consumer and transaction data. Effective cloud security practices are essential to mitigate threats and safeguard information integrity, confidentiality, and availability. This white paper addresses the critical areas of securing retail data in cloud environments, outlining best practices in cloud security management, data encryption, identity and access management (IAM), regulatory compliance, and incident response strategies. Retailers must carefully consider the cloud service provider's security measures and their alignment with organizational requirements and compliance standards. Employing comprehensive encryption solutions for data in transit and at rest significantly reduces the likelihood of unauthorized data access. Robust IAM frameworks ensure only authorized personnel can access sensitive data, reducing internal threats and managing user permissions effectively. Compliance with regulations such as PCI DSS, GDPR, and HIPAA requires thorough auditing and transparent reporting procedures to demonstrate adherence. This paper presents case studies demonstrating effective cloud data protection, outlines emerging trends in cloud security technologies, and provides a roadmap for retailers to strengthen their security posture. Implementing these best practices ensures that retailers maintain trust with customers, mitigate security risks, and achieve sustainable business resilience.
... Nevertheless, the combination of recovery and service investment cost-sharing strategies can recognize this under certain conditions. Some scholars have studied a class of service investment strategies such as delivery services (Chen et al. 2022b), warranty services , logistics services (Qin et al. 2020(Qin et al. , 2021, and big data services (Hazen et al. 2014). These studies have interesting findings regarding service investment. ...
... Under an online retailer's service investment, the service level s incurs a quadratic cost s 2 2 (Zhang et al. 2018). With big data and intelligent cloud technology applications, the hybrid e-commerce platform can comprehensively monitor online retailers' sales indicators in the marketplace channel and screen online retailers that meet reputation certification's requirements using algorithms (Hazen et al. 2014;Chae 2015;Han et al. 2018). However, e-commerce platforms also require cost c for data collection, screening, and calculation (Zhou et al. 2020). ...
Article
Full-text available
Hybrid e-commerce platforms have launched reputation certification strategies for third-party online retailers to encourage service investment. A crucial aspect of the reputation certification strategy is its potential impact on consumers’ purchase decisions, affecting online retailers’ service investment strategies. The study examines these problems by constructing four decision models of whether the online retailer invests when the platform does not launch reputation certification and whether the online retailer invests when the platform launches reputation certification. The study uses the game theory method to explore the impact of reputation certification strategies on the online retailer’s service investment in a hybrid e-commerce platform. First, the platform launches reputation certification when its implementation cost is low, and the commission rate, the marketplace channel’s retailing inefficiency, the reputation certification’s inefficiency, and consumer sensitivity to service meet certain conditions. Second, reputation certification does not necessarily help the online retailer set higher prices, and a high reputation certification standard may hinder the online retailer from service investment. Finally, only when consumers’ sensitivity to service is low, the launch of reputation certification by the platform will increase consumer welfare and social welfare, and service investment by the online retailer can always increase consumer surplus and social welfare. The findings provide new insights into hybrid e-commerce platforms that aim to launch reputation certification strategies.
... The adoption of advanced technologies has become increasingly crucial in this transformation, enabling organizations to optimize their processes, reduce operational costs, and improve service delivery (Gunasekaran et al., 2017) [3] . Technologies such as Artificial Intelligence (AI), the Internet of Things (IoT), blockchain, and advanced data analytics have significantly reshaped traditional logistics practices, providing businesses with the capability to respond rapidly to market demands and consumer expectations (Hazen et al., 2014;Waller & Fawcett, 2013) [5,11] . India's logistics and supply chain industry is a vital component of the nation's economic framework, contributing approximately 13%-14%) to the country's Gross Domestic Product (GDP) and employing millions of people (Ministry of Commerce and Industry, 2020) [7] . ...
... The adoption of advanced technologies has become increasingly crucial in this transformation, enabling organizations to optimize their processes, reduce operational costs, and improve service delivery (Gunasekaran et al., 2017) [3] . Technologies such as Artificial Intelligence (AI), the Internet of Things (IoT), blockchain, and advanced data analytics have significantly reshaped traditional logistics practices, providing businesses with the capability to respond rapidly to market demands and consumer expectations (Hazen et al., 2014;Waller & Fawcett, 2013) [5,11] . India's logistics and supply chain industry is a vital component of the nation's economic framework, contributing approximately 13%-14%) to the country's Gross Domestic Product (GDP) and employing millions of people (Ministry of Commerce and Industry, 2020) [7] . ...
Article
Full-text available
Objectives: This research investigates the adoption of advanced technologies in India’s logistics and supply chain sector, with a particular focus on Small and Medium Enterprises (SMEs). The study aims to evaluate the advantages and challenges of technology adoption, specifically in terms of improving operational efficiency, reducing costs, and enhancing customer service. While technologies such as Artificial Intelligence (AI), the Internet of Things (IoT), and blockchain present significant opportunities, SMEs face considerable barriers, including financial constraints, a lack of technical expertise, and organizational resistance to change. Methodology: Using a mixed-methods approach, this study combines quantitative surveys and qualitative interviews with 33 SMEs to provide comprehensive insights into the barriers and facilitators of technology adoption. The research assesses the extent of adoption, the perceived benefits, and the challenges encountered by SMEs in integrating these technologies into their operations. Findings: Findings indicate that over 50% of SMEs experience operational improvements, 45% report cost reductions, and 60% note enhanced supply chain visibility. However, 70% of SMEs identify high implementation costs as a major barrier, while 55% face challenges related to skill shortages, and 40% encounter organizational resistance, particularly in more traditional firms. AI emerges as the most widely adopted technology, utilized by 69.7% of respondents, followed by IoT and blockchain at 42.4%. Despite widespread adoption, full integration remains limited, with many SMEs still in the early stages of technology deployment. External factors such as market competition and customer demand are significant drivers of adoption, while regulatory frameworks and access to funding are less influential. Organizational culture also plays a crucial role, with 21.2% of respondents acknowledging management’s support for technology adoption and 33.3% of employees involved in the decision-making process. The research concludes with recommendations for policymakers and industry stakeholders, emphasizing the need for financial support, skill development programs, and phased adoption strategies to help SMEs overcome barriers and enhance their competitiveness in the logistics and supply chain sectors. Keywords: Logistics, supply chain, technology, organizational, digital transformation, sustainability, cyber security.
... Data serves as the fuel for scientific advancement in the contemporary world, playing a pivotal role in research. Quality data enables researchers to establish baselines and benchmarks, conduct experiments, derive insights, and tackle problems effectively [1]. Presently, researchers can swiftly acquire and develop a substantial volume of rich, varied datasets, routinely applying mathematical models and conducting statistical experiments with ever-increasing precision. ...
... We perform the first task (Task 1) of dataset extraction on the RCC-1 data, and the second task (Task 2) of dataset linking on the RCC-2 data. Details about RCC-1 and RCC-2 datasets are in Section 3. Our approach for Task-1 consists of two stages: (1). Classification to identify sentences containing dataset mentions, (2) Identification of actual dataset mentions within that sentence. ...
Article
Full-text available
Datasets are a crucial artifact in research, and there is always a high demand for good datasets. Due to rapid scientific progress and exponential growth in the scientific literature, there has been a proportionate increase in the number of datasets. However, many datasets and research studies are left unexplored and under-utilized as these are not discoverable easily, leading to duplicate efforts. Building good datasets is costly in terms of time, money, and human effort. Hence, automated tools to facilitate the search and discovery of datasets are crucial to the scientific community. In this work, we investigate a deep neural network-based architecture to automate the dataset discovery from scientific publications. In this paper, we perform two tasks namely dataset mention extraction and entity linking. Our method outperforms the earlier ones and achieves an F1 score of 56.24 in extracting dataset mentions from research papers on a popular corpus of social science publications. Our approach also outperforms the prior research and achieves a precision score of 88.63 in linking research papers to a dataset knowledge base for another popular corpus of social science publications. We hope that this system will further promote data sharing, offset the researchers’ workload in identifying the right dataset and increase the reusability of datasets.
... A digital supply chain's effectiveness depends on the accuracy and reliability of its data. Many failures in digital transformation projects stem from poor data quality and lack of standardization [16,20]. Best practices for data governance include (Table VII): ...
Article
Full-text available
Digitalization is transforming supply chain management by enhancing agility, quality, and operational resilience. This paper explores how digital technologies, such as real-time data analytics, artificial intelligence, blockchain, and automation, drive improvements in supply chain efficiency. The study examines industry trends, best practices, and case studies that demonstrate the benefits of digital transformation in achieving faster response times, reducing errors, and improving traceability. Furthermore, the paper discusses challenges in implementation and how organizations can effectively integrate digital solutions to enhance competitiveness. The findings highlight the critical role of digitalization in building adaptive and high-quality supply chains.
... Predictive analytics enables the organizations to assess the bottlenecks well in advance, thereby helping to enhance resilience (Dubey et al., 2021). However, its implementation seems cumbersome because of data quality issues as highlighted by Hazen (2014), who further suggested alternative methods to overcome these concerns. In continuation of the same, Choi and Luo (2019) emphasized that reduced data quality negatively impacts profit. ...
Article
Purpose The research paper assesses the anticipated risk mitigation strategies (RMS) for successfully adopting digital supply chain management (DSCM). It helps to enhance supply chain performance and achieve organizational excellence. Design/methodology/approach Risk mitigation strategies are analyzed using the fuzzy-analytic hierarchy process (F-AHP). Furthermore, sensitivity analysis is done by changing the weight of RMS to analyze their impact on the final ranking. Findings The RMS were categorized into five categories, namely Cybersecurity and Data Protection (CDP), Digitalization and Organization (DO), Supply Chain Visibility and Transparency (SCVT), Data Quality and Management (DQM), Change Management and Alliance (CMA). The research findings show that “developing a roadmap for the latest digital systems and processes” and “investing in integrating technologies for streamlined data flows in the supply chain” are the top risk mitigation strategies. Practical implications The research results benefit industry personnel and practitioners considering DSCM adoption. Prioritization of RMS will help to understand its relative importance, thereby facilitating management to focus more on critical aspects. Therefore, organizations can adopt these strategies to enhance supply chain agility, responsiveness and overall business performance. Originality/value This research paper contributes to the field of DSCM by employing the F-AHP method to evaluate RMS. Considering uncertainties and ambiguities, incorporating fuzzy logic adds originality and robustness to the assessment process. Furthermore, the study’s focus on RMS prioritization in the context of DSCM adds novelty to the existing body of knowledge.
... AI also enables automated exception handling, reducing manual intervention. Retailers like Home Depot and Tesco have used AI-enhanced WMS solutions to improve inventory accuracy and throughput (Hazen et al., 2014). Integration allows for seamless communication between AI tools and existing warehouse infrastructure. ...
Article
Full-text available
Warehouse automation is undergoing a revolutionary transformation driven by the integration of Artificial Intelligence (AI) and Robotics. These technologies are redefining the operational paradigms of inventory management, order fulfillment, predictive maintenance, and resource optimization. AI-driven solutions enable real-time decision-making, improve accuracy, and minimize human error through advanced data analysis, machine learning algorithms, and predictive capabilities. Robotics enhances physical efficiency, enabling high-speed, accurate movement of goods, dynamic navigation in complex warehouse environments, and round-the-clock operation. Combined, AI and robotics facilitate streamlined workflows, reduced operational costs, and increased throughput. The deployment of intelligent automation leads to smarter warehouses capable of adapting to changes in demand, customer preferences, and market fluctuations. This white paper explores key areas where AI and robotics are making a significant impact, including intelligent inventory tracking, autonomous picking and packing, fleet and task orchestration, and safety enhancements. It also highlights best practices for implementing AI-robotic solutions, addresses challenges such as system integration and data interoperability, and reviews emerging trends in AI-powered edge processing and collaborative robotics. Real-world examples and case studies provide tangible evidence of performance improvements and cost savings. Ultimately, the adoption of AI and robotics not only improves warehouse efficiency and accuracy but also builds operational resilience and scalability. As e-commerce and consumer expectations continue to grow, intelligent automation will become an indispensable component of modern warehouse ecosystems.
... This is an egregious gap. Data quality has been often considered from a technical perspective of data representing something [11,12,20,36,59,94], a serious bias in the world where data is a social artifact produced in some social context [2-4, 38, 44, 50, 67, 70, 88]. ...
Article
Full-text available
The value of data hinges on its quality, which is not solely defined by accuracy or completeness but also by ethical, legal, and contextual considerations. This article reviews the concept of data, examines the evolution of definitions of data quality (information quality), and introduces the FACT+ Framework - Fairness, Accuracy, Completeness, Timeliness, and other contextually relevant dimensions (PLUS) - as a comprehensive approach to understand and improve data quality. FACT+ provides a long-overdue update to understanding data quality to support data-driven developments, such as analytics, artificial intelligence and smart products and services.
... This approach lowers breaks in production, cuts maintenance costs, and increases the useful life of significant assets. Moreover, the frameworks of Explainable AI in SM systems of predictive maintenance revealed transparency and support in decision-making to enhance intervention by the engineers based on the factors behind the formulated predictions (Hazen et al., 2014). ...
Article
Full-text available
AI and ML have GPUs across industries, and predictive analytics and automation are now the centerpieces of operational excellence. This work presents a framework of predictive analytics coupled with automation through intelligent systems to improve the decision and process action in complex industrial settings. Predictive analytics lets organizations respond to emerging trends promptly; for example, when there are indications of inefficiency within particular processes, an organization will be ready to address the issue immediately. On the other hand, AI automation means that the current practices are controlled by specific algorithms that are developed to learn regularly. In aggregate, all these technologies enhance real-time decision-making and thus 'eliminate' almost all avoidable mistakes. This is a clear demonstration that, overall, AI and ML improve predictive performance by 25% and rates of task completion by 40%. These enhancements minimize operational intersession and maximize resource utilization, making the framework useful in healthcare, finance, and manufacturing industries. Because the identified gap shifts from data analysis to operationalization, the presented approach offers only the beginning of potentially more comprehensive, long-term, generalizable, and scalable industrial solutions in the future. INTRODUCTION 1.1 Background to the Study AI and the subset technology of ML have progressed past the point of being simple rule-following basic algorithms to something much more complex. Prior work emerged from quantitative tools and computational algorithms, while future AI and ML are based on NN and DL techniques. These enhancements have widened the scope of AI across disciplines to result in the execution of AI abilities, including natural language processing and predictive analytics (Lu, 2019). This progression enhanced the computational speed and awoke AI to become a part of most decision-making in today's industries. One of the major fields that is now regarded as strategic for any organization that uses information to make improved strategic decisions is predictive analytics, which is a key facet of AI solutions. Predictive analytics allows organizations to define trends and form preventive approaches that help
... Some of the predictions developed using this tool may be fairly simple and may be completed at a relatively low cost while using other tools may still require the implementation of expensive data 105 systems, acquisition of specialized staff or could integrate needlessly with your financial system. Therefore, the cost-benefit analysis of the use of predictive analytics is important especially for SMEs and where the profit margin is thin [32]. ...
Article
Full-text available
Predominant enterprises use predictive analytics as their main instrument to control monetary perils starting from foreign exchange (FX) exposure. Small and medium-sized enterprises (SMEs) encounter substantial obstacles when it comes to implementing data-driven approaches although large corporations have already accepted this methodology. The evaluation analyzes how organizational aspects contribute to the implementation of predictive analytics for FX exposure management within SMEs along with major adoption hindrances and assistance elements. The potential of predictive analytics for risk management in SMEs depends strongly on factors which include their technological capabilities and financial resources and data access limitations and workforce expertise alongside regulatory demands. Predictive analytics solutions for SMEs need dedicated development to match their needs while training programs and policy changes will help increase wide-spread implementation. Future studies need to concentrate on building both economical and easy-to-use technology models while investigating the behavioral aspects which determine acceptance rates. SMEs who successfully overcome obstacles in adopting predictive analytics technology will build better financial stability while keeping their market competitiveness strong.
... The Bibliometrix historiography, which uses Big Data technology to highlight the publication of papers over 22 years, is shown in Figure 6 below. During the research period of 2014-2018, it has been observed that the authors talked about a methodical approach for breaking down Big Data systems into four sequential modules: data generation, data acquisition, data storage, and data analytics (Hu et al., 2014), and address the problem of data quality in the context of a digital Supply Chain and suggest techniques for keeping an eye on and managing data quality Articles (Hazen et al., 2014). Dynamic capabilities theory envisions using BDA as a unique information processing capability that gives firms a competitive advantage (D. ...
Chapter
The rapid evolution of technology has been pivotal in reshaping industries, and the supply chain sector is no exception. As companies strive to remain competitive, innovation emerges as a critical driver of adaptation and transformation. This chapter aims to explore the profound impact of disruptive technologies (DTs) on supply chain management (SCM), specifically focusing on technologies such as the Internet of Things (IoT), blockchain, artificial intelligence (AI), and data science. In doing so, the chapter aims to provide a comprehensive overview of the historical development of DTs, analyze their current applications in enhancing the efficiency and efficacy of supply chain activities, and outline strategies for leveraging these technologies to gain a competitive advantage. The chapter employs a thorough review of existing literature and case studies to examine the integration of DTs in smart supply chains. Our review reveals that DTs have significantly improved the monitoring, creation, and transportation of commodities, fostering a more autonomous and cognitive-aware supply chain. Additionally, the chapter highlights the dual role of DTs in driving corporate digital transformation and enhancing overall performance. By dissecting the benefits and challenges associated with implementing DTs, this chapter contributes to the body of knowledge on smart supply chain management, offering valuable insights for practitioners and researchers aiming to harness the full potential of technological advancements in this field.
... The success of a health tracking system rests on its ability to provide correct, fast, and valuable data. Hazen et al. (2014) noted that a successful tracking system should ensure accuracy, data quality, and quick reporting methods. However, many low-and middle-income countries, including Nigeria, face problems such as poor buildings, limited human resources, and weak data management methods (Opele, 2017). ...
Article
Full-text available
Introduction: Comprehensive health surveillance systems are needed to identify, monitor, and manage infectious disease outbreaks. Deficits in disease surveillance have resulted in poor resource allocation, high rates of morbidity and mortality, and delays in responding to epidemics in Nigeria. Incorporating cutting-edge technologies such as electronic reporting systems, mobile health (mHealth) applications, artificial intelligence (AI), and geospatial mapping offers a revolutionary chance to fortify Nigeria's health surveillance infrastructure in light of the expanding global adoption of digital health innovations. This article evaluates the advantages and disadvantages of Nigeria's present health surveillance system, examines the role of digital technology in epidemic planning and response, and suggests using digital technologies to enhance disease monitoring and control. Materials and Methods: This study adopted a PRISMA-compliant systematic review technique to guarantee an organised and complete examination of available material. Data were sourced from multiple electronic databases, including Web of Science, Scopus, IEEE Xplore, ACM Digital Library, and Google Scholar, using targeted search terms such as "health system surveillance," "digital health innovations," "electronic reporting system," "data-driven approach," and "epidemic preparedness." Inclusion criteria comprised peer-reviewed journal papers, conference proceedings, and book chapters published in English between 2010 and 2020, concentrating on health monitoring, digital innovations, and pandemic preparation. Studies missing actual evidence or presenting just expert opinions were rejected. A total of 1,697 items were initially retrieved, with 1,375 remaining after duplication elimination. Through title and abstract screening, 798 articles were removed, and additional quality evaluation led to a final selection of 205 suitable sources. Data were retrieved using a standardized pro forma, collecting crucial data such as research objectives, procedures, findings, and consequences. The study employed theme synthesis and narrative synthesis methodologies, supported by the Critical Appraisal Skills Programme (CASP) and Mixed Methods Appraisal Tool (MMAT), to promote validity and reliability.
... This enables the identification of decisions that maximize desired objectives and minimize risks based on current circumstances [22]. As enterprises implement decision automation powered by prescriptive analytics, they actualize enhanced operational optimization, risk management, and strategic planning powered by IoT and Machine Leaning based intelligence [23]. ...
Research
Full-text available
In today's complex business landscape, organizations contend with an avalanche of data. Yet, the true value lies in the ability to transform this extensive data repository into insightful revelations that illuminate more strategic corporate maneuvers. This is precisely where the practice of IoT and Machine Leaning based decision-making emerges. By harnessing the potential of data and leveraging artificial intelligence (AI) capabilities, enterprises can seize the opportunity for well-informed selections that eventually lead to enhanced outcomes. This paper explores the concept of Machine Learning and IoT-powered decision-making and scrutinizes the pivotal role of AI in shaping these astute business resolutions.
... The OIPT theory explains that organizations are multifaceted and for smooth functioning and effective decision making, they strongly rely on harnessing the data. [34]. Due to the shifting market demands, economic fluctuations, competition, and technological advancements [35] there is an urgent need to develop and enhance the information processing capability of organizations, especially hospitals that depend on efficient and effective supply chains to provide high-quality patient care [36]. ...
Article
Full-text available
Purpose Despite the growing interest in Big Data Analytics Capabilities (BDAC), its significant impact on hospital operations and supply chains in shaping hospital performance remains elusive. The study investigates the pivotal role of BDAC within the framework of hospital supply chains across India. Drawing upon the Resource-Based View, Dynamic Capability View, and Organisation Information Processing Theory, this research explores the intricate relationships among the organization's capability factors, BDAC, and hospital performance indicators. Design/Methodology/Approach A conceptual model was developed and empirically tested using survey data collected from 446 hospital managers. The analysis was carried out by using partial least square-structural equation modeling (PLS-SEM). Findings The results of this study support the significant mediating impact of BDAC on Operational Flexibility, Supply Chain Sustainability, and Organisation Revenue leading to the enhancement of organizational performance. The findings highlight the strategic importance of cultivating BDAC to improve operational efficiency and overall effectiveness in the context of Indian multispeciality hospitals. Originality/Value This research contributes to the existing knowledge by highlighting the relationship between organization capability factors, BDAC, and performance indicators in the different settings of Indian multispeciality hospitals.
... Data analytics plays a crucial role in quality measurement, enabling organizations to track performance trends and identify improvement opportunities [51]. Advanced analytics tools support decision-making by providing insights into procurement effectiveness and insurance policy impacts. ...
Article
Full-text available
This comprehensive review examines the complex relationship between healthcare procurement processes and health insurance policies within the Medicare and Medicaid systems in the United States. The study analyzes how procurement strategies influence healthcare delivery, cost management, and patient outcomes while considering the intricate policy frameworks governing these public insurance programs. Through analysis of existing literature, policy documents, and administrative data, this review identifies key challenges and opportunities at the intersection of procurement and insurance administration. The research reveals significant variations in procurement practices across states and their impact on healthcare access, quality, and cost-effectiveness. Notable findings include the critical role of value-based purchasing initiatives, the influence of formulary management on pharmaceutical procurement, and the evolving landscape of managed care contracting. The study also highlights the importance of data integration and technological infrastructure in improving procurement efficiency and insurance administration. The findings suggest that optimizing the alignment between procurement strategies and insurance policies could significantly enhance healthcare delivery efficiency while maintaining program sustainability.
... SRE practices must adapt to these environments by developing strategies for managing resources across different cloud providers, ensuring consistent performance and compliance (Leinwand and Caccavale, 2020). Future directions for SRE in these contexts will likely focus on optimizing cloud resource management, enhancing interoperability, and leveraging advanced cloud-native technologies to support high availability and low latency (Hazen, et al., 2021;Lee and Kim, 2021;Tian, 2016;Xie et al., 2021). ...
Article
Full-text available
Site Reliability Engineering (SRE) has emerged as a critical discipline in cloud environments, focused on maintaining high availability, low latency, and overall system reliability. This review explores the strategies and practices that SRE teams employ to achieve these objectives. Central to SRE in cloud environments is the integration of automation, monitoring, and proactive incident management. By leveraging infrastructure as code (IaC) and continuous integration/continuous deployment (CI/CD) pipelines, SRE teams can automate repetitive tasks, reduce human error, and ensure consistent deployment processes. Additionally, the use of advanced monitoring tools and real-time analytics allows for the early detection of potential issues, enabling rapid response and minimizing downtime. Another key strategy is the implementation of scalable architectures that can dynamically adjust to varying load demands, thus maintaining optimal performance and low latency during peak times. Furthermore, the adoption of chaos engineering practices enables SRE teams to identify system weaknesses and improve resilience by simulating failures in controlled environments. The review also highlights the importance of a collaborative culture between development and operations teams, facilitated by SRE, which fosters continuous improvement and innovation. By integrating these strategies, organizations can enhance the reliability, performance, and scalability of their cloud-based applications, ensuring a seamless user experience. This study underscores the vital role of SRE in achieving operational excellence in cloud environments, particularly in maintaining high availability and low latency.
... Artificial intelligence algorithms optimize the process at Carbon Engineering's DAC facility in Squamish, Canada, lowering costs and increasing productivity. Using artificial intelligence to improve carbon capture and storage, Climeworks runs multiple DAC facilities across the globe [9]. Enhanced Oil Recovery (EOR): Methods for Enhanced Oil Recovery provide the dual benefit of increasing oil extraction while simultaneously burying carbon dioxide. ...
Article
Full-text available
The world of technology and corporate operations is always evolving, so it's crucial to keep up with the latest trends. To further understand how these developments have affected ERP optimization, this analysis assesses the state of machine learning (ML) integration with ERP systems. There has been a substantial improvement in the incorporation of ML technology into ERP settings in the last several years. Enterprise resource planning (ERP) systems are able to make better data-driven decisions and forecasts because to ML algorithms that can extract complex patterns from massive datasets. In conclusion, ML allows ERP systems to dynamically change according to real-time insights, leading to improved efficiency and adaptability. In addition, a growing number of companies are seeking out AI solutions to help them make ML models within ERP more understandable and accessible to stakeholders. Implementing these technologies allows ERP systems to handle and respond to data as it comes in, thanks to ML models. This allows businesses to successfully adapt to changing conditions. Keywords: enterprise resource planning (ERP) systems, artificial intelligence (AI), machine learning (ML), Supply Chain INTRODUCTION In order to make predictions and automate processes, AI analyses the data that ERP systems process. At the same time, Machine Learning sets up an automatic machine-human interface with ERP systems to guarantee targeted adjustments are made at the same time [1]. Strategically improving responsiveness and decreasing complexity, this technique overhauls ERP system functionality. Automating repetitive tasks like invoicing, report generation, and data entry is one way that modern technology improves the efficiency of enterprise resource planning (ERP) systems in financial management [2]. The use of AI in the supply chain allows for more precise inventory management by analyzing historical data to optimize the supply chain, demand, and operating capacities. Improving these systems' AI predictive power can also help cut costs, which means more money in the bank. Personalized customer management service is possible with ERP systems that have AI chatbots. These AI chatbots are able to swiftly determine client needs using NLP technology and provide immediate responses regardless of time or challenges [3]. In turn, this helps companies increase their conversion rate and happy customers. Work efficiency and task completion time are both enhanced by the fact that this process does not require any human intervention. These cutting-edge solutions also enable organizations to be ready for market swings by using Machine Learning
... Future advancements in digital tools will enhance the industry's ability to meet and exceed these standards by providing more accurate and timely data on environmental performance (Anjorin, ET AL., 2024, Onita & Ochulor, 2024 [21,138,168] . This will be crucial in maintaining regulatory compliance and achieving long-term sustainability goals (Hazen et al., 2014). In summary, the future of digital transformation in the oil and gas sector holds significant promise for enhancing sustainability. ...
Article
Full-text available
Digital transformation is reshaping the oil and gas industry, offering powerful tools to enhance sustainability while maintaining operational efficiency. This review examines how digital technologies such as artificial intelligence (AI), the Internet of Things (IoT), and data analytics contribute to reducing environmental impact, improving resource management, and driving innovation in oil and gas operations. The sector faces increasing pressure to transition towards more sustainable practices due to regulatory demands, stakeholder expectations, and the global shift towards a low-carbon economy. In response, companies are adopting digital solutions to optimize energy use, minimize emissions, and enhance safety and transparency. Key technologies like AI and machine learning are instrumental in predictive maintenance, helping companies to foresee equipment failures and reduce operational downtime. IoT devices enable real-time monitoring of energy consumption and emissions, allowing companies to adjust operations to meet environmental standards more efficiently. Data analytics further enhances decision-making by providing insights into energy use, waste reduction, and resource allocation. These digital tools support a shift toward more circular economy models, where waste is minimized, and energy efficiency is maximized. Moreover, blockchain technology is being employed to ensure transparency and traceability in supply chains, enabling more sustainable procurement and resource management. Digital twins-virtual replicas of physical assets-are used to simulate processes, reduce risks, and improve performance while reducing environmental impacts. This integration of digital technologies not only helps companies meet regulatory compliance but also opens new avenues for innovation and competitiveness in a sustainable market. As oil and gas companies navigate the complexities of digital transformation, it is evident that these technologies are key to achieving long-term sustainability goals. This study highlights how digital transformation is crucial for creating a more sustainable future in the oil and gas industry by enhancing operational efficiency, reducing emissions, and fostering innovation.
... Such finding suggests that the relationship between digital technology and other parts of the digital transformation and overall performance needs to be revisited. The argument that organizations must go beyond the adoption of digital technologies alone in order to derive value from their investments [103] might be realizing itself. ...
Article
Full-text available
Background: This study investigates digital transformation as a moderating variable in determining the effect of digital technologies, automation, and data integration of upstream and downstream providers on supply chain performance. By filling the existing research gap, the study reveals that more research regarding how digital transformation interventions impact the effectiveness of these technologies for industrial supply chains must be understood. Methods: A structured survey was applied to 181 supply chain managers in manufacturing firms scattered across Jordan. Results: The findings using SmartPLS for statistical analysis indicated that automation has the strongest positive effect on supply chain performance, followed by data integration. But digital technology did not have a significant direct effect, unless it was accompanied by broader digital transformation initiatives. Conclusions: Theoretically, this study reinforces digital transformation theory as a vital framework, whereas in practice, it invokes the strategic deployment of automation and integrated data application designs to underpin supply chain efficiency and competitiveness. Finally, this study offers practical guidance for practitioners who seek to employ the use of digital transformation in the current dynamic business environment.
Article
Full-text available
The complexity of global procurement has intensified in the face of volatile supply chains, geopolitical uncertainties, and fluctuating market dynamics. In this challenging environment, predictive analytics has emerged as a transformative tool, enabling procurement professionals to proactively manage risks and drive cost efficiency. This paper explores the application of predictive analytics in global procurement, focusing on its capacity to forecast demand, evaluate supplier performance, anticipate disruptions, and optimize spend. It examines how data-driven forecasting enhances strategic sourcing decisions and minimizes exposure to risks, such as supplier insolvency, price volatility, and logistical delays. The study highlights the evolving role of analytics in creating agile, resilient, and cost-effective procurement systems, and proposes best practices for organizations seeking to integrate predictive technologies into their procurement strategies.
Chapter
The growth of consumer demand has greatly accelerated resource consumption, with significant impact on supply chain sustainability. How to improve the eco-efficiency and balance economic and environmental performance is a key to sustainable supply chain management. This chapter focuses on illustrating the application of the multi-criteria decision making (MCDM) in optimizing the eco-efficiency of the supply chain, including A data-driven multi-objective optimization modeling approach to reduce the inherent risk, carbon emission, and operational cost of hazardous materials flowing through the supply chain; A system dynamics modeling approach to simulate the recycling capacity of wastes in a closed-loop supply chain system for agro-products; and A multiple indicator system integrated with data envelopment analysis and fuzzy decision-making to assess the vulnerability of core production enterprises in a typical chemical product supply chain.
Article
Full-text available
Artificial Intelligence (AI) is transforming business operations by enabling smarter decisions, automating tasks, and uncovering insights across sectors such as finance, retail, and customer service. However, the effectiveness of AI systems is closely tied to the quality of the data they are built on. This study examines the impact of data quality on AI model performance within business applications. Using a mixed-method approach-combining empirical analysis with a comprehensive review of existing literature-we explore how key dimensions of data quality, including accuracy, completeness, consistency, timeliness, and relevance, affect the reliability and outcomes of AI models. Case studies from real-world business environments show that poor data quality leads to reduced model accuracy, biased predictions, and suboptimal decisions. In contrast, high-quality data significantly enhances model precision, boosts operational efficiency, and delivers greater business value. The study emphasizes the importance of strong data governance and continuous quality control to ensure successful AI deployment. As organizations invest in AI, maintaining data integrity must become a strategic priority.
Article
The market for supply chain analytics is expected to develop at a CAGR of 17.3 percent from 2019 to 2024, more than doubling in size. This data demonstrates how supply chain organizations understand the advantages of being able to forecast what will happen in the future with a decent degree of accuracy. Google, Netflix, and Amazon are among the main corporations that use predictive analytics. According to Gartner research, organizations that use predictive supply chains have a good return on investment. Furthermore, owing to more precise demand forecasting, they may reduce inventory by 20-30%. Predictive analytics may help remove a lot of the guesswork from planning and decision-making. Supply chain predictive analytics, as opposed to historical analytics, helps supply chain management to foresee and plan. Supply chain leaders may use this data to address supply chain difficulties, cut costs, and enhance service levels all at the same time. Predictive analytics approaches enable businesses to uncover hidden patterns and trends in their data, allowing them to better analyze market trends, identify demand and set suitable pricing strategies.
Article
Full-text available
Artificial Intelligence (AI) is transforming business operations by enabling smarter decisions, automating tasks, and uncovering insights across sectors such as finance, retail, and customer service. However, the effectiveness of AI systems is closely tied to the quality of the data they are built on. This study examines the impact of data quality on AI model performance within business applications. Using a mixed-method approach-combining empirical analysis with a comprehensive review of existing literature-we explore how key dimensions of data quality, including accuracy, completeness, consistency, timeliness, and relevance, affect the reliability and outcomes of AI models. Case studies from real-world business environments show that poor data quality leads to reduced model accuracy, biased predictions, and suboptimal decisions. In contrast, high-quality data significantly enhances model precision, boosts operational efficiency, and delivers greater business value. The study emphasizes the importance of strong data governance and continuous quality control to ensure successful AI deployment. As organizations invest in AI, maintaining data integrity must become a strategic priority.
Article
Full-text available
Real-time data analytics significantly transforms retail supply chain efficiency by providing immediate insights into operational performance, consumer behavior, and inventory management. This white paper examines how real-time analytics enhances decision-making, reduces costs, and optimizes inventory within retail supply chains. Leveraging real-time data allows retailers to respond swiftly to market demands, preventing inventory shortages and reducing overstock situations. Integration of real-time analytics supports enhanced forecasting accuracy, inventory visibility, and streamlined logistics operations. Retailers adopting these capabilities experience improved customer satisfaction through increased availability and faster delivery times. However, implementing real-time analytics also poses challenges, including high initial investment, technical complexity, and data privacy concerns. Addressing these challenges requires careful strategic planning and robust infrastructure development. This paper presents practical case studies, examines critical success factors, and outlines best practices in deploying real-time data analytics solutions. Ultimately, adopting real-time data analytics positions retailers strategically by improving operational agility, reducing costs, and enhancing customer experience.
Article
El objetivo de este trabajo es analizar la inteligencia de negocios para gestión de inventarios en las empresas importadoras de estructurales de calzado, específicamente los ubicados en la ciudad de Ambato, Ecuador, enmarcado en el paradigma positivista bajo el enfoque cuantitativo. La perspectiva teórica fue abordada a partir de Rodríguez (2021), Zabaleta (2021), González (2020), Ivanov (2020), Li (2019) entre otros. Se obtuvo información a partir de fuentes primarias, mediante la aplicación de una encuesta a los dueños y/o gerentes de las empresas que importan estructurales de calzado en Ambato. El muestreo fue no probabilístico, se consideró 9 empresas importadoras de estructurales de calzado ubicado en Ambato; registrados en el Ministerio de Producción y la Cámara Nacional de Calzado. La información obtenida se procesó por medio de un análisis estadístico, donde se conoció que, en las empresas importadoras de estructurales de calzado, existe una baja presencia en el uso de información que proporciona la inteligencia de negocios para tomar decisiones y un bajo uso de herramientas de inteligencia de negocios para la gestión de los inventarios. Sin embargo, se conoce sobre los beneficios operativos y financieros que proporciona la utilización e implementación de herramientas en el área de inventarios. Así, la inteligencia de negocios representaría una herramienta fundamental para optimizar la gestión de inventarios, mejorar eficiencia, reducir costos, aumentar rentabilidad y fortalecer la competitividad.
Article
As the trend toward the digitization of complex business processes continues, the relevance of data quality for corporate success has increased. Especially, in multistep processes where data are created, modified, and transferred between different systems and departments, ensuring high data quality through continuous improvement is a competitive advantage. The interdependencies within multistep processes make troubleshooting more difficult and complex, as is typically the case in supply chains and logistics. At present, research on improving the data quality in complex process chains is relatively limited compared to the vast body of literature in operations research. Therefore, this exploratory study begins with a literature review on the measurement and monitoring of data quality in logistics and supply chains. Based on the findings from literature and the identified total data quality management model, a case study was conducted. As the first measuring approach, a survey was distributed to 148 employees in the central logistics department of a multinational automobile manufacturer to analyze the quality of billing‐relevant data in vehicle logistics. Although both subjective and objective approaches for measuring data quality have been described in the literature, automated techniques for continuous assessment of data quality have only increased in popularity in recent years. There is still potential for further research in the fields of process‐oriented measurement and monitoring that consider the interdependencies between systems and departments involved in multistage logistics processes. In the logistics and supply chain literature, the most common dimensions of data quality that can be measured automatically were accuracy, completeness, consistency, and timeliness. Consistency and accuracy were also found critical in the reference case, which could potentially be the result of unsatisfactory system interfaces, data quality checks, and system landscape. The statements related to the data quality checks, the system landscape, and the understandability dimension were rated quite differently by the different departments. The survey helped identify weaknesses that should be further investigated and improved in the future to ensure continuous process operation and profitability.
Article
Purpose The COVID-19 pandemic, geopolitical conflicts, anti-globalization and the digital economy have led to accelerated changes in the market, forcing companies to use big data to achieve precise and agile product or service delivery, thereby improving performance. Existing research has not yet explored the mechanisms for data-driven supply chain agility and supply chain performance. Based on dynamic capacity theory and organizational information processing theory, this paper constructs a conceptual model to investigate how big data analytics can facilitate the implementation of high-level supply chain agility and performance through customer integration, internal integration and collaborative knowledge creation. Design/methodology/approach We collected a sample of the Chinese food industry and conducted an empirical study using partial least squares structural equation model (PLS-SEM) and fuzzy set qualitative comparative analysis (fsQCA). Findings The results show that big data analytics has an impact on supply chain agility through three paths. Moreover, big data analytics capability and supply chain agility are considered dynamic capabilities and the effect of configuration under different conditions is empirically tested. Four solutions to improve the performance of the supply chain are obtained. Practical implications This research sheds light on the implementation process of big data-driven supply chain performance, which is of good theoretical and practical value for expanding the theory of organizational information processing and helping enterprises achieve high-level agile supply and performance. Originality/value We provide a new perspective on supply chain agility by exploring the antecedents of supply chain agility and its impact on supply chain performance from the perspective of information processing and dynamic capabilities. Existing studies have not focused on the role of big data analytic capabilities in improving supply chain agility. The purpose of this study is to attempt to establish a clear relationship between the three mediating paths (customer integration, internal integration and collaborative knowledge creation) between big data analytics capabilities and supply chain agility. In addition, we use the empirical methods of PLS-SEM and fsQCA to better substantiate the conclusions.
Chapter
In the present scenario, majority of the organizations wants to attain sustainable development goals in their various business practices. The main focus of the current chapter is to explore the digital technologies can be integrated with the various practices of the organization for the attainment of SDGs and sustainability advancement. This is initiated with an overview of the SDGs, highlighting their importance in addressing challenges at global level which includes change in climate, depletion of resources, and social inequality. The chapter defines the concept of the digital circular economy, highlighting how it encourages transition of business towards circular model of consumption and production form linear model. Key technologies in the digital space which includes blockchain, AI, and IoT are examined to test the capability for enhancing the resource efficiency, supply chain optimization, and facilitate sustainable practices. The discussion also addresses the challenges organizations face in adopting these technologies, such as organizational inertia, technological limitations, and regulatory hurdles. However, it emphasizes the opportunities presented by these advancements, including the potential for innovation, the creation of sustainable value, and developing the business models which are novel in nature. This chapter further outlines the future implications of digital technologies in fostering a circular economy, emphasizing the need for collaboration among governments, industries, and organizations to establish supportive regulatory frameworks. Ultimately, the current work as a call to action for both organizations and individual involved in policy making to embrace digital technologies as critical tools for fostering sustainability. By leveraging these technologies and collaborating across sectors, organizations can not only contribute to the SDGs but also drive significant positive influences on the setting and society.
Article
Full-text available
In the contemporary manufacturing landscape, the integration of data analytics has emerged as a pivotal driver for informed decision-making, enabling organizations to enhance operational efficiency, reduce costs, and foster innovation. As manufacturers confront increasing competition, fluctuating market demands, and the necessity for sustainable practices, data analytics provides actionable insights derived from vast amounts of data generated across production processes. This abstract explores the various dimensions of data analytics in manufacturing, including descriptive, diagnostic, predictive, and prescriptive analytics, each playing a unique role in transforming raw data into strategic intelligence. Descriptive analytics helps organizations understand historical performance by summarizing past data, while diagnostic analytics identifies the root causes of issues, facilitating proactive interventions. Predictive analytics leverages machine learning algorithms to forecast future trends and demand patterns, allowing manufacturers to optimize inventory levels and production schedules. Prescriptive analytics goes a step further by recommending specific actions based on predictive insights, guiding decision-makers toward optimal strategies. Moreover, the role of advanced technologies such as the Internet of Things (IoT) and artificial intelligence (AI) is critical in enhancing data collection and analysis capabilities. IoT devices enable real-time monitoring of equipment and processes, generating valuable data that, when analyzed, can lead to significant improvements in performance and resource utilization. AI enhances these analyses, providing deeper insights and enabling automated decision-making processes. The implementation of data analytics in manufacturing not only improves operational decision-making but also drives a culture of continuous improvement and innovation. As organizations harness the power of data, they can respond swiftly to market changes, enhance product quality, and achieve greater sustainability. This abstract underscores the transformative potential of data analytics in shaping the future of manufacturing, emphasizing its importance in creating resilient and adaptive manufacturing environments capable of thriving in an increasingly dynamic global market.
Chapter
Over the past few decades, advancements in data analytics and artificial intelligence, particularly machine learning, have underscored the significance of data-driven applications. Retailers in the textile industry now leverage predictive software for operational decision-making. This chapter examines the manufacturing data analytics of retail businesses, introducing a scalable business intelligence framework based on a graph data model and its management system. It also emphasizes how big data technologies and tools, including the Internet of Things, facilitate the real-time capture, storage, processing, and sharing of data. This capability allows companies to make quicker and more effective operational decisions. To illustrate the analytical potential of the graph database model for business intelligence, the chapter presents an algorithm designed to extract insights from stored business data.
Article
Full-text available
The rising quantities and enhanced security requirements in cloud data management operations have led to significant complexity. Maintaining a database by hand represents both an ineffective and error-prone practice. The paper examines how Python and AI-based monitoring tools enable automated management of cloud databases. The paper explains multiple database automation techniques together with the benefits of AI-supervised monitoring and reviews multiple Python libraries for automation purposes. Predictive analytics based on artificial intelligence shows its capability for anomaly detection and performance optimization in the study. The document concludes by examining the forthcoming opportunities of AI-operated cloud database management systems and their potential organizational effects.
Article
Full-text available
Objective: This study examines the impact of integrating Internet of Things (IoT) technologies and sophisticated analytics on supply chain efficiency within the automobile sector. It examines the alleviation of issues including data security, difficulties of system integration, and deficiencies in workforce skills to attain best operational results. Through this research we aim to achieve the sustainable development goals (SDGs) of 8 and 9, which are Decent Work and Economic Growth and Industry, Innovation and Infrastructure. It builds resilient infrastructure for the industry using innovative technologies like IoT and analytics and as a result achieving decent work and economic growth. Design/methodology/approach: The study used structural equation modeling (SEM) to examine data gathered from 150 automotive OEMs and their tier one supplier companies in and around Chennai, India. The research utilizes frameworks such as the Technology Acceptance Model (TAM) and the Theory of Constraints (TOC) to elucidate the connections among IoT problems, predictive analytics, and supply chain performance. Results and Discussion: The results show that predictive analytics enhances supply chain transparency, reducing inefficiencies, and thereby achieving accurate demand forecasting. Addressing the issues of IoT-related risks and operational inefficiencies significantly enhances supply chain performance metrics. Real-time data monitoring and strategic problem resolution were found to be essential for the successful integration of IoT and analytics. Practical implications: The research offers practical techniques for the use of IoT and analytics, highlighting real-time monitoring and data-driven decision-making to enhance supply chain responsiveness and efficiency. This research addresses IoT-related difficulties, enhancing academic debate on digital transformation in supply chains and providing a practical framework for the successful utilization of IoT and analytics. This study offers an effective framework for organizations in the automotive industry to successfully implement IoT and analytics by addressing key challenges. Originality/value: By focusing on the mediating role of recognizing and mitigating challenges in IoT and analytics adoption, this research contributes to the academic discourse on digital transformation in supply chains. It provides actionable insights for practitioners aiming to optimize supply chain operations through advanced technological solutions. The findings point out the importance of implementing proactive, data-driven strategies and fostering real-time visibility to achieve substantial gains in supply chain efficiency and responsiveness.
Chapter
Big data analytics uses data-driven decision making to optimize supply chain management. The optimized supply chain result in reduced lead times, heightened customer satisfaction, increased cost savings and enhancing the overall performance of the system. This chapter focuses on the elaborating the different methods where big data analytics can be used to enhance the effectiveness of supply chain. It also discusses the case studies where big data has given the better insights and enhanced the customers services
Article
Full-text available
Operational efficiency and reliability throughout various sectors depend heavily on industrial pump maintenance practices. Pumps operate as critical system components across manufacturing and oil and gas and water treatment industries because unexpected failures lead to disrupted processes alongside substantial financial losses and endanger personnel safety. Traditional maintenance approaches using both reactive and preventive methods encounter difficulties when managing the intricate operations of current industrial facilities. Traditional maintenance approaches regularly lead to equipment malfunctions at unpredictable times together with inefficient resource distribution and prolonged equipment downtime. The research demonstrates how data science will transform pump maintenance systems when traditional practices gradually transition toward predictive and data-based methods. Data analysis of past failure types alongside operating variables and maintenance record information allows data science tools to recognize familiar failure indicators. Early anomaly detection and precise failure forecasting along with maintenance program optimization can be achieved through combinations of machine learning algorithms with statistical modeling and real-time analytics. Continuous data acquisition through IoT sensors improves prediction accuracy because they allow enhanced data collection methods. Industry adoption of these technological solutions produces two main effects: it decreases equipment outages and boosts equipment lifespan while ensuring enhanced operational safety combined with significant cost reductions. This paper gives readers an implementation blueprint for data-driven maintenance operations while tackling crucial issues that include data cleanliness requirements integration difficulties and employee developmental needs. The paper outlines future perspectives that include artificial intelligence in combination with advanced analytics for developing better-maintained systems spanning from smarter to more reliable approaches. The research targets multiple industries that need a complete guide to adopting data science for efficient sustainable pump maintenance practices.
Article
Full-text available
One of the critical objectives underlying the digital transformation initiatives of numerous enterprises is the introduction of novel data-driven business models (DDBMs) aimed at facilitating the creation, delivery, and capture of value. While DDBMs has gained immense traction among scholars and practitioners, the implementation and scaling leave much to be desired. One widely argued reason is our poor understanding of the factors that enable DDBM’s effective implementation. Using a mixed-methods approach, this study identifies a comprehensive set of enablers, explores the enablers’ interdependencies, and discusses how the empirical findings are of value in DDBMs’ implementation from theoretical and practical viewpoints.
Chapter
This chapter explores how technology and entrepreneurial innovation enhance supply chain efficiency. It highlights the transformative impact of advanced technologies such as cloud computing, data analytics, and IoT on supply chain operations, improving responsiveness and reducing costs. The entrepreneurial orientation is emphasized as a key driver of innovation, enabling organizations to adapt to changing market conditions. Through case studies, the chapter illustrates successful collaborations between academia and industry, demonstrating how theoretical insights translate into practical solutions. Challenges in technology integration, such as data security and the need for skilled personnel, are also addressed. By bridging the gap between academia and industry, this work aims to inspire future research and applications that enhance supply chain management, underscoring the importance of fostering a culture of innovation to maintain competitiveness in the global marketplace.
Chapter
Full-text available
Cryptography is a subfield of cryptology. Its main function is to transform the original data into a very different content using mathematical operations and to prevent malicious individuals from accessing the original data content. This explains data confidentiality and holds a very important place in the field of security. In addition to ensuring data confidentiality, cryptology is also used for purposes such as data integrity, authorization, access control, and non- repudiation. The objectives of cryptography in data security have materialized in areas such as the Internet of Things (IoT), blockchain applications, digital signatures, and cloud computing. In these fields, numerous cryptographic algorithms have been developed from past to present to ensure data security. These algorithms, which can be described as traditional, are sufficient in terms of security in today's world. However, the widespread adoption of quantum computers in the near future is anticipated. Due to the high computational power of quantum computers, it is inevitable that the data security provided by traditional cryptographic algorithms will be compromised. Post- quantum cryptography (PQC) studies conducted in recent years aim to eliminate this threat and ensure post-quantum security. --161-- Cryptography has found many different application areas. In this book chapter, cryptography's most widely used areas, such as digital signature applications, cryptographic applications in the Internet of Things, blockchain technology applications, and cryptography in cloud computing, have been presented to the reader with a straightforward explanation.
Article
Full-text available
A multivariate extension of the exponentially weighted moving average (EWMA) control chart is presented, and guidelines given for designing this easy-to-implement multivariate procedure. A comparison shows that the average run length (ARL) performance of this chart is similar to that of multivariate cumulative sum (CUSUM) control charts in detecting a shift in the mean vector of a multivariate normal distribution. As with the Hotelling's χ2 and multivariate CUSUM charts, the ARL performance of the multivariate EWMA chart depends on the underlying mean vector and covariance matrix only through the value of the noncentrality parameter. Worst-case scenarios show that Hotelling's χ2 charts should always be used in conjunction with multivariate CUSUM and EWMA charts to avoid potential inertia problems. Examples are given to illustrate the use of the proposed procedure.
Article
Full-text available
Concepts of uncertainty and information processing are used to integrate the diverse organization design/structure literatures. This approach more fully explicates the concept of congruence which lies at the heart of contingency ideas. The review suggests a contingency approach to design which develops a feasible set of structural alternatives from which the organization can choose.
Article
Full-text available
Purpose This paper attempts to seek answers to four questions. Two of these questions have been borrowed (but adapted) from the work of Defee et al. : RQ1 . To what extent is theory used in purchasing and supply chain management (P&SCM) research? RQ2 . What are the prevalent theories to be found in P&SCM research? Following on from these questions an additional question is posed: RQ3 . Are theory‐based papers more highly cited than papers with no theoretical foundation? Finally, drawing on the work of Harland et al. , the authors have added a fourth question: RQ4 . To what extent does P&SCM meet the tests of coherence, breadth and depth, and quality necessary to make it a scientific discipline? Design/methodology/approach A systematic literature review was conducted in accordance with the model outlined by Tranfield et al. for three journals within the field of “purchasing and supply chain management”. In total 1,113 articles were reviewed. In addition a citation analysis was completed covering 806 articles in total. Findings The headline features from the results suggest that nearly a decade‐and‐a‐half on from its development, the field still lacks coherence. There is the absence of theory in much of the work and although theory‐based articles achieved on average a higher number of citations than non‐theoretical papers, there is no obvious contender as an emergent paradigm for the discipline. Furthermore, it is evident that P&SCM does not meet Fabian's test necessary to make it a scientific discipline and is still some way from being a normal science. Research limitations/implications This study would have benefited from the analysis of further journals, however the analysis of 1,113 articles from three leading journals in the field of P&SCM was deemed sufficient in scope. In addition, a further significant line of enquiry to follow is the rigour vs relevance debate. Practical implications This article is of interest to both an academic and practitioner audience as it highlights the use theories in P&SCM. Furthermore, this article raises a number of important questions. Should research in this area draw more heavily on theory and if so which theories are appropriate? Social implications The broader social implications relate to the discussion of how a scientific discipline develops and builds on the work of Fabian and Amundson. Originality/value The data set for this study is significant and builds on a number of previous literature reviews. This review is both greater in scope than previous reviews and is broader in its subject focus. In addition, the citation analysis (not previously conducted in any of the reviews) and statistical test highlights that theory‐based articles are more highly cited than non‐theoretically based papers. This could indicate that researchers are attempting to build on one another's work.
Article
Full-text available
Purpose Theory is needed for a discipline to mature. This research aims to provide a summary analysis of the theories being used in contemporary logistics and supply chain management (SCM) studies. Design/methodology/approach A comprehensive literature review of articles appearing in five top tier logistics and SCM journals is conducted in order to identify how often theory is used and to classify the specific theories used. An analysis of the theoretical categories is presented to explain the type and frequency of theory usage. Findings Over 180 specific theories were found within the sampled articles. Theories grouped under the competitive and microeconomics categories made up over 40 per cent of the theoretical incidences. This does not imply all articles utilize theory. The research found that theory was explicitly used in approximately 53 per cent of the sampled articles. Practical implications Two implications are central. First, in the minds of editors, reviewers and authors is approximately 53 per cent theory use enough? Literature suggests there continues to be a need for theory‐based research in the discipline. A first step may be to increase our theory use, and to clearly describe the theory being used. Second, the vast majority of theories used in recent logistics and SCM research originated in other disciplines. Growth in the discipline dictates the need for greater internal theory development. Originality/value Despite multiple calls for the use of theory in logistics and SCM, little formal research has been produced examining the actual theories being used. This research provides an in‐depth review and analysis of the use of theory in logistics and SCM research during the period 2004‐2009.
Article
Full-text available
Purpose To analyze the state of supply chain quality management in manufacturing companies by testing several hypotheses regarding the knowledge these companies have about their different supply chain partners, the attributes that characterize customer‐supplier relationships and the factors that determine the development of quality specifications in a supply chain, and the effect of supply chain quality management activities of companies on product quality. Design/methodology/approach Six hypotheses related to supply chain quality management have been developed through literature review and tested using survey data from US manufacturing companies. Findings Provides information about the results of each hypothesis, their implications, and how these findings relate to the previous literature. Research limitations/implications The study offers insights into what the findings suggest and provides guidelines for future research to tackle issues raised by these findings. There were also some research limitations. For instance, the study relied on the perceptions of the respondents to operationalize the survey instrument, and the variables were mostly operationalized using single measures. Practical implications The study recommends ways managers can use the study's findings to improve supply chain quality. Originality/value This paper fills a void in the literature by focusing on quality in supply chain management.
Article
Full-text available
In this editorial, we share our vision and expectation that articles published in the Journal of Business Logistics will be grounded in sound theory and make a clear contribution to theory development. By helping us make sense out of chaos, theory’s explanatory power leads to better decision making — a goal we should proactively pursue.
Article
In a field study, decision makers were found to choose information sources based on accessibility rather than quality. Some variation in source use was associated with individual characteristics such as motivation and tenure. Implications of the results are discussed for studies of communication and decision making.
Article
A control chart is considered for the problem of monitoring a process when all items from the process are inspected and classified into one of two categories. The objective is to detect changes in the proportion, p, of items in the first category. The control chart being considered is a cumulative sum (CUSUM) chart based on the Bernoulli observations corresponding to the inspection of the individual items. Bernoulli CUSUM charts can be constructed to detect increases in p, decreases in p, or both increases and decreases in p. The properties of the Bernoulli CUSUM chart are evaluated using exact Markov chain methods and by using a corrected diffusion theory approximation. The corrected diffusion theory approximation provides a relatively simple method of designing the chart for practical applications. It is shown that the Bernoulli CUSUM chart will detect changes in p substantially faster than the traditional approach of grouping items into samples and applying a Shewhart p-chart. The Bernoulli CUSUM chart is also better than grouping items into samples and applying a CUSUM chart to the sample statistics. The Bernoulli CUSUM chart is equivalent to a geometric CUSUM chart which is based on counting the number of items in the second category that occur between items in the first category.
Article
Control-charting has been successfully applied to two National Highway Traffic Safety Administration data systems to help control and thus assure the quality of their data. This article describes the methods used, illustrates the approach through various examples, and discusses various technical issues in applying control charts to these traffic safety data. The article also explains the rationale of the methods in terms of the logic of statistical process control. Finally, an example of nonrandomly missing data is given.
Article
The paper offers simple robust algorithms for checking consistency of large volumes of measured data. The checks differentiate between data collected on a spatial grid at one time point; and data collected on a spatial grid over many time points, as well as several related measurements collected on a spatial grid over time. The checking process involves computationally efficient methods of estimating expected values and variances used to judge measurement consistency. Three-sigma control limits are applied to flag inconsistent measurements. CUSUM and EWMA plans are advocated for flagging consistently small biased measures.
Article
Understanding sources of sustained competitive advantage has become a major area of research in strategic management. Building on the assumptions that strategic resources are heterogeneously distributed across firms and that these differences are stable over time, this article examines the link between firm resources and sustained competitive advantage. Four empirical indicators of the potential of firm resources to generate sustained competitive advantage-value, rareness, imitability, and substitutability are discussed. The model is applied by analyzing the potential of several firm resources for generating sustained competitive advantages. The article concludes by examining implications of this firm resource model of sustained competitive advantage for other business disciplines.
Article
A control chart is considered for the problem of monitoring a process when all items from the process are inspected and classified into one of two categories. The objective is to detect changes in the proportion, p, of items in the first category. The control chart being considered is a cumulative sum (CUSUM) chart based on the Bernoulli observations corresponding to the inspection of the individual items. Bernoulli CUSUM charts can be constructed to detect increases in p, decreases in p, or both increases and decreases in p. The properties of the Bernoulli CUSUM chart are evaluated using exact Markov chain methods and by using a corrected diffusion theory approximation. The corrected diffusion theory approximation provides a relatively simple method of designing the chart for practical applications. It is shown that the Bernoulli CUSUM chart will detect changes in p substantially faster than the traditional approach of grouping items into samples and applying a Shewhart p-chart. The Bernoulli CUSUM chart is also better than grouping items into samples and applying a CUSUM chart to the sample statistics. The Bernoulli CUSUM chart is equivalent to a geometric CUSUM chart which is based on counting the number of items in the second category that occur between items in the first category.
Article
This paper describes the present status of statistical process control from the view-point of the statistician. It summarizes twenty papers dating from early 1930 to the present. It indicates the relationship between the traditional concepts in quality control and the new concept of adaptive control.
Article
In today’s global business environment, supply chains have increased in both length and complexity. This increase in length and complexity coupled with a focus on improving efficiency, such as lean manufacturing practices, may lead to higher levels of supply chain risk where the likelihood of a disruption severely impacting supply chain performance increases. Resilient supply chains have been touted as a means to reduce the likelihood and severity of supply chain disruptions. However, there is little empirical evidence relative to the factors that contribute to or detract from supply resiliency. Using systems theory and the resource-based view of the firm as the theoretical underpinnings, this study provides an in-depth systematic investigation of supply resiliency. Adopting a theory-building approach based on a multi-industry empirical investigation, this study derives empirical generalizations linking 19 supply chain characteristics to supply resiliency. The study culminates in a framework that could be used to assess the level of resiliency in a supply base. Building on this framework, the study also provides a supply resiliency matrix that can be utilized to classify supply chains, or supply chains segments according to the level of resiliency realized. This article concludes by proposing several future research directions and issues that may be investigated in more detail.
Article
We illuminate the myriad of opportunities for research where supply chain management (SCM) intersects with data science, predictive analytics, and big data, collectively referred to as DPB. We show that these terms are not only becoming popular but are also relevant to supply chain research and education. Data science requires both domain knowledge and a broad set of quantitative skills, but there is a dearth of literature on the topic and many questions. We call for research on skills that are needed by SCM data scientists and discuss how such skills and domain knowledge affect the effectiveness of an SCM data scientist. Such knowledge is crucial to develop future supply chain leaders. We propose definitions of data science and predictive analytics as applied to SCM. We examine possible applications of DPB in practice and provide examples of research questions from these applications, as well as examples of research questions employing DPB that stem from management theories. Finally, we propose specific steps interested researchers can take to respond to our call for research on the intersection of SCM and DPB.
Article
Logistics customer service has received considerable attention over the past several decades. Evidence exists that superior logistics customer service leads to better overall firm performance. Yet mixed findings were observed, and this relationship has been tested across multiple operationalizations and diverse industry settings, which may contribute to these mixed findings. There is thus a need for a systematic analysis that examines all of the prior evidence in an aggregate inquiry of logistics customer service. Meta‐analysis, which is a relatively under‐utilized methodology in supply chain management research, is applied to provide a quantitative examination of 37 sample studies and an assessment of overall population effects. The main contribution of this research is that we statistically aggregate and summarize existing research on logistics customer service. In addition, moderators that affect the relationship between logistics customer service and firm performance are examined. The results provide evidence that logistics customer service has a significant positive relationship with firm performance; however, significant heterogeneity was detected. This points to areas in need of additional research in order to obtain generalizable evidence.
Article
Purpose ‐ This paper aims to explore the factors of quality control (QC) among key members of a supply chain and investigate the effect on supply chain management (SCM). Design/methodology/approach ‐ This research employs a case study approach of five firms in the fresh fruit and vegetable supply chain in Jordan. Cases are first analysed individually. Then a cross-analysis supplemented with archival material and non-participant observation is made. A questionnaire is also conducted in order to analyse the effect of QC on SCM. Findings ‐ The findings identify the high-order factors of QC and demonstrate the role of QC in SCM, acting as the main strategy to improve supply chains. Practical implications ‐ The case studies draw on the experiences and views of supply chain members in order to improve the understanding of the role of QC in SCM. The proposed conceptual framework can help managers in understanding the factors of supply chain QC. Originality/value ‐ This is one of only a few studies that examine QC in the supply chain. It is also one of only a few research studies to provide empirical evidence of the role of QC in SCM for the fruit and vegetable industry.
Article
In many production firms it is common practice to financially reward managers for firm performance improvement. The use of financial incentives for improvement has been widely researched in several analytical and empirical studies. Literature has also addressed the strategic effect of incentives, in particular what the effect of certain incentive structures would be on the behavior of a firm's competitor(s). Most of these studies, however, focus on sales incentives. In this paper we investigate the effects of strategic incentives for product quality and process improvement using a game theoretic model that considers two owner–manager pairs in competition. We find that if one of the managers is told to only maximize firm profits (which in fact is similar to profit incentives), the other manager will be offered positive incentives for product quality and process improvement. These product quality and process improvement incentives result in increased profits, at the expense of the profits of the other firm. Also we find that if both firm owners have the possibility to offer incentives for product quality and process improvement, they will both do so. However, this equilibrium essentially entails a prisoner's dilemma, in which the two firms earn lower profits compared to a situation in which the owners instruct their managers only to maximize firm profits. Insights into the normalization of the problem and the aggregation of multiple product quality and process improvement variables are also discussed.
Article
Research suggests that there are other, more granular factors within the domain of innovation diffusion theory that influence the adoption of technological innovations. In this study, the circumstances that affect a firm's intention to adopt cloud computing technologies in support of its supply chain operations are investigated by considering tenets of classical diffusion theory as framed within the context of the information processing view. We posit that various aspects of an organization and its respective environment represent both information processing requirements and capacity, which influence the firm's desire to adopt certain information technology innovations. We conducted an empirical study using a survey method and regression analysis to examine our theoretical model. The results suggest that business process complexity, entrepreneurial culture and the degree to which existing information systems embody compatibility and application functionality significantly affect a firm's propensity to adopt cloud computing technologies. The findings support our theoretical development and suggest complementarities between innovation diffusion theory and the information processing view. We encourage other scholars to refine our model in future supply chain innovation diffusion research. The findings of this study might also be used by industry professionals to aid in making more informed adoption decisions in regard to cloud computing technologies in support of the supply chain.
Article
As the volume and variety of available data continues to proliferate, organizations increasingly turn to analytics in order to enhance business decision-making and ultimately, performance. However, the decisions made as a result of the analytics process are only as good as the data on which they are based. In this article, we examine the data quality problem and propose the use of control charting methods as viable tools for data quality monitoring and improvement. We motivate our discussion using an integrated case study example of a real aircraft maintenance database. We include discussions of the measures of multiple data quality dimensions in this online process. We highlight the lack of appropriate statistical methods for analysis of this type of problem and suggest opportunities for research in control chart methods within the data quality environment. This paper has supplementary material online.
Article
This paper represents an attempt to offer a comprehensive bibliography of references on control charting using attribute data. A brief overview and perspective is given of some of the work in this area. Suggestions are made for future research topics.
Article
That we cannot make all pieces of a given kind of product identically alike is accepted as a general truth. It follows that the qualities of pieces of the same kind of product differ among themselves, or, in other words, the quality of product must be expected to vary. The causes of this variability are, in general, unknown. The present paper presents a scientific basis for determining when we have gone as far as it is economically feasible to go in eliminating these unknown or chance causes of variability in the quality of a product. When this state has been reached, the product is said to be controlled because it is then possible to set up limits within which the quality may be expected to remain in the future. By securing control, we attain the five economic advantages discussed in Part III.
Article
A geometrical moving average gives the most recent observation the greatest weight, and all previous observations weights decreasing in geometric progression from the most recent back to the first. A graphical procedure for generating geometric moving averages is described in which the most recent observation is assigned a weight r. The properties of control chart tests based on geometric moving averages are compared to tests based on ordinary moving averages.
Article
The purpose of this paper is to explain why task uncertainty is related to organizational form. In so doing the cognitive limits theory of Herbert Simon was the guiding influence. As the consequences of cognitive limits were traced through the framework various organization design strategies were articulated. The framework provides a basis for integrating organizational interventions, such as information systems and group problem solving, which have been treated separately before.
Article
This paper describes the present status of statistical process control from the viewpoint of the statistician. It summarizes twenty papers dating from early 1930 to the present. It indicates the relationship between the traditional concepts in quality control and the new concept of adaptive control.
Article
This article presents the design procedures and average run lengths for two mulativariater cumulative sum (CUSUM) quality-control procedures. The first CUSUM procedure reduces each multivariate observation to a scalar and then forms a CUSUM of the scalars. The second CUSUM procedure forms a CUSUM vector directly from the observations. These two procedures are compared with each other and with the multivariate Shewhart chart. Other multivariate quality-control procedures are mentioned. Robustness, the fast initial response feature for CUSUM schemes, and combined Shewhart-CUSUM schemes are discussed.
Article
In this article, the author reviews and synthesizes the varying definitions of product quality arising from philosophy, economics, marketing, and operations management. He then goes on to build an eight-dimensional framework to elaborate on these definitions. Using this framework, he addresses the empirical relationships between quality and variables such as price, advertising, market share, cost, and profitability.
Article
This paper, presented orally to the Gordon Research Conference on Statistics in Chemistry in July 1960, traces the development of process inspection schemes from the original methods of Shewhart to the new charts using cumulative sums, and surveys the present practice in the operation of schemes based on cumulative sums. In spitc of the completely different appearance of the visual records kept for Shewhart and cumulative sum charts, a continuous sequence of development from the one type of scheme to the other can be established. The criteria by which a particular process inspection scheme is chosen are also developed and the practical choice of schemes is described.
Article
Reviews the dynamic operation of supply chains and reaches some simple conclusions for reducing demand amplification, which consequently attenuates swings in both production rates and stock levels. The results are based on one particular supply chain, for which the use of systems simplification techniques has generated valuable insight into supply chain design. Although different strategies are compared for reducing demand amplification as witnessed by one particular supply chain model, the conclusions are nevertheless thought to have wide application and, indeed, implication. Comments in depth on the significance of the simulation results for the demand chain as a whole, and for the role of an individual business within the chain. In the first instance, supply chain integration, and in particular free exchange of information, is a prerequisite for progress. In the second case, shows that reduction in lead times throughout the supply chain via JIT is similarly beneficial. Clearly pinpoints the limitation to supply chain improvement which can be obtained as a result of using JIT alone. This can be an expensive and ongoing process of improvement with many spin-off benefits. Nevertheless, shows that the improvement possible by JIT operation of an individual business can be negated by the failure to design and manage the supply chain dynamics as a total system. The message for an individual business is thus quite specific. Not only must lead times be reduced via JIT, but also the business must seek to be part of the right supply chain, if it is to remain competitive and stable.
Article
Purpose – The objective of the study is to examine the extent to which quality management practices, tools and methods are employed and adopted in Australian companies. Design/methodology/approach – To address the aim of the study, a survey instrument was developed and data were collected from a field research on a sample of manufacturing, retail, and logistics companies. Findings – The results of this study indicate that the primary obstacles for not implementing quality programs are “establishing employee ownership of the quality process”, “changing the corporate culture”, and “training and education of employees”. The results also show that the most important component that identifies quality in logistics is “on‐time delivery”. Originality/value – This study demonstrates that integration of quality programs with corporate strategy, development of closer links with suppliers, and consistency in order cycle are critical initiatives required for improvement.
Article
Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.
Article
Back in the 1990s, computer engineer and Wall Street "quant" were the hot occupations in business. Today data scientists are the hires firms are competing to make. As companies wrestle with unprecedented volumes and types of information, demand for these experts has raced well ahead of supply. Indeed, Greylock Partners, the VC firm that backed Facebook and LinkedIn, is so worried about the shortage of data scientists that it has a recruiting team dedicated to channeling them to the businesses in its portfolio. Data scientists are the key to realizing the opportunities presented by big data. They bring structure to it, find compelling patterns in it, and advise executives on the implications for products, processes, and decisions. They find the story buried in the data and communicate it. And they don't just deliver reports: They get at the questions at the heart of problems and devise creative approaches to them. One data scientist who was studying a fraud problem, for example, realized it was analogous to a type of DNA sequencing problem. Bringing those disparate worlds together, he crafted a solution that dramatically reduced fraud losses. In this article, Harvard Business School's Davenport and Greylock's Patil take a deep dive on what organizations need to know about data scientists: where to look for them, how to attract and develop them, and how to spot a great one.