Figure 2 - uploaded by Onur Dogan
Content may be subject to copyright.
Define Phase One of the goals of LSS is to eliminate variations in the process. In Measure phase, ANOVA can identify the variation causing quality problems. Tally chart, descriptive and predictive statistics can be used to measure current situation of process. Figure 3 indicates the suggested roadmap for the Measure phase.

Define Phase One of the goals of LSS is to eliminate variations in the process. In Measure phase, ANOVA can identify the variation causing quality problems. Tally chart, descriptive and predictive statistics can be used to measure current situation of process. Figure 3 indicates the suggested roadmap for the Measure phase.

Source publication
Conference Paper
Full-text available
Almost all quality improvement methods require data collection and analysis to solve quality problems. The combination of six sigma and lean manufacturing creates lean six sigma methodology that aims to reach six sigma quality levels, less than 3.4 part per million defectives, by reducing variations and wastes within processes. Achieving the goal d...

Context in source publication

Context 1
... process discovery, status of the process can be shown visually. Figure 2 indicates the suggested roadmap for the Define phase. ...

Similar publications

Article
Full-text available
Manufacturing companies strive for ever-increasing competitiveness through productivity and quality. This goal can be achieved through the implementation of lean manufacturing and six sigma methodologies. Lean manufacturing adds value by reducing waste, while six sigma eliminates variability. In this context, this article proposes a new framework w...

Citations

... Non-HEI institutions or smaller educational providers have a different resource base or capital to invest in Digitalisation. While there are limited studies on Lean Six Sigma (LSS) and Industry 4.0, the literature discusses the benefits of combining Industry 4.0 technologies with Six Sigma problem-solving (Dogan and Gurcan, 2018;Sodhi, 2020). In addition, Industry 4.0 data analytics and data mining tools can aid statistical analysis and define, measure, analyse, improve, and control (DMAIC) problem-solving to aid improvements (Antony et al., 2021). ...
... Data quality directly affects a company's decision making, operational efficiency and customer satisfaction, which in turn affects the achievement of organizational dexterity (O'Cass et al., 2014). Today 's businesses generate valuable data that can be mined and processed to enhance the effectiveness and quality of business processes, process mining and analysis of high-quality data can assist businesses in making decisions that will maximize their business, but in order to do so, the quality of the data must adhere to certain standards (Dogan & Gurcan, 2018). This means that quality improvement techniques must be used to address data quality issues to ensure the reliability of the information obtained from the data analysis (Van der Aalst et al., 2011). ...
Article
Full-text available
Study design/ Methodology: In today's rapidly evolving digital landscape, the convergence of organizational dexterity and data quality has become paramount for organizations seeking to thrive amid the data revolution and navigate the complex web of opportunities and problems generated by data. This paper explores the essential intersection of data excellence, showing how organizational dexterity and well-informed decision-making are based on the synergy of big data and data quality. This paper has been divided into three fundamental elements, starting with the vast field of big data and its capacity to reveal revolutionary discoveries. Next, we move on to the idea of organizational dexterity, or the capacity of an organization to quickly adjust to changing conditions, data quality, which is the keystone that guarantees the accuracy and dependability of insights obtained from big data analytics, and obstacles that organizations encounter in preserving data of superior quality, then integrating contemporary research through the practical advantages of combining organizational dexterity, data quality, and Big Data analytics are highlighted via case studies and real-world examples. All these points were obtained through a review of the literature and articles from global databases. Findings: This paper emphasizes the necessity of data excellence as a critical strategic initiative by coordinating big data endeavors with organizational dexterity and dedication to data integrity. In conclusion, organizations should prioritize data quality through governance, cleansing, and validation and foster a data-driven culture with training and leadership to enhance decision-making. Since developing agility within organizations is crucial for adapting to market changes, companies can not only prosper in the digital environment, but also foster an environment of perpetual innovation and achievements.
... There are numerous optimization concepts in the field of production, which have already proved their worth through successful implementation in practice, such as lean production or the theory of constraints. However, these two optimization concepts have little interaction with the corresponding information flows [55,56]. Hence, the idea is to conceive and develop a platform for analyzing and optimizing processes in fixed mining facilities by accessing a combination of execution data and data from operational systems. ...
Article
Full-text available
Given the competition in the mining markets and the rapid evolution of customer requirements, the Moroccan mining group OCP (Office Chérifien des Phosphates) has been forced to improve the performance of its production systems. Thus, continuous performance improvement and optimization of production processes are prerequisites to remain competitive. However, in Morocco, data analytics-based mining process improvements do not fully utilize the data generated during process execution. They lack prescriptive methodologies, which is the major goal of this work, to translate analytic results into improvement actions. Indeed, we propose a new platform for optimizing the production processes of a Moroccan mine based on knowledge extraction from data, allowing mine managers to rapidly and continuously improve the performance of their production chains. The platform will be an effective and efficient tool for mining companies to generate prescriptive action recommendations during the execution of the processes.
... They interviewed different manufacturing companies in Italy, and in the end, they classified Industry 4.0 technologies under the DMAIC stages (Chiarini and Kumar 2021). Furthermore, Dogan and Gurcan (2018) approached the subject from a data perspective, intending to improve quality and different methods for each DMAIC stage by covering the concepts of statistics, quality tools, data mining, big data, and process mining. In addition, Anvari, Edwards, and Agung (2021) presented the results of an ongoing study that aims to show mutual support between Industry 4.0 and LSS. ...
... Machine learning could be useful for making a connection between entities, making intelligent decisions, and providing a deeper understanding of the system (Dogan and Gurcan 2018). In a cause-and-effect analysis, data derived from machine learning can be used to make dynamic decision-making and forecast potential causes of errors. ...
... Appropriate exploratory techniques are essential to analyze hidden causes of errors and different patterns. For this purpose, data mining provides an associative analysis, clustering, classification, and prediction of the system (Dogan and Gurcan 2018), and it is applicable for root cause analysis in terms of providing relevant data. ...
... Furthermore, insights from Milena Rajić's "Lean Six Sigma: Integrating Knowledge, Data, and Innovation for Organizational Excellence" highlights the significance of data-driven decision-making in optimizing organizational processes, suggesting a novel DMAIC 4.0 framework that aligns digital technologies with specific stages of the Lean Six Sigma process, as shown in Figure 3 [17]. In conclusion, the synergistic integration of Lean Six Sigma with AI, ML, IoT, and Blockchain technologies is paving the way for a new paradigm in manufacturing and service operations [18,19]. This evolution of LSS, driven by digital transformation, not only enhances process efficiency and quality but also significantly contributes to the sustainability and resilience of modern manufacturing landscapes. ...
Preprint
Full-text available
Purpose: This study delves deep into systematically integrating AI, Blockchain, and IoT within manufacturing, guided by Lean Six Sigma (LSS) philosophy, aiming to promote higher precision, human safety, sustainability, reduced errors, and wastage while maintaining minimal human involvement. Design/methodology/approach: The study rigorously studies cases of the manufacturing industry integrating these concepts into real industrial or experimental setups and discusses their potential implications. It explores the step-by-step integration, control, and regulation of intricate manufacturing aspects. Findings: Artificial Intelligence is beneficial for real-time regulation and prompt corrective measures during manufacturing operations. The Internet of Things provides real-time feedback, ensuring synchronization between teams and departments to maintain flaw detection and correction. Blockchain offers security and transparency in task performance, supply chain management, and payments, resulting in a seamless and efficient manufacturing experience. 1 Originality: This study is not a mere overview or surface-level juxtaposition of Industry 5.0 concepts. It offers a comprehensive analysis of the holistic integration of advanced technologies within manufacturing, linking developments to sustainable manufacturing. It provides new insights and perspectives not discussed in current literature found in databases like Scopus and Web of Science. Research limitations/implications: The study encourages ongoing research and development to meet modern economic needs and environmental challenges. It emphasizes that the integration is not the endpoint, but continuous improvement (kaizen) should prevail. Practical implications: This study serves as an authoritative source of information for the scientific community to further advancements in the field, guiding them to develop evidence-based opinions. Social implications: The findings support the vision of sustainable digital manufacturing, promoting economic efficiency and environmental sustainability, crucial for the current and future industrial landscape.
... In the context of the efforts made in the literature to find an implementation framework for LSS4.0, the attempts remain very limited (Antosz & Stadnicka, 2018;Chiarini & Kumar, 2021;Dogan & Gurcan, 2018). Despite recent efforts to combine I4.0 technologies with the LSS approach, it is yet difficult to achieve an advanced level of automation (Ghobakhloo & Fathi, 2019). ...
... As long as LSS relies on data and traditional methods require more time and cost, the integration of I4.0 technologies especially BDA and process mining, will have a powerful impact to have information in real-time, make effective decisions, and produce a better quality product. For this reason, Dogan and Gurcan (2018) proposed a theoretical framework to integrate BDA techniques into each phase of DMAIC or DMADV. Sony (2020) proposed a theoretical framework to implement I4.0 in organizations. ...
... Antosz and Stadnicka (2018) attempted to conduct a case study where Six Sigma is used to collect data, lean to identify waste and I4.0 to improve the maintenance service. Dogan and Gurcan (2018) propose a model that uses mining techniques in all LSS cycles to reach optimum and powerful decisions in each stage. Chiarini and Kumar (2021) provide a set of prerequisites to fulfill for successfully transitioning to LSS4.0. ...
... Machine Learning (ML) and Data Mining serve the purpose of categorising the data (Oussous et al., 2017), thus enabling the conversion from data to usable information . Humans are not able to cope with the significant amount of data that is collected (Dogan and Gurcan, 2018), let alone analyse it in real-time. Recent developments in the area of algorithms have significantly improved the capabilities of ML, which can capture and process large amounts of data at high velocity (Günther et al., 2017). ...
Article
Full-text available
There is a clearly identified need for adjusting the current implemented standards and methods in the area of process improvement, like Six Sigma, to be aligned with technology advances in the context of Industry 4.0. Thus, this research aims to focus on the Six Sigma DMAIC methodology and introduce a new quality improvement cycle toward Industry 4.0. The proposed new Six Sigma implementation procedure is called the DMAISE (Pronunciation: də-mɛjz/də-mayz) improvement cycle, which consists of five main phases: Data Measurement, Analysis, Interpretation, Simulation and Enhancement. DMAISE cycle is introduced to obtain all benefits from the DMAIC while not being affected by its limitations that result from the lack of proper integration of technologies available through the advancements inspired by Industry 4.0. A questionnaire survey is developed to collect data from practitioners, experienced employees, and academics in the available organisations to evaluate and validate the proposed new cycle. The results demonstrated that the proposed cycle is considered a viable quality improvement cycle for the new challenges that arise with the Industry 4.0/digital era and the smart technologies being developed for manufacturing environments.
... En la literatura se evidencian múltiples implementaciones exitosas de los enfoques antes mencionados de manera individual, con relevantes aportes en la mejora de procesos, costos productivos, calidad, velocidad de respuesta, plazos de entrega y flexibilidad. (Dogan & Gurcan, 2018). ...
... Tal filosofía se focaliza en el agregado de valor al cliente y la eliminación de desperdicios mediante la aplicación del mapeo del flujo de valor (VSM) (Chen & Weng, 2009) y la resolución de problemas para alcanzar la mejora continua de los sistemas de producción. (Dogan & Gurcan, 2018). ...
Article
Full-text available
Actualmente las organizaciones productivas afrontan una era de constantes desafíos vinculados con la adaptación de sus procesos al nuevo paradigma productivo de industrias inteligentes, mercados globalizados, alta competitividad y personalización de productos. Bajo este escenario, las metodologías de excelencia operacional integradas con tecnologías de gemelos digitales juegan un papel preponderante para mejorar el desempeño empresarial y generar ventajas competitivas.El presente trabajo propone como eje central de investigación la aplicación conjunta del modelo TLS (TOC, Lean, Six Sigma) con técnicas de simulación de eventos discretos y diseños de experimentos, como base para generar una mejoraen la capacidad productiva de una línea de porcelanato en una industria cerámica, ubicada en la provincia de Buenos Aires.La metodología utilizada para su desarrollo se compone de cuatro (4) fases principales. Una primera fase de caracterización del proceso y cuantificación de recursos para el desarrollo de un modelo conceptual. Una segunda fase de desarrollo y validación delmodelo computacional, utilizando FlexSim® como herramienta de simulación. Una tercera fase de diagnóstico, donde a través de la técnica VSM y la teoría de restricciones se identificó a la operación de cocción como cuello de botella del sistema. Y una última fase, de análisis, en la cual mediante un diseño de experimento unifactorial se evaluaron diferentes escenarios para potenciar la capacidad productiva de la línea implementando cambios en la estructura actual del horno.
... The outcome of the literature review observed several barriers related to I4.0 and LSS approach. For instance, Dogan and Gurcan (2018) highlighted that the lack of Intelligent monitoring and automation systems, a lack of comprehensive understanding of data analysis, and poor knowledge of modelling with I4.0 and LSS are the top three barriers to LSS and I4.0 adoption in an organisation. Johansson (2019) considered a lack of comprehensive understanding of data analysis, poor prevention and resolution of problems through AI technology, a lack of ecosystem, infrastructure and funds to deploy I4.0 and LSS as critical barriers to LSS and I4.0 adoption. ...
Article
Purpose The integration of Lean Six Sigma (LSS) and Industry 4.0 (I4.0) is in the nascent stage and promises to achieve new optimums in operational excellence. This study aims to empirically examine the enablers, barriers, benefits and application of I4.0 technologies in LSS and I4.0 integration. Design/methodology/approach A pilot survey was chosen as an appropriate methodology, as LSS and I4.0 integration is still budding. The survey targeted senior quality management professionals, quality managers, team leaders, LSS Black Belts and operations managers to collect the relevant research data. The questionnaire was sent to 200 respondents and received 53 valid responses. Findings This study reveals that “top management support” is an essential enabler for LSS and I4.0 integration. The most significant barrier was “poor understanding of data analysis” and “lack of top management support”. The findings further illustrated that LSS and I4.0 integration resulted in greater efficiency, lower operational costs, improved productivity, improved customer satisfaction and improved quality. Regarding I4.0 technology integration at different phases of LSS, the authors noticed that big data analytics and artificial intelligence (AI) are the most prominent technologies used in all phases of LSS implementation. Research limitations/implications One of the limitations of this study is the sample size. LSS and I4.0 are emerging concepts; hence, obtaining a larger sample size is difficult. In addition, the study used non-parametric tests to analyse the data. Therefore, future studies should be conducted with large sample sizes across different continents and countries to understand differences in the key findings. Practical implications The outcomes of this study can be useful for organisational managers to understand the enablers and barriers before integrating LSS and I4.0 for adoption in their organisations. Secondly, it helps to convince top management and human resource personnel by providing a list of benefits of LSS and I4.0 integration. Finally, it can help decision-makers understand which I4.0 technologies can be used in different stages of LSS methodology. Originality/value LSS and I4.0 integration was studied at a conceptual level. This is the first empirical study targeted toward understanding the LSS and I4.0 integration. In addition, this study investigates the application of widely used I4.0 technologies in different phases of LSS.
... There is an increase in valuable data generated in companies today which can be mined and processed to improve the quality and performance of business processes. Process mining and analysis from good quality data can help companies in taking effective decisions to optimize their business, for this to be possible the quality of the data must meet some standards (Dogan & Gurcan, 2018). This means quality improvement methods need to be applied to solve data quality problems so that the information gotten from data analysis can be reliable. ...
Article
Full-text available
Timestamps play a key role in process mining because it determines the chronology of which events occurred and subsequently how they are ordered in process modelling. The timestamp in process mining gives an insight on process performance, conformance, and modelling. This therefore means problems with the timestamp will result in misrepresentations of the mined process. A few articles have been published on the quantification of data quality problems but just one of the articles at the time of this paper is based on the quantification of timestamp quality problems. This article evaluates the quality of timestamps in event log across two axes using eleven quality dimensions and four levels of potential data quality problems. The eleven data quality dimensions were obtained by doing a thorough literature review of more than fifty process mining articles which focus on quality dimensions. This evaluation resulted in twelve data quality quantification metrics and the metrics were applied to the MIMIC-III dataset as an illustration. The outcome of the timestamp quality quantification using the proposed typology enabled the user to appreciate the quality of the event log and thus makes it possible to evaluate the risk of carrying out specific data cleaning measures to improve the process mining outcome.