Mark Treleven’s research while affiliated with University of North Carolina at Chapel Hill and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (9)


Job-shop vs. hybrid flow-shop routing in a dual resource constrained system
  • Article

June 2007

·

45 Reads

·

19 Citations

Decision Sciences

Douglas A. Elvers

·

Mark D. Treleven

This paper studies the impact of job routing pattern on the performance of dual resource constrained (DRC) systems. Three markedly different job routing patterns are simulated using a DRC system model that incorporates labor transfer times. The problem is viewed from two perspectives: one portion of the analysis focuses on the relative performances of five different dispatching rules as the routing pattern varies; the other examines the effect of the routing pattern on each rule's individual performance. One generalized conclusion of this study is that routing pattern has no significant impact on the relative effectiveness of various dispatching rules. Consequently, findings from pure job-shop research studies may be applied to situations where a mixture of job-shop and flow-shop routings is present.


A Review of the Dual Resource Constrained System Research

September 1989

·

100 Reads

·

160 Citations

IIE Transactions

This paper reviews the recent dual resource constrained (DRC)system literature. The DRC system research has gained new significance through the emphasis being placed on cross-training by Just-In-Time (JIT) system advocates. Cross-training is the key to labor flexibility-one of the key issues in the DRC system research. This review covers over twenty-five articles about DRC systems and should be helpful in familiarizing potential DRC system researchers with the field. The DRC system research and results, categorized into design and operating decisions, are summarized and suggestions for future research are provided. A table which summarizes the characteristics of the DRC systems modeled, decision rules examined, statistical analysis methods employed, and the performance criteria used in each study is also provided. An appendix is used to highlight characteristics that are common to many of the models employed in the studies surveyed.Handled by the Manufacturing and Automated Production Department


A Risk/Benefit Analysis of Sourcing Strategies: Single Vs. Multiple Sourcing

December 1988

·

922 Reads

·

206 Citations

Journal of Operations Management

As more companies recognize the importance of quality to their long‐term survival they are, among other actions, attempting to formulate sourcing strategies that are consistent with their quality strategies and policies. The two major sourcing strategy options available are multiple and single sourcing. Unfortunately, the literature has lagged behind the practitioner's needs for direction in this area and, therefore, provides little help to managers wrestling with this important and difficult issue. This study uses a risk/benefit approach to carefully examine sourcing strategies from the perspective of the purchaser, or vendee. To ensure a complete and accurate identification and description of the various factors which influence sourcing decisions, intensive interviews were conducted with purchasing and quality personnel at the Manager and Director levels within a number of large, international organizations. Most of the interview data was collected at Control Data, General Mills, Honeywell, Pillsbury, and Unisys. This data was supplemented with information from managers at other companies and the literature which exists in this area. The paper begins by describing the various definitions of single sourcing and the definition which is used for the purposes of this paper. Next, the factors which should influence managers' sourcing decisions are identified and classified into five risk/benefit categories. These categories are then incorporated into a conceptual risk/benefit assessment model for managers to use in the evaluation of their sourcing strategy. The conceptual model is an adaptation of the additive rating model used for facility location decisions [1]. It is suggested that the model would most appropriately be applied at the family of parts level. Application at this level would likely result in different sourcing strategies for different part families (and different organizations), depending on the characteristics of each family. This is consistent with the conclusions of Hahn, Kim and Kim [10]. This work contributes to the literature in several ways. The first of these contributions is the complete definition of the various levels of single sourcing. Compilation of the many factors associated with both single and multiple sourcing decisions is another contribution. Prior to this research, the identification of these factors had been incomplete, limited to single sourcing, fragmented across a number of articles, and disseminated largely by word of mouth. The construction of a framework within which these factors can be categorized is also new to the literature. The final contribution of this research is the development of a conceptual model to analyze the combined effects of these factors.


A Comparison of Flow and Queue Time Variances in Machine-Limited Versus Dual-Resource-Constrained Systems

March 1988

·

27 Reads

·

5 Citations

IIE Transactions

This paper addresses the issue of which performance time measure, flow or queue time, should be used to evaluate heuristics such as dispatching and labor assignment rules in both machine-limited and dual-resource-constrained production systems. Since the difference between the mean flow time and the mean queue time is simply the mean processing time, any reporting of differences in these means would be uninteresting. Therefore, this research focuses on the variances of these performance time measures. Using these measures, differences between machine-limited and dual-resource-constrained systems are also examined. The methodology employed to conduct this experimentation is computer simulation.The results indicate that there are major differences in the performance of machine-limited compared to dual-resource-constrained systems. Arguments for using queue time instead of flow time as a performance criterion are supported by the results of additional experimentation. These results have important implications for the interpretation of much of the existing and future research into dispatching and labor assignment rules.


The sources, measurements, and managerial implications of process commonality

October 1987

·

7 Reads

·

10 Citations

Journal of Operations Management

This article explores the issue of commonality of processes. While the benefits of high process commonality (similarity of processes) are generally recognized, this research represents the first clear identification of the sources, development of measurements, and discussion of the managerial implications of process commonality. The first section of the article identifies and categorizes the sources of process commonality. These categories are: low set‐up times, flexibility to change from one operation to another, and flexibility in making expedite decisions‐ Indices are developed in the second section for each of these three categories at the work center level. These indices are examined using sensitivity analysis and are found to respond appropriately to changes in parameters. Formulation of a composite index is also discussed. The final section discusses the managerial implications of process commonality at the strategic, tactical, and operational levels. The impact of improved process commonality covers a wide range of decisions, including product/process selection, production/inventory control system selection, and definition of group technology cells, among others.


The sources, measurements, and managerial implications of process commonality

October 1987

·

8 Reads

·

13 Citations

Journal of Operations Management

This article explores the issue of commonality of processes. While the benefits of high process commonality (similarity of processes) are generally recognized, this research represents the first clear identification of the sources, development of measurements, and discussion of the managerial implications of process commonality. The first section of the article identifies and categorizes the sources of process commonality. These categories are: low set‐up times, flexibility to change from one operation to another, and flexibility in making expedite decisions‐ Indices are developed in the second section for each of these three categories at the work center level. These indices are examined using sensitivity analysis and are found to respond appropriately to changes in parameters. Formulation of a composite index is also discussed. The final section discusses the managerial implications of process commonality at the strategic, tactical, and operational levels. The impact of improved process commonality covers a wide range of decisions, including product/process selection, production/inventory control system selection, and definition of group technology cells, among others.


Single Sourcing: A Management Tool for the Quality Supplier

March 1987

·

427 Reads

·

135 Citations

Journal of Supply Chain Management

Single sourcing is a purchasing policy that purportedly can produce significant benefits for both parties involved. This article takes a close look at the pros and cons of single sourcing, particularly from the perspective of the vendor. It is important that buyers understand the factors that are likely to motivate a vendor to enter into this type of relationship. With this understanding, a buyer is better prepared to explore the possibility of single sourcing arrangements with selected vendors.


Component part standardization: An analysis of commonality sources and indices

February 1986

·

232 Reads

·

120 Citations

Journal of Operations Management

The importance of higher component part standardization has been recognized as an important area of empirical investigation since it has been hypothesized to reduce inventory levels by reducing safety requirements, to reduce planned load through larger lot sizes, and to reduce planning complexity through reducing number of items to be planned. Therefore, component part standardization offers considerable promise for managers wishing to improve their production capabilities. In order to achieve higher standardization, measures indicating the degree of standardization are necessary. The most traditional measure of component part standardization is the degree of commonality index (DCI), which indicates the average number of uses per component parts. Unfortunately, this measure has many theoretical limitations. First, it is a cardinal measure and, therefore, cannot measure the degree of uncommon part numbers that frequently cause production planning problems. Additionally, as a cardinal measure, it cannot be used to compare planning across organizations and is not useful for making summary comparisons of planning complexities across organizations. This study develops a relative index that has boundaries of standardization between 0 and 1 corresponding to the lay language usage‐each item being unique (no standardization) and one item used everywhere (complete standardization). A second major weakness of the DCI is that it does not recognize sources of standardization for decision making. There are at least three principal decisions on which component part standardization indices can provide information: 1) within‐product decisions, 2) between‐product decisions, and 3) make‐buy decisions. The within‐product decision refers to using each component as frequently as possible within each end item. This increased usage means fewer unique items within each end item and is expected to reduce that item's planning complexity. The between‐product index is used to examine the design of new end items. Its purpose is to give indications of additional planning complexity by the increased number of new component parts. Hence, between‐product indices give information for reducing planning complexity with the introduction of new products. This article develops these two types of indices to indicate the relative proliferation of new component parts. The make‐buy decision involves indices that adequately describe manufactured versus made components. For the manufactured components, the level index computes the commonality by each product level of the bill of material. This index gives information for the design of a new, more automated manufacturing system by analyzing each level's standardization for possible inclusion into a more automated system (group technology, cell manufacturing, flexible manufacturing system, or automated factory). For the buy decision, the indices developed here can be utilized for reduction of the number of vendors due to “uncommon” components. Both level and buy indices can be used for analyzing and reducing the planning complexity of the production system. A third fundamental problem with DCI is its lack of realistic dimensions of end‐item volume, quantity per assembly, and cost. The DCI weights each end item precisely equally regardless of its volume. Therefore, an item produced once every two years would be weighted exactly the same as the best selling item. Similarly, the DCI does not consider the quantity per assembly (Q/A) of each component. Consequently, an item that had very small (Q/A) inside of an item has exactly the same weight as a high (Q/A) inside an item. Last, the DCI ignored the price of the component, which further limits its usefulness by causing the DCI not to have cost dimensions. This study developed relative within‐product, between‐product, and total commonality indices that include end‐item volumes, quantities per assembly, and component price. These indices can provide valuable insights for reducing relative planning complexities that improve system performance through analyzing relative costs of component parts usages.


An investigation of labor assignment rules in a dual-constrained job shop

November 1985

·

26 Reads

·

72 Citations

Journal of Operations Management

One of the management decisions required to operate a dual‐constrained job shop is the labor assignment rule. This study examines the effects of various labor assignment rules on the shop's performance. Eleven different labor assignment rules are simulated. A longest‐queue rule and the traditional counterparts of the first‐in‐system, first‐served, shortest operation time, job due date, critical ratio and shortest processing time dispatching rules are used to determine to which work center available workers should be transferred. Also tested are five new labor assignment rules that use an average of the priority values of all jobs in queue at a particular work center to determine whether that work center should receive the available worker. A SIMSCRIPT simulation program that models nine work centers provided the mechanism by which these rules were tested. Five dispatching rules, the counterparts of the five “traditional counterpart” labor assignment rules mentioned earlier, provided different shop environments. Also, the level of staffing of the work centers was altered to provide additional ship environments. Staffing levels of 50% and 67% were employed. The results show that none of the eleven labor assignment rules had a significant impact on shop performance. This is an important result because it implies that a manager can make the labor assignment decision based on other criteria such as ease or cost of application of the rules. These results were relatively insensitive to the shop environment, as represented by the dispatching rule and the staffing level.

Citations (7)


... Manufacturing processes is another area that can benefit from standardization. The sources of process commonality have been investigated by (Treleven and Wacker 1987). They developed metrics to measure the degree of commonality – or variety – and analyzed their managerial implications. ...

Reference:

Production Planning and Control for Mass Customization – A Review of Enabling Technologies
The sources, measurements, and managerial implications of process commonality
  • Citing Article
  • October 1987

Journal of Operations Management

... Since co-creation endeavors are typically characterized by closer client-vendor relationships than in traditional procurement relationships (Mudambi and Helper 1998), early co-creation strategies focused on supplier base reduction and even advocated single sourcing (Treleven 1987), before academics recognized the shortcomings of this strategy (Mudambi andHelper 1998, Choi andKrause 2006). As firms become increasingly aware of the disadvantages of single sourcing, multiple sourcing in client-vendor co-creation is again gaining attention (Bapna et al. 2010, Wu et al. 2010, Feng and Shi 2012, Mishra et al. 2015. ...

Single Sourcing: A Management Tool for the Quality Supplier
  • Citing Article
  • March 1987

Journal of Supply Chain Management

... One of the first studies in this domain was conducted by Fryer (1974), who investigated the effects of different structures on the performance of a DRC job-shop. Later, Treleven and Elvers (1985) examined the impact of various assignment rules on a shop's performance, and ElMaraghy et al. (1999) developed a genetic algorithm (GA) with a new chromosome representation for the DRC job-shop. The GA was compared against six dispatching rules. ...

An investigation of labor assignment rules in a dual-constrained job shop
  • Citing Article
  • November 1985

Journal of Operations Management

... There will be a mismatch between the number of devices and the number of manpower and the equipment and labour costs are high [13][14][15][16][17][18]. Therefore, the optimization scheduling of resources of the job-shop that considers the production cycle and capacity constraints is the prerequisite condition to reducing production cost and further improve the production efficiency of the job-shop and is of important practical significance [19]. ...

Job-shop vs. hybrid flow-shop routing in a dual resource constrained system
  • Citing Article
  • June 2007

Decision Sciences

... For example, Collier (1981) measured commonality between product variants based on the number of reoccurring parent items for each distinct component in a product assembly structure. Wacker and Treleven (1986) measured the degree of commonality by dividing the number of physically identical components by the total number of components in a product family. Ishii (1996, 1997) measured commonality based on the ratio between the total number of unique components to the total number of components in a product family. ...

Component part standardization: An analysis of commonality sources and indices
  • Citing Article
  • February 1986

Journal of Operations Management

... While single sourcing brings in benefits of lower inventories, economies of scale and learning effect on production cost (reduction), higher raw material quality due to tandem improvement initiatives along with justin-time strategy, secured information sharing, etc. it also brings disruption risks into play (Namdar et al., 2018). Consequently, multiple sourcing acts as a good deterrent to risks such as natural disasters, shortages, employee strikes and technouncertainty (Treleven and Bergman Schweikhart, 1988). This also results in maintaining supplier competition and, hence, ensures that quality is not compromised (Elmaghraby, 2000). ...

A Risk/Benefit Analysis of Sourcing Strategies: Single Vs. Multiple Sourcing
  • Citing Article
  • December 1988

Journal of Operations Management