## No full-text available

To read the full-text of this research,

you can request a copy directly from the authors.

Traditionally, data envelopment analysis (DEA) evaluates the performance of decision-making units (DMUs) with the most favorable weights on the best practice frontier. In this regard, less emphasis is placed on non-performing or distressed DMUs. To identify the worst performers in risk-taking industries, the worst-practice frontier (WPF) DEA model has been proposed. However, the model does not assume evaluation in the condition that the environment is uncertain. In this paper, we examine the WPF-DEA from basics and further propose novel robust WPF-DEA models in the presence of interval data uncertainty and non-discretionary factors. The proposed approach is based on robust optimization where uncertain input and output data are constrained in an uncertainty set. We first discuss the applicability of worst-practice DEA models to a broad range of application domains and then consider the selection of worst-performing suppliers in supply chain decision analysis where some factors are unknown and not under varied discretion of management. Using the Monte-Carlo simulation, we compute the conformity of rankings in the interval efficiency as well as determine the price of robustness for selecting the worst-performing suppliers.

To read the full-text of this research,

you can request a copy directly from the authors.

... They applied their model to assess cross-efficiency and real-life bank data. Considering similar adaptation of interval data and non-discretionary factors, Arabmaldar et al. [2] proposed a robust worst-practice model to the worst-performing suppliers where some factors in the supply chain decision analysis are not under the discretion of management. Hatami-marbini and Arabmaldar [28] extended the RDEA to estimate Farrell's cost efficiency in situations of endogenous and exogenous uncertainties. ...

... The relationship between the input-orientated models (2) and (4) and the output-oriented models (3) and (5) is established in the literature (see [14] ). Mathematically, it is shown the relation, ϕ * = 1 / θ * . ...

... Uncertainty analysis in an equality constraint may lead to in-feasibility by restricting the feasibility region (see [5] ). Salahi et al. [51] converted the normalization equation of model (2) to two inequality constraints. Omrani [41] instead replaced the equality con- ...

Robust Data Envelopment Analysis (RDEA) is a DEA-based conservative approach used for modeling uncertainties in the input and output data of Decision-Making Units (DMUs) to guarantee stable and reliable performance evaluation. The RDEA models proposed in the literature apply robust optimization techniques to the linear and conventional DEA models which lead to the difficulty of obtaining a robust efficient DMU. To overcome this difficulty, this paper tackles uncertainty in DMUs from the original fractional DEA model. We propose a robust fractional DEA (RFDEA) model in both input and output orientation which enables us to overcome the deficiency of existing RDEA models. The linearized models of the fractional DEA are further used to establish duality relations from a pessimistic and optimistic view of the data. We show that the primal worst of the multiplier model is equivalent to the dual best of the envelopment model. Furthermore, we show that the robust efficiency in the input- and output-oriented DEA models are still equivalent in the new approach which is not the case in conventional RDEA models. We finally present a study of the largest airports in Europe to illustrate the efficacy of the proposed models. The proposed RDEA is found to provide an effective management evaluation strategy under uncertain environments.

... Moreover, traditional DEA models represent efficiency values as fixed numerical values, which hinder the detection of potential efficiency changes. Arana-Jiménez et al. [17] used slack-based interval DEA to calculate efficiency scores, Wei et al. [18] proposed two forms of intervals for performance assessment, Lei et al. [19] calculated DEA sorting intervals and ordering, Aliasghar et al. [20] investigated the sorting problem of integer interval DEA, and Toloo et al. [21] proposed an interval efficiency approach that considers dual role variables. ...

The data envelopment analysis (DEA) models have been widely recognized and applied in various fields. However, these models have limitations, such as their inability to globally rank DMUs, the efficiency values are definite numerical values, they are unable to reflect potential efficiency changes, and they fail to adequately reflect the degree of the decision maker’s preference. In order to address these shortcomings, this paper combines possibility theory with self-interest and non-self-interest principles to improve the DEA model to provide a more detailed reflection of the differences between DMUs. First, the self-interest and non-self-interest principles are employed to establish the DEA evaluation model, and the determined numerical efficiency is transformed into efficiency intervals. Second, an attitude function is added to the common possible-degree formula to reflect the decision maker’s preference, and a more reasonable method for solving the attitude function is presented. Finally, the improved possible-degree formula proposed in this paper is used to rank and compare the interval efficiencies. This improved method not only provides more comprehensive ranking information but also better captures the decision maker’s preferences. This model takes preference issues into account and has improved stability and accuracy compared with existing models. The application of the improved model in airlines shows that the model proposed in this paper effectively achieved a full ranking. From a developmental perspective, the efficiency levels of Chinese airlines were generally comparable. Joyair and One Two Three performed poorly, exhibiting significant gaps compared with other airlines.

... In addition, Arabmaldar et al. [61,62] recently proposed that, among the commonly used exogenous directions, proportional directional distances (with directional weights being respective observed inputs and outputs of the DMU under evaluation) yield the best performance against data uncertainty. However, also these directional proposals do not consider different preference structures and value judgments. ...

... They applied their model to assess cross-efficiency and real-life bank data. Considering similar adaptation of interval data and nondiscretionary factors, Arabmaldar et al. [34] proposed a robust worst practice model to the worst-performing suppliers where some factors in the supply chain decision analysis are not under the discretion of management. Robust DEA (henceforth RDEA) is the application of RO in DEA. ...

Due to the nonlinear and discrete nature of BCC (Banker, Charnes, and Cooper, [1]) models for determining the most efficient decision-making unit, it is practically impossible to evaluate the models' dual and, consequently, optimistic case. Thus, in this paper, the linear model with linear constraints proposed by Akhlaghi et al. [2] is used to investigate the dual equality of the model's robust problem and the optimistic case of the new model's dual under VRS uncertainty. The model proposed in this paper is novel in comparison to previous models because it solves the most efficient decision-making unit only once, without relying on uncertain data to determine its rank. The paper demonstrates how the proposed robust model can also ascertain the most efficient decision-making unit when uncertainty exists. Furthermore, the dual issues raised by robust counterparts in the new linear programming (LP) model are addressed to identify the most efficient decision-making unit. The robust counterpart is demonstrated to be equivalent to a linear program under interval uncertainty, and the dual of the robust counterpart is shown to be equal to the optimistic counterpart of the dual problem. Consequently, this study aims to demonstrate that the dual problem is equivalent to a decision-maker operating under optimal data, whereas the primal robust problem is equivalent to a decision-maker operating through the worst-case possible data scenario.

... First, they presented robust DEA models to tackle uncertain inputs and outputs and then developed a pair of robust models to attend to uncertain input prices. Arabmaldar et al. (2021) examined the worst-practice DEA frontier and then proposed a robust model in the presence of interval data and non- Shokouhi et al. (2014) Radial Interval CCR ( Wang et al., 2005 ) Input Interval Banking Lu (2015) Radial BCC ( Banker et al., 1984 ) Output Polyhedral Meta-heuristics algorithm Aghayi & Maleki (2016) Non-Radial DDF ( Zanella et al., 2015 ) Output Interval Banking Arabmaldar et al. (2017) Radial CCR ( Charnes et al., 1978 (2022b) developed two robust non-radial DEA models to compute efficiency under data uncertainty and presented a simulation study to identify a suitable range of the conservatism level. We report in Table 1 some key differences between the aforesaid studies and our proposed study (shown in the last row) in terms of model types (radial/nonradial), model variants, orientation types, uncertainty types, and application areas. ...

In the literature of data envelopment analysis, the directional distance function (DDF) model is commonly used to measure efficiency improvement, as it allows the decision maker to choose an appropriate direction that permits input contraction and output
expansion. However, choosing the right direction is challenging in empirical research. Additionally, efficiency measurement becomes problematic when input and output data are uncertain. To address these issues, we present an equivalent DDF model in multiplier form and use the robust optimization approach to construct a technology in order to develop a generalized robust-DDF measure of efficiency. Among the three commonly used predefined directions (input-oriented, output-oriented, and proportional) considered in this study, we define the robust direction as the one with the minimum price that decision-maker must pay to be immune to data uncertainty. To demonstrate the usefulness of our proposed robust direction measure, we apply it a real-life data on life insurance companies in India over eight years (2011-12 – 2018-
19). Our results show that the proportional direction exhibits the lowest price of robustness and is therefore the most appropriate for measuring potential efficiency improvement. Additionally, the increasing efficiency trend in the life insurance industry confirms the evidence of increased work intensity due to competition resulting from insurance reforms, supporting the competition and X-efficiency hypothesis.

... Moreover, the extant energy and environmental performance results of NMI are all obtained from the optimistic production frontier. In reality, however, decision makers may care not only how close they are to the best-performing DMU, but also how far they are from the worst-performing DMU (Arabmaldar et al. 2021). Therefore, to obtain more reliable results, it is necessary to consider both the optimistic production frontier and the pessimistic production frontier (Ganji et al. 2019). ...

As an energy-intensive industry, the energy and environmental issues of non-ferrous metals industry receive increasing attention. Dividing the production of non-ferrous metals into two sub-stages including mining and processing (M&P) and smelting and pressing (S&P), a non-cooperative game network data envelopment analysis model with double viewpoints (optimistic and pessimistic) is developed to evaluate the energy and environmental efficiency of China's non-ferrous metals industry. Moreover, we construct a novel network Malmquist productivity index to identify the dynamic evolution of energy and environmental performance. The results of 25 provinces over 2010–2017 show that energy and environmental efficiency of China's non-ferrous metals industry is very low during the study period, and the efficiency gap of two sub-stages increases significantly over time. Additionally, the energy and environmental efficiency in eastern region is more affected by the M&P sub-stage, while the efficiency of that in central and western regions is greatly influenced by the S&P sub-stage. Technical progress plays a key role in the productivity improvement of the M&P sub-stage, but the main driving force of productivity change in the S&P sub-stage is different in the three regions. Based on above findings, several targeted policy implications are suggested to prompt the sustainable development of China's non-ferrous metals industry.
Graphical abstract

... Sensitivity analysis is also used to determine how sensitive the decision-making units' (DMUs) solution values and efficiency scores are to numerical observations. By changing the DMUs' reference set, the proposed new model tests the robustness of DEA efficiency scores (Arabmaldar et al., 2021). For this purpose, in this study, the DEA method has been used. ...

Cotton is one of the important crops that play an important role in creating a livelihood for rural people in many parts of Iran. Cotton production necessitates a large amount of resources (e.g., fossil energy and agrochemicals, all of which have the potential to damage the environment in various ways). The purpose of the current study was to evaluate the environmental effects of cotton production in the South Khorasan Province of Iran. For this purpose, life cycle assessment (LCA) and data envelopment analysis (DEA) techniques have been applied to investigate the environmental impacts of cotton production. LCA is a practical method to evaluate the environment on the product flow, in which all aspects of the product life cycle are examined by a comprehensive approach. Furthermore, combining the LCA method with other managerial strategies such as DEA could allow researchers to provide decision-makers with more practical and interpretable data. The findings of the efficiency test showed that the average technical efficiency, pure technical efficiency, and scale efficiency were 0.81, 0.92, and 0.87, respectively. Respiratory inorganics (i.e., respiratory effects resulting from winter smog caused by emissions of dust, sulfur, and nitrogen oxides to air) posed the greatest environmental burden in cotton production, followed by non-renewable energy, carcinogens, and global warming. In addition, the highest effects were on human health, and then, on resources and climate change. Energy, on-system pollution, and waste played a crucial role in the environmental impacts of cotton processing. This study suggests improving farmers' knowledge toward the optimum application of chemical fertilizers, or their substitution with green fertilizers, which reduces the environmental effect of growing cotton in the area.

... Thus one may combine it with our proposed ASBM models in the presence of undesirable outputs. Moreover, the idea of uncontrollable inputs [33], nondiscretionary factors [24] and uncertainty in dataset [1,23,28], have also pervasive applications, so including them in the proposed models would be an absorbing future research directions. ...

The slacks-based measure (SBM) and additive SBM (ASBM) models are two widely used DEA models acting based on inputs and outputs slacks and giving efficiency scores between zero and unity. In this paper, we use both models with the application of the weak disposability axiom for outputs to evaluate efficiency in a two-stage structure in the presence of undesirable outputs. In the external evaluation, the SBM model is reformulated as a linear program and the ASBM model is reformulated as a second-order cone program (SOCP) that is a convex programming problem. In the internal evaluation, the SBM model for a specific choice of weights is linearized while the ASBM model is presented as an SOCP for arbitrary choice of weights. Finally, the proposed models are applied on a real dataset for which efficiency comparison and Pearson correlation coefficients analysis show advantages of the ASBM model to the SBM model.

... DEA (see e.g., Landete et al. (2017); Bȃdin et al. (2019)), robust DEA (see e.g., Hatami-Marbini & Arabmaldar (2021); Ebrahimi et al. (2021); Arabmaldar et al. (2021); Toloo et al. (2022)), and fuzzy-robust DEA (Gholizadeh et al., 2022) approaches. ...

Russell measure (RM) and enhanced Russell measure (ERM) are popular non-radial measures for efficiency assessment of decision-making units (DMUs) in data envelopment analysis (DEA). Input and output data of both original RM and ERM are assumed to be deterministic. However, this assumption may not be valid in some situations because of data uncertainty arising from measurement errors, data staleness, and multiple repeated measurements. Interval DEA (IDEA) has been proposed to measure the interval efficiencies from the optimistic and pessimistic viewpoints while the robustness of the assessment is questionable. This paper draws on a class of robust optimisation models to surmount uncertainty with a high degree of robustness in the RM and ERM models. The contribution of this paper is fivefold; (1) we develop new robust non-radial DEA models to measure the robust efficiency of DMUs under data uncertainty, which are adjustable based upon conservatism levels, (2) we use Monte-Carlo simulation in an attempt to identify an appropriate range for the budget of uncertainty in terms of the highest conformity of ranking results, (3) we introduce the concept of the price of robustness to scrutinise the effectiveness and robustness of the proposed models, (4) we compare the developed robust models in this paper with other existing approaches, both radial and non-radial models, and (5) we explore an application to assess the efficiency of the Master of Business Administration (MBA) programmes where data uncertainties influence the quality and reliability of results.

... Together with the widespread use of the RO approach in diverse fields, several studies have been carried out to tackle the problems of uncertainty in DEA (see e.g. Shokouhi et al. 2010Shokouhi et al. , 2014Sadjadi et al. 2011;Omrani 2013;Lu 2015;Arabmaldar et al. 2017;Tavana et al. 2021;Arabmaldar et al. 2021;Hatami-Marbini and Arabmaldar 2021;). This strand of DEA models is generally based on the three RO approaches proposed by Mulvey et al. (1995), Ben-Tal and Nemirovski (2000) and Bertsimas and Sim (2004). ...

This paper aims to contribute to the contemporary and imperative research on the performance and productivity growth of the oil industry. Among cutting edge methods, frontier analysis is a successful approach that has been widely used to assess the efficiency and productivity of entities with multiple resources and multiple outputs. This study first develops a unique framework based upon data envelopment analysis (DEA) to measure efficiency and productivity in the way that it tackles the uncertainty in data and
undesirable outputs and, in turn, provides useful information to decision-makers. An adaptive robust optimisation (RO) is utilised to combat those uncertain data whose distributions are unknown and consider the nexus between the level of conservatism and decision makers’ risk preference. The key advantage of the proposed robust DEA approach is that the results remain relatively unchanged when uncertain conditions exist in the problem. An empirical study on the oil refinery is presented in situations of data
uncertainty along with considering CO2 emissions as the undesirable output to conduct environmental efficiency and productivity analysis of the 25 countries over the period 2000-2018. The empirical results obtained from the proposed approach give some imperative implications. First, results show that the price of robustness does not affect identically for varying technologies when assessing productivity in a global oil market, and USA's oil industry is observed as the highest productivity growth in all cases confirming its
efforts for the rapid rise in oil extraction and production at low costs. There may be practical lessons for other nations to learn from the American oil industry to improve productivity. Findings also support a considerable regress during the 2008 Global Financial Crisis in the oil industry compared to the rest of the periods in question, and due to monetary and fiscal stimulus, there is a sharp productivity growth from 2009 to 2011. The other implication that can be drawn is that the GDP growth rate and technology innovation can more effectively improve the productivity of the oil industry across the globe.

... In the DEA literature, there exist four main approaches to deal with imperfect knowledge of data ; (i) fuzzy DEA (see e.g., Sengupta, 1992;Hatami-Marbini et al., 2010;, (ii) chanceconstrained DEA (see e.g., Land et al., 1993;Tavana et al., 2012Tavana et al., , 2013Izadikhah et al., 2020), (iii) interval DEA (see e.g., Cooper et al., 1999;Hatami-Marbini et al., 2014;Toloo et al., 2018;Toloo et al., 2021a), and robust DEA approach (see e.g. Shokouhi et al., 2010Shokouhi et al., , 2014Toloo and Mensah, 2019;Salahi et al. 2019;Salahi et al., 2020;Arabmaldar et al., 2021;Toloo et al., 2021b). Amongst them, the fuzzy DEA approach becomes more visible aiming at dealing with imprecision or vagueness frequently observed in real-world problems. ...

Performance-based budgeting (PBB) aims to formulate and manage public budgetary resources to improve managerial decisions based on actual performance measures of agencies. Although the PBB system has been overwhelmingly applied by various agencies, the progress and maturity of its implementation process are not satisfactory at large. Therefore, it warrants to find, evaluate and improve the performance of organisations in relation to implementing a PBB system. To do so, the composite indicators (CIs) have been proposed to aggregate multiple indicators associated with the PBB system, but their employment is contentious as they often lean on ad-hoc and troublesome assumptions. Data envelopment analysis (DEA) methods as a powerful and established tool help to contend with key limitations of CIs. Although the original DEA method ignores an internal production process, the knowledge of the internal structure of the PBB systems and indicators is of importance to provide further insights when assessing the performance of PBB systems. In this paper, we present a budget assessment framework by breaking a PBB system into two parallel stages including operations performance (OP) and financial performance enhancement (FPE) to open up the black-box structure of the system and consider the indicator hierarchy configuration of each stage. In situations of the hierarchical configuration of indicators, we develop a multilayer parallel network DEA-based CIs model to measure the PBB maturity levels of the system and its stages. It is shown that the discrimination power of the proposed multilayer model is better than the existing models with one layer and in situations of relatively small number of DMUs the model developed in this paper can be a good solution to the dimension reduction of indicators. Moreover, this research leverages fuzzy logic to surmount the subjective information that is often available in collecting indicators of the PBB systems. The major contribution of this research is to examine a case study of a PBB maturity award in Iran, as a developing country with a myriad of financial challenges, to adopt a PBB maturity model as well as point towards the efficacy and applicability of the proposed framework in practice.

Supply chain operations reference (SCOR) is an appropriate method for assessing supply chains. SCOR defines supply chains' stages as follows: plan, source, make, deliver, and return. To examine these phases in terms of economic, social, and environmental aspects, the SCOR model is incorporated into the sustainable supply chain management (SSCM) concept. Given that the case study deals with two-stage networks, i) a new network data envelopment analysis (NDEA) model is developed to evaluate the divisional and overall efficiencies of supply chains. ii) In this study, NDEA and SCOR models are used to evaluate the sustainability of supply chains. iii) Furthermore, to deal with the interval data, a new NDEA model is developed. Some theorems prove the capabilities of the proposed new model. iv) To assess the sustainability of supply chains, given the interval data, the objective of this paper is to develop an interval network directional distance function of inefficiency (INDDFI) model based on the SCOR framework. v) The preference-based approach for comparing the efficiency intervals is provided to present the final ranking. vi) The proposed method is applied to assess the sustainability of supply chains in the printing industry. vii) A sensitivity analysis of the results is presented. According to the sensitivity analysis, it is found that the activities in the Making Division have the highest effect on the whole network structure.

A One-shot device is defined as a one-time unit that cannot be subjected to destructive tests or used more than once. Each component/sub-component used in one-shot systems, regardless of how it is supplied, has a specific risk level. On the other hand, each component affects the factors influencing the total system failure. It is the main objective of the present paper to provide a multi-objective model for determining suppliers of components used in one-shot systems. In this model, the risk associated with each supplier is initially estimated using risk assessment indicators. In the second stage, the risks associated with sub-components are assess using cause-and-effect matrices. Based on the likely catastrophic incidents to the system, the fault tree analysis is used and the relevant diagrams are drawn to identify the best composition of equipment suppliers. The results obtained from a case-study are then used in the especially developed multi-objective model to identify the allocations. The optimum values for the first and second objective functions are calculated to be 1489.349 and 0.809, respectively. Finally, the performance of the model is validated by sensitivity analysis.

Performance assessment can play an effective role in providing quality services in healthcare organizations. To develop and compete, hospitals need a performance evaluation system to measure the effectiveness and efficiency of their programs, processes, and workforce. However, hospital data are often uncertain, imprecise, or unknown. In this paper, a fuzzy stochastic data envelopment analysis (DEA) method based on the slacks-based measure of efficiency (SBM) is presented and employed to evaluate the performance of 11 hospitals in an urban setting. Due to the multiplicity of the initially identified criteria, an integrated algorithm based on Decision-Making Trial and Evaluation Laboratory (DEMATEL) and Analytic Network Process (ANP) is used to determine the main and effective criteria. The result of the algorithm shows that the criterion “total inpatient visits” is the most important criterion in this evaluation. Finally, the proposed model is implemented based on different levels of significance, and the performance of each unit is determined in uncertain conditions.

Data envelopment analysis (DEA) based productivity indexes models are widely applied to evaluate the productivity of decision-making units over a period. This study proposes a productivity index for evaluating environmentally sensitive productivity growth while excluding the possibility of spurious technical regress. This innovative index has been created by combining directional distance functions, sequential DEA, undesirable data, and the concept of interval DEA. With this combination, the traditional sequential Malmquist-Luenberger productivity index (SMLPI) has been reformulated as an interval DEA problem to present a novel productivity index named interval SMLPI. We propose a decomposition of the developed index utilizing both constant returns to scale and variable returns to scale frontiers as the benchmark, which allows us to quantify scale efficiency change with interval data. To exhibit the capability of the proposed extension of SMLPI, we model a framework for Indian commercial banks and measure productivity change intervals for twenty-one banks from 2011 to 2018. The empirical findings elucidate that the scale efficiency change plays an essential role in driving productivity change. ICICI Bank had the highest average marginal productivity gain of 1.5007 between 2011 and 2018, whereas Karur Vysya bank had the highest average marginal productivity decline of 0.9411.

The ability to deal with systems parametric uncertainties is an essential issue for heavy self-driving vehicles in unconfined environments. In this sense, robust controllers prove to be efficient for autonomous navigation. However, uncertainty matrices for this class of systems are usually defined by algebraic methods which demand prior knowledge of the system dynamics. In this case, the control system designer depends on the quality of the uncertain model to obtain an optimal control performance. This work proposes a robust recursive controller designed via multiobjective optimization to overcome these shortcomings. Furthermore, a local search approach for multiobjective optimization problems is presented. The proposed method applies to any multiobjective evolutionary algorithm already established in the literature. The results presented show that this combination of model-based controller and machine learning improves the effectiveness of the system in terms of robustness, stability and smoothness.

Data envelopment analysis (DEA) is a data-driven and benchmarking tool for evaluating the relative efficiency of production units with multiple outputs and inputs. Conventional DEA models are based on a production system by converting inputs to outputs using input-transformation-output processes. However, in some situations, it is inescapable to think of some assessment factors, referred to as dual-role factors, which can play simultaneously input and output roles in DEA. The observed data are often assumed to be precise although it needs to consider uncertainty as an inherent part of most real-world applications. Dealing with imprecise data is a perpetual challenge in DEA that can be treated by presenting the interval data. This paper develops an imprecise DEA approach with dual-role factors based on revised production possibility sets. The resulting models are a pair of mixed binary linear programming problems that yield the possible relative efficiencies in the form of intervals. In addition, a procedure is presented to assign the optimal designation to a dual-role factor and specify whether the dual-role factor is a nondiscretionary input or output. Given the interval efficiencies, the production units are categorized into the efficient and inefficient sets. Beyond the dichotomized classification, a practical ranking approach is also adopted to achieve incremental discrimination through evaluation analysis. Finally, an application to third-party reverse logistics providers is studied to illustrate the efficacy and applicability of the proposed approach.

Degenerate optimal weights and uncertain data are two challenging problems in conventional data envelopment analysis (DEA). Cross-efficiency and robust optimization are commonly used to handle such problems. We develop two DEA adaptations to rank decision-making units (DMUs) characterized by uncertain data and undesirable outputs. The first adaptation is an interval approach, where we propose lower- and upper-bounds for the efficiency scores and apply a robust cross-efficiency model to avoid problems of non-unique optimal weights and uncertain data. We initially use the proposed interval approach and categorize DMUs into fully efficient, efficient, and inefficient groups. The second adaptation is a robust approach, where we rank the DMUs, with a measure of cross-efficiency that extends the traditional classification of efficient and inefficient units. Results show that we can obtain higher discriminatory power and higher-ranking stability compared with the interval models. We present an example from the literature and a real-world application in the banking industry to demonstrate this capability.

This paper extends the conventional DEA models to a robust DEA (RDEA) framework by proposing new models for evaluating the efficiency of a set of homogeneous decision-making units (DMUs) under ellipsoidal uncertainty sets. Four main contributions are made: (1) we propose new RDEA models based on two uncertainty sets: an ellipsoidal set that models unbounded and correlated uncertainties and an interval-based ellipsoidal uncertainty set that models bounded and correlated uncertainties, and study the relationship between the RDEA models of these two sets, (2) we provide a robust classification scheme where DMUs can be classified into fully robust efficient, partially robust efficient and robust inefficient, (3) the proposed models are extended to the additive DEA model and its efficacy is analyzed with two imprecise additive DEA models in the literature, and finally, (4) we apply the proposed models to study the performance of banks in the Italian banking industry. We show that few banks which were resilient in their performance can be robustly classified as partially efficient or fully efficient in an uncertain environment.

Robust goal programming (RGP) is an emerging field of research in decision-making problems with multiple conflicting objectives and uncertain parameters. RGP combines robust optimization (RO) with variants of goal programming techniques to achieve stable and reliable goals for previously unspecified aspiration levels of the decision-maker. The RGP model proposed in Kuchta (2004) and recently advanced in Hanks, Weir, and Lunday (2017) uses classical robust methods. The drawback of these methods is that they can produce optimal values far from the optimal value of the "nominal" problem. As a proposal for overcoming the aforementioned drawback, we propose light RGP models generalized for the budget of uncertainty and ellipsoidal uncertainty sets in the framework discussed in Schöbel (2014) and compare them with the previous RGP models. Conclusions regarding the use of different uncertainty sets for the light RGP are made. Most importantly, we discuss that the total goal deviations of the decision-maker are very much dependent on the threshold set rather than the type of uncertainty set used.

Performance evaluation of decision-making units (DMUs) via the data envelopment analysis (DEA) is confronted with multi-conflicting objectives, complex alternatives and significant uncertainties. Visualizing the risk of uncertainties in the data used in the evaluation process is crucial to understanding the need for cutting edge solution techniques to organizational decisions. A greater management concern is to have techniques and practical models that can evaluate their operations and make decisions that are not only optimal but also consistent with the changing environment. Motivated by the myriad need to mitigate the risk of uncertainties in performance evaluations, this thesis focuses on finding robust and flexible evaluation strategies to the ranking and classification of DMUs. It studies performance measurement with the DEA tool and addresses the uncertainties in data via the robust optimization technique.
The thesis develops new models in robust data envelopment analysis with applications to management science, which are pursued in four research thrust. In the first thrust, a robust counterpart optimization with nonnegative decision variables is proposed which is then used to formulate new budget of uncertainty-based robust DEA models. The proposed model is shown to save the computational cost for robust optimization solutions to operations research problems involving only positive decision variables. The second research thrust studies the duality relations of models within the worst-case and best-case approach in the input – output orientation framework. A key contribution is the design of a classification scheme that utilizes the conservativeness and the risk preference of the decision maker. In the third thrust, a new robust DEA model based on ellipsoidal uncertainty sets is proposed which is further extended to the additive model and compared with imprecise additive models. The final thrust study the modelling techniques including goal programming, robust optimization and data envelopment to a transportation problem where the concern is on the efficiency of the transport network, uncertainties in the demand and supply of goods and a compromising solution to multiple conflicting objectives of the decision maker.
Several numerical examples and real-world applications are made to explore and demonstrate the applicability of the developed models and their essence to management decisions. Applications such as the robust evaluation of banking efficiency in Europe and in particular Germany and Italy are made. Considering the proposed models and their applications, efficiency analysis explored in this research will correspond to the practical framework of industrial and organizational decision making and will further advance the course of robust management decisions.

Determining the optimal scale size of a combined cycle power plant is inherently a complex problem often with multiple and conflicting criteria as well as uncertain factors. The complexity of the problem is compounded by the production of undesirable outputs and the presence of natural and managerial disposability. We propose a customized data envelopment analysis (DEA) method for solving the return to scale (RS) problem in the presence of uncertain data and undesirable outputs. A combined cycle power plant is considered a decision making unit (DMU) which consumes fuels to produce electricity and emissions. The uncertainty of the inputs and outputs are modeled with interval data and the emissions are assumed to be undesirable outputs. The proposed DEA method determines the interval efficiency scores of the DMUs and offers a practical benchmark for enhancing the efficiency scores. We demonstrate the applicability of the proposed method and exhibit the efficacy of the procedure with a six-year study of 17 combined cycle power plants in Iran. The main contributions of this paper are six fold: we (1) model the uncertainties in the input and output data using interval data; (2) consider undesirable outputs; (3) determine the efficiency scores of the DMUs as interval values; (4) develop a group of indices to distinguish between the efficient and inefficient DMUs; (5) determine the most economic scale size for the efficient DMUs; and (6) determine practical benchmarks for the inefficient DMUs.

Traditional models of decision-making under uncertainty assume perfect information, i.e., ac-curate values for the system parameters and specific probability distributions for the random variables. However, such precise knowledge is rarely available in practice, and a strategy based on erroneous inputs might be infeasible or exhibit poor performance when implemented. The purpose of this tutorial is to present a mathematical framework that is well-suited to the limited information available in real-life problems and captures the decision-maker's attitude towards uncertainty; the proposed approach builds upon recent developments in robust and data-driven optimization. In robust optimization, random variables are modeled as uncertain parameters be-longing to a convex uncertainty set and the decision-maker protects the system against the worst case within that set. Data-driven optimization uses observations of the random variables as direct inputs to the mathematical programming problems. The first part of the tutorial describes the robust optimization paradigm in detail in single-stage and multi-stage problems. In the second part, we address the issue of constructing uncertainty sets using historical realizations of the random variables and investigate the connection between convex sets, in particular polyhedra, and a specific class of risk measures.

Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. The standard DEA models assume that all inputs and outputs are crisp and can be changed at the discretion of management. While crisp input and output data are fundamentally indispensable in the standard DEA evaluation process, input and output data in real-world problems are often imprecise or ambiguous. In addition, real-world problems may also include non-discretionary factors that are beyond the control of a DMU’s management. Fuzzy logic and fuzzy sets are widely used to represent ambiguous, uncertain or imprecise data in DEA by formalizing the inaccuracies inherent in human decision-making. In this paper, we show that considering bounded factors in DEA models results in a disregard to the concept of relative efficiency since the efficiency of the DMUs are calculated by comparing the DMUs with their lower and/or upper bounds. In addition, we present a fuzzy DEA model with discretionary and non-discretionary factors in both the input and output-oriented CCR models. A numerical example is used to demonstrate the applicability and the efficacy of the proposed models.

Optimal solutions of Linear Programming problems may become severely infeasible if the nominal data is slightly perturbed.
We demonstrate this phenomenon by studying 90 LPs from the well-known NETLIB collection. We then apply the Robust Optimization
methodology (Ben-Tal and Nemirovski [1–3]; El Ghaoui et al. [5, 6]) to produce “robust” solutions of the above LPs which are
in a sense immuned against uncertainty. Surprisingly, for the NETLIB problems these robust solutions nearly lose nothing in
optimality.

A robust approach to solving linear optimization problems with uncertain data was proposed in the early 1970s and has recently been extensively studied and extended. Under this approach, we are willing to accept a suboptimal solution for the nominal values of the data in order to ensure that the solution remains feasible and near optimal when the data changes. A concern with such an approach is that it might be too conservative. In this paper, we propose an approach that attempts to make this trade-off more attractive; that is, we investigate ways to decrease what we call the price of robustness. In particular, we flexibly adjust the level of conservatism of the robust solutions in terms of probabilistic bounds of constraint violations. An attractive aspect of our method is that the new robust formulation is also a linear optimization problem. Thus we naturally extend our methods to discrete optimization problems in a tractable way. We report numerical results for a portfolio optimization problem, a knapsack problem, and a problem from the Net Lib library. Subject classifications: Programming, stochastic: robust approach for solving LP/MIP with data uncertainties. Area of review: Financial Services. History: Received September 2001; revision received August 2002; accepted December 2002.

Measuring economic and cost efficiency receives ever-increasing attention of the executives and managers of small-and medium-sized enterprises (SMEs) to minimise the total production costs. The conventional Farrell cost efficiency (CE) as a key determinant requires the precise information on inputs, outputs and input prices, while in praxis uncertainty is inherent and inevitable in data and its negligence conceivably results in a dire approximation for CE measures. This paper is concerned with Farrell CE in situations of both endogenous and exogenous uncertainty. The source of uncertainty allows us to define two different scenarios; (i) in situations of endogenous uncertainty in input and output data where the uncertainty is affected by the decision maker, and (ii) in situations of uncertain prices for inputs where the uncertainty is exogenously given. In the first scenario, the theory of robust optimisation is adopted to develop the robust data envelopment analysis (DEA) models with the aim of grappling uncertainties in input and output data when measuring technical and cost efficiencies. The second scenario aims to accommodate uncertainties on price information by developing a pair of robust DEA models based upon robust optimisation estimating the upper and lower bounds for CE measures. This unprecedented study helps us to provide a generalised framework for economic efficiency with uncertainties in which conventional properties of Farrell measures are fulfilled. In addition to comparing the developed approach in this paper with other existing approaches through a simple numerical example, the usefulness and applicability of the suggested framework are minutely studied in an empirical application in the context of allocation problems.

Flexibility in selecting the weights of inputs and outputs in data envelopment analysis models and uncertainty associated with the data might lead to unreliable efficiency scores. In this paper, to avoid these problems, first, we discuss robust Charnes, Cooper, Rhodes (CCR) model under Bertsimas and Sim approach. Then, the robust CCR solutions are used to find robust common set of weights under norm-1 and Bertsimas and Sim approach. Finally, on two numerical real-world examples, the performance of the proposed approach is compared by a similar recent approach from the literature to show the advantages of the new method and its applicability.

Transportation plays an important role in development of countries and linking different sectors of economy. Efficiency evaluating is also an essential part in the process of controlling an organization and evaluating its performance. Therefore, evaluating the efficiency of transportation industry, especially air transport, as the fastest and most expensive one, has particular importance in today's competitive world. Paying attention to this issue can have a significant impact on the effectiveness of transport policies. The purpose of this study is to evaluate the efficiency of airlines using a new robust data envelopment analysis (DEA) model. In this manuscript, slack-based measure (SBM) model is developed by adding undesirable outputs and uncertainty to consider real-world situations. Then the proposed model is implemented in a real case study and the efficiency of 14 Iranian airlines is calculated. The results revealed that using a non-robust model; Mahan, Taban, Pouya Air and Caspian are efficient airlines, respectively, while using the new robust model, all efficiency values are reduced and also their efficiency scores were less than one.

Russell measure is among non-radial measures for efficiency evaluation of decision making units in data envelopment analysis. Due to the nonlinearity of its objective function, an enhanced version of it is proposed that can be linearized using the known Charnes–Cooper change of variables. In this article, we give equivalent formulations of the robust Russell measure and its enhanced models under interval and ellipsoidal uncertainties in their best- and worst-cases. We show that the built formulations stay convex for both best- and worst-cases under interval uncertainty as well as worst-case with ellipsoidal uncertainty. In other words, these formulations are nonconvex only for ellipsoidal uncertainty in their best-case. Some illustrative examples are provided to validate the new models.

Robust optimization has become the state-of-the-art approach for solving linear optimization problems with uncertain data. Though relatively young, the robust approach has proven to be essential in many real-world applications. Under this approach, robust counterparts to prescribed uncertainty sets are constructed for general solutions to corresponding uncertain linear programming problems. It is remarkable that in most practical problems, the variables represent physical quantities and must be nonnegative. In this paper, we propose alternative robust counterparts with nonnegative decision variables – a reduced robust approach which attempts to minimize model complexity. The new framework is extended to the robust Data Envelopment Analysis (DEA) with the aim of reducing the computational burden. In the DEA methodology, first we deal with the equality in the normalization constraint and then a robust DEA based on the reduced robust counterpart is proposed. The proposed model is examined with numerical data from 250 European banks operating across the globe. The results indicate that the proposed approach (i) reduces almost 50% of the computational burden required to solve DEA problems with nonnegative decision variables; (ii) retains only essential (non-redundant) constraints and decision variables without alerting the optimal value.

Performance measurement is one of the most essential real life decision making problems faces several criteria and different types of uncertainty and vagueness. Human expertise is mandatory for such situation although it is not enough to overcome such complicated cases. Expert systems which can sense, analyze and incorporate all details of real life decision making problems are required. In this paper an expert system is proposed in form of an uncertain Data Envelopment Analysis (DEA) approach. More formally a mixed fuzzy-robust uncertainty is addressed in the proposed DEA. Two scenario-based robust DEA models under Constant Return to Scale (CRS) and Variable Return to Scale (VRS) conditions have been proposed. Since the observed values for the inputs and outputs of Decision Making Units (DMUs) in each scenario may be ambiguous or vague, fuzzy DEA models corresponding to each of the robust scenarios are developed. So, this paper proposes four fuzzy-robust DEA models, which simultaneously maintain the advantage of each of the fuzzy and robust approaches, and at same time calculate the upper and lower bounds of the efficiency scores of DMUs under CRS and VRS conditions. Finally, to evaluate the validity and applicability of the proposed models, two numerical examples and a real case study in Small and Medium-sized Enterprises (SMEs) are presented and discussed.

An alternatve approach of super efficiency slack-based measure data envelopment analysis has been proposed to make the projection of an efficient decision making unit strongly-Pareto efficient. The approach however did not examine the simultaneous effect of non-discretionary factors and integer requirements on efficiency measures. To obtain more accurate efficiency measures, this paper proposes a two-stage approach of super efficiency slack-based measure in non-discretionary factors and mixed integer requirements. Both new factors and requirements were first integrated into the existing approach. The optimal solution of the proposed approach was then obtained by transforming its fractional form to a linear form. Thus, the scalar measures of the proposed approach can now deal directly with discretionary mixed integer input saving-output surplus and discretionary mixed integer input excess-output shortfall. The practicability of the proposed approach was tested using empirical data of Malaysian community colleges.

Efficiency analyses are crucial to managerial competency for evaluating the degree to which resources are consumed in the production process of gaining desired services or products. Among the vast available literature on performance analysis, Data Envelopment Analysis (DEA) has become a popular and practical approach for assessing the relative efficiency of Decision-Making Units (DMUs) which employ multiple inputs to produce multiple outputs. However, in addition to inputs and outputs, some situations might include certain factors to simultaneously play the role of both inputs and outputs. Contrary to conventional DEA models which account for precise values for inputs, outputs, and dual-role factors, we develop a methodology for quantitatively handling imprecision and uncertainty where a degree of imprecision is not trivial to be ignored in efficiency analysis. In this regard, we first construct a pair of interval DEA models based on the pessimistic and optimistic standpoints to measure the interval efficiencies where some or all observed inputs, outputs, and dual-role factors are assumed to be characterized by interval measures. The optimal multipliers associated with the dual-role factors are then used to determine whether a factor is designated as an output, an input, or is in equilibrium even though the status of the dual-role factors may not be unique based upon the pessimistic and optimistic standpoints. To deal with the problem, we present a new model which integrates both pessimistic and optimistic models. The integrated model enables us to identify a unique status of each imprecise dual-role factor as well as to develop a structure for calculating an optimal reallocation model of each dual-role factor among the DMUs. As another method to investigate the role for dual-role factors, we introduce a fuzzy decision-making model which evaluates all DMUs simultaneously. We finally present an application to a data set of 20 banks to showcase the applicability and efficacy of the proposed procedures and algorithm.

This article examines the potential benefits of solving a stochastic DEA model over solving a deterministic DEA model. It demonstrates that wrong decisions could be made whenever a possible stochastic DEA problem is solved when the stochastic information is either unobserved or limited to a measure of central tendency. We propose two linear models: a semi-stochastic model where the inputs of the DMU of interest are treated as random while the inputs of the other DMUs are frozen at their expected values, and a stochastic model where the inputs of all of the DMUs are treated as random. These two models can be used with any empirical distribution in a Monte Carlo sampling approach. We also define the value of the stochastic efficiency (or semi-stochastic efficiency) and the expected value of the efficiency.

Applications of traditional data envelopments analysis (DEA) models require knowledge of crisp input and output data. However, the real-world problems often deal with imprecise or ambiguous data. In this paper, the problem of considering uncertainty in the equality constraints is analyzed and by using the equivalent form of CCR model, a suitable robust DEA model is derived in order to analyze the efficiency of decision-making units (DMUs) under the assumption of uncertainty in both input and output spaces. The new model based on the robust optimization approach is suggested. Using the proposed model, it is possible to evaluate the efficiency of the DMUs in the presence of uncertainty in a fewer steps compared to other models. In addition, using the new proposed robust DEA model and envelopment form of CCR model, two linear robust super-efficiency models for complete ranking of DMUs are proposed. Two different case studies of different contexts are taken as numerical examples in order to compare the proposed model with other approaches. The examples also illustrate various possible applications of new models.

In this paper we propose robust efficiency scores for the scenario in which the specification of the inputs/outputs to be included in the DEA model is modelled with a probability distribution. This proba- bilistic approach allows us to obtain three different robust efficiency scores: the Conditional Expected Score, the Unconditional Expected Score and the Expected score under the assumption of Maximum Entropy principle. The calculation of the three efficiency scores involves the resolution of an exponential number of linear problems. The algorithm presented in this paper allows to solve over 200 millions of linear problems in an affordable time when considering up 20 inputs/outputs and 200 DMUs. The approach proposed is illustrated with an application to the assessment of professional tennis players.

This paper provides a review of stochastic Data Envelopment Analysis (DEA). We discuss extensions of deterministic DEA in three directions: (i) deviations from the deterministic frontier are modeled as stochastic variables, (ii) random noise in terms of measurement errors, sample noise, and specification errors is made an integral part of the model, and (iii) the frontier is stochastic as is the underlying Production Possibility Set (PPS).Stochastic DEA utilizes non-parametric convex or conical hull reference technologies based upon axioms from production theory accompanied by a statistical foundation in terms of axioms from statistics or distributional assumptions. The approaches allow for an estimation of stochastic inefficiency compared to a deterministic or a stochastic PPS and for statistical inference while maintaining an axiomatic foundation. Focus is on bridges and differences between approaches within the field of Stochastic DEA including semi-parametric Stochastic Frontier Analysis (SFA) and Chance Constrained DEA (CCDEA).We argue that statistical inference based upon homogenous bootstrapping in contrast to a management science approach imposes a restrictive structure on inefficiency, which may not facilitate the communication of results of the analysis to decision makers. Semi-parametric SFA and CCDEA differ w.r.t. the modeling of noise and stochastic inefficiency. The two approaches are in spite of the inherent differences shown to be complements in the sense that the stochastic PPSs obtained by the two approaches share basic similarities in the case of one output and multiple inputs. Recent contributions related to (i) disentangling of random noise and random inefficiency and (ii) obtaining smooth shape constrained estimators of the frontier are discussed.

Within Data Envelopment Analysis, several alternative models allow for an environmental adjustment. The majority of them deliver divergent results. Decision makers face the difficult task of selecting the most suitable model. This study is performed to overcome this difficulty. By doing so, it fills a research gap. First, a two-step web-based survey is conducted. It aims (1) to identify the selection criteria, (2) to prioritize and weight the selection criteria with respect to the goal of selecting the most suitable model and (3) to collect the preferences about which model is preferable to fulfil each selection criterion. Second, Analytic Hierarchy Process is used to quantify the preferences expressed in the survey. Results show that the understandability, the applicability and the acceptability of the alternative models are valid selection criteria. The selection of the most suitable model depends on the preferences of the decision makers with regards to these criteria.

Recent advances in state-of-the-art meta-heuristics feature the incorporation of probabilistic operators aiming to diversify search directions or to escape from being trapped in local optima. This feature would result in non-deterministic output in solutions that vary from one run to another of a meta-heuristic. Consequently, both the average and variation of outputs over multiple runs have to be considered in evaluating performances of different configurations of a meta-heuristic or distinct meta-heuristics. To this end, this work considers each algorithm as a decision-making unit (DMU) and develops robust data envelopment analysis (DEA) models taking into account not only average but also standard deviation of an algorithm’s output for evaluating relative efficiencies of a set of algorithms. The robust DEA models describe uncertain output using an uncertainty set, and aim to maximize a DMU’s worst-case relative efficiency with respect to that uncertainty set. The proposed models are employed to evaluate a set of distinct configurations of a genetic algorithm and a set of parameter settings of a simulated annealing heuristic. Evaluation results demonstrate that the robust DEA models are able to identify efficient algorithmic configurations. The proposed models contribute not only to the evaluation of meta-heuristics but also to the DEA methodology.

Determining the appropriate supplier is a crucial strategic consideration in supply chain management. Worst-practice frontier data envelopment analysis (WPF-DEA) model is one of the new models in data envelopment analysis (DEA). In this paper, the concept of imprecise data envelopment analysis (IDEA) approach and dual-role factor is used to develop WPF-DEA model. Then, the proposed model is applied for supplier selection problem. A numerical example demonstrates the application of the proposed model.

Data envelopment analysis (DEA), considering the best condition for each decision making unit (DMU), assesses the relative efficiency and partitions DMUs into two sets: efficient and inefficient. Practically, in traditional DEA models more than one efficient DMU are recognized and these models cannot rank efficient DMUs. Some studies have been carried out aiming at ranking efficient DMUs, although in some cases only discrimination of the most efficient unit is desirable. Furthermore, several investigations have been done for finding the most CCR-efficient DMU. The basic idea of the majority of them is to introduce an integrated model which achieves an optimal common set of weights (CSW). These weights help us identify the most efficient unit in an identical condition.
Recently, Toloo (2012) [13] proposed a new mixed integer programming (MIP) model to find the most BCC-efficient unit. Based on this study, we propose a new basic integrated linear programming (LP) model to identify candidate DMUs for being the most efficient unit; next a new MIP integrated DEA model is introduced for determining the most efficient DMU. Moreover, these models exclude the non-Archimedean epsilon and consequently the optimal solution of these models can be obtained, straightforwardly. We claim that the most efficient unit, which could be obtained from all other integrated models, has to be one of the achieved candidates from the basic integrated LP model. Two numerical examples are illustrated to show the variant use of these models in different important cases.

One of the primary issues on data envelopment analysis (DEA) models is the reduction of weights flexibility. There are literally several studies to determine common weights in DEA but none of them considers uncertainty in data. This paper introduces a robust optimization approach to find common weights in DEA with uncertain data. The uncertainty is considered in both inputs and outputs and a suitable robust counterpart of DEA model is developed. The proposed robust DEA model is solved and the ideal solution is found for each decision making units (DMUs). Then, the common weights are found for all DMUs by utilizing the goal programming technique. To illustrate the performance of the proposed model, a numerical example is solved. Also, the proposed model of this paper is implemented by using some actual data from provincial gas companies in Iran.

This paper provides an overview of developments in robust optimization since 2007. It seeks to give a representative picture of the research topics most explored in recent years, highlight common themes in the investigations of independent research teams and highlight the contributions of rising as well as established researchers both to the theory of robust optimization and its practice. With respect to the theory of robust optimization, this paper reviews recent results on the cases without and with recourse, i.e., the static and dynamic settings, as well as the connection with stochastic optimization and risk theory, the concept of distributionally robust optimization, and findings in robust nonlinear optimization. With respect to the practice of robust optimization, we consider a broad spectrum of applications, in particular inventory and logistics, finance, revenue management, but also queueing networks, machine learning, energy systems and the public good. Key developments in the period from 2007 to present include: (i) an extensive body of work on robust decision-making under uncertainty with uncertain distributions, i.e., “robustifying” stochastic optimization, (ii) a greater connection with decision sciences by linking uncertainty sets to risk theory, (iii) further results on nonlinear optimization and sequential decision-making and (iv) besides more work on established families of examples such as robust inventory and revenue management, the addition to the robust optimization literature of new application areas, especially energy systems and the public good.

We evaluate, by means of mathematical programming formulations, the relative technical and scale efficiencies of decision making units (DMUs) when some of the inputs or outputs are exogenously fixed and beyond the discretionary control of DMU managers. This approach further develops the work on efficiency evaluation and on estimation of efficient production frontiers known as data envelopment analysis (DEA). We also employ the model to provide efficient input and output targets for DMU managers in a way that specifically accounts for the fixed nature of some of the inputs or outputs. We illustrate the approach, using real data, for a network of fast food restaurants.

Data envelopment analysis (DEA) is a non-parametric method for mea-suring the relative efficiency of a set of decision making units using multiple precise inputs to produce multiple precise outputs. Several extensions to DEA have been made for the case of imprecise data, as well as to improve the robustness of the assessment for these cases. Prevailing robust DEA (RDEA) models are based on mirrored interval DEA models, including two distinct production possibility sets (PPS). However, this approach renders the distance measures incommensurate and violates the standard assumptions for the interpretation of distance measures as efficiency scores. We propose a modified RDEA (MRDEA) model with a unified PPS to overcome the present problem in RDEA. Based on a flexible formulation for the number of variables perturbed, MRDEA calculates the empirical distribution for the interval efficiency for the case of a random number of variables affected. The MRDEA approach also decreases All authors made an equal contribution to this work. the computational complexity of the RDEA model, as well as significantly increases the discriminatory power of the model without additional information requirements. The properties of the method are demonstrated for four different numerical instances.

Since in evaluating by traditional data envelopment analysis (DEA) models many decision making units (DMUs) are classified as efficient, a large number of methods for fully ranking both efficient and inefficient DMUs have been proposed. In this paper a ranking method is suggested which basically differs from previous methods but its models are similar to traditional DEA models such as BCC, additive model, etc. In this ranking method, DMUs are compared against an full-inefficient frontier, which will be defined in this paper. Based on this point of view many models can be designed, and we mention a radial and a slacks-based one out of them. This method can be used to rank all DMUs to get analytic information about the system, and also to rank only efficient DMUs to discriminate between them.

This note formulates a convex mathematical programming problem in which the usual definition of the feasible region is replaced by a significantly different strategy. Instead of specifying the feasible region by a set of convex inequalities, fi(x) ≦ bi, i = 1, 2, …, m, the feasible region is defined via set containment. Here n convex activity sets {Kj, j = 1, 2, …, n} and a convex resource set K are specified and the feasible region is given by \documentclass{aastex} \usepackage{amsbsy} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{bm} \usepackage{mathrsfs} \usepackage{pifont} \usepackage{stmaryrd} \usepackage{textcomp} \usepackage{portland,xspace} \usepackage{amsmath,amsxtra} \pagestyle{empty} \DeclareMathSizes{10}{9}{7}{6} \begin{document} $$X =\{x \in R^{n}\mid x_{1}K_{1} + x_{2}K_{2} + \cdots + x_{n}K_{n} \subseteq K, x_{j}\geq 0\}$$ \end{document} where the binary operation + refers to addition of sets. The problem is then to find x̄ ∈ X that maximizes the linear function c · x. When the res...

Abstract
Data envelopment analysis (DEA) is a nonparametric model which evaluates the relative efficiencies of decision-making units (DMUs). These DMUs produce multiple outputs by using multiple inputs and the relative efficiency is evaluated using a ratio of total weighted output to total weighted input. In this paper an alternative interpretation of efficiency is first given. The interpretation is based on the fuzzy concept even though the inputs and outputs data are crisp numbers. With the interpretation, a new model for ranking DMUs in DEA is proposed and a new perspective of viewing other DEA models is now made possible. The model is then extended to incorporate situations whereby some inputs or outputs, in a fuzzy sense, are almost discretionary variables.

The objective of the present paper is to propose a novel pair of dataenvelopmentanalysis (DEA) models for measurement of relative efficiencies of decision-making units (DMUs) in the presence of non-discretionary factors and imprecise data. Compared to traditional DEA, the proposed interval DEA approach measures the efficiency of each DMU relative to the inefficiency frontier, also called the input frontier, and is called the worst relative efficiency or pessimistic efficiency. On the other hand, in traditional DEA, the efficiency of each DMU is measured relative to the efficiency frontier and is called the best relative efficiency or optimistic efficiency. The pair of proposed interval DEA models takes into account the crisp, ordinal, and interval data, as well as non-discretionary factors, simultaneously for measurement of relative efficiencies of DMUs. Two numeric examples will be provided to illustrate the applicability of the interval DEA models.

Data Envelopment Analysis (DEA) is a nonparametric approach to evaluating the relative efficiency of decision making units (DMUs) that use multiple inputs to produce multiple outputs. An assumption underlying DEA is that all the data assume the form of specific numerical values. In some applications, however, the data may be imprecise. For instance, some of the data may be known only within specified bounds, while other data may be known only in terms of ordinal relations. DEA with imprecise data or, more compactly, the Imprecise Data Envelopment Analysis (IDEA) method developed in this paper permits mixtures of imprecisely- and exactly-known data, which the IDEA models transform into ordinary linear programming forms. This is carried even further in the present paper to comprehend the now extensively employed Assurance Region (AR) concepts in which bounds are placed on the variables rather than the data. We refer to this approach as AR-IDEA, because it replaces conditions on the variables with transformations of the data and thus also aligns the developments we describe in this paper with what are known as cone-ratio envelopments in DEA. As a result, one unified approach, referred to as the AR-IDEA model, is achieved which includes not only imprecise data capabilities but also assurance region and cone-ratio envelopment concepts.

The purpose of this paper is to introduce the concept of worst practice DEA, which aims at identifying worst performers by placing them on the frontier. This is particularly relevant for our application to credit risk evaluation, but this also has general relevance since the worst performers are where the largest improvement potential can be found. The paper also proposes to use a layering technique instead of the traditional cut-off point approach, since this enables incorporation of risk attitudes and risk-based pricing. Finally, it is shown how the use of a combination of normal and worst practice DEA models enable detection of self-identifiers. The results of the empirical application on credit risk evaluation validate the method. The best combination of layered normal and worst practice DEA models yields an impressive 100% bankruptcy and 78% non-bankruptcy prediction accuracy in the calibration data set, and equally convincing 100% and 67% out-of-sample classification accuracies.

Evaluation of performance using DEA requires models consistent with the underlying technology. There have been a number of models proposed for analyzing performance in the presence of non-discretionary inputs. Banker and Morey (Operations Research 34 (1986) 513–521) provided the first DEA model to measure technical efficiency. Other single- and multiple-stage models that incorporate DEA have been developed. This paper discusses the various approaches and provides a simulation analysis to compare the relative performance of each.

This paper presents Data Envelopment Analysis (DEA) model with uncertain data for performance assessment of electricity distribution companies. During the past two decades, DEA has been widely used for benchmarking the electricity distribution companies. However, there is no study among many existing DEA approaches where the uncertainty in data is allowed and, at the same time, the distribution of the random data is permitted to be unknown. The proposed method of this paper develops a new DEA method with the consideration of uncertainty on output parameters. The method is based on the adaptation of recently developed robust optimization approaches proposed by Ben-Tal and Nemirovski [2000. Robust solutions of linear programming problems contaminated with uncertain data. Mathematical Programming 88, 411–421] and Bertsimas et al. [2004. Robust linear optimization under general norms.Operations Research Letters 32, 510–516]. The results are compared with an existing parametric Stochastic Frontier Analysis (SFA) using data from 38 electricity distribution companies in Iran to show the effects of the data uncertainties on the performance of DEA outputs. The results indicate that the robust DEA approach can be a relatively more reliable method for efficiency estimating and ranking strategies.

Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. In this study, we provide a taxonomy and review of the fuzzy DEA methods. We present a classification scheme with four primary categories, namely, the tolerance approach, the α-level based approach, the fuzzy ranking approach and the possibility approach. We discuss each classification scheme and group the fuzzy DEA papers published in the literature over the past 20 years. To the best of our knowledge, this paper appears to be the only review and complete source of references on fuzzy DEA.

In original data envelopment analysis (DEA) models, inputs and outputs are measured by exact values on a ratio scale. Cooper et al. [Management Science, 45 (1999) 597–607] recently addressed the problem of imprecise data in DEA, in its general form. We develop in this paper an alternative approach for dealing with imprecise data in DEA. Our approach is to transform a non-linear DEA model to a linear programming equivalent, on the basis of the original data set, by applying transformations only on the variables. Upper and lower bounds for the efficiency scores of the units are then defined as natural outcomes of our formulations. It is our specific formulation that enables us to proceed further in discriminating among the efficient units by means of a post-DEA model and the endurance indices. We then proceed still further in formulating another post-DEA model for determining input thresholds that turn an inefficient unit to an efficient one.

Crisp input and output data are fundamentally indispensable in traditional data envelopment analysis (DEA). However, the input and output data in real-world problems are often imprecise or ambiguous. Some researchers have proposed interval DEA (IDEA) and fuzzy DEA (FDEA) to deal with imprecise and ambiguous data in DEA. Nevertheless, many real-life problems use linguistic data that cannot be used as interval data and a large number of input variables in fuzzy logic could result in a significant number of rules that are needed to specify a dynamic model. In this paper, we propose an adaptation of the standard DEA under conditions of uncertainty. The proposed approach is based on a robust optimization model in which the input and output parameters are constrained to be within an uncertainty set with additional constraints based on the worst case solution with respect to the uncertainty set. Our robust DEA (RDEA) model seeks to maximize efficiency (similar to standard DEA) but under the assumption of a worst case efficiency defied by the uncertainty set and it’s supporting constraint. A Monte-Carlo simulation is used to compute the conformity of the rankings in the RDEA model. The contribution of this paper is fourfold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA; (2) we address the gap in the imprecise DEA literature for problems not suitable or difficult to model with interval or fuzzy representations; (3) we propose a robust optimization model in which the input and output parameters are constrained to be within an uncertainty set with additional constraints based on the worst case solution with respect to the uncertainty set; and (4) we use Monte-Carlo simulation to specify a range of Gamma in which the rankings of the DMUs occur with high probability.

This paper studies how to conduct efficiency assessment using data envelopment analysis (DEA) in interval and/or fuzzy input–output environments. A new pair of interval DEA models is constructed on the basis of interval arithmetic, which differs from the existing DEA models handling interval data in that the former is a linear CCR model without the need of extra variable alternations and uses a fixed and unified production frontier (i.e. the same constraint set) to measure the efficiencies of decision-making units (DMUs) with interval input and output data, while the latter is usually a nonlinear optimization problem with the need of extra variable alternations or scale transformations and utilizes variable production frontiers (i.e. different constraint sets) to measure interval efficiencies. Ordinal preference information and fuzzy data are converted into interval data through the estimation of permissible intervals and α-level sets, respectively, and are incorporated into the interval DEA models. The proposed interval DEA models are developed for measuring the lower and upper bounds of the best relative efficiency of each DMU with interval input and output data, which are different from the interval formed by the worst and the best relative efficiencies of each DMU. A minimax regret-based approach (MRA) is introduced to compare and rank the efficiency intervals of DMUs. Two numerical examples are provided to show the applications of the proposed interval DEA models and the preference ranking approach.

The standard data envelopment analysis (DEA) method requires that the values for all inputs and outputs be known exactly. When some outputs and inputs are unknown decision variables such as bounded data, ordinal data, and ratio bounded data, the DEA model becomes a non-linear programming problem and is called imprecise DEA (IDEA). There are two different approaches in dealing with imprecise outputs and inputs. One uses scale transformations and variable alternations to convert the non-linear IDEA model into a linear program. The other converts imprecise data into exact data and then uses the standard linear DEA model. The current paper reviews and compares the two approaches through an efficiency analysis of a set of telephone offices. A simplified approach is developed to reduce the computational burden if one uses the first approach. The treatment of weight restrictions in IDEA is discussed. It is shown that weight restrictions on imprecise data are redundant. New developments and improvements to both approaches are provided.

This paper proposes a hybrid approach that predicts the failure of firms based on the past business data, combining rough
set approach and worst practice data envelopment analysis (DEA). The worst practice DEA can identify worst performers (in
quantitative financial data) by placing them on the frontier while the rules developed by rough set uses non-financial information
to predict the characteristics of failed firms. Both DEA and rough set are commonly used in practice. Both have limitations.
The hybrid model Rough DEA takes the best of both models, by avoiding the pitfalls of each. For the experiment, the financial
data of 396 Taiwan firms during the period 2002-2003 were selected. The results show that the hybrid approach is a promising
alternative to the conventional methods for failure prediction.

This paper presents a framework where data envelopment analysis (DEA) is used to measure overall profit efficiency with interval data. Specifically, it is shown that as the inputs, outputs and price vectors each vary in intervals, the DMUs cannot be easily evaluated. Thus, presenting a new method for computing the efficiency of DMUs with interval data, an interval will be defined for the efficiency score of each unit. As well as, all the DMUs are divided into three groups which are defined according to the interval obtained for the efficiency value of DMUs.

An original data envelopment analysis (DEA) model is to evaluate each decision-making unit (DMU) with a set of most favorable weights of performance indices. The efficient DMUs obtained from the original DEA construct an efficient (best-practice) frontier. The original DEA can be considered to identify good (efficient) performers in the most favorable scenario. For the purpose of identifying bad performers such as bankrupt firms in the most unfavorable (worst-case) scenario, radial worst-practice frontier DEA (WPF–DEA) model in which the “worst efficient” DMUs construct a worst-practice frontier has been proposed. To identify bad performers together with the slack values we formulate another model called WPF–SBM. Then we develop the HypoSBM model to distinguish the worst performers from the bad ones. Finally, a solution approach is suggested to fully rank worst efficiencies in the worst-case scenario.

This paper discusses the interpretation of exogenously fixed, non-discretionary factors in data envelopment analysis (DEA) and suggests a generalised model for incorporating different types of inputs and outputs in DEA. The paper first compares two approaches to the inclusion of continuous non-discretionary factors in a one-stage model, one introduced by Banker and Morey [Operat. Res. 34 (4) (1986) 513–521] and the other by Lovell [TOP 2 (1994) 175–248] and Ruggiero [Eur. J. Operat. Res. 90 (1996) 553–565]. The approaches are compared theoretically, on the basis of axiomatic systems, and empirically, on the basis of examples and simulated data. The key difference is that some models require non-discretionary factors to be scale independent, i.e. indices, and some require them to be scale dependent, i.e. volume measures. Also, the returns to scale properties vary depending on the manner in which non-discretionary factors are treated. In practice, we may want to utilise the properties of the different models simultaneously. A generalised DEA model allowing for this possibility is presented.

One of the primary concerns on target setting the electricity distribution companies is the uncertainty on input/output data. In this paper, an interactive robust data envelopment analysis (IRDEA) model is proposed to determine the input and output target values of electricity distribution companies with considering the existence perturbation in data. Target setting is implemented with the uncertain data and the decision maker (DM) can search the envelop frontier and find the targets based on his preference. In order to search the envelop frontier, the paper combine the DEA and multi-objective linear programming method such as STEM. The proposed method of this paper is capable of handling uncertainty in data and finding the target values according to the DM’s preferences. To illustrate ability the proposed model, a numerical example is solved. Also, the input and output target values for some of the electricity distribution companies in Iran are reported. The results indicate that the IRDEA model is suitable for target setting based on DM’s preferences and with considering uncertain data.

In recent years, determining an appropriate supplier has become a crucial strategic consideration in a competitive market – with data envelopment analysis (DEA) methods increasingly important in this respect. DEA traditionally requires that the values for all inputs and outputs be known exactly. However, this assumption may not be true, because data in many real applications cannot be precisely measured. A successful approach for addressing uncertainty in data is to replace deterministic data with random variables, leading to chance-constrained DEA. In this article, the concept of chance-constrained programming is used to develop a Worst-practice frontier-Charnes-Cooper-Rhodes model and also its deterministic equivalent. Furthermore, it is shown that the latter can be formulated as a quadratic program. Finally, a numerical example demonstrates the application of the proposed model.