Article

Introductory Management Science

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... We provide two examples of 0 -1 IP problems in Appendix B. Example 1 is the classical knapsack problem that can be found in various Operations Research textbooks (Gould et al., 1993, Hillier and Lieberman, 2021, Winston and Albright, 2019. Example 2 is a Set-Covering problem and similar problems can be found in several textbooks (Hillier and Lieberman, 2021, Winston, 2004, Taylor, 2018. ...
Article
Full-text available
The COVID-19 epidemic has severely harmed the worldwide higher education business, resulting in an enormous health and socioeconomic tragedy that will be remembered for a long time. The COVID-19 pandemic has underlined the need for improved international and global perspectives to assess the numerous COVID-19 ramifications in the short, medium, and long term. Several higher education organizations and associations, including student groups and other higher education organizations, conducted surveys with a specific focus on a topic or problem that developed. This article investigates the pandemic's initial consequences on education and research activities. We want to look at how coronaviruses impact academic research in higher education. We employed a geographically distributed responder survey from Albanian private and public higher education institutions.
... Many introductory textbooks for management science (Eppen et al. 1993, Hillier and Hillier 2003, Anderson et al. 2019, Taylor 2019, operations research (Winston 1994), and quantitative methods (Balakrishnan et al. 2007, Render et al. 2009) provide the same traditional linear programming (LP) formulation for minimizing total crashing cost subject to a completion time constraint under the assumption that the time-cost tradeoff is a continuous linear function. This formulation has two sets of decision variables. ...
Article
Problems associated with time–cost trade-offs in project networks, which are commonly referred to as crashing problems, date back nearly 60 years. Many prominent management science textbooks provide a traditional linear programming (LP) formulation for a classic project crashing problem, in which the time–cost trade-off for each activity is continuous (and linear) over a range of possible completion times. We have found that, for students who are being introduced to time–cost trade-offs and the principles of project crashing, an alternative LP formulation facilitates a greater conceptual understanding. Moreover, the alternative formulation uses only half of the decision variables in the traditional formulation and has fewer constraints for many problems encountered in management science textbooks. Results from an MBA section of operations management suggest that students prefer the alternative formulation. Additionally, we have developed an Excel workbook that generates all possible paths for a network, allows students to manually evaluate crashing decisions, and generates the alternative LP formulation. We demonstrate the workbook using a small synthetic example and a larger, real-world network from the literature. We also show that the alternative formulation can be adapted easily to accommodate discrete project crashing problems for which the time–cost trade-offs for activities are not necessarily linear.
... Por estes motivos surgem como respostas a estas transformações, o conceito de competências digitais, que será tratado na seção seguinte. Já os modelos, segundo Anderson et al. (1991) representam objetos e situações, constituem uma abstração seletiva da realidade para Eppen et al. (1987). ...
Thesis
Full-text available
À medida que os alunos enfrentam rápidas mudanças na Educação a distância (EAD) mediada pelas Tecnologias Digitais (TD) exige-se deles um conjunto de competências cada vez mais amplo e que apoie as necessidades do aprender exclusivamente através dos recursos on-line. Portanto, propõem-se na presente tese a construção e validação de um Modelo de Competências Digitais em Educação a distância (EAD) intitulado MCompDigEAD, com foco no aluno desta modalidade. O objetivo é possibilitar o aprimoramento destas competências e seus elementos constituintes conhecimentos, habilidades e atitudes (CHA), ajudando-os a avaliar e identificar suas necessidades em meios digitais. Entende-se que o modelo é uma forma de estabelecer uma relação por analogia com a realidade de forma simplificada, sendo um sistema figurativo (Behar et al., 2009). Trata-se de uma pesquisa do tipo exploratório-descritiva, com abordagem qualitativa realizada através de duas etapas. Na primeira, o mapeamento e validação de competências digitais a serem integradas ao MCompDigEAD. Este mapeamento foi desenvolvido através de análise e comparação entre competências digitais do referencial teórico e mapeadas por meio de estudos de casos com alunos da EAD. Na segunda etapa, foi realizada a construção e validação do MCompDigEAD, mediante os seguintes procedimentos, delineados ao longo desta pesquisa, 1) Concepção; 2) Planificação; 3) Modelagem e 4) Validação. Como resultados, obteve-se um Modelo de Competências Digitais em EAD, composto por três Competências Digitais: Alfabetização Digital, Letramento Digital e Fluência Digital, quatorze competências específicas, detalhadas através dos conhecimentos, habilidades e atitudes (CHA), totalizando 328 elementos. Cada competência específica possui três níveis de proficiência, 1) Inicial, 2) Intermediário e 3) Avançado com exemplo de casos de uso. O MCompDigEAD, é um modelo com a finalidade de orientar e auxiliar os processos de aprendizagem na EAD ao integrar as Tecnologias Digitais em uma perspectiva de competências. Poderá servir de referência e ser adaptado de acordo com o perfil de alunos e instituições, constituindo-se, assim, um instrumento que apoie a identificação de competências digitais em contextos educacionais a distância.
... It also accounts for uncertainty in important parameters but in a more rudimentary way (typically by specifying the probabilities that the reserves fall into broad classes, such as large, small, or zero). This approach is considered to be somewhere in between a present value analysis and a real-options approach and is not considered here; see Boyle and Irwin (2004) for discussion and Eppen et al. (1993) for a good overview of the approach. ...
Book
This Element on cost-benefit analysis provides a summary of recent theoretical and empirical developments and summarizes state-of-the-art stated-preference and revealed-preference valuation methods. The Element discusses how to assess small (or marginal) as well as large (or non-marginal) projects that have a significant impact on prices and/or other economic variables. It also discusses distortions like taxes, market power, and sticky prices. In addition, risk/uncertainty is considered. A novel feature is the elaboration on flexible evaluation rules for reasonably small projects. Conventional point-estimates of projects should be used with care, because they typically give biased results.
... One exception is the textbook by Levine, Berenson and Stephan (1998). Decision risk analysis is also much more commonly taught in operations research textbooks (Chelst 1998;Anderson, Sweeney, and Williams 1994;Gould, Eppen, and Schmidt 1991;Hillier and Lieberman 1990;Ragsdale 1995;Winston 1994). As the next section will show, even these textbooks provide fairly limited coverage of decision theory. ...
Article
Full-text available
There has been much concern about making the curriculum for engineering statistics more relevant to the needs of industry. One proposed solution is to include decision risk analysis in the curriculum. However, the current coverage of decision risk analysis in statistics textbooks is either nonexistent or very introductory. In part, this reflects the fact that decision risk analysis, as currently taught, relies on the complex notion of a utility function. Recent research in decision theory suggests a way of comprehensively and rigorously discussing decision theory without using utility functions. In this new approach, the decision risk analysis course focuses on making decisions so as to maximize the probability of meeting a target. This allows decision theory to be integrated with reliability theory. This course would be more comprehensive than the conventional introductory treatment of decision theory and no more difficult to teach. In addition, integrating decision theory with reliability theory facilitates its incorporation in curricula that currently exclude decision theory.
... Optimalni plan zadataka (1) -(3) poklapa se sa jednim od planova sistema jednačina (2). Simpleks-metod se sastoji u metodi uzastopnog poboljšanja plana. ...
... As proposed by Abbagnano (1970), model is one of the fundamental scientific concepts. A model represents a selective abstraction of reality, representations of real objects and situations (Eppen et al., 1987;Anderson et al., 1991;Pidd, 1998), and it aims to clarify the relationship between different elements, indicating effective causalities and interactions (Harding and Long, 1998). Models are relevant because they allow a better understanding of something that will be built. ...
... The single period model is concerned with the planning and control of inventory items for which only one replenishment opportunity exists. Unconstrained single period stochastic inventory systems with constant unit costs treated by Gould [1] , Nahmias [2] and Tersine [3] . Also, Silver [4] examined the single period model when the demand during the period normally distributed. ...
... where c 2 ≥ 0 and it is interpreted as the earning obtained by B2 per produced unit. Nash equilibrium (Luce and Raiffa 1957;Imbert Tamayo and Petrosian 1980): Player B1 must calculate the set of optimal vectors X*(u) solutions of the linear parametric programming problem (Gould et al. 1993;Hillier and Lieberman 2001): max c 1 X = c 1 X*(u) ( 7 ) with XA ≤ u + a; X ≥ 0; a ≥ 0 where u is a parameter vector and the vector a represents resources that do not depend on A0. ...
Article
Full-text available
This paper deals with the application of Game Theory with Perfect Information to an agricultural economics problem. The goal of this analysis is demonstrating the possibility of obtaining an equilibrium point, as proposed by Nash, in the case of an agricultural company that is considered together with its three sub-units in developing a game with perfect information. Production results in terms of several crops will be considered in this game, together with the necessary parameters to implement different linear programming problems. In the game with perfect information with the hierarchical structure established between the four considered players (a management center and three production units), a Nash equilibrium point is reached, since once the strategies of the rest of the players are known, if any of them would use a strategy different to the one proposed, their earnings would be less than the ones obtained by using the proposed strategies. When the four linear programming problems are solved, a particular case of equilibrium point is reached.
... A "greedy" heuristic provides a viable tool for doing this and guarantees a feasible schedule will be generated because all resource constraints are satisfied at each stage. It is known that this approach is not optimal, but it is widely used as giving good results (Gould, Eppen, and Schmidt, 1993). ...
... Courses the author has taught include statistical methods, operations research methods, mathematical programming, regression analysis, categorical data analysis, mathematical probability and statistics, design and analysis of experiments, and survival analysis. In teaching the introductory survey course in operations research, the author has used textbooks written by Cook and Russell (1981), Eppen et al. (1987), Anderson et al. (1994), Ragsdale (1998), Lapin (1994), and Lapin and Whisler (2002). ...
Article
Full-text available
This paper provides a review of the history of operations research pedagogy and the progress the discipline has made in improving the quality of education it provides college students since its recognized inception during World War II. Recent and current trends are examined and ongoing activities and initiatives in operations research pedagogy are discussed. Finally, implications for the future of operations research are considered.
... where c 2 ≥ 0 and it is interpreted as the earning obtained by B2 per produced unit. Nash equilibrium (Luce and Raiffa 1957;Imbert Tamayo and Petrosian 1980): Player B1 must calculate the set of optimal vectors X*(u) solutions of the linear parametric programming problem (Gould et al. 1993;Hillier and Lieberman 2001): max c 1 X = c 1 X*(u) ( 7 ) with XA ≤ u + a; X ≥ 0; a ≥ 0 where u is a parameter vector and the vector a represents resources that do not depend on A0. ...
Article
Full-text available
This paper deals with the application of Game Theory with Perfect Information to an agricultural economics problem. The goal of this analysis is demonstrating the possibility of obtaining an equilibrium point, as proposed by Nash, in the case of an agricultural company that is considered together with its three sub-units in developing a game with perfect information. Production results in terms of several crops will be considered in this game, together with the necessary parameters to implement different linear programming problems. In the game with perfect information with the hierarchical structure established between the four considered players (a management center and three production units), a Nash equilibrium point is reached, since once the strategies of the rest of the players are known, if any of them would use a strategy different to the one proposed, their earnings would be less than the ones obtained by using the proposed strategies. When the four linear programming problems are solved, a particular case of equilibrium point is reached.
... Since the solution to a model is often merely a starting point and is, in itself, the least interesting part of the analysis of the real problem, a manager will pose numerous additional questions before there is sufficient confidence (Gould et al., 1993). The possible additional questions for mall M are as follows: ...
... On the basis of this tool decision-makers could gradually gain substantial insights. Eppen et al. (1993) favor the idea of tools in the form of a simple map. According to them, decision theory provides a framework for analyzing a wide variety of management problems. ...
Article
Small and medium-sized enterprises (SMEs) need help in making better decisions about e-commerce investments. In this paper a tool for the continuous improvement of the decision-making process is presented. The tool is designed to meet the needs of SMEs' by understanding the concept of e-commerce through combining organizational aspects with the integration of information and communication (ICT). Using the tool, decision-makers will be able to quickly acquire substantial knowledge about e-commerce investments, which will lead to better decisions on e-commerce investments. The tool is part of a conceptual framework introduced here with the objective of supporting decision-makers in their initial steps at the beginning of a decision-making process.
... Os modelos são muito utilizados dentro da teoria organizacional , em particular na área de gestão de operações , com a finalidade de estimar, prever, e também no processo de tomada de decisão. Eles são representações de objetos e situações reais (Anderson et al., 1991 ); todavia , constituem uma abstração seletiva da realidade (Eppen et al., 1987). Para Harding e Long (1998), um modelo é uma representação dinâmica da realidade e tem como objetivo aclarar as relações entre diferentes elementos , indicando causalidades e interações efetivas. ...
... For example, if the right hand side of the constraint is changed by an amount (d), the solution will increase/decrease by (Xi) x (d). The linear programing terminology for this effect is termed " shadow price " and is represented by the use of dual variables [4]. In the previous example, if two units are added to the right hand side, Y increases by (1.5)(2) = 3 units. ...
Article
Portfolio risk minimization is sought by investment managers with an ideally large expected return. By evaluating the variance and covariance of given stocks, the number of shares to be purchased relating to a predetermined return can be sought. Lagrange multipliers, used to simplify nonlinear equations, will expand the linear programming capabilities of analytical software. These tools can be effectively used by portfolio managers in the quest for superior returns and associated lowered risk.
... areas across multidisciplines such as the nonlinear algebraic problems of optima in mathematics and statistics (Anton 1999;Johnson and Bhattacharyya 1987), applied optimization problems in science and engineering (Dorf 2005;MATLAB 2011), constrained/unconstrained optima in operations studies (Fabrycky 1984), computational optimization in computer science and software engineering (Press et al. 1992;Tucker 1992;Wang 2007;2008c), behavioral optimization in artificial intelligence (Bender, 2000;Wang, 2008d;2010;Wang et al. 2009b), rational decisions in decision theories and economics (Brue 2001) (Wang and Ruhe 2007;Wang and Chiew 2010), and optimal strategies in management science (Eppen et al. 1993). A number of numerical methods were developed for optimization such as the golden section search method, the quadratic regression method, Newton's method, the random search method, and the steepest ascent method (Chapra and Canale 2002;Dennis and Schnabel 1996;Fletcher and Powell 1963;Fletcher 1987;Gilat and Subramaniam 2011;Gill et al. 1981;Nocedal and Wright 1999;Rice 1993;Scheid 1988). ...
Article
Full-text available
Optimization is a searching process for seeking the maximum and/or minimum of a one or multidimensional function. Optimization is widely used to obtain the best outcomes of a design, solution, prototype, and project in science and engineering. This paper presents two novel numerical methods for function optimization known as the binary section search (BSS) method and the sliced reduction search (SRS) method. For each of the numerical methods developed in this paper, its conceptual model, mathematical model, and algorithm are formally modeled and analyzed. A set of comparative experiment results in MATLAB® is provided. Interesting findings are reported based on comparative studies of numerical optimization methods and their complexity and performance. It is found that the golden search method is not the fastest converging bracket method for 1-D function optimization as conventionally supposed. The traditional random search method and the steepest accent method for 2-D and n-D function optimization are extended by a more efficient method of sliced reduction search.
... Os modelos são muito utilizados dentro da teoria organizacional, em particular na área de gestão de operações, com a finalidade de estimar, prever, e também no processo de tomada de decisão. Eles são representações de objetos e situações reais (Anderson et al., 1991); todavia, constituem uma abstração seletiva da realidade (Eppen et al., 1987). Para Harding e Long (1998), um modelo é uma representação dinâmica da realidade e tem como objetivo aclarar as relações entre diferentes elementos, indicando causalidades e interações efetivas. ...
Article
Full-text available
Many studies have been devoted to representing, understanding and explaining organizational action. The purpose of this work is to contribute to the Theory of Operations Management. An understanding of the causal relationships between strategy, structure and performance is still of major importance in the area of 'Strategy and Organizations'. This paper discusses the theoretical assumptions of a reference framework that can be used in the construction of models aimed at studying organizational action from the standpoint of structuring, implementation and performance. The approach employed here is structuralism, into which concepts of resources and capabilities have been incorporated based on an approach centered on the development of organizational competences. The framework presented here is founded on the hypertext metaphor and interrelates the dimensions of organizational processes and structures in the context of organizational 'spaces'. The framework incorporates the dimensions of space, the structuralist approach founded on the definition of 'form', and the 'hypertext' architecture to integrate the dimensions. The framework articulates and represents the horizontal and vertical coordination of activities and processes in an integrated and coherent way within the context of participative management and knowledge management.
... From its struggle to be taken seriously in the 1950s, the science of management and administration has become a principal component of management theory and practice in the 1990s. It is prominent in the academic literature (Assad et al., 1992;Austin, 1993;Culbert, 1996;Mingers and Gill, 1997;Plane, 1994;Reisman, 1992;Sproull, 1997) 2 and in business school classrooms (e.g., Eppen et al., 1998;Taylor, 1998). In public administration, a "new" science of administration (Daneke, 1990;Dennard, 1996;Kiel, 1994;Neumann, 1996; is establishing a presence with the literature of traditional administrative science (Dunsire, 1973;Lee, 1990;White, 1975). ...
... From its struggle to be taken seriously in the 1950s, the science of management and administration has become a principal component of management theory and practice in the 1990s. It is prominent in the academic literature (Assad et al., 1992;Austin, 1993;Culbert, 1996;Mingers and Gill, 1997;Plane, 1994;Reisman, 1992;Sproull, 1997) 2 and in business school classrooms (e.g., Eppen et al., 1998;Taylor, 1998). In public administration, a "new" science of administration (Daneke, 1990;Dennard, 1996;Kiel, 1994;Neumann, 1996; is establishing a presence with the literature of traditional administrative science (Dunsire, 1973;Lee, 1990;White, 1975). ...
... Coordination in this sense refers to the efficient allocation of resources across multiple tasks, setting and communicating task deadlines, and scheduling activities to minimize total project duration. Managers address such problems using PERT charts and`criticaland`critical path analysis' (Eppen et al., 1993, pp. 662 ± 711). ...
Article
Managers and scholars emphasize the importance of inter-departmental integration to successful product development and emphasize the importance of product development capabilities to firm performance. Interdepartmental integration can increase shared knowledge; this in turn can reduce the severity and incidence of seemingly unnecessary and costly mistakes leading to more efficient and effective product development. This paper examines the sources of shared knowledge problems across functional and product-based groups. Using interviews, product development plans, and time-sheet data, we look at the underlying factors behind two sources of knowledge sharing problems: (1) a serial product development process (i.e. little or no face-to-face communications) and (2) ineffective use of integrating mechanisms because of sticky knowledge. For each problem source, we find two key causal factors. We discuss the problems, their factors, and the implications for management and theory.
... Theory suggests that durable organizational change of the kind required by CCM is most likely when stakeholders are involved in design and implementation [12,13]. Yet classical continuous quality improvement (CQI) for depression, which maximizes participation, does not improve outcomes [14,15]. ...
Article
Full-text available
Meta-analyses show collaborative care models (CCMs) with nurse care management are effective for improving primary care for depression. This study aimed to develop CCM approaches that could be sustained and spread within Veterans Affairs (VA). Evidence-based quality improvement (EBQI) uses QI approaches within a research/clinical partnership to redesign care. The study used EBQI methods for CCM redesign, tested the effectiveness of the locally adapted model as implemented, and assessed the contextual factors shaping intervention effectiveness. The study intervention is EBQI as applied to CCM implementation. The study uses a cluster randomized design as a formative evaluation tool to test and improve the effectiveness of the redesign process, with seven intervention and three non-intervention VA primary care practices in five different states. The primary study outcome is patient antidepressant use. The context evaluation is descriptive and uses subgroup analysis. The primary context evaluation measure is naturalistic primary care clinician (PCC) predilection to adopt CCM.For the randomized evaluation, trained telephone research interviewers enrolled consecutive primary care patients with major depression in the evaluation, referred enrolled patients in intervention practices to the implemented CCM, and re-surveyed at seven months. Interviewers enrolled 288 CCM site and 258 non-CCM site patients. Enrolled intervention site patients were more likely to receive appropriate antidepressant care (66% versus 43%, p = 0.01), but showed no significant difference in symptom improvement compared to usual care. In terms of context, only 40% of enrolled patients received complete care management per protocol. PCC predilection to adopt CCM had substantial effects on patient participation, with patients belonging to early adopter clinicians completing adequate care manager follow-up significantly more often than patients of clinicians with low predilection to adopt CCM (74% versus 48%%, p = 0.003). Depression CCM designed and implemented by primary care practices using EBQI improved antidepressant initiation. Combining QI methods with a randomized evaluation proved challenging, but enabled new insights into the process of translating research-based CCM into practice. Future research on the effects of PCC attitudes and skills on CCM results, as well as on enhancing the link between improved antidepressant use and symptom outcomes, is needed. ClinicalTrials.gov: NCT00105820.
... Moreover managerial decision-making should not be so dependent on one person. Theory of decision-making has taken, as its subject matters how individuals and groups make decisions [2], [4], [5]. The goal for much of this work has been the production of a model of decision making -a model general enough to describe individual cases of decision making while drawing out important generalities across different individuals and situations. ...
Article
Full-text available
Manager who is adjusted to think about decision process in the terms of intuitive process, rational model, and model of bounded rationality is usually confused with many technical details describing something he does not understand. To take more advantage of decision support tools managers need IT professionals to speak with the same language. This article represents an attempt how to use managerial language in order to describe decision support tools.
Article
Full-text available
يهتم هذا البحث بتقييم محفظة إستثمارات تأمينات الممتلكات في السوق المصري من خلال نموذج رياضي يساعد شركات تأمينات الممتلكات على الاختيار المناسب للمحفظة الاستثمارية عن طريق صياغة المشكلة كنموذج للبرمجة التربيعية خلال الفترة من 2008 / 2009 - 2019 - 2020 وتوصلت الدراسة إلي تركز معظم إستثمارات شركات تأمينات الممتلكات في السوق المصري في نوعين فقط من الاستثمارات هما الأوراق المالية غير الحكومية بنسبة 46,11 % والودائع الثابتة بنسبة 32,65 % وهذا لا يتناسب مع مبدأي التنويع والسيولة الواجب توفرهما في محفظة الاستثمارات، إلا أنه في نفس الو قت يوجد تشتت كبير لنسب عوائد كل من العقارات والاستثمارات الأخرى تعكسها قيم التباين والانحراف المعياري ومعامل الاختلاف، وتوصل الباحثان إلي أن عائد الاستثمار الأقل تباين 7,3 % لمحفظة استثمارات شركات تأمينات الممتلكات في السوق المصري، وأوصي الباحثان بضرورة الاعتماد علي صناديق الاستثمار لضمان الاستثمار المتنوع والآمن، بالإضافة إلي عمل شراكة حقيقية بين شركات تأمينات الممتلكات في السوق المصري والمؤسسات المتخصصة في إدارة محافظ الأوراق المالية للاستفادة من خبراتها، مع ضرورة إعادة النظر في القيود القانونية المفروضة علي استثمارات شركات تأمينات الممتلكات من قبل هيئة الإشراف والرقابة لتحقيق أعلي عائد ممكن بدرجة مخاطر مقبولة.
Conference Paper
Full-text available
Predmet ovog istraživanja su standardna situacijska obilježja košarkaša sudionika na Olimpijskim igrama 2012. godine u Londonu. Na osnovu datog predmeta definirali smo i cilj ovog istraživanja, a to je utvrditi povezanost situacione efikasnosti u košarci sa ostvarenim plasmanom reprezentacija koje su učestvovale na olimpijskom košarkaškom turniru u Londonu 2012. godine. Uzorak varijabli u ovom istraživanju obuhvata jedanaest varijabli za procjenu situacione efikasnosti u košarci koje je utvrdila Svjetska košarkaška federacija (FIBA). Na osnovu dobijenih rezultata došli smo do zaključaka da reprezentacije učesnice koje imaju bolji procenat šuta za dva poena, više ostvarenih skokova u fazi odbrane, asistencija, napravljenih ličnih grešaka i osvojenih lopti u toku olimpijskog košarkaškog turnira imaju bolji ostvareni plasman.
Conference Paper
Full-text available
Naučno istraživanje u ovom slučaju, uočava se iz definisanog tematskog pristupa koji naglašava potrebu za istraživanjem putem koncipiranog programa rada, sa pretpostavkomda je moguće u određenom vremenskom intervalu realizacijom sadržaja košarke,uticati na usavršavanje ne samo tehničkih elemenata, motoričke i situaciono-motoričkesposobnosti, već i na transformaciju i oblikovanje morfološkog statusa košarkaša studenata i učenika srednjih i osnovnih škola. Cilj ovog istraživanja, bio je da se utvrde kvalitativne i kvantitativne promjene morfološkog statusa studenata fakulteta sporta, pod uticajen dodatnog programa nastave košarke kao izbornog predmeta. Dobijeni su značajni pokazatelji transformacije morfološkog statusaistraživanog uzorka studenata, kao i značajne informacije, koje će u velikoj mjeri unaprijediti nastavni plan i program košarke na fakultetu, a samim tim pozitivno uticati na podizanje kvaliteta nastave na univerzitetu.
Conference Paper
This study was conducted to identify students’ perception towards retail management skills through managing business simulation at Centre Of Retail Excellence (CORE). Sample is a Diploma in Retail Management (DRM) students at Politeknik Sultan Azlan Shah (PSAS), Perak which managed CORE during their study. At Polytechnic Sultan Azlan Shah (PSAS), Centre of Retail Excellence (CORE) is a business simulation for teaching and learning process for DRM students. The sample size was 125 people who were randomly selected and distributed questionnaires through on-line survey. This study involved students of Diploma in Retail Management (DRM) semester 4, 5, 6 and alumni DRM. The objectives of this study was to identify students’ perception towards retail management skills by managing CORE and to identify the types of skills gained by students through managing CORE. The study was analysed by using excel in Google Drive. From this study it can be concluded that by giving the students opportunity in managing CORE, there are positive perception in increasing retail skills among students DRM. In addition, it is also concluded that the skills used in managing CORE is a skills needed by employers. Keywords work business simulations retail management retail skills
Chapter
General Decision Making (Including Bibliographies) Analytic Hierarchy Process Applications Of Decision Aids Behavioral Aspects of Decision Making Chemical Process Safety and Decision Making Cost-Benefit Analysis and Cost-Effectiveness Analysis Decision Analysis Decision Support Systems Economics Expert Judgment Expert Systems Future Research and Development Fuzzy Sets Game Theory Group Decision Making Issues Surrounding the Use of Decision Aids Kepner-Tregoe Decision Analysis Mathematical Programming Multiattribute Utility Analysis Multiple Criteria Decision Making Operations Research/Management Science Other Decision Aids Payoff Matrix Analysis Probability and Statistics Risk Analysis and Risk Assessment Social Choice Theory Software Utility Theory Value of Life Voting Weighted Scoring Methods
Chapter
Purpose of Chapter Overview of Mathematical Programming Explanation of Mathematical Programming Case Study: Underground Pipeline Extensions of Mathematical Programming Implementation Needs Summary
Chapter
Just about any direction you turn nowadays, the term “network” pops up. So pervasive, in fact, is the idea of a network that it’s actually entered the English language as a verb, as in “networking,” which is used to express the idea of having lots of contacts, friends and colleagues. To get a feel for how the term is used in everyday life, let us briefly look at a few examples of networks.
Article
Students (N = 112) rank-ordered the suitability of three applicants for a graduate teaching assistantship position. Six rank-order combinations were possible, each indicating a different decision style. Contrary to expectations, groups in high- and low-conflict conditions preferred (51%) the applicant who had the highest lowest (Maximin or Wald decision strategy) score among the criterion categories.
Chapter
Dank der rasanten Entwicklung im Hardware- und Softwaresektor ist die Simulation ein bedeutsames Instrument zur Unterstützung bei der Entscheidungsfindung geworden. Dies insbesondere deshalb, weil praxisrelevante Problemstellungen sehr schnell durch ein hohes Mass an Komplexität geprägt sind, sodass der Einsatz von mathematischen Optimierungsverfahren im Rahmen von Entscheidungsmodellen allenfalls nicht mehr gegeben ist. Dies ist vor allem bei vielen dynamischen, stochastischen Systemen mit gegenseitigen Abhängigkeiten zwischen den einzelnen Prozessen der Fall. In derartigen Situationen sind einzig noch Simulationsmodelle einsetzbar, obwohl damit eine „Rückstufung“ von der Ebene der Optimierungsmodelle auf jene der Erklärungsmodelle („wenn-dann“-Kausalbeziehungen) in Kauf genommen werden muss. Die Simulation beinhaltet indessen im Gegensatz zur weitverbreiteten Meinung mehr als nur Experimentieren am Modell. Neben der Modellentwicklung und Modellimplementierung gewinnt die effektive und effiziente Planung und Durchführung der Simulationsexperimente zentrale Bedeutung; andernfalls wird das Arbeiten mit Simulationsmodellen sowohl in zeitlicher als auch kostenmässiger Hinsicht sehr aufwendig, einer der Hauptgründe dafür, dass viele entwickelte Simulationsmodelle nicht genutzt bzw. eingesetzt werden. Im Rahmen der Planung von stochastischen Simulationsexperimenten tritt somit die Frage in den Vordergrund, welche Parameterkonstellationen wie lange simuliert werden müssen, um aus den Ergebnissen Schlussfolgerungen ziehen zu können. Die Frage nach der Effektivität, also nach der Auswahl geeigneter, wichtiger oder „richtiger“ Parameterkonstellationen, wird im Rahmen der Versuchsplanung beantwortet. Die Frage nach der Effizienz, d.h. nach der optimalen Anzahl der Simulationsläufe je Parameterkonstellation wird im Rahmen der taktischen Planung von Simulationsexperimenten behandelt. Diese Anzahl erforderlicher Simulationsläufe wird durch die Rechengenauigkeit bestimmt, die ihrerseits auf die Daten- und Modellgenauigkeit auszurichten ist.
Article
Full-text available
Architectural energetics, subsumed within replicative archaeology, provides a means through which buildings are translated into labor-time estimates. To date, the majority of architectural energetics analyses have generated comparative measures of architectural costs, equating these with a vertical structure of political power and authority within and among societies. The present analysis expands the application of architectural energetics by subjecting construction labor costs to an analysis based on concepts central to the Theory of Constraints, which is widely applied in modern operations management. This modeling generates a hypothetical set of behavioral patterns performed by general laborers within a construction project and explicates a method which allows further exploration into the question of labor organization (i.e., allocation and articulation of workers), as well as perhaps other economic organization, in an archaeological context. The case example is Structure 10L-22, a large Mayan palace at the site of Copan, Honduras.
Article
Option pricing, decision trees and Monte Carlo simulations are three methods used for evaluating projects. In this paper their similarities and differences are compared from three points of view:- how they handle uncertainty in the values of key parameters such as the reserves, the oil price and costs; how they incorporate the time value of money and whether they allow for managerial flexibility. We show that despite their obvious differences, they are in fact different facets of a general project evaluation framework which has the static base case scenario as its simplest form. Compromises have to be made when modelling the complexity of the real world. These three approaches can be obtained from the general framework by focussing on certainty aspects.
Article
Full-text available
This paper (SPE 57894) was revised for publication from paper 52949, originally presented at the 1999 SPE Hydrocarbon Economics and Evaluation Symposium held in Dallas, 20–23 March. Original manuscript received 3 January 1999. This paper has not been peer reviewed. Summary Option pricing, decision trees, and Monte Carlo simulations are three methods used to evaluate projects. In this paper, we compare their similarities and differences from three points of view—how they handle uncertainty in the values of key parameters, such as reserves, oil price, and costs; how they incorporate the time value of money; and whether they allow for managerial flexibility. We show that, despite their obvious differences, they are in fact different facets of a general project-evaluation framework that has the static base-case scenario as its simplest form. Compromises have to be made when modeling the complexity of the real world. These three approaches can be obtained from the general framework by focusing on certainty aspects. Introduction Option pricing, decision trees, and Monte Carlo simulations are three methods for evaluating oil projects that seem at first radically different. Option pricing comes from the world of finance. In its most common form, it incorporates the Black and Scholes1 model for spot prices and expresses the value of the project as a stochastic differential equation. Decision trees, which come from operations research and games theory, neglect the time variations in prices but concentrate on estimating the probabilities of possible values of the project, sometimes with Bayes theorem and pre- and post-probabilities (see Ref. 2). In their simplest form, Monte Carlo simulations merely require the user to specify the marginal distributions of all the parameters appearing in the equation for the net present value (NPV) of the project. All three approaches seek to determine the expected value (or maximum expected value) of the project and possibly the histogram of project values; however, they make different assumptions about the underlying distributions, the variation with time of input variables, and the correlations between these variables. Another important difference is the way they handle the time value of money. Decision trees and Monte Carlo simulations use the traditional discount rate; option pricing makes use of the financial concept of risk-neutral probabilities. One of the difficulties in estimating the value of a project is that it usually is a nonlinear function of the input variables (for example, tax is treated differently in years with a profit than in years with a loss). Starting out from the NPV calculated on the base case, this paper shows how Monte Carlo simulations and decision trees build uncertainty and managerial flexibility into the evaluation method. Option pricing starts out by defining the options available to management and then models the uncertainty in key parameters. The three approaches are, in fact, different facets of a general framework. They can be obtained from this framework by focusing on certain aspects and simplifying or ignoring others. First Step-NPV for the base case The first step in evaluating any project is to set up a base-case scenario and to calculate its NPV with the parameter values that have been agreed upon. This assumes that the values of the input parameters are known: original oil in place, decline rate, oil prices for each year, costs for each year, discount rate, and tax structure, among others. It further assumes that the scenario and the project life are fixed and that management will not intervene because of changes in the oil price, new technological developments, and other such factors. In the real world, the values of the variables are uncertain and management does react to changing situations, so it is vital to incorporate these two factors into the evaluation procedure. Ideally, distributions of all the variables should be modeled together with the correlations over time and the complex links between variables should be modeled for a wide variety of management scenarios. However, as Smith and McCardle3 demonstrated, this rapidly becomes very unwieldy and the sheer complexity of the situation forces compromise. Monte Carlo simulations, decision trees, and option pricing address this problem in different ways; each focuses on certain aspects and simplifies or ignores others. We show how these methods build up from the NPV equation in the base case incorporating uncertainty in the input variables for all three methods and incorporating managerial flexibility for decision trees and option pricing.
Article
Poor performing students in technical required courses tend to assume that their failure reflects lack of ability within the discipline. The suggestion made here is that tacit student strategies for learning and problem solving are also crucial factors, and when ineffective, function as concealed contributors to poor performance. It is further suggested that experts in various domains rely on some commonly held tacit strategies, and that these can be partially transferred to students through a set of implementable steps. This process benefits students by addressing some underlying issues and by providing useful interventions.
Article
The paper reports a study carried out in a large UK industrial R&D establishment which had recognised that queuing for service from support groups was becoming a serious problem.The main part of the study was carried out on the operations of an analytical department. Interviews were carried out with both ‘servers’ and ‘customers’ at all levels in the organisation. These were supplemented by observations of the actual process by which service was demanded and rendered, and by analysis of records.The study revealed why the queuing problem arose and the reasons for its undesirable effects. The authors concluded that resolution of the problems is a matter for the organization as a whole, and not merely for servers and customers.The situation is further analysed in terms of queuing theory. It is argued that improvement could be achieved if a systematic priority allocation system which took into account both urgency and importance were instituted. However, rational scheduling is possible only over a short time horizon, given current human and computing abilities. To achieve optimization over a longer time horizon the authors recommend the development of an expert system incorporating past experience into its knowledge base.
Article
Purpose This paper aims to provide a solution that allows a contractor to present its maximum competitiveness in meeting the multiple objectives defined by a project client in bidding for a contract. Design/methodology/approach By applying goal programming (GP) technique, this paper presents an alternative quantitative method for assisting a contractor to find out its own most competitive strategy in bidding for construction contracts where multiple selection criteria are adopted. A goal programming optimal bidding strategy (GP‐OBS) model is introduced to help a contractor generate an optimal resources allocation solution. A hypothetical example is used to show the procedures of using the model GP‐OBS. Findings This paper demonstrates that the GP technique can help contractors in producing an optimal bidding strategy which enables the contractor to present his/her maximum competitiveness in implementing project multiple objectives. Originality/value The solution from the model helps contractors to decide the optimal level of resources to be consumed within the resource limitations. The model GP‐OBS provides a new approach in assessing contractor overall competitiveness.
Article
In this paper we consider the problem of maximizing a non‐linear or linear objective function subject to non‐linear and/or linear constraints. The approach used is an adaptive random search with some non‐random searches built‐in. The algorithm begins with a given point which is replaced by another point if the latter satisfies each of the constraints and results in a bigger functional value. The process of moving from one point to a better point is repeated many times. The value of each of the coordinates of the next point is determined by one of several ways; for example, a coordinate is sometimes forced to have the same value as the value of the corresponding coordinate of the current feasible point. In this algorithm, a candidate point receives no further computational considerations as soon as it is found to be unfeasible; this makes the algorithm general. Computer programs illustrating the details of the new algorithm are given and computational results of two numerical test problems from the literature are presented. Optimality was reached in each of these two problems.
Article
One of the fascinating things about multiple criteria decision making (MCDM) is the degree to which the contributions that have built the field have come from all over the world. In this paper the international nature of the authorships of the 1216 refereed journal articles published on MCDM between 1987 and 1992, and the journals in which they have appeared, is examined. Also, an analysis of the 217 books and 31 journal special issues that have been published on MCDM, and 143 conferences that have been held on MCDM since the inception of the field, is similarly conducted. The paper concludes with a conference organizer, author and special issue guest editor index.
Article
Full-text available
Much recent thought in strategy has stressed the importance of organizational integration for competitive advantage. Empirical studies of product development have supported this emphasis by correlating integrating practices and superior performance. We propose, from a resource-based or capability view, that this correlation results from integration leading to patterns of shared knowledge among firm members, with the shared knowledge constituting a resource underlying product development capability. To explore this connection, we examine the product development efforts of a scientific software company. We define the ‘glitch’ as a costly error possible only because knowledge was not shared, and measure the influence of glitches on firm performance. At this company, gaps in shared knowledge did cause the company to incur significant excess costs. We also identify a set of ‘syndromes’ that can lead to glitches, and measure the relative importance of these syndromes. The glitch concept may offer a general tool for practical measurement of the marginal benefits of shared knowledge. Copyright © 1999 John Wiley & Sons, Ltd.
Article
The advent of Decision Support Systm redefines the primary professional role for institutional researchers.
Article
Although the Gap Procedure that Brams and Kilgour (2001) proposed for determining the price of each room in the housemates problem has many favorable properties, it also has one drawback: Its solution is not always envy-free. Described herein is an approach that uses linear programming to find an envy-free solution closest (in a certain sense) to the Gap solution when the latter is not envy-free. If negative prices are allowed, such a solution always exists. If not, it sometimes exists, in which case linear programming can find it by disallowing negative prices. Several examples are presented.
Article
Full-text available
A hospital's intensive care unit (ICU) is a limited and critical resource. The efficient utilization of ICU capacity impacts on both the welfare of patients and the hospital's cost effectiveness. Decisions made in the ICU affect the operations of other departments. Yet, decision making in an ICU tends to be mainly subjective and lacking in clear criteria upon which to base any given decision. This study analyzes the admission-and-discharge processes of one particular ICU, that of a public hospital in Hong Kong, by using queuing and computer simulation models built with actual data from the ICU. The results provide insights into the operations management issues of an ICU facility to help improve both the unit's capacity utilization and the quality of care provided to its patients.
Article
This paper describes advanced interval methods for finding a global optimum and finding all solutions of a system of nonlinear equations, as implemented in the Premium Solver Platform, an extension of the Solver bundled with Microsoft Excel. It also describes the underlying tools that allow Excel spreadsheets to be evaluated over real and interval numbers, with fast computation of real gradients and interval gradients. The advanced interval methods described include mean value (MV) and generalized interval (GI) representations for functions, constraint propagation for both the MV and GI forms, and a linear programming test for the GI form, in the context of an overall interval branch and bound algorithm. Numerical results for a set of sample problems demonstrate a significant speed advantage for the GI techniques, compared to alternative methods.
Article
Decision problems often consist of numerous smaller decisions that are aggregated and interrelated while spanning multiple domains, paradigms, and/or perspectives. Therefore, the decision making process should be structured in such a flexible and iterative manner that enables a range of structured to unstructured decisions to be considered, built and solved in an appropriate manner. We propose and implement a framework and architecture that uses the three pillars of flexibility in decision making (sequential, parallel, convergence, and interwoven), versatility (of paradigm and/or domain), and independence of components (value, dimension, and purpose) to support the decision making process and the entire modelling lifecycle.
ResearchGate has not been able to resolve any references for this publication.