Science topic

Operational Research - Science topic

Explore the latest questions and answers in Operational Research, and find Operational Research experts.
Questions related to Operational Research
  • asked a question related to Operational Research
Question
3 answers
Im looking for method to find the best path with more alternative path. I know djikstra to find shortest path but my research seems like find longest path and path shape is like a tree and i dont know the method is. Example my problem is we have financial manager, before be a financial manager, he must be an Assistant Financial Manager (A) or Assistant Budget manager (B). If Assistant Financial Manager got optimal performance score (4 point) and Assistant Budget Manager got potential score (3 point). And before be an assistant financial manager he must be a financial and accounting supervisor (C) and before be an assistant budget manager he must be a general affair supervisor (D). If financial and accounting supervisor got potential score (3 point) and general affair supervisor got potential score also (3 point) how we determine the best path with maximize performance score?. Imagine the path like Z - A(4) - C(3) also Z - B(3) - D(3).
Relevant answer
Answer
Optimal Design of Truss Structures with Frequency Constraints: A Comparative Study of DE, IDE, LSHADE, and CMAES Algorithms (Published2021)
  • asked a question related to Operational Research
Question
6 answers
Preliminary results of the EURO WISDOM survey to understand career paths in Operational Research (OR) by gender suggest OR is carried out in more Engineering departments than in Schools of Business.
Faculty survey respondents to date are affiliated to:
Engineering 34%
Business 30%
Computer Science 12%
Mathematics 10%
Research Centre/Other 15%
Which is that true for you?
We invite anyone working in OR (male/female/academic/practionner) to complete the 10 minute anonymous survey by Sunday 25th July 2021. The survey is here: https://ucdbusiness.eu.qualtrics.com/jfe/form/SV_3UfzP74RaNRhbxQ
The EURO WISDOM Forum provides a platform to support, empower and encourage the participation of all genders in Operational Research within EURO. You'll find more details about the EURO WISDOM Forum here: https://www.euro-online.org/web/pages/1654/wisdom
Relevant answer
Answer
Operation Research(OR) is a core part of mathematics ,but some engineering subjects like industrial engineering has co-relation with OR. Many theories of OR are used in the field of engineering to optimize production cost ,maximizing profit and reducing the use of resources .
  • asked a question related to Operational Research
Question
12 answers
Any recommendation from a scientific journal to submit a paper on operations research applying linear programming and vehicle routing (VRP) using the B&B algorithm?
Relevant answer
Answer
You can ask your thesis advisor about what journal they think would be best to submit your work. It is hard to suggest a journal for you without seeing the actual paper.
  • asked a question related to Operational Research
Question
2 answers
I use queuing theory for determining premium pricing through aggregate loss. I haven't found suitable data.
Relevant answer
Answer
Dear sir,
Can you elaborate it more....
Thank you
  • asked a question related to Operational Research
Question
10 answers
We got 1000 items with a weight Wk, we have :
Wk=max[(1-Xm)*Wm+Pk]
with Xk a binary variable if we put the item or not
Xm the weights of related items "m" to the item "k"
Pk the basis weight of item k.
Relevant answer
Answer
From your statement of the problem, it is not clear what MIP you are solving. Is your MIP problem the following quadratic MIP:
Min W1 X1 + ... + Wn Xn
s.t.
Wi >= (1-Xj)*Wj+Pj, i=1, ... , n;
j = 1, ..., n;
i != j
Wi >= 0, i=1, ... ,n
Xi = 0 or 1, i=1, ..., n
  • asked a question related to Operational Research
Question
24 answers
Mathematical programming is the best optimization tool with many years of strong theoretical background. Also, it is demonstrated that it can solve complex optimization problems on the scale of one million design variables, efficiently. Also, the methods are so reliable! Besides, there is mathematical proof for the existence of the solution and the globality of the optimum.
However, in some cases in which there are discontinuities in the objective function, there would be some problems due to non-differentiable problem. Some methods such as sub-gradients are proposed to solve such problems. However, I cannot find many papers in the state-of-the-art of engineering optimization of discontinuous optimization using mathematical programming. Engineers mostly use metaheuristics for such cases.
Can all problems with discontinuities be solved with mathematical programming? Is it easy to implement sub-gradients for large scale industrial problems? Do they work in non-convex problems?
A simple simple example of such a function is attached here.
Relevant answer
Answer
Your ideas for dividing the region and using local optimizer are so nice!
Thanks a lot!
  • asked a question related to Operational Research
Question
22 answers
Is there really a significant difference between the performance of the different meta-heuristics other than "ϵ"?!!! I mean, at the moment we have many different meta-heuristics and the set expands. Every while you hear about a new meta-heuristic that outperforms the other methods, on a specific problem instance, with ϵ. Most of these algorithms share the same idea: randomness with memory or selection or name it to learn from previous steps. You see in MIC, CEC, SigEvo many repetitions on new meta-heuristiics. does it make sense to stuck here? now the same repeats with hyper-heuristics and .....   
Relevant answer
Answer
Apart from the foregoing mentioned discussion, all metaheuristic optimization approaches are alike on average in terms of their performance. The extensive research studies in this field show that an algorithm may be the topmost choice for some norms of problems, but at the same, it may become to be the inferior selection for other types of problems. On the other hand, since most real-world optimization problems have different needs and requirements that vary from industry to industry, there is no universal algorithm or approach that can be applied to every circumstance, and, therefore, it becomes a challenge to pick up the right algorithm that sufficiently suits these essentials.
A discussion of this issue is at section two of the following reference:
  • asked a question related to Operational Research
Question
3 answers
We have conducted an AHP (analytic hierarchy process) to understand experts' opinions of the relative importance of different aspects of natural resource management, specifically for fisheries governance.
From our data, it appears group coherency is low - all our experts have every different opinion about what components of governance are important in the situation we proposed to them.
To confirm this, we want to compute a measure of group coherence as described by Zahir in two papers in 1999. However, this calculation is beyond my statistical abilities. Does anyone have any guide, or ideally code (Python, R script) to calculate this statistic?
Zahir S (1999a). Clusters in a group: Decision making in the vector space formulation of the analytic hierarchy process. European Journal of Operational Research 112: 620-634.
Zahir S (1999b). Geometry of decision making and the vector space formulation of the analytic hierarchy process. European Journal of Operational Research 112: 373-396.
Relevant answer
Answer
For this better tool is ANP!
  • asked a question related to Operational Research
Question
10 answers
As we know in this pandemic, everything seems quite hard to control. I hope the researches could discuss and share their thoughts for researching at this time. Especially related in Industrial Engineering.
Relevant answer
Answer
As we all know that " Necessity is the mother of invention" . During World War II , whole world was facing scarcity in each and every field. then OPERATIONS RESEARCH taken birth. Like that COVID-19 , created disruption in all the functions of industries. So, each and every functions of industries can be tackle by Industrial Engineers and provide remedy through analytical approach.
  • asked a question related to Operational Research
Question
58 answers
Is the so-called "novel" metaheuristic methods are threatening to lead the area of metaheuristics away from scientific rigor?
Why we need other metaheuristic methods although all the new methods solve the same problems and in the same context, the unsolved problem is still unsolved also, there is no existing standard rule judge the mathematical model and pseudo-code?
In recent years, the field of combinatorial optimization has witnessed a true tsunami of so-called "novel" metaheuristic methods, most of them based on a metaphor of some natural or man-made process. The behavior of virtually any species of insects, the flow of water, musicians playing together -- it seems that no idea is too far-fetched to serve as an inspiration to launch yet another metaheuristic.
Sources:
1- Del Ser, Javier, Eneko Osaba, Daniel Molina, Xin-She Yang, Sancho Salcedo-Sanz, David Camacho, Swagatam Das, Ponnuthurai N. Suganthan, Carlos A. Coello Coello, and Francisco Herrera. "Bio-inspired computation: Where we stand and what's next." Swarm and Evolutionary Computation 48 (2019): 220-250.
2- Sörensen, Kenneth. "Metaheuristics—the metaphor exposed." International Transactions in Operational Research 22, no. 1 (2015): 3-18.
Relevant answer
Answer
Metaheuristics are here to stay, and as has been discussed before they have their potentials in some circumstances.
But: being a professor in applied math, I would probably very seldom recommend to start by running metaheuristics: I would first always look at the problem at hand, figure out a few equivalent formulations (always profitable!), and run small/medium instances that make sense (with an array of existing math programming tools, of course), to see whether a few of the models appear to yield better results than some others, and ultimately have one to three model representations that I would let "compete" on both small instances and large-scale ones. By doing that you can collect observations that will be a good help to fix, for example, parameter settings for subsets of instances having different characteristics.
It's a way better way to "learn" the problem than to throw it into a random metaheuristic and write "Q.E.D." - as to be lazy is not a virtue. And you wouldn't have a clue how far away you are from an optimum.
  • asked a question related to Operational Research
Question
3 answers
Dear all,
I used those two types of regression to estimate effects of environment factors on hospital efficiency.
However, they gave me opposite results. For instance, in FR percentage of people above 65 was negatively associated with DEA score, whereas in DBTR it was positively associated. The same results was for turnover rate.
I appreciate if you could explain me such contradictory.
Relevant answer
Answer
  • asked a question related to Operational Research
Question
13 answers
Dear Operational Research Society,
Although it may sound like a general question, when it is more appropriate to use mathematical modeling instead of simulation and vice verca?
What is the key factor in making a decision here?
Thanks a lot
Relevant answer
Answer
Simulations generally require mathematical modeling. From a personal experience, mathematical models allow humans to make predictions about the behaviors of the process systems. A ‘good’ model gives reliable predictions, while a ‘bad’ model gives inaccurate predictions. One example is the first-ever picture of a black hole, by the Event Horizon Telescope (EHT).
Models also allow developers to scale the prototype of a product before making the real thing, which can be massive and expensive.
  • asked a question related to Operational Research
Question
5 answers
In India, separate Ministry of AYUSH has established for their own progress and development,
AYUSH system comprises of Ayurveda, Unani, Sidha and Homeopathy. My question is today, we have "modern Allopathy literature (Anat, Physio, Pharmac, Medicine, etc)" as a science for understanding of Health, however even AYUSH system of medicine also fallows the science but always these system may not fit in line with the modern medical sciences. so there may be problem for integration of AYUSH and Allopathy at Internal Medicine level.
However, as recently WHO has published a report on "Traditional Medicine" and also "Self health practices", indicating importance of these intervention for achieving the overall health and acceptance of health at the ground level and reaching towards "Universal Health Coverage"
I personally believe that many of the OPD and IPD, diseases, illness, symptoms, etc. there are scope of these alternative medical sciences particularly in rising burden of NCDs. These alternative medical sciences may be preventive, promotive and curative in nature.
i also believe that, these science need more "Operational research" at the community level and so in area of "Public Health" Today, there is need for documentaion and research and scientific support for these types of sciences.
In View of the same, what are the best practices, national policies and public health intervention available at present all over the world which supports and help in redevelopment of alternative medical sciences??
Relevant answer
Answer
Thanks a lot,
In India Yoga and Ayush have been a source of treatment and has build in culture of India. Hence even if todays modern science may not be able to explain the various positive effect (so called placebo) of these indigenous treatment i believe that one day science is mature enough or developed enough to justify these knowledge. I also believe that each region of the World has such health science which inbuilted in their societies and culture and so i also believe without integrating these sciences achieving "Wellness and Holistic Health" would be like water on ducks back.
  • asked a question related to Operational Research
Question
6 answers
Please give their appropriate cases.
Relevant answer
Answer
we can chose linear or dual dependent to conditional constrants
  • asked a question related to Operational Research
Question
3 answers
Can anyone kindly share resource material on Operational Research in Health Systems Strengthening projects.
Relevant answer
Answer
Dear Farooq Khan,
I suggest you to see links and attached files on topic.
Operational Research to Improve Health Services - International ...
Systems thinking for health systems strengthening - Service ...
Strengthening Health Systems for an Effective HIV/AIDS Response ...
Framework for Operations and Implementation Research in Health ...
Best regards
  • asked a question related to Operational Research
Question
27 answers
I want to know how should we write values for alternatives when we have quantitative attributes. (In AHP)
As you know when all of attributes are qualitative, we use saaty scale (1 - 9)...
EXAMPLE) we have price as a quantitative attribute ...
alternative 1 = 1000 $
alternative 2 = 5000 $
alternative 3 = 50000 $
How should we fill pairwise comparison matrix (for price)?
        A1     A2     A3
A1    1
A2             1
A3                       1
Relevant answer
Answer
Excellent answers
  • asked a question related to Operational Research
Question
25 answers
Production Engineering is an Engineering area that deals with the problems of productive operations, with emphasis on the production of goods and services. Operational Research (OR) is defined as the area responsible for solving real problems, using decision-making situations, using mathematical models. The OR is an applied science focused on solving real problems that seeks to apply knowledge from other disciplines such as mathematics, statistics, computation to improve rationality in decision making processes.
Operational Research (OR), responsible for solving real problems, through mathematical and statistical models. How have we used OR in our Searches?
Relevant answer
Answer
Dear Leopoldino
I guess that you are referring to OR (Operational Research) and you use the Brasilian OP (Pesquisa Operacional). If it is so, I believe that the most important aspect of your remark is that OR, and most especially Linear Programming (LP) solves real problems, which is possible because its algebraic structure of inequalities. It allows the construction of a scenario model by far much more representative than in any MCDM methods.
LP was developed in 1939 by Leonid Kantarovich, and for this development he was awarded in 1952 the Nobel Prize in Economy. It is then the granddad of all present-day methods for MCDM.
The actual algorithm, the Simplex, is due to the genius of George Dantzing which developed it in 1948. This same algorithm is still used to day, after 70 years, and is, according to some sources, used by about 70,000 large companies world-wide.
LP processes large amounts of information, it is irrelevant the number of alternatives and criteria, which can be in the thousands each, and it is so important, that Excel incorporated it as an add-in since 1991.
However, LP has two severe drawbacks; one of them is that it works with only one objective and with only quantitative criteria, which is not very realistic in nowadays projects. Nowadays, these two problems have been superseded by new methods and software based on LP, which do not yield an optimal solution, as the original Simplex, but one which is probably very close to it.
What is important is that LP allows for modelling very complex scenarios, incorporating features than none of the more than two dozen MCDM in the market can handle, and for this reason, it is able to model real problems, by establishing restrictions, dependencies and even correlation.
In addition, it does not produce Rank Reversal.
I have been using LP for decades and in the late 70s was fortunate enough to act as a counterpart of the Massachusetts Institute of Technology (MIT), to solve by LP in a two years project, a very complex problem related with a river basin.
Since then, I solved more than one hundred problems in very different areas, and many of them have been published in my books, and at present, I am trying to promote its use in RG, where, as you properly say, rationality is paramount.
LP mathematics is a little complex, however, a user does not need to know it, in the same way as he does not need to know the mathematics of AHP, PROMETHHE or TOPSIS. As matter of fact, LP is easier to use than other methods since no weights are needed.
A couple of weeks ago I proposed in RG to develop a course on LP, however, nobody was interested.
  • asked a question related to Operational Research
Question
6 answers
For the application of Industry 4.0 and hence making the machine self aware, what optimization techniques could be used for a machining process? (Preferably please explain a mathematical model regarding the same or a case study)
Relevant answer
Answer
Some times in order to know the important parameters, it is necesary to use a methodology such experiment design. Although it is time consuming, it is better in order to obtain better results
  • asked a question related to Operational Research
Question
4 answers
I am an undergraduate student in Production and Industrial Engineering, looking for a research proposal for applying in a doctoral program. Also, it would be great if you suggest some read. Or any suggestions?
Thank you for your time.
Relevant answer
There are number of research works in the area of FMS Scheduling. I would suggest that you have broader scope of scheduling challenges under different types and levels of automation. It would be instructive to look at the practical utility of the FMS concept in today's manufacturing environments. Please think through the features of an FMS and their impacts on scheduling rules and their preferences.
  • asked a question related to Operational Research
Question
25 answers
Is Entropy Shanon a good technique for weighting in Multi-Criteria Decision-Making?
As you know we use Entropy Shanon for weighting criteria in multi-criteria decision-making.
I think it is not a good technique for weighting in real world because:
It just uses decision matrix data.
If we add some new alternatives, weights change.
If we change period of time weights change.
For example we have 3 criteria: price, speed, safety
In several period of time weights of criteria vary
For example if our period of time is one month"
This month may price get 0.7 (speed=0.2, safety=0.1)
Next month may price get 0.1 (speed=0.6, safety=0.3)
It is against reality! What is your opinion?
Relevant answer
Answer
Once I was working on several variables and I wanted to weight them. At this time, people usually say that we'd better provide a questionnaire and then through AHP, ANP or other related methods define the weights for variables. That's quite common but how about the bias of the those who fill the questionnaire. Therefore, I looked for some other methods to weight variables based on the reality and I came across with Entropy. In fact, I weighted variables based on the each of these methods and then I compared the results. Entropy results were much closer to what is going on in real world.
  • asked a question related to Operational Research
Question
8 answers
Hey,
I have got a mixed-integer linear programming problem that has complicating (coupling) variables as well as one complicating constraint. The complicating variables are binary and the complicating constraint contains only binary variables. In the image you can see the structur.
It's about unit commitment with chance constraints.
How can i decompose this problem?
Should i do a Benders Decomposition and split the problem into one master problem with all binary variables and the subproblems with only continuous variables?
Relevant answer
Answer
Hi, you can take a look at our cross decomposition algorithm for problems with coupling variables and constraints, which we refer to as weak and strong linking.
  • asked a question related to Operational Research
Question
3 answers
How much the size of spatial area affects traffic generation within the area and the resulting traffic volumes in the relevant city (public transport or individual transport)?
Is there any method of theoretical approach how to describe this issue?
Relevant answer
Answer
So, you are looking for a direct link between traffic and area extension. At the best of my knowledge this issue has not been investigated. Traditionally, I would refer to 4steps modelling approach instead gravity model. For you pueposes, gravity models should work only if the population ans /or other attributes are uniformely distributed.
  • asked a question related to Operational Research
Question
3 answers
I'm looking for some advice for a suitable technique for optimising subject to constraints. My initial problem started out as a basic simplex optimisation problem finding a weight vector $w$, but now I'm trying to add more special constraints. I've heard of techniques to transform absolute function constraints into multiple linear constraints, but I don't know about sign functions. Here is my problem: given input values $x_{i}^{j}$ and $y_{i}^{j}$, I'm trying to find a weight vector $w_{i}^{j}$ to optimise:
\[ \sum^{m}_{j=1}\sum^{n}_{i=1}(w_{i}^{j}+1)y_{i}^{j} \]
subject to:
\[ \sum^{m}_{j=1}\left| \sum^{n}_{i=1}w_{i}^{j}x_{i}^{j}\right| \leq H \]
and
\[ -1 \leq w_{i}^{j} \leq 1 \] for all $i$ and $j$
and
\[ \left|\sum^{n}_{i=1}w_{i}^{j}x_{i}^{j}\right|\leq \left|\sum^{n}_{i=1}x_{i}^{j}\right| \] for all $j$ and
and
\[ sign\left(\sum^{n}_{i=1}w_{i}^{j}x_{i}^{j}\right) \left(sign\left(\sum^{n}_{i=1}w_{i}^{j}x_{i}^{j}\right) + sign\left(\sum^{n}_{i=1}x_{i}^{j}\right) \right) = 0\] for all $j$
Any advice anyone can give would be greatly appreciated, I dont really want to go down the Genetic Algorithm route straight away, if theres a numerical programming technique that is suitable I'd prefer to work with that and if this can be transformed into basic linear constraints and I can carry on using simplex optimisation that would be great!
Relevant answer
Answer
Thank you very very much guys, apologies for my vagueness I meant to say its maximise, its my first attempt using Latex since my PhD many years ago. Thank you for the suggestions, my only real experience of operational research was a 2nd year course during my maths undergrad where I learnt about the simplex method. I had a quick look at CPlex it does indeed have a lot more functionality than the standard Microsoft Solver Foundation I'm currently using. As you mentioned I am just using those sign functions like logical or statements, I didn't even realise you could use logical constraints like that, could be very powerful for my research and far more intuitive for my applications.
.
  • asked a question related to Operational Research
Question
7 answers
We all know that data is very important in decision making processes, and also it's obvious that row data can not lead to a precise decision.
Operations Research (OR) can help transform the data to useful information by which accurate decisions can be made. Decision Support Systems are good examples in which this transformation and process of making accurate decisions occur.
Health care problems have attracted too much attention and researchers are using OR tools to solve them.
Now a critical question is: What are the main trends of using operations research tools in healthcare problems?
Relevant answer
Answer
Thank you Behzad for sharing such a significant question. The answers to this question were important for me too.I follow this.
  • asked a question related to Operational Research
Question
4 answers
Suppose we have a set of players, i.e., {p1, p2, ..., pn} and each player has three different strategies, i.e., {s1, s2, s3}. They play m number of games. In each game, each player seeks to maximize its profit by selecting a strategy with highest playoff. The profit associated with each strategy is as follows.
1) Payoff for selecting strategy s1 is zero
2) Payoff for selecting strategy s2 is a real number, which is calculated using some formula f1
3) Payoff for selecting strategy s3 is also a real number, however, it is calculated using another formula f2
I want to prove the existence of Nash equilibrium when all the players select one of the available strategies.
I have searched on web and found several documents, however, I couldn't get a clear idea to prove it mathematically.
Any help is deeply appreciated. Please let me know if I have missed any information. Thank you in advance.
Relevant answer
Answer
@Felipe Please find attachment to see the formulas.
  • asked a question related to Operational Research
Question
12 answers
Given a set of m (>0) trucks and a set of k (>=0) parcels. Each parcel has a fixed amount of payment for the trucks (may be same for all or may different for all) . The problem is to pick up the maximum number of parcels such that the profit of each truck is maximized. There may be 0 to k number of parcels in the service region of a particular truck. Likewise, a parcel can located in the service region of 0 to m trucks. There are certain constraints as follows.
1. Each truck can pick up exactly one parcel.
2. A parcel can be loaded to a truck if and only if it is located within the service region of the truck.
The possible cases are as follows
Case 1. m > k
Case 2. m = k
Case 3. m < k
As far as I know, to prove a given problem H as NP-hard, we need to give a polynomial time reduction algorithm to reduce a NP-Hard problem L to H. Therefore, I am in search of a similar NP-hard problem.
Kindly suggest some NP-hard problem which is similar to the stated problem. Thank you in advance.
Relevant answer
Answer
Let p_{ij} denote the profit gained if parcel j is loaded onto truck i. If the parcel cannot be loaded onto that particular truck, we just set p_{ij} to zero. It looks like we just need to solve the following 0-1 linear program:
maximise
\sum_{i=1}^m \sum_{j=1}^k p_{ij} x_{ij}
subject to
\sum_{i=1}^m x_{ij} \le 1 (for all j)
\sum_{j=1}^k x_{ij} \le 1 (for all i)
x_{ij} \in \{0,1\} (for all i and all j).
If that's right, the problem is very easy. As stated by Przemysław and Helmut, it is equivalent to the linear assignment problem, which in turn is equivalent to maximum-weight bipartite matching.
  • asked a question related to Operational Research
Question
12 answers
I am looking for example of those construction companies because I am interested in see if they have improve their performance applying OR techniques.
Relevant answer
Answer
I would say : this is indeed a very broad question and difficult to answer here in this platform.
But I take an opportunity to share my experience as prior to academics I was in Industry for almost 14 years and application of CPM and PERT were common in the execution of large projects that I handled in India. Multiple Decision Criteria Analysis has now become important tool for managing large construction projects.
To answer your question, I would without any hesitation say that Operation Research Techniques are most valuable in managing the projects. Let us consider all the contractors involved in large projects have been successful using OR analytic
  • asked a question related to Operational Research
Question
3 answers
I have a problem, where equation changes with problem instance size.
There are given n circles (centers and radiuses ). Circles dont intersect, they are separable. I need to visit every circle starting from the first circle and returning to the first circle. The sequence of visiting circles is known. The task is to find points on these circles (described by radius) to minimize the path.
For 3 circles we have to find P1 P2 and P3 that minimizes d=d(P1, P2)+d(P2, P3) + d(P3, P1) (for permutation (1, 2, 3)). When number of circles changes, the minimised equation changes.
I would like to have a module invoked from C++/Python that will solve such problem for me. I would like to send number of circles and its parameters, and solver should solves a problem and responds with P1, P2, P3,...Pn.
In the attachment you can find an image of two circles and some points P1, P2 (it is not optimal solution, of course).
Relevant answer
Answer
I can recommend pyomo. http://pyomo.org
It allows you to create your optimization problem in plain python and calls the solver to solve it. It supports both Cplex and Gurobi but also many interesting open source solvers. 
  • asked a question related to Operational Research
Question
10 answers
list parameters used
Relevant answer
Answer
Dear Prof. Subhi
Vehicle category(LMV, HMV, Off-road, SUV and so on),
Time restriction,
Direction restriction and
Number of Inter-junctions
shall also be considered.
  • asked a question related to Operational Research
Question
6 answers
I'm going to use an utility function in my nonlinear optimization model. (for asset allocation problem)
It can be logarithmic function like :  log(cTx) (C transpose X)
Please introduce me some other popular utility functions except log.
Relevant answer
Answer
If the payoff variable is non-negative, you can use square root for risk aversion and square for risk seeking. If the payoff can be negative, then you first have to add a constant to make it non negative.
  • asked a question related to Operational Research
Question
1 answer
Is there anyone using vPlanner for the implementation of Last Planner? I have some questions for defining the Lookahead plan manually into vPlanner. I would appreciate it if anyone is familiar message me in researchgate 
  • asked a question related to Operational Research
Question
9 answers
There are 4 options with the different distance and carbon footprint value (as per in attachment). What is the method that can be used to choose the best minimum value for both distance and carbon footprint from those 4 options.
Individually the minimum distance is Option 4, but the minimum carbon footprint is Option 3. How can the option be selected by considering both distance and carbon footprint in that data set?
Relevant answer
Answer
Try with the data Envelopment Analysis,  which does not require normalizatioin of data and allows selecting a pareto-optimal solution under conflicting objectives. Thanks to the assurance regions you can also give more relevance to one of these criteria.However you would need more than 2 evaluating criteria.
  • asked a question related to Operational Research
Question
7 answers
We often concider in academic scheduling problems the minimization of the weighted sum of completion time (∑WiCi). 
To what can correspond the optimization of this criterion  in real industrial problems?
Thank you for your answers :)
Relevant answer
Answer
Hi,
As mentionned in Pinedo's famous book "Scheduling: Theory and applications" (Springer 2008), The sum of the weighted completion times of the n jobs gives an indication of the total holding or inventory costs incurred by the schedule. In the weighted version of the mean flow time, the higher the weight of a job, the earlier the latter has to be completed.
I hope this helps,
Kind regards,
Imène.
  • asked a question related to Operational Research
Question
2 answers
Through an operational leadership perspective, how does one approach a misinterpreted labeling of operations without disrupting the synergy of the department? Is there any supporting evidence of shifting operational labeling without hindering morale?
Relevant answer
Answer
First, I would say that the key to defeating issues of employee morale stemming from organizational change is to properly communicate the change initiative with the employees. This is the key for any change initiative.
Second, I would suggest looking into the literature on change management. For Example, Lewin's Change model requires that an organization unfreezes the old "way",  introduce the new way, and refreeze. This process should help facilitate this process without alienating employees.
I hope this helps
Sincerely,
Sean
  • asked a question related to Operational Research
Question
17 answers
I am writing a paper regarding finding optimal location for a port. I used AHP for determining criteria weight and PROMETHEE for outranking alternatives. I know we have about 7 pre-determined preference functions and more could be defined by the researcher but which one should be used in which context? (please introduce an article in this subject if you know one). Also, I am really confused regarding determining appropriate preference function.
Thank you so much for your kind responses 
Relevant answer
Answer
(3) Concerning Robustness.
Definitely, it's OK if models we use are robust.  [certainly, we should religiously approach to these and close  notions in math sense...]
Should we use only Robust (stable) Models (with structural stability only) ??
Would we do that and avoid(!) using other models..., we had never collided with such famous notions as Bifurcations, Strange Attractors, Stochastic Dynamics (and Catastrophes)  - (those from Dynamical Systems).
The close situation may be in Decision Analysis.
I'm not a follower of AHP!! Moreover, 10 years ago I didn't like AHP.... due to ...Possible Rank Reversals (as for Topsis and Promethee)..., etc. I teach my students all the basic MCDA methods, and interrogate each year concerning their attitude to different meth. Students with good knowledge in Math like complex methods (MAVT, MAUT, Probab i Fuzzy meth). Managers or economists prefer more simpler (they think so) meth - Topsis, AHP.. Is it bed? No!
Is the Rank Reversals Problem in AHP, TOPSIS, and PROMETHEE (and other possible drawbacks) a cause to forget them !?? My point of view - NO!
I'm a mathematician, and for me, Rank Reversals was, that time, a big reason to consider such a method as a bed one. But now I have another position.
I dealt (deal) with ranking, and see, that some times an intuitively understandable relation, with some good properties (distinguishability, etc) is not transitive... and, in some procedures, rank reversals are possible.
Is it too bed!? I think - No! This may be in the cases, when alternatives are 'close'; and if they are not 'close', they are distinguishable by different methods, including AHP..., and without Rank Reversals, etc..
So, in such situations, such a 'drawback' can serve as an (additional) reason to careful analysis of this MCDA problem..(and with Different MCDA Methods). Definitely, if we want to find "a 'good' trade-off decision"! (and not to be in a hurry to rank alternatives ASAP).
  • asked a question related to Operational Research
Question
10 answers
I think about onsite decision support for incident commanders.
In my opinion currently the descriptive and naturalistic models are exploited. Why not prescriptive or normative?
If we consider a human as a decision maker, the factors underlying the use of descriptive or naturalistic models are: inability to comprehend and process in analytical way all the information, course of actions, consequences and costs of alternative activities in mental and time pressure environment.
If we consider a computer system as a decision maker, the factors are:
  • lack of information - we can not ask firefighters to insert data into computer system because they don't have such time;
  • poor sensory layer for recognition of phenomena or victims - there is no so far sensors in building which enable to track fire dynamics, people localization and their physical state;
  • huge uncertainty in modeling and foreseeing the fire and people behavior, reaction of the building to the fire exposure, changes in ventilation, extinguishing effects and many others.
What do you think about this problem?
Relevant answer
Answer
Hi Adam, I think this is a really interesting question.
I very much agree with answers above. An additional thought on the role of environment...
Klein's work demonstrated that expert firefighters can develop effective heuristics. That is, through experience, firefighters develop mental shortcuts that help them to classify rapidly and accurately a scenario and make an effective decision. A pure normative approach takes time. So, in contexts where time is critical and humans can develop effective heuristics, the descriptive approach will be most appropriate.
We should consider employing normative/prescriptive models in contexts where ineffective heuristics are developed, thus healing to reduce overconfidence and reckless decision making.
A great reference for this: Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: a failure to disagree. The American Psychologist, 64(6), 515–526. http://doi.org/10.1037/a0016755
This is a current topic I'm wrangling with in Defence context, so welcome your thoughts.
Simon
  • asked a question related to Operational Research
Question
9 answers
What is the difference between optimal Pareto solutions of the following bi-objective problems? Is there any difference in general case?
1) Min f(x)
Min g(x)
H(x,y)<=B
2) Min f(x)
Min g(y)
H(x,y)<=B
Relevant answer
Answer
In that case you shoul rewrite it as
Min f(x,y) over x
Min g(x,y) over y
H(x,y)<=B
If that is the case I agree with Behrang, it is a bi-level optimization problem.
According to the way you wrote it on the question it is neither, but if it is according to how I wrote it above (if both objectives involve both variable vectors but we optimize each objectives based on only one variable) then it is bi-level, not bi-objective.
  • asked a question related to Operational Research
Question
3 answers
I need to generate multiple sets of realistic data by random variation in order to test my model. I'm working in scheduling operations. Is there a way to do it with cplex?
Relevant answer
Answer
Hi dear Laila
Cplex is an optimization software and can not generate any random variation.
Cplex just solve your optimization models and obtain optimum solution If the answer is.
One of the advantage of Cplex is it easily be connected with Java or C# .
You can write your model and call Cplex in the java or C# , in this case you can generate random variation.
other way for you : you write your model in GAMS and use Cplex for solve your model.
in the GAMS you can generate random variation.
  • asked a question related to Operational Research
Question
13 answers
Hi
Task assignment is assigning several tasks to several persons. I know that Hungarian Algorithm is common to solve this problem. But the limitation of Hungarian algorithm is that, if tasks are more than workers, some tasks are left not assigned. Or, it assigns tasks to workers in 1-1 mapping only.
What I need is simply that a worker is assigned more than one task. This happens when we have number of tasks is bigger than number of workers. In other words, we need (Several tasks - to - One persons )
For example, I have 9 tasks (cost is man-hour) and 6 workers. An example of solving this might be 
Persoon 1 -> 1 task
Persoon 2 -> 2 tasks
Persoon 3 -> 3 tasks
Persoon 4 -> 1 task
Persoon 5 -> 1 task
Persoon 6 -> 1 task
---------------------------
6 Persons -> 9 tasks
- So far, No assumption that tasks are related.
- Cost is different per task.
Is there an algorithm that solves such a problem? Where all tasks are definitely assigned (while minimizing cost is required).
Thank you
Relevant answer
Answer
Here is a very straightforward LP solution using EXCEL solver -- note that this is a standard format for a model typically called a bounded transportation problem
I set the coefficients to be "costs" in which case lower is better and the values were simply generated (by me) to be between 1 and 4 --- say "1" indicating the best match / lowest cost for a worker to a task, and "4" meaning a high cost. The 5 workers in this example are assigned to the 8 tasks for a total cost of 9 -- so almost everything is covered by a "good" low cost match (the lowest possible would be 8, right?)
Your question suggested that you wanted an "algorithm" and obviously some of the other answers are guiding you to ways to form this as a network problem. Since some of your other comments suggested a more elementary need, I hope this basic model is useful to you.
  • asked a question related to Operational Research
Question
13 answers
As I know, most mixed integer programming problems with block diagonal structure are suitable to be solved with Dantzig-Wolfe decomposition and column generation. Analogously, which kinds of problems or which structures are preferable to be solved with Bender decomposition. Can Bender decomposition also be feasible for general MIP problem, and which one can get better performance for general MIP, Bender decomposition or Dantzig-Wolfe decomposition?
Relevant answer
Answer
Generally Benders Decomposition is suitable for problems with complicating variables, i.e. if you fix some variables the solution of the problem become more simple. DW is suitable for problems with complicating constraints. In LP, dual of a problem with complicating variables will be a problem with complicating constraints.
In MINLP problem, using BD, the problem is converted into a set of IP and NLP problems which are solved iteratively.
  • asked a question related to Operational Research
Question
6 answers
Hi, could anyone tell me that which Operational Research (OR) method is best for agriculture planning for optimization of crop production and why? I shall be very thankful if anyone give me relevant literature evidence and some other useful links. Thanks in advance.
Relevant answer
Answer
Hi, 
Initially, you need a model or meta model relates to your project to use mathematics of OR. In order to get an exact model, I advice you to use the tools of experimental design. In agriculture area,it can be a little bit difficult to get data in a short time. However, experimental design tools and statistical methods such as, factorial design or Taguchi will give you less points of combinations of factor levels to test your experiments and reaching a closed formulation of your problem. After you get the model, to choose linear, nonlinear or heuristic methods to solve your problem relates to your regression model order. I advice you Design and Analysis of Experiments , Montogomery. 
  • asked a question related to Operational Research
Question
7 answers
Given a homogeneous linear equations AX=0 (X≥0), how to get the nonnegative basic solution set?
Relevant answer
Answer
So what you have is a convex polyhedral cone. For those you need of course the extreme rays - these are the ones you are looking for. Here is a classic paper on the subject:
But I should think that you find descriptions on this topic in any linear programming book.
  • asked a question related to Operational Research
Question
5 answers
Can anyone provide recommendations or articles relating to organizational structure w.r.t managed services / outsourcing within the telecommunication sector.
I really appreciate any assistance offered.
Relevant answer
Answer
Hi Elisha,
Positive impacts can include reduction of human errors & risk, improved efficiency & productivity, improved SLA & customer satisfaction as well as cost reduction. These positive impacts can be achieved when IT managed services / outsourcing service providers adopt the following organizational structures:
1) Organizational Structure based on Full Time Equivalent (FTE) – instead of based on headcount which is more costly, some service providers design their structure based on FTE / effort model in which IT staff are delivering their services remotely / off-shore from delivery centers all over the world. This virtual structure enables concentration of IT staff in one center which can develop economy of scale that can reduce the operation costs.
2) Organizational Structure based on Specialization – e.g. even though an IT job like server administration can be handled by an IT staff, some service providers further break down this role into multiple sub-tasks e.g. eyeballing server event, responding to event when first alerted, troubleshooting level 1, 2, 3 etc, patch management, break-fix support, vendor management etc so that each sub-task is specially handled by a pool of IT staff that only focusing on their sub-task day-in & day-out for different customers. This structure enables the team to be very specialized and increases their productivity. However, the team members feel boring easily and once they leave, replacement can be found easily. This structure can reduce operation cost but job dissatisfaction and attrition can be high.
3) Organizational Structure based on Advanced Technologies – some service providers invest in technologies like IT process automation, orchestrated automation, cloud computing etc in which organizational structure generally is flatter and consists of lesser IT staff. However, these IT staff are very skilled in using these technologies to deliver their work in which human errors are reduced, system exposure & risk are decreased, job efficiency & productivity increased, improved SLA, cost reduction as well as customer satisfaction.
4) Organizational Structure based on Project Team – this longer lasting structure normally consists of skilled IT staff hand-picked to join as team members to serve a dedicated outsourcing customer until contract ends / renewal. This structure usually consist of IT staff that can demonstrate higher team work, team cohesion, team effectiveness and sometimes increase customer satisfaction.
Regards,
Fung
  • asked a question related to Operational Research
Question
1 answer
-
Relevant answer
Answer
I think the Pro-model software could be work.
  • asked a question related to Operational Research
Question
4 answers
As already stated in the robust optimization literature, there are some cases when the adaptive (or adjustable) robust problem and the static robust problem of uncertain adjustable linear programming problem are equivalent. These include the cases where the uncertainty affecting every one of the constraints is independent of the uncertainty affecting all other constraints.
Does this statement hold for an MIP problem where the uncertainty only applies to upper bounds of continuous variables, and the uncertainty of each constraint is independent of the others? If so, could you provide me with a reference that supports this?
Relevant answer
Answer
Unfortunately this paper is a mathematical one ! I can't understand what Nemirovski says! I have tried many of his papers but ...
I am looking for a very simple example to understand the concept. 
  • asked a question related to Operational Research
Question
28 answers
Hello,
I need to optimize a model using Subgradient Lagrangian Relaxation (LR) method.
Assume the general form as below:
.
Min f(x,y)
s.t.
Ax+By <= E1 (inequality constraint)
Cx+Dy = E2 (equality constraint)
.
Clearly, its LR counterpart will be as follows
Min f(x,y)+Landa * (Ax+By - E1) + u * (Cx+Dy - E2)
.
If there is only one constraint, I know how to solve it (e.g. updating only one Lagrangian variable i.e. only Landa or only u) but my model has two complicating constraints and as a result I do not know how to update both Landa and u variables simultaneously.
I would be grateful if you help me to revise an algorithm to update both Landa and u simultaneously (and not only one of them).
.
Thanks,
.
Morteza
Relevant answer
Answer
Dear Morteza, 
You can find some details on Lagrangian method with a numerical example here (dualizing two constraints): 
I hope it will be helpful ! 
Regards.
  • asked a question related to Operational Research
Question
4 answers
Hello Scholars, I am interested in Economic Lot Scheduling Problems where products are produced in a cyclic pattern. There are some existing modelling approaches, among them extended basic period is more effective than the others. But will someone help me to understand the concept of basic period (which is assumed to be equal for all products) the relationship/ difference between basic period and cycle time for each product?
Relevant answer
Answer
The basic period (BP) is a period of time of sufficient long duration to accommodate the production of all economic production lots specified for all products. It is a concept advanced by Bomberger and has the advantage of permitting a dynamic programming model to be constructed to solve the problem of "how much to produce" of each item (the "lot size"), and "how many times" should it be produced in the BP (the individual cycle times). It is assumed that the BP repeats forever (infinite planning horizon).
To gain a deeper insight into the concept, its historical development and its utility (and drawbacks), I suggest reading the papers by Elmaghraby and the references cited therein:
 "An Extended Basic Period Approach to the Economic Lot Scheduling Problem (ELSP)," Proceedings, the 4th International Conference on Production Research, Tokyo, Japan, August 27-30, (1977); Proceedings of 4th ICPR.
"The Economic Lot Scheduling Problem (ELSP): Review and Extensions," (1978), Management Sci. 24(6): 587-598.
  • asked a question related to Operational Research
Question
28 answers
Means converting the problem with multiple objectives into a single objective optimization problem either using weighted sum objective function, etc.
Relevant answer
Answer
There are several reasons why scalarization is often used for solving multi-objective problems, and whether it is useful or not depends on your application and the structure of your multiobjective problem.
  1. for convex multiobjective problems there is a theorem that states that they can be (essentially) solved by linear scalarization.
  2. there is a large class of optimization methods - the descent methods - which can efficiently solve single objective problems, and scalarization converts a multiobjective to a (series of) single objective problems.
  3. sometimes in applications you can express the various objectives in a single unit (e.g. costs) by weighting them properly, and making them comparable in that way. Then the multiobjective problem can by scalarization be solved as  a single objective problem.
  4. If scalarization is performed carefully, the solution of the scalarized problem will be a Pareto optimal point
  5. Nonlinear scalarization can be used to incorporate a priori preference information into a problem
  6. No preference multiobjective problems are usually solved by norm-scalarization (e.g. global criterion solving)
We also need to discuss what solving a multiobjective problem actually means. This also depends on your application. First, it is clear that a point that is dominated by another point (i.e. every objectives at the second point is smaller than the corresponding objective of the first point) cannot be a solution of the multiobjective problem. The remaining points form the Pareto frontier. That is the reason why many multiobjective algorithms try to identify the Pareto frontier, and it is a good reason to call a point only then a solution if it is Pareto optimal.
However, for an application it is not necessarily important to find the whole Pareto frontier. Often, it is enough to find the one point on the Pareto frontier that "is preferred". Usually, it is difficult to find a proper mathematical model for that preference. However, you can try to use a priori preference information (i.e. model the preferences), you can choose a posteriori (this is when you need to find (most of) the Pareto frontier), and you can choose interactively (has been mentioned already by Michael Patriksson). Many methods to find one (or a few) of the preferred Pareto optimal points use scalarization to do so.
One more thing: I would be very careful to say that the "right alternative" is to use evolutionary algorithms to solve the problem. Evolutionary algorithms lack convergence theory, they often work not well unless the number of function evaluations is really high, and at no point in time while the solution algorithm works you have the faintest idea whether you have all of the Pareto frontier, just parts of it, or even a single Pareto optimal point. They usually do not even work fast in the single objective case (see various comparisons, e.g. by Rios and Sahinidis). If the evolutionary algorithm produces some "Pareto set" there is not even a guarantee that this set contains a single Pareto optimal point. Just in the black-box case when absolutely nothing is known about the objective functions except for a computationally cheap method to compute function values the evolutionary algorithms can usually expected to provide good solutions.
If gradients of the functions can be computed, descent like methods can be used, and optimality conditions can be checked to rule out non-Pareto-optimal points, etc. Usually, if something is know about the structure of the objective functions, algorithms can be used that make use of more mathematical theory, and those algorithms usually come up with better solution in less time.
  • asked a question related to Operational Research
Question
9 answers
Some optimization problems are well known, but the instances or the benchmarks used are not available. Generally what can we do in this situation?
Relevant answer
Answer
Dear Dahi Abd Zakaria Abd El Moiz,
please have a look at the discussion "How to evaluate the quality of metaheuristic algorithms without any benchmarks?" on the link https://www.researchgate.net/post/How_to_evaluate_the_quality_of_metaheuristic_algorithms_without_any_benchmarks
and in particular on the link  http://www.fedulov.ge  as an application.
  • asked a question related to Operational Research
Question
11 answers
I'm using an evolutionary algorithm with powerful exploration for solving some optimization and I want to add some local search for the sake of better exploitation to the EA. I'm looking for a source on different methods of local search.
Relevant answer
Answer
In addition to those already suggested, I would recommend the classic:
Handbook of Metaheuristics
[Note there are two editions, the first (2003) by Glover and Kochenberger, the second (2010) by Gendreau and Potvin.]
Since you're looking to hybridise local search with a GA, also check out previous research in this area, nowadays known as 'memetic algorithms', e.g.:
Handbook of Memetic Algorithms
Note there's a chapter on local search in the latter book:
  • asked a question related to Operational Research
Question
2 answers
Imagine you have problem like:
min z=Cx
subject to Ax=b
If you just modify b by b2, i wonder how to find the common feasible region of those two problems (Ax=b, Ax=b2), if there is one?
Of course for a two variable problem you can draw it but for more than 3 it's not really possible.
Relevant answer
Answer
First, I assume that you mean a standard LP problem which would include the constraints x >= 0 (non-negativity constraints) as well. If you want to describe the intersection of the sets defined by the constraints
Ax = b
x >= 0
and
Ax = b2
x >= 0
you just need to write
Ax = b
Ax = b2
x >= 0
On the other hand, notice that if you really do mean that the matrix A is duplicated and b is not equal to b2, then a feasible solution x with respect to the first set of constraints would imply that Ax = b, and that means that the second part of the constraints can only be satisfied if in fact b2 = b. So this exercise makes not that much sense.
It could make a little bit more sense if A actually also includes an m x m unit matrix, indicating that the system of equations stem from a set of m linear inequalities. If you hence do have two different sets of constraints of the form
B x <= b1
B x <= b2
x >= 0,
for two vectors b1 and b2, then this polyhedron could be non-empty. You of course would replace these constraints by
B x <= b, where b element-wise is the min of b1 and b2.
Where does this idea come from?
... and sorry if I did actually misunderstood something! :-)
  • asked a question related to Operational Research
Question
6 answers
Operations management tools
Relevant answer
Answer
Definition of Total Quality Management : Strategy, culture and management style of a company according to which all persons in the same study, practice, participate and promote continuous quality improvement. This approach has evolved in recent years and tends to include today to subcontractors, suppliers, distributors, etc. Therefore Total quality management is based on the participation of all members and the organization and influences all activities taking place in the organization in order to ensure effective management, the competitiveness of the organization and excellence.
TQM Helps to Improve operational process through:
-The effective decision making.
-The corrective and preventive actions.
-Orient the company to the customer and to your goals.
-Optimal delivery time and service.
-Provides a wider and more comprehensive view of the organization and its internal relationships.
-Helps reduce operating costs
  • asked a question related to Operational Research
Question
1 answer
I have already carried out a Monte Carlo Simulation for analyzing a PERT or CPM Network with one variable, either time or cost. I now want to extend this to a Monte Carlo Simulation for analyzing a PERT or CPM Network for both time-cost variables.
Relevant answer
Answer
What is the relationship between time and cost in your considered problem setting? If cost depend on the project duration, e.g. tardiness cost for late project completion, they are obtained as a transformation of the (simulated) distribution function of the project duration. In case of time-cost trade-offs on the other hand a decision problem arises, for which the following paper might help: http://www.sciencedirect.com/science/article/pii/S0377221704002401
  • asked a question related to Operational Research
Question
3 answers
I want to bind a classical project scheduling problem with Cmax objective function. For example, the first and the second activity can start at the beginning of the project. I want to limit the model in which 1st and 2nd activities never run at the same time. But I don’t know which one is scheduled earlier. Moreover, in my model, just these two activities use the same resource. This resource is bounded, hence model is a special case of RCPSP. How can I handle the resource constraint on my model?
Relevant answer
Answer
Just for the record, this is exactly the type of problem that the classical project scheduling model is designed to solve (for a very thorough coverage, see Herroelen and Demeulemeester 2002, Project Scheduling: A Research Handbook). Resource constraints are at the heart of the issue. In your example, you would define a resource with one unit of capacity that is required by both activities at a level of one. (In general, there may be many other resources with different capacities that apply to other subsets of activities, each of which requires different levels of capacity, such that the total capacity utilized at any time cannot exceed the resource capacity. Some activities may require more than one resource.) As such, it can indeed be formulated as an mixed integer program but only very small problems can be solved that way. A branch and bound approach is better, but real-life projects are typically too large to yield to B&B either. If you decide to resort to heuristics, there are many competing approaches, usually based on a search tree that is similar to the search tree of a B&B but using priority rules to curtail the branching.
  • asked a question related to Operational Research
Question
1 answer
Our paper is about dealing with (reducing/exploiting) degeneracy in column generation. Linear relaxations of combinatorial optimization problems are known to suffer from this defect; I would be interested in learning about your experiences.
Relevant answer
Answer
I encountered a lot of degeneracy when I used column generation on circuit partitioning problems some years ago. I've heard that this is very common but I don't know a good reference to cite.
I think that the combinatorial nature of problems that require interlocking pieces to fit together is a major cause of degeneracy. For example, suppose you need to make deliveries to locations 1..8. You have two columns, one for a vehicle to deliver to 1234 and the other for a vehicle to deliver to 5678. A better solution (lower cost) has one vehicle delivering 1357 and the other 2468. The column for the 1357 has a good reduced cost so you bring it in to your column set. However, when you solve the LP, the variable for that column equals 0 because it is useless without the 2468. That is, the only feasible solution that delivers exactly once to each of 1,2,...8 using columns 1234,5678,and 1357, puts weight 1 on the first two columns and 0 on the third. So when you do column generation, your new columns get weight zero until there are enough of them to work together to bring you to a different solution.
  • asked a question related to Operational Research
Question
1 answer
Is there any place to obtain detailed ideas about Meta-raps?
Relevant answer
Answer
The Meta-Raps are meta-heuristics for randomized priority search. They are generic Algorithms which uses a high level strategy to combine greedy procedures.
Some conceprs are in these papers:
  • asked a question related to Operational Research
Question
8 answers
I am new to scheduling. I got really interested in meta-heuristics. I read through papers and every paper says how genetic algorithm works but how do I carry it out with a set of data?
Relevant answer
Answer
You need to devise a procedure that can create a collection of random initial solutions, a "reproduction" procedure that takes a pair of "parent" solutions and creates a "child" solution from them, and a "mutation" procedure that takes a child solution and modifies it slightly. The details of these procedures will depend on the specific scheduling problem that you are trying to solve.
  • asked a question related to Operational Research
Question
9 answers
The constraint :
x1<2 or x1>8
the value x1 = 5 is not valid
Relevant answer
Answer
You can't have strict inequalities ("<" and ">") in mathematical programming. Perhaps you meant (x1 <= 2) or (x1 >= 8). This can be modelled provided three conditions hold:
(i) You are allowed to include an additional binary variable.
(ii) A lower bound L on the value if x1 is available.
(iii) An upper bound U on the value of x1 is available.
Under these conditions, it suffices to let y1 be a binary variable, taking the value 1 if and only if x1 >= 8, and add the constraints x1 <= 2 + (U-2) y1 and x1 >= L + (8-L) y1.
  • asked a question related to Operational Research
Question
11 answers
There was recently a post in SIMSOC (Newsgroup discussing the use of Computer Simulation in the Social Sciences) asking for examples of agent-based models that have actually been used to support policy decision making or for other purposes "in the real world". The poster (Nigel Gilbert) was looking for examples which have been used for practical purposes rather than those that have been designed to be applied but have never in fact been used.
The post and a summary of responses can be found here:
Are there any such projects related to OR that you have worked on or know of? Mine are unfortunately all examples that have been designed to be applied but have never in fact been used. Is this a general tendency?
Relevant answer
Answer
I actually think there are many cases in which research projects led to actual decisions... I only provide a few examples in an area I know well (pedestrian simulation).
One of these cases, about the management of a public event generating potentially problematic crowding conditions, is described in a paper by Micheal Batty (Michael Batty, Jake Desyllas, Elspeth Duxbury: The discrete dynamics of small-scale spatial events: agent-based models of mobility in carnivals and street parades. International Journal of Geographical Information Science 17(7): 673-697 (2003) - http://dx.doi.org/10.1080/1365881031000135474).
Fewer cases led to the birth of spin-off companies that turned out to be successful in the market (a nice example is represented by TraffGo GmbH - http://www.traffgo-ht.com/en/company/index.html).
Sometimes models developed in the academia have been later encapsulated in the engine of commercial products (e.g. the Social Force model of the above mentioned Prof. Dirk Helbing in the VisWalk pedestrian simulator - http://vision-traffic.ptvgroup.com/fileadmin/files_ptvvision/Downloads_N/0_General/2_Products/3_PTV_Viswalk/EN-UK_PTV_Viswalk_Brochure.pdf).
Similar examples can be surely found in the traffic area, and I'm pretty sure in other fields of application.
Sincerely
Giuseppe
  • asked a question related to Operational Research
Question
17 answers
When I want to obtain importance of every criteria and alternative, how can I prepare a questionnaire for this goal? Is there any source that could be helpful?
Relevant answer
Answer
Here is another file. Better than a link.
Cheers
  • asked a question related to Operational Research
Question
18 answers
MCDM
Relevant answer
Answer
Dear Iman, there are 2 software which can help you in your calculations:
Super Decisions (for ANP & AHP)
Expert Choice (for AHP)
But the best alternative is MS Excel, you just have to learn how to use functions and formulas to do your calculations.
Good luck!
  • asked a question related to Operational Research
Question
6 answers
What is the best software to visualize or compute problems of linear programming, operation research?
Relevant answer
Answer
If you are familiar with SAS statistical software, it has a package (SAS/OR), that facilitate, linear programming, non-linear programming, mixed-integer programming, quadratic programming, etc. It can even develop GANTT Charts and incorporate Miscrsoft Project 98 (and later) into SAS data sets that can be processed. SAS is expensive, but in my experience, there is no statistical or mathematical problem that cannot be addressed by SAS. I have included the *.pdf file for the lates SAS/OR manual (1276 pages) so that you can view it and determine its usefulness for you.
  • asked a question related to Operational Research
Question
1 answer
We have conducted many training programs. However still confused about its outcome measurement procedure, how to do it?
Relevant answer
Answer
1. Check the objective of the training and see if knowledge has increased in participants.
2. do alumni interviews and check what they are doing then after after if performing good.......N.B. you cannot rule out confounding effect.
Belaynew