Science topics: Applied MathematicsOperational Research

Science topic

# Operational Research - Science topic

Explore the latest questions and answers in Operational Research, and find Operational Research experts.

Questions related to Operational Research

Im looking for method to find the best path with more alternative path. I know djikstra to find shortest path but my research seems like find longest path and path shape is like a tree and i dont know the method is. Example my problem is we have financial manager, before be a financial manager, he must be an Assistant Financial Manager (A) or Assistant Budget manager (B). If Assistant Financial Manager got optimal performance score (4 point) and Assistant Budget Manager got potential score (3 point). And before be an assistant financial manager he must be a financial and accounting supervisor (C) and before be an assistant budget manager he must be a general affair supervisor (D). If financial and accounting supervisor got potential score (3 point) and general affair supervisor got potential score also (3 point) how we determine the best path with maximize performance score?. Imagine the path like Z - A(4) - C(3) also Z - B(3) - D(3).

Preliminary results of the EURO WISDOM survey to understand career paths in Operational Research (OR) by gender suggest OR is carried out in more Engineering departments than in Schools of Business.

Faculty survey respondents to date are affiliated to:

Engineering 34%

Business 30%

Computer Science 12%

Mathematics 10%

Research Centre/Other 15%

Which is that true for you?

We invite anyone working in OR (male/female/academic/practionner) to complete the

**10 minute anonymous**survey by Sunday 25th July 2021. The survey is here:**https://ucdbusiness.eu.qualtrics.com/jfe/form/SV_3UfzP74RaNRhbxQ**The EURO WISDOM Forum provides a platform to support, empower and encourage the participation of all genders in Operational Research within EURO. You'll find more details about the EURO WISDOM Forum here: https://www.euro-online.org/web/pages/1654/wisdom

Any recommendation from a scientific journal to submit a paper on operations research applying linear programming and vehicle routing (VRP) using the B&B algorithm?

I use queuing theory for determining premium pricing through aggregate loss. I haven't found suitable data.

We got 1000 items with a weight Wk, we have :

Wk=max[(1-Xm)*Wm+Pk]

with Xk a binary variable if we put the item or not

Xm the weights of related items "m" to the item "k"

Pk the basis weight of item k.

Mathematical programming is the best optimization tool with many years of strong theoretical background. Also, it is demonstrated that it can solve complex optimization problems on the scale of one million design variables, efficiently. Also, the methods are so reliable! Besides, there is mathematical proof for the existence of the solution and the globality of the optimum.

However, in some cases in which there are discontinuities in the objective function, there would be some problems due to non-differentiable problem. Some methods such as sub-gradients are proposed to solve such problems. However, I cannot find many papers in the state-of-the-art of engineering optimization of discontinuous optimization using mathematical programming. Engineers mostly use metaheuristics for such cases.

Can all problems with discontinuities be solved with mathematical programming? Is it easy to implement sub-gradients for large scale industrial problems? Do they work in non-convex problems?

A simple simple example of such a function is attached here.

Is there really a significant difference between the performance of the different meta-heuristics other than "ϵ"?!!! I mean, at the moment we have many different meta-heuristics and the set expands. Every while you hear about a new meta-heuristic that outperforms the other methods, on a specific problem instance, with ϵ. Most of these algorithms share the same idea: randomness with memory or selection or name it to learn from previous steps. You see in MIC, CEC, SigEvo many repetitions on new meta-heuristiics. does it make sense to stuck here? now the same repeats with hyper-heuristics and .....

We have conducted an AHP (analytic hierarchy process) to understand experts' opinions of the relative importance of different aspects of natural resource management, specifically for fisheries governance.

From our data, it appears group coherency is low - all our experts have every different opinion about what components of governance are important in the situation we proposed to them.

To confirm this, we want to compute a measure of group coherence as described by Zahir in two papers in 1999. However, this calculation is beyond my statistical abilities. Does anyone have any guide, or ideally code (Python, R script) to calculate this statistic?

Zahir S (1999a). Clusters in a group: Decision making in the vector space formulation of the analytic hierarchy process. European Journal of Operational Research 112: 620-634.

Zahir S (1999b). Geometry of decision making and the vector space formulation of the analytic hierarchy process. European Journal of Operational Research 112: 373-396.

As we know in this pandemic, everything seems quite hard to control. I hope the researches could discuss and share their thoughts for researching at this time. Especially related in Industrial Engineering.

Is the so-called "novel" metaheuristic methods are threatening to lead the area of metaheuristics away from scientific rigor?

Why we need other metaheuristic methods although all the new methods solve the same problems and in the same context, the unsolved problem is still unsolved also, there is no existing standard rule judge the mathematical model and pseudo-code?

In recent years, the field of combinatorial optimization has witnessed a true tsunami of so-called "novel" metaheuristic methods, most of them based on a metaphor of some natural or man-made process. The behavior of virtually any species of insects, the flow of water, musicians playing together -- it seems that no idea is too far-fetched to serve as an inspiration to launch yet another metaheuristic.

Sources:

1- Del Ser, Javier, Eneko Osaba, Daniel Molina, Xin-She Yang, Sancho Salcedo-Sanz, David Camacho, Swagatam Das, Ponnuthurai N. Suganthan, Carlos A. Coello Coello, and Francisco Herrera. "Bio-inspired computation: Where we stand and what's next." Swarm and Evolutionary Computation 48 (2019): 220-250.

2- Sörensen, Kenneth. "Metaheuristics—the metaphor exposed." International Transactions in Operational Research 22, no. 1 (2015): 3-18.

Dear all,

I used those two types of regression to estimate effects of environment factors on hospital efficiency.

However, they gave me opposite results. For instance, in FR percentage of people above 65 was negatively associated with DEA score, whereas in DBTR it was positively associated. The same results was for turnover rate.

I appreciate if you could explain me such contradictory.

Dear Operational Research Society,

Although it may sound like a general question, when it is more appropriate to use mathematical modeling instead of simulation and vice verca?

What is the key factor in making a decision here?

Thanks a lot

In India, separate Ministry of AYUSH has established for their own progress and development,

AYUSH system comprises of Ayurveda, Unani, Sidha and Homeopathy. My question is today, we have "modern Allopathy literature (Anat, Physio, Pharmac, Medicine, etc)" as a science for understanding of Health, however even AYUSH system of medicine also fallows the science but always these system may not fit in line with the modern medical sciences. so there may be problem for integration of AYUSH and Allopathy at Internal Medicine level.

However, as recently WHO has published a report on "Traditional Medicine" and also "Self health practices", indicating importance of these intervention for achieving the overall health and acceptance of health at the ground level and reaching towards "Universal Health Coverage"

I personally believe that many of the OPD and IPD, diseases, illness, symptoms, etc. there are scope of these alternative medical sciences particularly in rising burden of NCDs. These alternative medical sciences may be preventive, promotive and curative in nature.

i also believe that, these science need more "Operational research" at the community level and so in area of "Public Health" Today, there is need for documentaion and research and scientific support for these types of sciences.

In View of the same, what are the best practices, national policies and public health intervention available at present all over the world which supports and help in redevelopment of alternative medical sciences??

Please give their appropriate cases.

Can anyone kindly share resource material on Operational Research in Health Systems Strengthening projects.

I want to know how should we write values for alternatives when we have quantitative attributes. (In AHP)

As you know when all of attributes are qualitative, we use saaty scale (1 - 9)...

EXAMPLE) we have price as a quantitative attribute ...

alternative 1 = 1000 $

alternative 2 = 5000 $

alternative 3 = 50000 $

How should we fill pairwise comparison matrix (for price)?

A1 A2 A3

A1 1

A2 1

A3 1

Production Engineering is an Engineering area that deals with the problems of productive operations, with emphasis on the production of goods and services.

**Operational Research**(OR) is defined as the area responsible for solving real problems, using decision-making situations, using mathematical models. The**OR**is an applied science focused on solving real problems that seeks to apply knowledge from other disciplines such as mathematics, statistics, computation to improve rationality in decision making processes.Operational Research (OR), responsible for solving real problems, through mathematical and statistical models. How have we used OR in our Searches?

For the application of Industry 4.0 and hence making the machine self aware, what optimization techniques could be used for a machining process? (Preferably please explain a mathematical model regarding the same or a case study)

I am an undergraduate student in Production and Industrial Engineering, looking for a research proposal for applying in a doctoral program. Also, it would be great if you suggest some read. Or any suggestions?

Thank you for your time.

Is Entropy Shanon a good technique for weighting in Multi-Criteria Decision-Making?

As you know we use Entropy Shanon for weighting criteria in multi-criteria decision-making.

I think it is not a good technique for weighting in real world because:

It just uses decision matrix data.

If we add some new alternatives, weights change.

If we change period of time weights change.

For example we have 3 criteria: price, speed, safety

In several period of time weights of criteria vary

For example if our period of time is one month"

This month may price get 0.7 (speed=0.2, safety=0.1)

Next month may price get 0.1 (speed=0.6, safety=0.3)

It is against reality! What is your opinion?

Hey,

I have got a mixed-integer linear programming problem that has complicating (coupling) variables as well as one complicating constraint. The complicating variables are binary and the complicating constraint contains only binary variables. In the image you can see the structur.

It's about unit commitment with chance constraints.

How can i decompose this problem?

Should i do a Benders Decomposition and split the problem into one master problem with all binary variables and the subproblems with only continuous variables?

How much the size of spatial area affects traffic generation within the area and the resulting traffic volumes in the relevant city (public transport or individual transport)?

Is there any method of theoretical approach how to describe this issue?

I'm looking for some advice for a suitable technique for optimising subject to constraints. My initial problem started out as a basic simplex optimisation problem finding a weight vector $w$, but now I'm trying to add more special constraints. I've heard of techniques to transform absolute function constraints into multiple linear constraints, but I don't know about sign functions. Here is my problem: given input values $x_{i}^{j}$ and $y_{i}^{j}$, I'm trying to find a weight vector $w_{i}^{j}$ to optimise:

\[ \sum^{m}_{j=1}\sum^{n}_{i=1}(w_{i}^{j}+1)y_{i}^{j} \]

subject to:

\[ \sum^{m}_{j=1}\left| \sum^{n}_{i=1}w_{i}^{j}x_{i}^{j}\right| \leq H \]

and

\[ -1 \leq w_{i}^{j} \leq 1 \] for all $i$ and $j$

and

\[ \left|\sum^{n}_{i=1}w_{i}^{j}x_{i}^{j}\right|\leq \left|\sum^{n}_{i=1}x_{i}^{j}\right| \] for all $j$ and

and

\[ sign\left(\sum^{n}_{i=1}w_{i}^{j}x_{i}^{j}\right) \left(sign\left(\sum^{n}_{i=1}w_{i}^{j}x_{i}^{j}\right) + sign\left(\sum^{n}_{i=1}x_{i}^{j}\right) \right) = 0\] for all $j$

Any advice anyone can give would be greatly appreciated, I dont really want to go down the Genetic Algorithm route straight away, if theres a numerical programming technique that is suitable I'd prefer to work with that and if this can be transformed into basic linear constraints and I can carry on using simplex optimisation that would be great!

We all know that data is very important in decision making processes, and also it's obvious that row data can not lead to a precise decision.

Operations Research (OR) can help transform the data to useful information by which accurate decisions can be made. Decision Support Systems are good examples in which this transformation and process of making accurate decisions occur.

Health care problems have attracted too much attention and researchers are using OR tools to solve them.

**Now a critical question is:**What are the main trends of using operations research tools in healthcare problems?

Suppose we have a set of players, i.e., {

*p*_{1},*p*_{2}, ...,*p*} and each player has three different strategies, i.e., {_{n}*s*_{1},*s*_{2},*s*_{3}}. They play*m*number of games. In each game, each player seeks to maximize its profit by selecting a strategy with highest playoff. The profit associated with each strategy is as follows.1) Payoff for selecting strategy

*s*_{1}is zero2) Payoff for selecting strategy

*s*_{2}is a real number, which is calculated using some formula*f*_{1}3) Payoff for selecting strategy

*s*_{3}is also a real number, however, it is calculated using another formula*f*_{2}I want to prove the existence of Nash equilibrium when all the players select one of the available strategies.

I have searched on web and found several documents, however, I couldn't get a clear idea to prove it mathematically.

Any help is deeply appreciated. Please let me know if I have missed any information. Thank you in advance.

Given a set of

*m*(>0) trucks and a set of*k*(>=0) parcels. Each parcel has a fixed amount of payment for the trucks (may be same for all or may different for all) . The problem is to pick up the maximum number of parcels such that the profit of each truck is maximized. There may be 0 to*k*number of*parcels in the service region of a particular truck. Likewise, a parcel can located in the service region of 0 to**m*trucks. There are certain constraints as follows.1. Each truck can pick up exactly one parcel.

2. A parcel can be loaded to a truck if and only if it is located within the service region of the truck.

The possible cases are as follows

Case 1.

*m*>*k*Case 2.

*m*=*k*Case 3.

*m*<*k*As far as I know, to prove a given problem H as NP-hard, we need to give a polynomial time reduction algorithm to reduce a NP-Hard problem L to H. Therefore, I am in search of a similar NP-hard problem.

Kindly suggest some NP-hard problem which is similar to the stated problem. Thank you in advance.

I am looking for example of those construction companies because I am interested in see if they have improve their performance applying OR techniques.

I have a problem, where equation changes with problem instance size.

There are given n circles (centers and radiuses ). Circles dont intersect, they are separable. I need to visit every circle starting from the first circle and returning to the first circle. The sequence of visiting circles is known. The task is to find points on these circles (described by radius) to minimize the path.

For 3 circles we have to find P

_{1}P_{2}and P_{3}that minimizes d=d(P1, P2)+d(P2, P3) + d(P3, P1) (for permutation (1, 2, 3)). When number of circles changes, the minimised equation changes.I would like to have a module invoked from C++/Python that will solve such problem for me. I would like to send number of circles and its parameters, and solver should solves a problem and responds with P1, P2, P3,...Pn.

In the attachment you can find an image of two circles and some points P1, P2 (it is not optimal solution, of course).

I'm going to use an utility function in my nonlinear optimization model. (for asset allocation problem)

It can be logarithmic function like : log(cTx) (C transpose X)

Please introduce me some other popular utility functions except log.

Is there anyone using vPlanner for the implementation of Last Planner? I have some questions for defining the Lookahead plan manually into vPlanner. I would appreciate it if anyone is familiar message me in researchgate

There are 4 options with the different distance and carbon footprint value (as per in attachment). What is the method that can be used to choose the best minimum value for both distance and carbon footprint from those 4 options.

Individually the minimum distance is Option 4, but the minimum carbon footprint is Option 3. How can the option be selected by considering both distance and carbon footprint in that data set?

We often concider in academic scheduling problems the minimization of the weighted sum of completion time (∑WiCi).

To what can correspond the optimization of this criterion in real industrial problems?

Thank you for your answers :)

Through an operational leadership perspective, how does one approach a misinterpreted labeling of operations without disrupting the synergy of the department? Is there any supporting evidence of shifting operational labeling without hindering morale?

I am writing a paper regarding finding optimal location for a port. I used AHP for determining criteria weight and PROMETHEE for outranking alternatives. I know we have about 7 pre-determined preference functions and more could be defined by the researcher but which one should be used in which context? (please introduce an article in this subject if you know one). Also, I am really confused regarding determining appropriate preference function.

Thank you so much for your kind responses

I think about onsite decision support for incident commanders.

In my opinion currently the descriptive and naturalistic models are exploited. Why not prescriptive or normative?

If we consider a human as a decision maker, the factors underlying the use of descriptive or naturalistic models are: inability to comprehend and process in analytical way all the information, course of actions, consequences and costs of alternative activities in mental and time pressure environment.

If we consider a computer system as a decision maker, the factors are:

- lack of information - we can not ask firefighters to insert data into computer system because they don't have such time;
- poor sensory layer for recognition of phenomena or victims - there is no so far sensors in building which enable to track fire dynamics, people localization and their physical state;
- huge uncertainty in modeling and foreseeing the fire and people behavior, reaction of the building to the fire exposure, changes in ventilation, extinguishing effects and many others.

What do you think about this problem?

What is the difference between optimal Pareto solutions of the following bi-objective problems? Is there any difference in general case?

1) Min f(x)

Min g(x)

H(x,y)<=B

2) Min f(x)

Min g(y)

H(x,y)<=B

I need to generate multiple sets of realistic data by random variation in order to test my model. I'm working in scheduling operations. Is there a way to do it with cplex?

Hi

Task assignment is assigning several tasks to several persons. I know that Hungarian Algorithm is common to solve this problem. But the limitation of Hungarian algorithm is that, if tasks are more than workers, some tasks are left not assigned. Or, it assigns tasks to workers in 1-1 mapping only.

What I need is simply that a worker is assigned more than one task. This happens when we have number of tasks is bigger than number of workers. In other words, we need (Several tasks - to - One persons )

For example, I have 9 tasks (cost is man-hour) and 6 workers. An example of solving this might be

Persoon 1 -> 1 task

Persoon 2 -> 2 tasks

Persoon 3 -> 3 tasks

Persoon 4 -> 1 task

Persoon 5 -> 1 task

Persoon 6 -> 1 task

---------------------------

6 Persons -> 9 tasks

- So far, No assumption that tasks are related.

- Cost is different per task.

Is there an algorithm that solves such a problem? Where all tasks are definitely assigned (while minimizing cost is required).

Thank you

As I know, most mixed integer programming problems with block diagonal structure are suitable to be solved with Dantzig-Wolfe decomposition and column generation. Analogously, which kinds of problems or which structures are preferable to be solved with Bender decomposition. Can Bender decomposition also be feasible for general MIP problem, and which one can get better performance for general MIP, Bender decomposition or Dantzig-Wolfe decomposition?

Hi, could anyone tell me that which Operational Research (OR) method is best for agriculture planning for optimization of crop production and why? I shall be very thankful if anyone give me relevant literature evidence and some other useful links. Thanks in advance.

Given a homogeneous linear equations

**AX**=**0 (X≥0)**, how to get the nonnegative basic solution set?Can anyone provide recommendations or articles relating to organizational structure w.r.t managed services / outsourcing within the telecommunication sector.

I really appreciate any assistance offered.

As already stated in the robust optimization literature, there are some cases when the adaptive (or adjustable) robust problem and the static robust problem of uncertain adjustable linear programming problem are equivalent. These include the cases where the uncertainty affecting every one of the constraints is independent of the uncertainty affecting all other constraints.

Does this statement hold for an MIP problem where the uncertainty only applies to upper bounds of continuous variables, and the uncertainty of each constraint is independent of the others? If so, could you provide me with a reference that supports this?

Hello,

I need to optimize a model using Subgradient Lagrangian Relaxation (LR) method.

Assume the general form as below:

.

Min f(x,y)

s.t.

Ax+By <= E1 (inequality constraint)

Cx+Dy = E2 (equality constraint)

.

Clearly, its LR counterpart will be as follows

Min f(x,y)+Landa * (Ax+By - E1) + u * (Cx+Dy - E2)

.

If there is only one constraint, I know how to solve it (e.g. updating only one Lagrangian variable i.e. only Landa or only u) but my model has two complicating constraints and as a result I do not know how to update both Landa and u variables simultaneously.

I would be grateful if you help me to revise an algorithm to update both Landa and u simultaneously (and not only one of them).

.

Thanks,

.

Morteza

Hello Scholars, I am interested in Economic Lot Scheduling Problems where products are produced in a cyclic pattern. There are some existing modelling approaches, among them extended basic period is more effective than the others. But will someone help me to understand the concept of basic period (which is assumed to be equal for all products) the relationship/ difference between basic period and cycle time for each product?

Means converting the problem with multiple objectives into a single objective optimization problem either using weighted sum objective function, etc.

Some optimization problems are well known, but the instances or the benchmarks used are not available. Generally what can we do in this situation?

I'm using an evolutionary algorithm with powerful exploration for solving some optimization and I want to add some local search for the sake of better exploitation to the EA. I'm looking for a source on different methods of local search.

Imagine you have problem like:

min z=Cx

subject to Ax=b

If you just modify b by b2, i wonder how to find the common feasible region of those two problems (Ax=b, Ax=b2), if there is one?

Of course for a two variable problem you can draw it but for more than 3 it's not really possible.

I have already carried out a Monte Carlo Simulation for analyzing a PERT or CPM Network with one variable, either time or cost. I now want to extend this to a Monte Carlo Simulation for analyzing a PERT or CPM Network for both time-cost variables.

I want to bind a classical project scheduling problem with Cmax objective function. For example, the first and the second activity can start at the beginning of the project. I want to limit the model in which 1st and 2nd activities never run at the same time. But I don’t know which one is scheduled earlier. Moreover, in my model, just these two activities use the same resource. This resource is bounded, hence model is a special case of RCPSP. How can I handle the resource constraint on my model?

Our paper is about dealing with (reducing/exploiting) degeneracy in column generation. Linear relaxations of combinatorial optimization problems are known to suffer from this defect; I would be interested in learning about your experiences.

I am new to scheduling. I got really interested in meta-heuristics. I read through papers and every paper says how genetic algorithm works but how do I carry it out with a set of data?

The constraint :

x1<2 or x1>8

the value x1 = 5 is not valid

There was recently a post in SIMSOC (Newsgroup discussing the use of Computer Simulation in the Social Sciences) asking for examples of agent-based models that have actually been used to support policy decision making or for other purposes "in the real world". The poster (Nigel Gilbert) was looking for examples which have been used for practical purposes rather than those that have been designed to be applied but have never in fact been used.

The post and a summary of responses can be found here:

Are there any such projects related to OR that you have worked on or know of? Mine are unfortunately all examples that have been designed to be applied but have never in fact been used. Is this a general tendency?

When I want to obtain importance of every criteria and alternative, how can I prepare a questionnaire for this goal? Is there any source that could be helpful?

What is the best software to visualize or compute problems of linear programming, operation research?

We have conducted many training programs. However still confused about its outcome measurement procedure, how to do it?