Article

The New Palgrave: A Dictionary of Economics.

Taylor & Francis
Journal of the American Statistical Association
Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Therefore, this study aims to apply Game Theory to resource optimization of concreting operation. Game Theory has been applied majorly to economics (Eatwell et al, 1987). As a decision technique, game Theory makes use of available information to devise the best plan to achieve one's objective. ...
... Therefore, a formal study of decisionmaking in a strategic situation is known as Game Theory (Theodore and Bernard, 2003). Eatwell et al (1987) defined Game Theory as an interactive decision theory and a rational analysis of a strategic situation. Kartik (2009)described it as a formal methodology and a set of techniques to study the interaction of rational agents in strategic settings. ...
... The 'decision tree' is a 'decision analysis' that maps out all of the possibilities in form of a decision structure and makes it easier to assess the pay-offs and make optimal decisions (Erhun and Keskinocak, 2003;Dixit and Barry, 1991).A game that evolves over time is better Olugboyega / Journal of Logistics, Informatics and Service Sciences Vol. 4 (2017) No.1 1-20 8 represented by a decision tree than using the pay-offs matrix. The pay-offs matrix only contains redundancies but decision tree reflects the temporal aspect and formally describe the game with a specification of the sequence of decisions, available information and the payoffs (Eatwell et al, 1987;Theodore and Bernard, 2003) A technique to solve a game of perfect information is referred to as 'regressive induction'. It is required to determine the best move or the optimal strategy in each decision structure. ...
... In a state of atomicity, specifically, utility of outcome depends on decisions that directly pertain to the exact time or condition involving the decision maker subjectively in the face of certainty. A perfect scenario is that of health insurance, whereby the decision to enrolment or payment of premium depends on illness or health condition (Eatwell et al., 1987;Zank & Wakker, 1999). The theory therefore does not provide any condition of probability. ...
... Other critics like Eatwell et al. (1987), Zank and Wakker (1999) This assertion is refuted by the assumptions of SDUT. ...
Thesis
Full-text available
Social Health Insurance Schemes (SHIs) as Social Health Protection (SHP) interventions are an important tool for reducing poverty and ill health. For these reasons, governments employed SHI as a policy framework to promote access to healthcare, and to ensure financial protection among the poorest households to improve their health conditions. However, there is limited empirical studies on what motivates the poorest to get enrolled onto NHIS and how it helps them save income for consumption and other health outcomes. The study was conducted by engaging the Livelihood Empowerment Against Poverty (LEAP) household heads to identify the empirical evidence. The study also compared the consumption between insured and uninsured, and analysed the effects of NHIS membership on healthcare use and out-of-pocket health expenditure (OOPHE) among the poorest households. Decision making theories (Expected Utility Theory (EUT) and State Dependent Utility Theory (SDUT) and a Health behaviour theory (Health Belief Model (HBM)) were used as the theoretical lens for the study. The researcher adopted a pragmatic approach which involved the use of a mixed methodology, using both qualitative and quantitative approaches. A cross-sectional design was also adopted for the study. Entirely, the study was conducted in two districts; Shai Osudoku district in the Greater Accra Region and Amansie West district in the Ashanti Region and engaged LEAP beneficiary households. Thematic analysis approach was used to reveal the result of NHIS enrolment decision. In the analyses, the theoretical constructs of HBM proved useful in uncovering factors that influence enrolment decisions among poorest households. The study also found illness vulnerability and guaranteed financial access to healthcare as dominant factors that generally influenced household heads decisions to enrol onto NHIS. Addressing possible selection bias due to the non-random enrolment to the NHIS, the propensity score matching (PSM) technique was used to estimate the difference in outcomes between treated and control groups. The results of the average treatment effects on the treated reveals that participation in NHIS tends to increase in consumption expenditure by GH₵ 263.43, hospital visits by 0.74 visits and reduce OOPHE statistically significant by GH₵ 79.77 with household members that are insured than households’ members that are uninsured. By employing a mixed method approach instead of a quantitative approach alone, the study has contributed to existing knowledge by revealing a unique perspective on effects of NHIS on enrolment decisions among poorest households in Ghana. These positive outcomes of the study point to future research options.
... In the security domain, having attack choices favorable to both the attacker and the defender is rather unlikely. A very practical class of games here is the Strictly competitive games [11], where all outcomes are pareto optimal. In particular, if the attacker deviates to lower utility, the defender gets a smaller loss, thus, the attacker playing rationally is the worst case for the defender. ...
... Our numerical results show that PT shows significant improvement for homogeneous populations and for a high risk-aversion, however, for heterogeneous populations, MMR moderately improves the defender loss while also achieving much lower regret. Finally, GEBRA is valuable in the Strictly Competitive [11] setting where previous model-free approaches for handling bounded rationality prove ineffective, particularly for attackers with a high deviation from rationality. ...
Chapter
Full-text available
Recent works have growingly shown that Cyber deception can effectively impede the reconnaissance efforts of intelligent cyber attackers. Recently proposed models to optimize a deceptive defense based on camouflaging network and system attributes, have shown effective numerical results on simulated data. However, these models possess a fundamental drawback due to the assumption that an attempted attack is always successful—as a direct consequence of the deceptive strategies being deployed, the attacker runs a significant risk that the attack fails. Further, this risk or uncertainty in the rewards magnifies the boundedly rational behavior in humans which the previous models do not handle. To that end, we present Risk-based Cyber Camouflage Games—a general-sum game model that captures the uncertainty in the attack’s success. In case of the rational attackers, we show that optimal defender strategy computation is NP-hard even in the zero-sum case. We provide an MILP formulation for the general problem with constraints on cost and feasibility, along with a pseudo-polynomial time algorithm for the special unconstrained setting. Second, for risk-averse attackers, we present a solution based on Prospect theoretic modeling along with a robust variant that minimizes regret. Third, we propose a solution that does not rely on the attacker behavior model or past data, and effective for the broad setting of strictly competitive games where previous solutions against bounded rationality prove ineffective. Finally, we provide numerical results that our solutions effectively lower the defender loss.
... The social multiplier effect describes when behavioral changes at the individual level spill over to the community level (44). This was observed in the MINCOME project. ...
... That is because only onethird of families received MINCOME payments with most being small top-ups. As such, investigators hypothesized that grade 11 students whose families received little to no MIN-COME might make their decision to continuously in grade 12 based on friends making the same decision (10,44). The social multiplier effect can also be used to explain the significant reduction (46 percent) in minor illness and injury rate in the Indian project. ...
Article
Objectives: To: a) Familiarize readers with the concept of a basic income guarantee (BIG) and its different forms; b) Consider how BIG could improve oral health and decrease oral health disparities; c) Motivate readers to advocate for the evaluation of oral health outcomes in BIG experiments. Methods: Published articles and book chapters that have analyzed and reviewed data from past BIG pilot projects were examined for their findings on health and socioeconomic outcomes. Results: Our findings suggest various areas and mechanisms whereby BIG can influence oral health-related outcomes, whether through impacts on work, illness and injury, education, a social multiplier effect, expenditure behavior, and/or mental illness and other health outcomes. Conclusion: Our findings illustrate the importance of assessing oral health-related outcomes in future BIG pilot projects.
... To this end, we asked participants to list the best alternative use of an Amazon gift card immediately before sampling, which made the best alternative use highly accessible during sampling. The reason why we asked them to list the best alternative use was because the normative value of a medium of exchange results from the best consumption that it provides (Buchanan, 2008;Eatwell et al., 1987). ...
Article
Full-text available
Sampling provides limited experience with an offering to promote its purchase, either now or later. Sampling involves an ongoing choice about whether to buy the sampled option. We propose that ongoing choice feels more like a choice when people consider opportunity costs. Consequently, we predict that opportunity cost consideration will accentuate the impact of ongoing choosing on enjoyment over time of the sampled option (i.e., a slope effect). It follows that when ongoing decision evolves toward not choosing the sampled option today, its negative impact on enjoyment should become more pronounced when people consider their opportunity costs, decreasing overall enjoyment. Studies 1, 2, and 3 provided support for this key prediction. Studies 4 and 5 showed that when the best alternative use of a resource people considered was more attractive, they experienced accelerated satiation from an unchosen sampled option. While previous research showed that opportunity cost consideration accentuated the impact of one‐time choice on evaluation (i.e., intercept effect), we showed that it accentuated the impact of ongoing choice on enjoyment over time (i.e., slope effect). We also contribute to the understanding of the factors that increase overall enjoyment of a sampling experience, which should influence future purchase likelihood.
... Over years, many economists attempted to make further development of Walras' capital accumulation within Walrasian framework (e.g. (Morishima, 1964(Morishima, , 1977Diewert, 1977;Eatwell, 1987;Dana et al., 1989;and Montesano, 2008)). Nevertheless, no study succeeds in solving the common problem of lacking proper microeconomic foundation for wealth accumulation. ...
Chapter
Full-text available
The chapter proposes a general equilibrium growth model with wealth and human capital accumulation in a dynamic multi-race economy. Coexistence of multiple races in the same labor and goods markets is common in modern economies. There is no proper theoretical treatment of income and wealth distribution in endogenous growth theories on the issue. Marx’s theory on capital accumulation is too simplified to understand processes of complexity of income and wealth distribution between heterogeneous households. In Capital in the Twenty-First Century, Piketty examines issues related to wealth accumulation and income and wealth distribution. He collects many interesting data across many countries, but without a profound analytical framework. This chapter focuses on how racial differences in preferences and knowledge accumulation affect national economic growth and racial income and wealth distributions.
... There are various measures of real integration. Traditionally, real integration has conceptually been considered the extent to which international barriers and other restrictions impede international trade of goods (Eatwell, Milgate, & Newman, 1987). However, there is an issue with trade barriers as non-tariff barriers cannot be easily measured (Ammer & Mei, 1996). ...
Article
Full-text available
The paper examines the impact of economic integration on the relationship between the currency and equity markets for a group of Asian emerging economies using both linear and non‐linear frameworks. We first derive the dynamic conditional correlations between the two markets and then examine the impact of economic integration on their relationship. Our main results are: (a) there is a negative correlation between real exchange rate changes and equity return differentials for all countries apart from China, which becomes deeper during the global financial crisis (GFC) for some of the countries; (b) economic integration, both real and financial, has an asymmetric impact on the relationship between the two markets both in the short‐run and in the long‐run; and (c) applying a linear framework does not bring out the impact of financial integration.
... This is considered a benchmark rate that some of the world's leading banks charge each other for short-term loans. Lastly, there is the prime overdraft rate (POR), which is used by banks to price the lending rates offered to clients at either above or below a particular rate (Eatwell, Milgate, & Newman, 1987). ...
Article
Full-text available
The objective of this study is to provide empirical evidence on the short-and long-run relationships between the short-term interest rate, London interbank offered rate (LIBOR) and macroeconomic policy objectives, such as price stability, economic growth, and stability of the exchange rate market. For this purpose, we deploy quarterly frequency data from the United Kingdom between 2000 and 2015 and adopt a multiple regression model. Furthermore, this study uses the Johansen, Stock-Watson cointegration test and the Granger Causality test in order to examine the dynamic short-and long-run relationships among LIBOR, the consumer price index as a proxy of price stability, the real gross domestic product as a proxy of economic growth, and the exchange rate as a proxy of exchange rate market stability. The results showed that all variables have the same order of integration and long-run equilibrium relationships exist between them. The results show evidence of long-run equilibrium relationship between the variables with strong evidence of unidirectional granger causality flow from GDP, CPI and exchange rates to LIBOR. The recommendations proposed in this study have important policy implications for the U.K. government. It is therefore recommended that policy makers and government authorities together with the Bank of England develop and pursue sensible fiscal and monetary policies that would aim at stabilizing both the micro-and macroeconomic indicators such as the inflation rate, interest rate, exchange rate, and money supply, to enhance the growth of the economy, especially for the period after the BREXIT decision.
... Where the elasticity can be expressed in terms of logarithmic derivatives, as mentioned by Newman (2008). There is also a calculus of the elasticities where it is possible to derive expressions for the elasticity of the addition and multiplication of functions, and therefore of the subtraction and division of functions for both the continuous case, as mentioned by Allen (1938, pp. ...
Article
Full-text available
This article presents a geometric and algebraical approach to teach the concept of elasticity for undergraduate students with different levels of knowledge in mathematics and pointing out why it is necessary to introduce it from the idea of he slope of a function. To achieve this objective a short review of the literature of the main properties associated with elasticity is carried out, without separating them from their historical development, to illustrate a way of presenting the concept in a discrete or continuous way and using geometric arguments or algebraic expressions. Due to the flexibility of the concept, and taking into account the previous knowledge of the student, it is concluded that this approach is a more adequate way to teach the concept.
... For the payments, the net gain or loss for any three contests out of the thirty contests was chosen at random. This was done in order to minimize the wealth effects [22]. ...
Preprint
Existing literature on information sharing in contests has established that sharing contest-specific information influences participant behaviors, and thereby, the outcomes of a contest. However, in the context of engineering design contests, such as crowdsourcing, there is still a significant gap in our understanding of how the contest design decisions, such as what information to share, influence participants' design behaviors and the outcomes of a design process. Particularly, there is a lack of knowledge about how information about historical performances of competitors influences a participant's design behaviors and the outcomes of a design contest. To address this gap, the objective of this paper is to quantify the influence of information about competitors' past performance on a participant's design behaviors and outcomes. The objective is achieved by (i) developing a descriptive contest model of strategic information acquisition decisions, based on an optimal one-step look ahead strategy, utilizing expected improvement maximization, and (ii) using the model in conjunction with a controlled behavioral experiment. An agent is designed as a competitor such that the agent's past performances are quantified via a performance distribution. A behavioral experiment is conducted where design contests with design optimization problems are considered. The participants are subjected to agents with strong or poor performance records such that they are either made aware of these records or not. Our results indicate that participants spend greater efforts when they are aware that their opponent has a strong performance record than when the opponent has a poor performance record. Moreover, our model parameter is able to quantify the influence of the contest-specific information sharing on a participant's sequential design behaviors. We observe that sharing information about an opponent with a strong past performance record "polarizes" the participants such that, their average performance distribution has a higher variation than when they do not have information about the opponent as well as when they know that their opponent has a poor performance record. Moreover, we find that, if possible, contest designers are better off not providing historical performance records if past design qualities do not match the expectations set up for a given design contest.
... In standard neo-classical theory, no distinction is made between the principal and the agent(s). Where the distinction is made, it is assumed that both parties will strive towards the same goal (Stiglitz, 1987in Eatwell et al., 1991. The principal-agent problem occurs when one takes into account that the principal (employer or manager) and agent (employee) have diverse needs and are working in a world characterised by asymmetric information. ...
Article
Full-text available
There are many factors that may lead to inefficiencies in a firm. One reason is the existence of a principal-agent problem. Linked with this problem is asymmetric information, unaligned motives of principals and agents, distrust (that was rampant in the era of apartheid in South Africa, but more recently the Basic Conditions of Employment Act can fulfil this role) and conflict. Worker participation schemes can help to alleviate this problem and different forms of worker participation schemes are discussed that can increase efficiency of firms.
... To date, however, very similar modes to penalty in intertemporal choice are speed-up modes (e.g., as seen above, "receive a $75 Amazon gift certificate in 3 months" versus "receive a gift certificate of lesser value that day" ) or decision problem formulations with explicit reference to opportunity costs. Notably, opportunity costs rely on the fact that people have to think about the pleasure associated with the outcome but also consider other alternative resources, items or experiences, that could give them the same pleasure (see Buchanan, 2008; Carmon and Ariely, 20 0 0; Eatwell, Milgate, and Newman, 1998;Henderson, 2014 ). As for intertemporal choice, the opportunity cost of time effect was investigated by Zhao et al., (2015) . ...
Article
This paper experimentally investigates the framing effects of intertemporal choice using two different elicitation modes, termed classical and penal. In the classical mode, participants are given the choice between receiving a certain amount of money, smaller and sooner, today and a higher amount, larger and later, delayed (e.g., “€55 today vs. €75 in 61 days”). This is referred to as the standard mode. In the penalty mode, the participant must give up an explicit amount of money in order to choose the smaller and sooner option (e.g., “€75 in 61 days vs. €55 today with a penalty of €20”). This is the explicit mode. We find that estimates of individual discount rates are lower in the explicit mode than in the standard mode. This result suggests that even very simple information about the amount of money one must surrender for choosing the earlier option increases delayed consumption. The finding has relevant implications for self-control and long-term planning in intertemporal choice.
Article
Full-text available
The objective of this article is to elaborate an economic and historical approach to the notion/concept of strategy. More specifically, it aims to propose a definition of strategy that helps to unravel this important topic from a paralysing tangle of schools, approaches and definitions. It was also sought to delimit more clearly its nature and what can (and what should not) be considered as strategy. For that, the research work adopts a historical perspective and, as starting point, the approach proposed by Simon (1993). Besides that, it was made an additional effort to specify better and enrich the Simon’s driveway with the contributions from some other pertinent authors - mainly from the Keynesian and Schumpeterian schools.
Book
Full-text available
Ekonomi merupakan bagian yang tidak bisa dipisahkan dari kehidupan manusia. Ekonomi juga memiliki peranan yang penting untuk menjaga kestabilan kehidupan berbangsa dan bernegara. Tingkat pertumbuhan dan pembangunan suatu negara dapat dilihat dari indikator ekonominya. Setiap negara, dalam mencapai tujuannya menggunakan sistem ekonomi yang berbeda-beda. Sistem ekonomi yang berkembang saat ini di dunia adalah sistem ekonomi kapitalis, sosialis, campuran, dan sistem ekonomi Islam. Pertumbuhan ekonomi merupakan salah satu ukuran yang penting untuk mengetahui keberhasilan pembangunan di suatu wilayah. Sebuah wilayah dianggap berhasil melaksanakan pembangunan jika pertumbuhan ekonomi masyarakat di wilayah tersebut cukup tinggi. Pertumbuhan ekonomi diartikan sebagai kenaikan GDP (Gross Domestic Product) tanpa memandang bahwa kenaikan itu lebih besar atau lebih kecil dari pertumbuhan penduduk dan tanpa memandang apakah ada perubahan dalam struktur ekonominya. Dalam konsep dasar ekonomi makro indikator yang digunakan dalam mengukur pertumbuhan ekonomi adalah produk domestik bruto (PDB) untuk tingkat nasional dan Produk Domestik Regional Bruto (PDRB) untuk wilayah propinsi. Pertumbuhan dan peningkatan PDRB dari tahun ke tahun merupakan indikator dari keberhasilan pembangunan daerah yang dapat menggambarkan ada atau tidaknya perkembangan perekonomian suatu daerah. Semakin besar sumbangan yang diberikan oleh masing-masing sektor ekonomi terhadap PDRB maka daerah dapat melaksanakan pertumbuhan ekonomi kearah yang lebih baik dan yang berarti pula akan meningkatkan kesejahteraan dan kemakmuran rakyat.
Article
This paper discusses some issues related to the triangle between capital accumulation, distribution, and capacity utilization. First, it explains why utilization is a crucial variable for the various theories of growth and distribution, and, more precisely, with regards to their ability to combine an autonomous role for demand (along Keynesian lines) and an institutionally determined distribution (along classical lines). Second, it responds to some recent criticism by Girardi and Pariboni (2019) and I explain that their interpretation of the model in Nikiforos (2013) is misguided, and that the results of the model can be extended to the case of a monopolist. Third, it provides some concrete examples on why demand is a determinant for the long‐run rate of utilization of capital. Finally, it argues that when it comes to the normal rate of utilization it is the expected growth rate of demand that matters, and not the level of demand. This insight provides a more straightforward way to link the adjustment at the micro and the macro level.
Article
Full-text available
This analysis evaluated price innovation impact on the telecommunication industry using a non-parametric approach based on linear programming which is conducted to measure the relative efficiency of a set of similar Decision-Making Units (DMUs). The DEA model evaluated the efficiency of various prices for the different kinds of services. Each company was assigned a set of efficiency scores for the period 2009 to 2016. Thus the data for this study was collected through an individual's company website and by visiting the company for the periods 2009-2016. The sample comprises four telecommunication companies in Congo. The analysis measured Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH) technical efficiency scores using the Efficiency Measurement System (EMS Version 1.3, 2000). Therefore the results showed two out of seven variables are found statistically significant at the significant level of 0.05. The negative coefficient of large indicates that a large firm is more likely efficient than a small firm. The results also show that government ownership-based firm is more likely to be efficient than domestic ownership-based firm.
Article
Full-text available
Humanitarian concerns owing to the dreadfulness and impact of human trafficking prompted several stakeholders under the umbrella of the United Nations (UN) to approve legal measures to criminalize this menace. Several states that are parties to the UN anti-trafficking protocols and conventions have domesticated some of the provisions of these regulations by enacting comprehensive laws that criminalize the various components of human trafficking. Unfortunately, this approach has not brought about any significant reduction in the crime. This article adopted a qualitative approach and drew from the findings of a broader doctoral study. It evaluates the efficacy of current South African anti-trafficking legislation in the fight against human trafficking in the country. Findings indicate, among others, that anti-trafficking legislation is at best a stop-gap strategy in combating the crime, and not all-encompassing. It concludes that an effective response to human trafficking transcends the enactment of laws. Moreover, laws do not thrive in a vacuum, but rely on a range of factors, particularly the political will to address the underlying causes of a crime, and effective law enforcement capacity.
Article
Full-text available
his paper focused on the process of coordination between the financial and monetary policies and the boundaries that prevent compromising the independence of central banks, especially in times of financial and economic crises, as the solutions and mechanisms used to mitigate or manage their results weaken the voices calling for the need to separate the two policies in a way that ensures the success of each of them in achieving the goals that they seek. Nevertheless, this research dealt with some important literature and theories, those that focus on the need for independence for central banks and financial policy, as well as on the need for limited coordination between them in times of crisis, as they are the two most important policies within the framework of macroeconomic activity, provided that this necessary emergency coordination ends with the end of those crises. Furthermore, the research concluded by studying the most important literature that coordination between the monetary and financial policies is very necessary, however is considered a moral obligation in front of decision-makers in both policies, in order to spare the economy and people the negative effects resulting from crises. Also, this coordination is important in normal times to maximize welfare and achieve more strength and progress for the economy, provided that it does not affect the independence of central banks and does not negatively affect the fiscal policy tools. keywords : Monetary policy, fiscal policy, independence of the central bank, coordination between fiscal and monetary policy Introduction: The financial and economic crises that the superior and developing economies were subjected to, starting with the Great Depression in the thirties of the last century and ending with the last economic crisis in 2019 changed many opinions and beliefs that looked at the effectiveness of each of the financial and monetary policies in isolation from each other, as the monetary and financial debate became worthless after the solutions that dealt with those crises proved the importance of both financial and monetary tools, in times of crisis, targeting inflation is no longer the only main objective on which independent monetary policy should focus, because of the devastating economic and social effects of deflation. However, as a result of the recurrence of these crises and the depth of their effects, the fiscal policy has become unable on its own to confront and treat, and its unconventional tools have been used, due to the inability of the traditional tools of monetary policy to work in light of the strong economic downturn, especially when interest rates reached the zero limit and the liquidity trap became the dominant feature on customer decisions, which obliged it to coordinate with the financial policy, especially in the field of public debt, and to rescue the troubled financial institutions due to crises. Due to these developments in the field of work of both fiscal and monetary policies, the theoretical debate was launched between economists about the effectiveness of coordination between the two policies and its ineffective limits, and the responsibility of decision-makers in the trade-off between saving the economy as a whole or maintaining their belief in the necessity of a complete separation between the tools of both policies.
Thesis
Full-text available
The purpose of the study was to develop an optimum market-positioning model for the special interest tourism market to support arts festivals in South Africa (SA). Three subareas were deemed essential for the model, namely determining which attributes contribute to the success of three arts festival scenarios, comparing the different arts festival packages as a tourism attraction and then combining these subareas to develop a model enabling future researchers and marketers to present a successful arts festival in South Africa. The three main arts festivals in South Africa, at Potchefstroom, Grahamstown and Oudtshoorn, were studied. Screening questions followed by judgmental and quota sampling were used to select only like-minded respondents from festival attendees on a scenario basis. In personal interviews the data were collected and then analysed using conjoint analysis and game theory. Conjoint analysis was used in a linear regression model with individual ratings for each product. The average of the r-squares in this study was 0,83, indicating a good fit between data and model developed. Then these results were used in the game theory, comparing the three arts festival scenarios to identify the most successful tourism attraction. A different combination of attributes gave each of the three festival scenarios an optimum market position in its own niche market. The study contributes to the existing body of positioning knowledge, specifically in the festivals and events domain. It also adds value as this model can be applied to other festivals in South Africa and also to other business sectors.
Chapter
Interdisciplinary working is increasingly common in academic research projects, but presents a number of challenges. This chapter focuses on terminology and language across disciplines, and explores why this matters for understanding social responses to renewable energy techology. It demonstrates that definitions are important not just to ensure that representatives from different disciplines can talk to each other, but that they have ramifications for how social responses themselves are conceptualised. Indeed, the nature of the issue itself depends on how concepts are defined. This chapter explores the impact of terminology; and suggests how to move forward in future interdisciplinary research.
Article
Full-text available
This paper aims at determining the role and impact of saving, investment and capital formation in economic development of Nepal. The macro economic variables are introduced vide an extension of the econometric model, which explicitly included Almon's (1965) Polynomial Lag Model. The empirical results have been estimated by using annual data for the period of 1974/75 to 2000/01 at current prices and in real terms with the entire study period divided into different sub-periods. The study revealed the strong role and impact of saving, investment and capital formation on economic development of Nepal. The estimated regression equations showed that current and past values of saving, investment and capital formation have positive impact on economic development but the current values have the largest impact. The study also showed that the role played by investment in economic development is weak while the role played by saving and capital formation is strong.
Chapter
The chapter explores the concept of productivity and its relationship with work–life integration. It discusses the connection of work process flow with work–life integration and the factors that make up optimal productivity procedure. The chapter discusses critical inputs for optimal productivity flow within the framework of the work–life integration continuum. It also examines the effects of demographic shift, changing work environment, and changing family structure on productivity and its resultant influence on work–life integration. However, factors that make up value-add at the individual level may differ at the organizational level, but these factors affect productivity and ultimately impact work–life integration. The chapter discusses these factors: prioritization, time management, discipline, delayed decision-making, scheduling, and concludes with a discussion on the strategies for productivity improvement and the attendant consequence on work–life integration.
Book
Full-text available
Stocks address a large group of investor, due to the fact that it can be bought and sold easily. The common purpose of this group is to earn income as a result of their investments. For this purpose, it is important to evaluate the international markets, to analyze the sector and the firm and to compare the other investment alternatives, and also to take the position by knowing the intrinsic value of the stocks in the investments to be made to the stocks. Because knowing the intrinsic value of the stocks gives to the investors the opportunity to compare with the stocks’ current market price. In this way, if the intrinsic value is above the market price, it is interpreted that the stock price is lower than the market price or on the contrary, it is interpreted that the stocks surpass the intrinsic value in the market as go up in high prices. The purpose of this study is to investigate whether the stocks traded in the BIST 100 Index in 2017 are bubbles in market prices. In this context, in order to determine the intrinsic value of the stocks, the intrinsic value of the stocks was calculated by using the model developed by Frankel and Lee (1998) and then the level of positive or negative bubbles were determined by comparing with the current prices and the deviation from the market price. According to the results reached; it was determined that 90% of the 54 firms traded in the BIST 100 Index and included in the sample form positive bubble as go up in high prices in the market, while the remaining 10% were priced lower incidental to negative price bubble. In 2017, the year in which the analysis was conducted, there were no firms which stock price and intrinsic value were equal, or in other words, did not have a price bubble. In spite of the price bubbles, the BIST 100 Index was the most significant investment instrument in 2017, when the economic recession in 2016 was tried to be eliminated.
Article
Full-text available
The role of public expenditures is of great importance in the national economy. If the goal of public spending is to satisfy the public needs to what a State may have to go to satisfy these needs, and what is the extent that they should end at trading the urgent requirements of expenditures against deterioration in the structure of the national economy and the failure of the various sectors. Note that the structural distortion in the Iraqi economy, and as a result of political and social pressures, arrived at a level that is difficult to diagnose as well as treat, because of the priority given to the urgent needs and the lack of attention to developing the economy and its development. This research aims to determine the optimal financial policy to achieve the general economic and social goals, and optimization of public spending in accordance with the requirements necessary to develop the economy and to achieve the requirements of society and thus achieving the general economic balance. Considering that the trade-off between alternatives in the allocation of resources is based on the adoption of goals priority, in addition to that the requirements of social development and sustainability can not be achieved unless there is a structural change in the economy.
Article
Full-text available
In this paper, Leontief linear production functions with one product, and one activity are used to derive the production function of Abyek Cement Factory. The mathematical closed form of production function and also, profit, cost, and demand functions for production factors are obtained for the cited factory. We tried to calculate Operational Production Function of Abyek Cement Factory. It was realized that Leontief linear production function is applicable, and its mathematical form can properly express the economic structure of production in a cement factory. The efficient production function for this factory is also derived in this research. This function exhibits the costs incurred due to the inefficient production of the factory during different years. According to the findings, it was concluded that if the Abyek Cement Factory produces efficiently through employing optimal amounts of factors of production, it can reduce costs by 21 to 52 percent without any change in production level. Calculations were done for both short-term and long-term periods. JEL: D22, L11, L61
Chapter
The classical theory of competition is analysed as a dynamic process of rivalry in the struggle of units of capital (or firms) to gain the largest possible market share for themselves at the expense of their rivals. We argue that the classical dynamic theory of competition is characteristically different from the neoclassical static conception of competition as an end-state, where actual prices and quantities produced are compared to those that would have been established had perfect competition prevailed. In fact, the neoclassical analysis of competition is quantitative in nature for its focus is on the number (manyness or fewness) and also the size of contestants. After a comparison of the two characteristically different conceptualizations of competition, the analysis continues with deriving the laws of classical or real competition between and within industries and their integration with the mediation of regulating capital.
Chapter
The theories of value and distribution of Smith, Ricardo and Marx are presented and critically assessed so that they may become operational and, therefore, useful. We explain Ricardo’s successes and failures by referring to his numerical examples, from which we try to extract the core of a realistic approach to the estimation of natural prices as the centre of gravitation for market prices. We argue that Ricardo’s theory of value is intertemporal in character and its fundamental premise can be tested empirically. The discussion of Marx’s labour theory of value follows immediately after, and we explain his notions of abstract labour time, the two senses of socially necessary labour time as well as the concept of labour power. The latter enabled Marx to show the exploitative nature of the capitalist system and the production of value and surplus-value, all discovered through and evaluated by labour time. We further argue that the economic theories advanced by the old classical economists and Marx along with more recent theoretical developments following Sraffa’s (Production of commodities by means of commodities: A prelude to economic theory. Cambridge, UK: Cambridge University Press, 1960) book share the same set of data and may be fruitfully integrated into the classical political economics (CPE).
Article
Full-text available
Installment Financial Sharing (IFS) is a subsystem of Rastin Profit and Loss Sharing (PLS) Banking System, and the guidelines, instructions, organization, workflow and electronic mechanism of Rastin PLS Banking have been put forward for this subsystem as well. Profit in this financial sharing method is based upon the yield of the real sector, and bank as an intermediary of funds, receives a commission as like as an agent, and provides capital management and financial services to financer (depositor) and participates in investment project of the entrepreneur on behalf of the depositor. In installment Financial Sharing, the contribution of the depositor is paid back by installments and ownership of the project is finally transferred to entrepreneur.Financial innovations of "Mughasatah Certificate" and "Musharakah Mughasatah Certificate" and "Rental Mughasatah Certificate" are used in this subsystem. The financer (depositor) of sharing project receives a certificate, which is negotiable in Rastin Certificate Market via the internet.
Article
Full-text available
Women are under-represented in leadership roles in United Kingdom Higher Education Institutions (HEIs). Existing scholarship focuses on institutional barriers, which include cognitive bias and entrenched homosocial cultures, rather than external factors such as the use of executive search firms (ESFs) in recruitment and selection. Recent research indicates that the use of ESFs is increasing for senior HEI appointments. This analysis offers insights on these firms’ involvement from a gender equality perspective, based on the results from a study that used a ‘virtuous circle’ approach to research and knowledge exchange. The requirement for HEIs to pay ‘due regard’ to equality considerations under the Public Sector Equality Duty provides a framework for analysis. This paper provides new insights on the dynamics within recruitment processes when ESFs are involved and on how a legislative approach can leverage better equality outcomes.
Article
Agent‐based computational economics (ACE) has been used for tackling major research questions in macroeconomics for at least two decades. This growing field positions itself as an alternative to dynamic stochastic general equilibrium (DSGE) models. In this paper, we provide a much needed review and synthesis of this literature and recent attempts to incorporate insights from ACE into DSGE models. We first review the arguments raised against DSGE in the macroeconomic ACE (macro ACE) literature, and then review existing macro ACE models, their explanatory power and empirical performance. We then turn to the literature on behavioural New Keynesian models that attempts to synthesize these two approaches to macroeconomic modelling by incorporating insights of ACE into DSGE modelling. Finally, we provide a thorough description of the internally rational New Keynesian model, and discuss how this promising line of research can progress.
Article
Full-text available
Secondary data analysis can make it possible to research questions with high-quality data that would not otherwise be possible, especially for an early career researcher. For my PhD research, I investigated change in food consumption and associated practices across Europe. Coding, analysis, recoding and more analysis of the data sets for equivalence and descriptive statistics exposed trends and patterns that existed within the data sets. It became obvious that country differences were still important. What also emerged was that older people in Italy and France have very different food expenditure patterns than older people in Ireland and the United Kingdom, which indicate different food consumption practices. These differences coincide with country differences that have been discussed in nutritional literature and named, “the Mediterranean diet” and “the French paradox”, and provide more insight into the health differences in older people that exist between the researched countries. http://methods.sagepub.com/case/comparative-research-exploring-europes-changing-food-consumption-practices?fromsearch=true
Thesis
Full-text available
La ricerca indaga sulle interrelazioni tra sviluppo finanziario e sviluppo economico e/o crescita economica. Il concetto di sviluppo economico è, dal punto di vista teorico, più ampio di quello di semplice crescita, in quanto implica anche un cambiamento qualitativo dei fattori di produzione e dei beni prodotti e non solo quantitativo. Nella nostra analisi, tuttavia, i due termini, “crescita” e “sviluppo” si potranno confondere non essendo sempre facile discriminare nella prassi i due concetti. La letteratura economica applicata e econometrica, per esempio, verte principalmente sul concetto di crescita. La nostra idea è che, comunque, “confondere” i due concetti non inficia i risultati della nostra analisi, in quanto un’economia che non cresce difficilmente si sviluppa, ossia presupposto dei cambiamenti strutturali del sistema è l’aumento della produzione di beni e servizi. Il verificarsi di vincoli tecnici o finanziari alla continuazione delle attività produttive e commerciali con una data tecnologia, innesca la ricerca di “innovazioni” per cercare di superare tali vincoli.
Article
Full-text available
At the end of 2006 I posted on my website a short article entitled Eureka! Info-Gap is Worst Case Analysis (Maximin) in Disguise! where I set out a formal, rigorous proof that info-gap's robust-satisficing decision model is a (Wald) maximin model 1. Since then I outlined similar formal proofs in other articles, including peer-reviewed articles, and I posted on my website a wealth of material supplementing this fact 2. Over the years I repeatedly called the attention of many info-gap scholars, including Prof. Yakov Ben-Haim, the Father of info-gap decision theory, to the misleading rhetoric in the info-gap literature concerning the maximin connection. Regrettably, the misconceptions about this connection continue to be promulgated in the professional literature, including peer-reviewed journals such as Risk Analysis, whose referees should know better. They should know better because this matter is as good as self-evident. Namely, it can be settled by inspection. For the question is this: is the model on the right hand-side an instance of the model on the left hand-side? Prototype Maximin model Info-gap's robust-satisficing decision model max y∈Y min s∈S(y) {f (y, s) : con(y, s), ∀s ∈ S(y)} max q∈Q,α≥0 {α : r c ≤ r(q, u), ∀u ∈ U (α, ˜ u)} (1) where con(y, s) denotes a list of constraints on the (y, s) pairs. The rhetoric in the info-gap literature on this issue has it that the two models " are different ". For, consider this: These two concepts of robustness—min-max and info-gap—are different, motivated by different information available to the analyst. The min-max concept responds to severe * This article was written for the Risk Analysis 101 Project to provide a Second Opinion on pronouncements on the relationship between Wald's maximin paradigm and info-gap's robust-satisficing approach to decision making under severe uncertainty, published recently in Risk Analysis. See Risk-Analysis-101.moshe-online.com. 1 uncertainty that nonetheless can be bounded. The info-gap concept responds to severe uncertainty that is unbounded or whose bound is unknown. It is not surprising that min-max and info-gap robustness analyses sometimes agree on their policy recommendations, and sometimes disagree, as has been discussed elsewhere. (40) Ben-Haim (2012, p. 7) where reference [40] is Ben-Haim et al. (2009). The implication therefore must be that, Risk Analysis referees are apparently of the opinion that the following model, where R denotes the real line, namely R := (−∞, ∞), is not a minimax model: z * := min x∈R max y∈R {x 2 + 2xy − y 2 }. (2) Or, could it be that these referees hold that, insofar as Risk Analysis is concerned, the interval (−∞, ∞) is bounded !! One wonders. .. The incontestable fact obviously is that the above model is a perfectly kosher minimax model and the real line R remains unbounded. The conclusion therefore must be that Risk Analysis referees are unaware of the fact that info-gap's robustness model and info-gap's robust-satisficing decision model are both maximin models. Specifically, they are unaware that these models are rather simple instances of the following prototype maximin model 3 : z • := max y∈Y min s∈S(y) {f (y, s) : con(y, s), ∀s ∈ S(y)}. (3) Or, if you will, these models are simple instances of the following " textbook " maximin model: z := max y∈Y min s∈S(y) g(y, s). (4) This being so, the implication therefore is that Risk Analysis referees second the absurd proposition that a simple instance of a prototype model is capable of representing situations that the prototype itself cannot represent. Namely, Risk Analysis referees accept the astounding proposition that while maximin models cannot handle unbounded uncertainty spaces, info-gap's robustness model indeed can! Again, one wonders. .. It is important to take note that claims that info-gap's robust-satisficing decision model is not a maximin model are based on a comparison of these two models: Maximin model Info-gap's robust-satisficing decision model max q∈Q min u∈U (α ,˜ u) r(q, u) max q∈Q,α≥0 {α : r c ≤ r(q, u), ∀u ∈ U (α, ˜ u)} (5) where α is a given value of α. But the point to note here is that this is a non sequitur par excellence. That is, the fact that the model on the left hand-side of (5) is dissimilar from the model on the right hand-side of (5) does not imply that the latter is not a maximin model. Indeed, it is elementary to show that info-gap's robust-satisficing decision model is a max-imin model. It is therefore mind boggling that info-gap scholars who base their claims on the comparison shown in (5), do not bother to consider the following comparison: Maximin model Robust-satisficing decision model max q∈Q,α≥0 min u∈U (α,˜ u) {h(q, α, u) : r c ≤ r(q, u), ∀u ∈ U (α, ˜ u)} max q∈Q,α≥0 {α : r c ≤ r(q, u), ∀u ∈ U (α, ˜ u)} (6) 3 Here con(y, s) denotes a list of constraints on the (y, s) pairs. 2 Because, the fact that info-gap's robust-satisficing decision model is an instance of the maximin model shown in (6) simply stares one in the face! So, again, one wonders. .. This state of affairs raises a number of questions. For instance, consider these two: · Considering how easy it is to show/prove/verify that info-gap's robustness model and info-gap's robust-satisficing decision model are both maximin models, on what grounds do info-gap scholars claim, and Risk Analysis referees apparently concur, that these models are not maximin models? · Why is it important to be clear on the fact that info-gap's robustness model and info-gap's robust-satisficing decision model are simple maximin models? I take up the first question in the sequel. At this stage I address only the second question whose answer is in four parts: · Info-gap decision theory is being proclaimed a new theory that is radically different from all current theories on decision under uncertainty. So, showing that its two core models are in fact simple instances of the most famous non-probabilistic robustness model used is the broad area of decision making, risk analysis etc., demonstrates how groundless this claim is. But more importantly, this fact raises serious questions about the narrative in the info-gap literature on Wald's maximin model, worst-case analysis, control theory, and so on. In short, this fact calls into question statements made in the info-gap decision theory about classic decision theory (Luce and · The info-gap literature is saturated with misleading pronouncements on Wald's maximin paradigm and its many variant models: on its capabilities and limitations and its relation to info-gap's robustness model and info-gap's robust-satisficing decision model. It is regrettable that such pronouncements have found their way into peer-reviewed journals, such as Risk Analysis. It is important therefore to dispense with the fallacies about Wald's maximin paradigm that continue to be disseminated by peer-reviewed journals such as Risk Analysis. · It is important that readers take special note of the following facts. Articles, such as Ben-Haim (2012), denying that info-gap's robust-satisficing decision model is a maximin model, and articles such as Schwartz et al. (2010), seeking to promote info-gap's robust-satisficing approach as a new normative standard of rational decision making, are engaged in a blatant misrepresentation of the state of the art in the broad area of decision making especially of the field of robust optimization. · Indeed, in spite of the fact that both info-gap's robust-satisficing decision model and info-gap's robustness model are simple robust optimization models, not a single reference can be found to robust optimization in these two articles, nor in the three books on info-gap decision theory (Ben-Haim 2001, 2006, 2010). In fact, it would seem that every effort is made to avoid any discussion on robust optimization, and this in spite of the fact that the robust-satisficing approach advocated by info-gap decision theory is a simplistic, indeed, naive robust optimization approach. It is important that referees of journals such as Risk Analysis be aware of these facts and their implications. 3 A close examination of info-gap's misleading rhetoric on the maximin connection reveals that info-gap scholars, and by implication Risk Analysis referees, have serious misconceptions about the following: · The difference between local and global worst-case analysis. · The difference between local and global robustness. · The difference between robustness with respects to payoffs and robustness with respect to constraints. · The relation between a prototype model and its instances. These misconceptions are merely touched on in this article, for its main objective is to introduce referees of journals, such as Risk Analysis, to the rhetoric in the info-gap literature on the relationship between this theory and Wald's maximin paradigm. The rhetoric in the info-gap literature surrounding the profound incongruity between the severity of the uncertainty postulated by info-gap decision theory, and the model of local ro-bustness that the theory deploys for the management of this uncertainty, will be discussed in a separate article entitled Rhetoric in risk analysis, Part II: Anatomy of a Peer-reviewed Voodoo Decision Theory.
Article
Le concept d’équilibre permet d’identifier le domaine de théorie économique dans la recherche actuelle en économie. Cette place prépondérante a plusieurs origines historiques. Avant même que la notion d’équilibre fasse son apparition, les arguments d’équilibre se retrouvaient dans les écrits du XVII e siècle sur le commerce. C’est à la fin du XVIII e siècle qu’il entre dans le vocabulaire des économistes politiques. C’était un concept analogique et surtout normatif se rapportant à l’ordre naturel de l’économie. À la fin du XIX e siècle, il s’enrichit d’une signification analytique, principalement grâce à la différenciation entre équilibre statique et dynamique. Entre les années 1940 et 1960, le concept évolue en notion métathéorique, devenant la pierre angulaire de l’analyse axiomatique de l’équilibre général; cette dernière deviendra le fondement de la modélisation mathématique actuelle. On retrouve également le concept dans la théorie des jeux, où il est le principal outil d’analyse sans être restreint à l’étude des marchés concurrentiels; il permet la formalisation d’environnements institutionnels et d’interactions stratégiques au niveau d’un individu ou d’un groupe. Dans presque tout modèle macroéconomique, la théorie de l’équilibre général reste le point de référence principal.
Article
The main aim of the paper is to examine what are the advantages and disadvantages of applications of analogies and metaphors based on systems thinking in describing and explaining change and status quo in international systems. The concept of “stability” can serve as best example for demonstrating the impact of classic cyberneticsbased systems thinking on the language of international relations theory and practice. In theoretical discussions it is often replaced with other systems analogies and metaphors - turbulence and chaos as well as autopoiesis and self-organization. These terms seem particularly relevant to the emerging post-Cold War international situation.
Chapter
This chapter reviews the experimental literature on ambiguity attitudes, focusing on three topics. First, it considers various approaches to operationalize ambiguity in experiments. Second, the chapter reviews basic findings in the field regarding the prevalence of ambiguity aversion and ambiguity seeking in static individual decision situations. Third, it looks at studies that probe the external validity of these basic results. Finally, the chapter summarizes the limited evidence on the link between experimental measures of ambiguity attitude and people's decisions in the field. The chapter considers only experimental work on ambiguity attitude, complementing a few review articles mostly focusing on theoretical work. Evidence on home bias and source preference is also discussed. The chapter highlights that the review of potential moderators of ambiguity attitude, the underlying psychological mechanisms, and its relation to behavior outside the laboratory, revealed mixed results.
Chapter
Economies of scale appear when total average cost of production decreases as the level of production increases. The reverse relationship holds when diseconomies of scale are present. Determinants of the precise nature of this relationship may be the technology of production, the organizational structure of a firm, or possibly the level of expertise in producing a certain product or service.
Article
Hedonic methods are considered state of the art for handling quality changes when compiling consumer price indices. The present article proposes first a mathematical description of characteristics and of elementary aggregates. In a following step, a hedonic econometric model is formulated and hedonic elementary population indices are defined. We emphasise that population indices are unobservable economic parameters that need to be estimated by suitable sample indices. It is shown that within the framework developed here, many of the hedonic index formulae used in practice are identified as sample versions corresponding to particular hedonic elementary population indices. The article closes with an empirical part on quarterly housing data where the considered hedonic indices are estimated along with their bootstrapped confidence intervals. It is shown that the computed confidence intervals together with the results from theory suggest a particular answer to the price index problem.
Article
Full-text available
[Original String]:
ResearchGate has not been able to resolve any references for this publication.