To read the full-text of this research, you can request a copy directly from the authors.
Abstract
While industrial marketing often comprises a process that, at least in principle, mirrors Bayesian reasoning, the notion of Bayesian inference has predominantly been utilized in the marketing field as a methodological tool. This article suggests that the practice of industrial marketing itself should be (re)conceptualized as a Bayesian process of belief-updating that entails a continuous cognitive cycle of formulation of hypotheses (i.e., beliefs about the market) and the subsequent updating of those hypotheses through exposure to market evidence (e.g., data from the market). A Bayesian perspective on industrial marketing enables a synthesis of a broad body of extant research as well as a focus on the interconnection between executives' market beliefs (theories-in-use) and belief-updating (assessing the validity of those beliefs in view of market evidence). A view of industrial marketing as a Bayesian process not only enhances our understanding in general but also fosters insights into market learning in uncertain and volatile situations. A Bayesian conceptualization suggests a new understanding of industrial marketing that also informs a typology of marketing approaches. We outline opportunities for developing a better understanding of the Bayesian foundation of industrial marketing.
To read the full-text of this research, you can request a copy directly from the authors.
... The study's theoretical background is the theory of sectoral markets (Peters et al., 2013;Pedersen and Ritter, 2022) and resource-energy cycles (Mensah and Castro, 2004;Shpilevsky and Lelyuk, 2011). The research is based on the parametric identification method, which implies finding such estimates of parameters of the model which provide the most closeness of the values at the output with the same input values (Ashby, 1957). ...
The paper explores 29 models of the European electricity markets and the degree of their deviation from the single electricity market model established by the trans-European legislation. Although all models correspond to the design of the pan-European single one, they remain significantly differentiated according to trading forms and pricing methods. The electric power system in Ukraine has developed oppositely to the European ones, and the electricity market is quasi-competitive. The study proposes a novel model for the Ukrainian electricity market, which considers the European acquis communautaires, the advanced practice in electricity market development, and the specifics of the Ukrainian electric power system.
... The key takeaway from marketing competencies is that it is important to reduce one's confidence in one's knowledge, experience, and skills based on established mental models and theories [40]. It follows that marketers must update their beliefs and assumptions according to observed data [41]. Availability of IT competencies in self-employed people. ...
Self-employment in the Russian Federation is a special tax regime; tax on personal income is a simplified form of entrepreneurship. The self-employed are often associated with freelancers. The exponential growth of information increases uncertainty, and the development of digitalization levels out uncertainty. This work analyses the factors influencing the digitalization development of self-employment as an integral indicator that can affect the sustainability of self-employment. The main method used is a topological method based on the polymerase chain reaction method, as well as the model based on fuzzy sets theory – Mamdani fuzzy inference algorithms. The data for the study were collected through a survey posted on Google Forms. The respondents were experts in the self-employment sector. Eight people participated in the survey (4 – self-employed; 4 – university professors). The self-employed comprised the following areas: developer – 1; service worker – 1; online marketer – 1; musician, event host – 1. Further calculations were performed in Mathlab. According to the study results, the level of factors in the development of self-employed digitalization is 0.502, which corresponds to the third interval of the five-level classifier and has growth potential.
Structural health monitoring (SHM) has been recognized as a useful tool for safety management and risk reduction of offshore wind farms. In complex offshore environment, jacket structures of offshore wind turbines are prone to damages due to corrosion and fatigue. Effective SHM on jacket structures can substantially reduce their operation risk and costs. This work reviews the latest progress on the SHM of offshore wind jacket structures. The achievements in the structural damage identification, location, quantification, and remaining useful life (RUL) estimation are respectively introduced in detail, and existing challenges are discussed. Possible solutions to the challenges using the Digital Twin (DT) technology are put forward. The DT is able to mirror a real jacket structure into a virtual model, and Bayesian updating can refresh the virtual model parameters in real-time to keep consistency between virtual model and physical structure; then, just-in-time SHM can be carried out for jacket structures by performing damage detection, location, quantification, and RUL estimation using the virtual model. As a result, the DT may provide engineers and researchers a practicable tool for safety monitoring and risk reduction of fixed foundation offshore wind structures.
Empathy, that is, the capacity of understanding another person's perspective and feelings, has conventionally been a pillar of marketing, as the discipline from its very inception has emphasized the importance of understanding and putting oneself in the shoes of customers. Yet, with an increasing focus on rationality, objectivity, and science, marketing has arguably become more empathy‐deprived. This is unfortunate, as the increasing objectification, specialization, and the resultant distancing and fragmentation of the marketing field make it proportionally essential to emphasize an empathetic approach to marketing. Here, I address this gap by (re)introducing an empathetic approach to marketing entitled empathy‐based marketing. Empathy‐based marketing addresses the increasing customer‐distancing and subdivision of the field of marketing, by focusing on an empathetic core, stimulating cross‐fertilization, and accentuating the general well‐being of the field. Beyond discussing what this paradigm is and why the field needs it, an agenda for future research is also outlined.
Customer solutions have been touted as the next service-growth engine. Yet, pursuing a solutions strategy can seriously backfire in times of severe crises. The massive economic shock wave brought on by the recent COVID-19 pandemic challenges some of the presumed advantages of business-to-business customer solutions and reveals downsides of these complex offerings to which academics and managers alike may have given insufficient attention. This editorial focuses on goods-centered companies’ recent foray into the solution business and the pressing managerial questions regarding the evolution of solutions as the world begins to emerge from the COVID-19 pandemic. Based on the key characteristics of solution offerings, we identify seven potential downsides of customer solutions that are revealed by the current global crisis and develop promising research avenues mirroring these challenges. In each area, we propose three illustrative sets of research questions that may guide scholars and provide insights to practitioners for positioning solution businesses in the post-COVID-19 “next-normal” world.
Bayesian analysis constitutes an important pillar for assessing and managing risk, but it also has some weaknesses and limitations. The main aims of the present paper are to summarize the scope and boundaries of Bayesian analysis in a risk setting, point to critical issues and suggest ways of meeting the problems faced. The paper specifically addresses the Bayesian perspective on probability and risk, probability models, the link between probability and knowledge, and Bayesian decision analysis. A main overall conclusion of the paper is that risk analysis has a broader scope and framing than Bayesian analysis, and that it is important for risk assessment and management to acknowledge this and build approaches and methods that extend beyond the Bayesian paradigm. To adequately assess and handle risk it is necessary to see beyond risk as commonly defined in Bayesian analysis.
This study draws on an extensive survey and interview data collected during the COVID-19 pandemic. The respondents were executives of industrials firms whose factories, warehouses, and headquarters are located in Northern Italy. This is undoubtedly the European region first and most extensively affected by the pandemic, and the government implemented radical lockdown measures, banning nonessential travel and mandating the shutdown of all nonessential businesses. Several major effects on both product and service businesses are highlighted, including the disruption of field-service operations and supply networks. This study also highlights the increased importance of servitization business models and the acceleration of digital transformation and advanced services. To help firms navigate through the crisis and be better positioned after the pandemic, the authors present a four-stage crisis management model (calamity, quick & dirty, restart, and adapt), which provides insights and critical actions that should be taken to cope with the expected short and long-term implications of the crisis. Finally, this study discusses how servitization can enhance resilience for future crises-providing a set of indicators on the presumed role of, and impact on, service operations in relation to what executives expect to be the "next normal."
Regression to the mean is nice and reliable. Regression to the tail is reliably scary. We live in the age of regression to the tail. It is only a matter of time until a pandemic worse than covid-19 will hit us, and climate more extreme than any we have seen. What are the basic principles that generate such extreme risk, and for navigating it, for government, business, and the public?
As a powerful means of theory building, conceptual articles are increasingly called for in marketing academia. However, researchers struggle to design and write non-empirical articles because of the lack of commonly accepted templates to guide their development. The aim of this paper is to highlight methodological considerations for conceptual papers: it is argued that such papers must be grounded in a clear research design, and that the choice of theories and their role in the analysis must be explicated and justified. The paper discusses four potential templates for conceptual papers – Theory Synthesis, Theory Adaptation, Typology, and Model – and their respective aims, approach for using theories, and contribution potential. Supported by illustrative examples, these templates codify some of the tacit knowledge that underpins the design of non-empirical papers and will be of use to anyone undertaking, supervising, or reviewing conceptual research.
This article’s objective is to inspire and provide guidance on the development of marketing knowledge based on the theories-in-use (TIU) approach. The authors begin with a description of the TIU approach and compare it with other inductive and deductive research approaches. The benefits of engaging in TIU-based research are discussed, including the development of novel organic marketing theories and the opportunity to cocreate relevant marketing knowledge with practitioners. Next, they review criteria for selecting research questions that are particularly well-suited for examination with TIU-based research. This is followed by detailed suggestions for TIU research: focusing on developing new constructs, theoretical propositions (involving antecedents, moderators, and consequences), and arguments for justifying theoretical propositions. A discussion of TIU tradecraft skills, validity checks, and limitations follows. The authors close with a discussion of future theory-building opportunities using the TIU approach.
Given the scant attention paid to Bayesian inference in the academic sales literature, researchers could be forgiven for believing that frequentist methods provide the only feasible way for sales researchers to derive important insights for both theory and practice. The purpose of this research is to demonstrate that this belief overlooks the considerable value that Bayesian inference can provide to sales theory and practice. In so doing, we outline fundamental differences between Bayesian and frequentist methods, and describe how these differences can lead to different empirical insights. We review the extant literature that employs Bayesian methods, with an emphasis on how these studies provide insight that may elude frequentist methods. Then, using a sample of
146 B2B salespeople, we empirically demonstrate that the use of Bayesian methods is both within the methodological reach of the vast majority of sales researchers, and can also provide different empirical insights using the same dataset, than would frequentist methods. We then provide some future research ideas to encourage sales researchers to employ Bayesian methods in their own research. Finally, in hopes that readers do not view Bayesian inference as a “silver bullet”, we examine some drawbacks and limitations of this intriguing method of statistical inference.
A new product’s success in the marketplace largely depends on salesforce actions. Many B2B salespeople display conservatism when confronted with new products in their portfolio, such that they maximize their efforts to sell existing products before engaging in efforts to sell the new product. So far, it is unclear whether this conservative selling behavior (CSB) is harmful to new product selling performance, and how this behavior can be managed. Building on perceived risk processing theory, and employing multi-level structural equation modeling on a multi-source dataset, the authors empirically substantiate that salespeople’s CSB makes their effort to sell new products more effective. Remarkably, such effort is then valued less by sales managers. The authors also find that CSB is a result of a risk assessment and evaluation process, in which internal marketing efforts (i.e., providing salespeople with information on the new product) determine the weight of perceived performance risk (i.e., new product radicalness), social risk (i.e., managerial new product orientation), and financial risk (i.e., long-term rewards). Managers looking to control the levels of CSB in their salesforce should carefully align their information support activities with the perceived risk dimensions of the new product selling situation.
This paper introduces the Bayesian revolution that is sweeping across multiple disciplines but has yet to gain a foothold in organizational research. The foundations of Bayesian estimation and inference are first reviewed. Then, two empirical examples are provided to show how Bayesian methods can overcome limitations of frequentist methods: (a) a structural equation model of testosterone's effect on status in teams, where a Bayesian approach allows directly testing a traditional null hypothesis as a research hypothesis and allows estimating all possible residual covariances in a measurement model, neither of which are possible with frequentist methods; and (b) an ANOVA-style model from a true experiment of ego depletion's effects on performance, where Bayesian estimation with informative priors allows results from all previous research (via a meta-analysis and other previous studies) to be combined with estimates of study effects in a principled manner, yielding support for hypotheses that is not obtained with frequentist methods. Data are available from the first author, code for the program Mplus is provided, and tables illustrate how to present Bayesian results. In conclusion, the many benefits and few hindrances of Bayesian methods are discussed, where the major hindrance has been an easily solvable lack of familiarity by organizational researchers.
Given the prevalence of statistical techniques that use probability to quantify uncertainty, the aim of this article is to highlight the theoretical aspects and implications of the major current probability interpretations that justify the development and use of such techniques. After briefly sketching the origins and development of the notion of probability, its theoretical interpretations will be outlined. Two main trends will be distinguished: one epistemic and one empirical, corresponding to the twofold meaning characterizing probability. The epistemic type embodies the so-called classical theory put forward by Laplace as well as the logical and subjective approaches. By contrast, the frequency and propensity theories are, in theory, empirical in character. This way of understanding probability contrasts with both the tenet that there is a Bayesian interpretation of probability and the tendency to conflate Bayesian probability with the subjective interpretation, both of which are misleading for reasons that emerge from the following discussion. The final section of the paper addresses the question of which type of probability is best suited for the organization sciences and suggests the subjective interpretation as the best option by virtue of its pluralism and awareness of context.
In management research, empirical data are often analyzed using p-value null hypothesis
significance testing (pNHST). Here we outline the conceptual and practical advantages of an
alternative analysis method: Bayesian hypothesis testing and model selection using the Bayes
factor. In contrast to pNHST, Bayes factors allow researchers to quantify evidence in favor of
the null hypothesis. Also, Bayes factors do not require adjustment for the intention with
which the data were collected. The use of Bayes factors is demonstrated through an extended
example for hierarchical regression based on the design of an experiment recently published
in the Journal of Management (Dierdorff et al., 2012). This example also highlights the fact
that p-values overestimate the evidence against the null hypothesis, misleading researchers
into believing that their findings are more reliable than is warranted by the data.
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided.
Competition in the information age often takes the form of a standards war: a battle for market dominance between incompatible technologies. A company's success or failure can easily hinge on its ability to wage such a standards war. Standards wars are especially bitter in markets with strong network effects, where consumers place great value on compatibility and interconnection with each other. These markets tend to exhibit positive feedback and "tip" to a single winner. Based on a study of dozens of standards wars going back over 100 years, this article offers a "battle guide" for waging a standards war. After classifying standards wars and identifying seven key assets that firms can use to successfully establish a new technology, the authors recommend three tactics in standards battles: building alliances, exploiting first-mover advantages, and managing consumer expectations.
The goal of every author is to write a paper that readers (and reviewers) find convincing. Since writers of papers based on case research do not have recourse to the canonical statement "results are significant at p 0.05" that helps assuage read- ers' skepticism of empirical papers, researchers us- ing case research often feel they are fighting an uphill battle to persuade their readers. In this short essay, I provide some thoughts guided by my expe- rience of reading, reviewing, and writing papers based on case-based research over the last decade. These are clearly only the views of this particular writer and thus should be taken with a consider- able grain of salt. I am seeking here more to provoke thought than to provide answers. What makes a case study persuasive? The first
Fifty years after Converse's (1945) classic statement on the "art or science of marketing", the debate has come full circle. The holy grail of Science has not been attained and its pursuit has not only served to alienate practitioners from academics, but it has also done enormous damage to our discipline. This paper traces the development of the great debate, discusses the damaging postmodern critique of western Science and concludes that, as an Art, marketing should be judged by appropriately aesthetic criteria.
n a situation where several hundred new music albums are released each month, produc- ing sales forecasts in a reliable and consistent manner is a rather difficult and cumbersome task. The purpose of this study is to obtain sales forecasts for a new album before it is intro- duced. We develop a hierarchical Bayesian model based on a logistic diffusion process. It allows for the generalization of various adoption patterns out of discrete data and can be applied in a situation where the eventual number of adopters is unknown. Using sales of pre- vious albums along with information known prior to the launch of a new album, the model constructs informed priors, yielding prelaunch sales forecasts, which are out-of-sample pre- dictions. In the context of new product forecasting before introduction, the information we have is limited to the relevant background characteristics of a new album. Knowing only the general attributes of a new album, the meta-analytic approach proposed here provides an informed prior on the dynamics of duration, the effects of marketing variables, and the unknown market potential. As new data become available, weekly sales forecasts and mar- ket size (number of eventual adopters) are revised and updated. We illustrate our approach using weekly sales data of albums that appeared in Billboard's Top 200 albums chart from January 1994 to December 1995. (Forecasting; Empirical Generalization; Hierarchical Bayes Model)
How do people approach marketing in the face of uncertainty, when the product, the market, and the traditional details involved in market research are unknowable ex ante? The authors use protocol analysis to evaluate how 27 expert entrepreneurs approach such a problem compared with 37 managers with little entrepreneurial expertise (all 64 participants are asked to think aloud as they make marketing decisions in exactly the same unpredictable situation). The hypotheses are drawn from literature in cognitive science on (1) expertise in general and (2) entrepreneurial expertise in particular. The results show significant differences in heuristics used by the two groups. While those without entrepreneurial expertise rely primarily on predictive techniques, expert entrepreneurs tend to invert these. In particular, they use an effectual or nonpredictive logic to tackle uncertain market elements and to coconstruct novel markets with committed stakeholders.
Decision making requires managers to constantly estimate the probability of uncertain outcomes and update those estimates in light of new information. This article provides guidance to managers on how they can improve that process by more explicitly adopting a Bayesian approach. Clear understanding and application of the Bayesian approach leads to more accurate probability estimates, resulting in better informed decisions. More importantly, adopting a Bayesian approach, even informally, promises to improve the quality of managerial thinking, analysis, and decisions in a variety of additional ways.
Among the most critical challenges crisis leaders face is evaluating “the situation”—what is happening and what to do about it. Extensive scholarship on Situational Awareness (SA) has identified a gap: a disciplined process for achieving accurate SA. Further, SA only addresses the first half of that situation equation; awareness is necessary, yet not sufficient, unless linked to and integrated with meaningful decisions and actions. The POP-DOC Loop is a six-step SA tool that combines analysis and action into a continuous process. The analytic side is Perceive, Orient, Predict. The Action side is Decide, Operationalize, Communicate. POP-DOC builds upon Boyd's Observe, Orient, Decide, Act, or OODA Loop. OODA evolves from and focuses upon military command-and-control contexts, though it is applied in other settings as well. The advance design of POP-DOC incorporates a wider range of human factors, including neuro- and decision science research, in order to equip leaders to build SA in high-stress, high-stakes, evolving, and unpredictable situations.
In the light of the current coronavirus crisis, business-to-business firms face a variety of challenges in a complex and fast-changing environment. In order to provide structured analysis and to guide strategic decision-making, we present a novel, five-step approach for analyzing the impact of a crisis on a firm's business model. We applied the approach with eight business-to-business firms and find support for its usefulness. The evidence suggests very different impacts of the coronavirus crisis on business-to-business firms, and that understanding these differences is important for strategizing during the crisis but also to navigating successfully into the future. We also describe six different types of crisis impacts on business models. We conclude by developing managerial implications and questions for future research.
The COVID-19 pandemic has been a sobering reminder of the extensive damage brought about by epidemics, phenomena that play a vivid role in our collective memory, and that have long been identified as significant sources of risk for humanity. The use of increasingly sophisticated mathematical and computational models for the spreading and the implications of epidemics should, in principle, provide policy- and decision-makers with a greater situational awareness regarding their potential risk. Yet most of those models ignore the tail risk of contagious diseases, use point forecasts, and the reliability of their parameters is rarely questioned and incorporated in the projections. We argue that a natural and empirically correct framework for assessing (and managing) the real risk of pandemics is provided by extreme value theory (EVT), an approach that has historically been developed to treat phenomena in which extremes (maxima or minima) and not averages play the role of the protagonist, being the fundamental source of risk. By analysing data for pandemic outbreaks spanning over the past 2500 years, we show that the related distribution of fatalities is strongly fat-tailed, suggesting a tail risk that is unfortunately largely ignored in common epidemiological models. We use a dual distribution method, combined with EVT, to extract information from the data that is not immediately available to inspection. To check the robustness of our conclusions, we stress our data to account for the imprecision in historical reporting. We argue that our findings have significant implications, including on the extent to which compartmental epidemiological models and similar approaches can be relied upon for making policy decisions.
Rare events are common: Even though any particular type of ‘rare event’ - a world war, global economic collapse, or pandemic for that matter - should only occur once every 100 years, there are enough of those types of ‘rare events’ that overall, they commonly occur about once every 10 years. As we are currently experiencing with the COVID-19 pandemic, we do not sufficiently leverage the rich toolset that risk management offers to prepare for and mitigate the resulting uncertainty. This article highlights four aspects of risk management, and their practical and theorical implications. They are: 1) Risk (in the narrower sense), where possible future outcomes can be captured through probability distributions. 2) A situation of uncertainty, where there is transparency regarding what is not known, but probability distributions are unknown, as well as causal relationships influencing the outcome in question. 3) A situation of ignorance, where there is no understanding that certain possible future developments are even relevant. And finally: 4) The emergence of organizational and inter-organizational myopia as an effect of risk, uncertainty and ignorance on collective human behaviour.
The current Coronavirus crisis is having disastrous effects for most B2B firms around the world. The decline in sales provokes intra-organizational and inter-organizational tension, requiring a new approach for managing firms' business operations. Particularly, the direct threat to human beings places the attention of managers on the individual. This study investigates the main differences between prior “traditional” financial-based crises and the practices that managers can adopt to navigate and survive the Coronavirus crisis from a social exchange theory (SET) view. The authors identify eight crisis-comparative dimensions to consider to successfully prevail: (1) formation, (2) focus, (3) temporality, (4) government jurisdiction, (5) preparedness, (6) normality, (7) business, and (8) operational deployment. In addition, the study results propose four intertwined areas to classify the managerial practices: (1) digital transformation, (2) decision-making processes, (3) leadership, and (4) emotions and stress.
Sales is a profession that faces an inordinate amount of failure. When salespeople fail and face rejection from customers, the consequences are widespread and lasting. Perhaps, rather than aiming to prevent inevitable failures, salespeople should instead anticipate and control the timing of when failure occurs in the sales cycle. Across three progressive studies, this research explicates the phenomenon of failing fast within a business-to-business sales context. The authors theoretically conceptualize and operationalize a failing fast process model: prospect intent collection, prospect intent interpretation, and salesperson failing fast. The authors then study the focal relationship between salesperson failing fast and sales performance, contingent on individual-level, organizational-level, and environmental-level moderators. While the direct effect is non-significant, the finding is novel in that it shows that certain forms of failure may not actually be a detriment to performance. The moderator analysis sheds further light on this relationship, revealing a mixture of accentuating and attenuating effects. This research collectively brings greater nuance to the study of sales failure and enables future scholars to understand the consequences, and even potential benefits, of failing early in the sales process.
The literature reflects remarkably little effort to develop a framework for understanding the implementation of the marketing concept. The authors synthesize extant knowledge on the subject and provide a foundation for future research by clarifying the construct's domain, developing research propositions, and constructing an integrating framework that includes antecedents and consequences of a market orientation. They draw on the occasional writings on the subject over the last 35 years in the marketing literature, work in related disciplines, and 62 field interviews with managers in diverse functions and organizations. Managerial implications of this research are discussed.
Marketing academicians and practitioners have been observing for more than three decades that business performance is affected by market orientation, yet to date there has been no valid measure of a market orientation and hence no systematic analysis of its effect on a business's performance. The authors report the development of a valid measure of market orientation and analyze its effect on a business's profitability. Using a sample of 140 business units consisting of commodity products businesses and noncommodity businesses, they find a substantial positive effect of a market orientation on the profitability of both types of businesses.
Considerable progress has been made in identifying market-driven businesses, understanding what they do, and measuring the bottom-line consequences of their orientation to their markets. The next challenge is to understand how this organizational orientation can be achieved and sustained. The emerging capabilities approach to strategic management, when coupled with total quality management, offers a rich array of ways to design change programs that will enhance a market orientation. The most distinctive features of market-driven organizations are their mastery of the market sensing and customer linking capabilities. A comprehensive change program aimed at enhancing these capabilities includes: (1) the diagnosis of current capabilities, (2) anticipation of future needs for capabilities, (3) bottom-up redesign of underlying processes, (4) top-down direction and commitment, (5) creative use of information technology, and (6) continuous monitoring of progress.
Considerable similarities can be found between some of the problems that arise in the practice of marketing and those encountered in the study of history. The author describes the four stages of the historical investigation process and explains the techniques historians use in working with various types of evidence. He gives examples of how these techniques can be applied in marketing practice, arguing that they can be a valuable supplement to the application of scientific method.
Both this article and the preceding one by Harry V. Roberts on “Bayesian Statistics in Marketing” (pp. 1–4) show that Bayesian statistics is a new and potentially powerful tool for systematically working with management judgments.
The present article by Paul E. Green shows how this approach can be used in the area of pricing analysis.
Although the role of market knowledge competence in enhancing new product advantage is assumed widely in the literature, empirical studies are lacking because of an absence of the concept definition. In this study, the authors conceptualize market knowledge competence as the processes that generate and integrate market knowledge. The authors test the conceptual model using data collected from the software industry. The findings show that each of the three processes of market knowledge competence exerts a positive influence on new product advantage. The results also reveal a positive association between new product advantage and product market performance. The findings regarding the antecedents indicate that the perceived importance of market knowledge by top management has the largest impact on the processes of market knowledge competence.
Bayesian statistics points the way to an articulation of business judgment and statistical research in approaching marketing problems.
We have always known that neither executive nor statistician is sufficient in himself, but now for the first time we have formal tools that show that the two can be brought closer together. This article shows how this may be accomplished.
Also, see the succeeding article, “Bayesian Decision Theory in Pricing Strategy,” by Paul E. Green, pp. 5–14.
While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs have been rather neglected. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects. The capability to manage beliefs will increasingly be a strategic one, a key source of wealth creation, and a key research area for strategic organization scholars.
The authors provide a critical examination of marketing analytics methods by tracing their historical development, examining their applications to structured and unstructured data generated within or external to a firm, and reviewing their potential to support marketing decisions. The authors identify directions for new analytical research methods, addressing (1) analytics for optimizingmarketing-mix spending in a data-rich environment, (2) analytics for personalization, and (3) analytics in the context of customers' privacy and data security. They review the implications for organizations that intend to implement big data analytics. Finally, turning to the future, the authors identify trends that will shape marketing analytics as a discipline as well as marketing analytics education.
Firms in high-technology industries frequently face the dangers and opportunities associated with strategic inflection points in their development trajectory. Strategic inflection points (SIPs) are caused by changes in fundamental industry dynamics, winning strategies, and dominant technologies. SIPs generate strategic dissonance in the organization because they are associated with divergences between the basis of competition and the firm's distinctive competence, and between top management's strategic intent and strategic action. Top management can take advantage of the information generated by strategic dissonance to develop new strategic intent and lead the organization through the turbulence and uncertainty associated with SIPs. This requires a capacity for strategic recognition on the part of top and senior management. Strategic recognition in turn is facilitated by an internal selection environment that allocates resources based on competitive reality and values dissent and debate. Strategic recognition is the foundation for exerting strategic leadership: encouraging debate and bringing debate to a conclusion that realigns the basis of competition and distinctive competence, and strategy and action.
The main objective of this article is to investigate the empirical performances of the Bayesian approach in analyzing structural equation models with small sample sizes. The traditional maximum likelihood (ML) is also included for comparison. In the context of a confirmatory factor analysis model and a structural equation model, simulation studies are conducted with the different magnitudes of parameters and sample sizes n = da, where d = 2, 3, 4 and 5, and a is the number of unknown parameters. The performances are evaluated in terms of the goodness-of-fit statistics, and various measures on the accuracy of the estimates. The conclusion is: for data that are normally distributed, the Bayesian approach can be used with small sample sizes, whilst ML cannot.
This paper addresses the concept of serendipity in entrepreneurship, defined as search leading to unintended discovery. It conceptually delineates serendipity, showing how it is related to the entrepreneurship literature on prior knowledge and systematic search. The paper also discusses how serendipitous entrepreneurship relates to some aspects of evolutionary theory, socio-economic institutions, and social psychology. It is suggested that serendipity may be a quite prevalent feature of entrepreneurship and thus has implications for both research and practice.
This paper investigates the possibilities of a technology of Intelligence, one field of application being to support Industrial Marketing Management. Intelligence Services are not new: In fact they are very old and traditional ways of marshalling information. In recent decades their scope has widened, their concern with command and decision-making has intensified, and speed of operation has accelerated. The great value of Intelligence Systems to management lies in their information-synthesising capabilities and in their concern with the whole organisation and with operating, tactical and strategic decisions. Comparative sketches are given of Intelligence at work in different fields-Government, Economic/ Financial, Company, Technical and Person. Espionage is distinguished clearly from Intelligence and briefly discussed as a necessary field of concern. It is found also that Market Research has been evolving towards a more global concern with marketing and the total organisation while the same shift can be seen in competing management technologies including Management Accounting, Operational Research, Computer Science and others. There is an evolutionary pattern in the quantifying processes used by Marketing, and cause-and-effect relationships between company, product and environmental factors, that help us to relate marketing more closely to different sizes of organisation, different market structures and so on. A possible analogy between human and systematised Intelligence processes is briefly considered, mainly as an aid to describing what a technology of Intelligence would be about and to suggest the main domains of study. The conclusion is that deep-seated trends are lifting us away from fragmented management technologies with a sub-optimising role to a much greater concern with the whole range of decision making. It seems necessary to regain a unity of application for quantifying and information-handling processes and to move towards a single supporting technology matched to the individual needs of the organisation concerned. In effect we need advanced Intelligence Systems. The marketing function, and Industrial Marketing in particular, happens to be well suited to such a development and should be fertile ground. There is a serious prospect that studies and courses in Marketing Intelligence on a large scale could be commenced in Europe. It is largely a matter of timing. A main object of the writer is to encourage response in order to discover how much support might be found for such a major course of study serving both the professional Intelligence and Marketing Management aspects.
The purpose of this article is to discuss two approaches to being market oriented—a market-driven approach and a driving-markets
approach.Market driven refers to a business orientation that is based on understanding and reacting to the preferences and behaviors of players
within a given market structure.Driving markets, on the other hand, implies influencing the structure of the market and/or the behavior(s) of market players in a direction
that enhances the competitive position of the business. There are three generic ways of changing the structure of a market:
(1) eliminating players in a market (deconstruction approach), (2) building a new or modified set of players in a market (construction
approach), and (3) changing the functions performed by players (functional modification approach). Market behavior can be
modified directly or, alternatively, indirectly by changing the mind-set of market players (e.g., customers, competitors,
and other stakeholders).
Purpose
– The purpose of this commentary is to summarise developments in the science of serendipity and urge marketers to pay more attention to the incorrigible incalculability of commercial life.
Design/methodology/approach
– Explains how luck is a crucial component of business success and argues, citing examples of Shelby D. Hunt and Ted Levitt, among others, that it is perhaps time to abandon our fixation with customer focus and start taking serendipity seriously
Findings
– Fortune, clearly, favours the brand. Indeed, the history of management in general and marketing in particular reveals that serendipity plays a significant part in the commercial equation.
Originality/value
– Highlights the latter day advances in the science of serendipity.
Current research offers alternative explanations to the ‘linkage’ between the pattern of diversification and performance. At least four streams of research can be identified. None of these can be considered to be a reliable, predictive theory of successful diversification. They are, at best, partial explanations. The purpose of this paper is to propose an additional ‘linkage’, conceptual at this stage, that might help our understanding of the crucial connection between diversity and performance. The conceptual argument is intended as a ‘supplement’ to the current lines of research, rather than as an alternative explanation.
Did German flying bombs really target the poorer areas of London—or did the poor people underneath them just think that they did? How many millions of dollars will the current re-make of King Kong really gross? Oh, and how many taxis are there in London? The human mind can make very good guesses, it seems—and it uses Bayesian analysis to do so. Tom Griffiths and Joshua Tenenbaum explain.
Much of the current thinking about competitive strategy focuses on ways that firms can create imperfectly competitive product markets in order to obtain greater than normal economic performance. However, the economic performance of firms does not depend simply on whether or not its strategies create such markets, but also on the cost of implementing those strategies. Clearly, if the cost of strategy implementation is greater than returns obtained from creating an imperfectly competitive product market, then firms will not obtain above normal economic performance from their strategizing efforts. To help analyze the cost of implementing strategies, we introduce the concept of a strategic factor market, i.e., a market where the resources necessary to implement a strategy are acquired. If strategic factor markets are perfect, then the cost of acquiring strategic resources will approximately equal the economic value of those resources once they are used to implement product market strategies. Even if such strategies create imperfectly competitive product markets, they will not generate above normal economic performance for a firm, for their full value would have been anticipated when the resources necessary for implementation were acquired. However, strategic factor markets will be imperfectly competitive when different firms have different expectations about the future value of a strategic resource. In these settings, firms may obtain above normal economic performance from acquiring strategic resources and implementing strategies. We show that other apparent strategic factor market imperfections, including when a firm already controls all the resources needed to implement a strategy, when a firm controls unique resources, when only a small number of firms attempt to implement a strategy, and when some firms have access to lower cost capital than others, and so on, are all special cases of differences in expectations held by firms about the future value of a strategic resource. Firms can attempt to develop better expectations about the future value of strategic resources by analyzing their competitive environments or by analyzing skills and capabilities they already control. Environmental analysis cannot be expected to improve the expectations of some firms better than others, and thus cannot be a source of more accurate expectations about the future value of a strategic resource. However, analyzing a firm's skills and capabilities can be a source of more accurate expectations. Thus, from the point of view of firms seeking greater than normal economic performance, our analysis suggests that strategic choices should flow mainly from the analysis of its unique skills and capabilities, rather than from the analysis of its competitive environment.
Cet article cherche a determiner a quels types d'evenements historiques s'applique l'analyse de path dependence. Selon l'A., il s'agit de sequences historiques au sein desquelles des evenements contingents mettent en mouvement des modeles institutionnels ou des chaines d'evenements ayant des proprietes deterministes. L'identification de la path dependence implique a la fois de relier un resultat a une serie d'evenements et de montrer en quoi ces evenements sont eux-memes des occurences contingentes ne pouvant etre expliquees par des conditions historiques prealables. Ces sequences historiques sont generalement de deux types : les sequences a auto-renforcement et les sequences reactives