Content uploaded by Oriana R Aragón
Author content
All content in this area was uploaded by Oriana R Aragón on Oct 14, 2019
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=mmer20
Marketing Education Review
ISSN: 1052-8008 (Print) 2153-9987 (Online) Journal homepage: https://www.tandfonline.com/loi/mmer20
INTEGRATING ANALYTICS INTO MARKETING
CURRICULA: CHALLENGES AND EFFECTIVE
PRACTICES FOR DEVELOPING SIX CRITICAL
COMPETENCIES
Danny Weathers & Oriana Aragón
To cite this article: Danny Weathers & Oriana Aragón (2019): INTEGRATING ANALYTICS
INTO MARKETING CURRICULA: CHALLENGES AND EFFECTIVE PRACTICES FOR
DEVELOPING SIX CRITICAL COMPETENCIES, Marketing Education Review, DOI:
10.1080/10528008.2019.1673664
To link to this article: https://doi.org/10.1080/10528008.2019.1673664
Published online: 03 Oct 2019.
Submit your article to this journal
View related articles
View Crossmark data
INTEGRATING ANALYTICS INTO MARKETING CURRICULA: CHALLENGES
AND EFFECTIVE PRACTICES FOR DEVELOPING SIX CRITICAL
COMPETENCIES
Danny Weathers
,
and Oriana Aragón
College of Business, Clemson University, Clemson, SC, USA
As organizations become increasingly dependent on marketing analytics, universities are adapting
their curricula to equip students with skills necessary to operate in data-rich environments. We describe
six competencies that students need to become proficient with analytics: (1) assessing data quality, (2)
understanding measurement, (3) managing datasets, (4) analyzing data, (5) interpreting results, and (6)
communicating results. We discuss what these competencies entail, challenges students may face in
developing them, and effective practices for instructors to foster the competencies. We provide data
that support the value of teaching analytics with a focus on developing these competencies.
INTRODUCTION
The ease of collecting data through means such as con-
sumer surveys and panels, in-store scanner systems, and
online behavior tracking software, just to name a few,
allows marketing decision-makers to have more infor-
mation at their disposal than ever before. Consequently,
the field of marketing analytics, defined here as quanti-
tative data used to make marketing decisions and evalu-
ate marketing performance, is growing rapidly. To
illustrate, a recent CMO Survey (2015) found that com-
panies use analytics for numerous marketing functions
and were expected to devote 11.1% of their marketing
budget to analytics in 2018, up from 6.7% in 2015.
Because theuse of analytics ispositively related to profits
and ROI (Ariker, Diaz, Moorman, & Westover, 2015), the
widespread and growing use of marketing analytics, and
employers’desires to find employees with quantitative
skills (Schlee & Harich, 2010), are not surprising.
In response to this trend, universities have been adapt-
ing their marketing curricula to train students to operate
in data-rich business environments. These curricular
changes have taken several forms. In some cases, univer-
sities have created new degree programs in “Marketing
Analytics”or “Business Analytics.”In other cases, instruc-
tors have adapted existing courses by integrating course-
relevant analytics content and exercises. Between these
extremes, departments have created stand-alone courses
dedicated to marketing analytics (Pilling, Rigdon, &
Brightman, 2012;Saber&Foster,2011).
Growth in analytics courses has prompted researchers
to provide guidance to instructors who develop and
teach such courses (see, for example, Houghton,
Schertzer, & Beck, 2018;LeClair,2018; Liu & Burns,
2018; Liu & Levin, 2018; Pilling et al., 2012;Pirog,
2010; Saber & Foster, 2011; Wilson, McCabe, & Smith,
2018). Recommendations tend to focus on course con-
tent, course structure, teaching interventions, and inte-
grating analytics courses into the overall curriculum.
They encourage instructors to introduce metrics and
analyses related to specific areas of marketing, such as
brand management and target market assessment, use
evidence-based pedagogical tools, provide active-
learning opportunities, align course content with
intended learning outcomes, provide students with rele-
vant feedback, and push students to stretch in their
learning. They also tout the merits of integrating market-
ing concepts with exercises that utilize specific software,
such as R and Excel, to illustrate concept-relevant
metrics, and they demonstrate the effectiveness of spe-
cific analytics-related exercises.
The current research complements existing research by
examining in detail the skills students need to become
proficient with analytics. Regardless of whether instruc-
tors expose students to analytics through stand-alone
courses or on a more limited basis through integrating
analytics exercises into existing courses, instructors
should work to develop specific competencies to enable
Address correspondence to Danny Weathers, College of
Business, Clemson University, Sirrine Hall, Clemson, SC
29634, USA. E-mail: pweath2@clemson.edu
Marketing Education Review, vol. 00, no. 00 (2019), pp. 1–17.
Copyright Ó2019 Society for Marketing Advances
ISSN: 1052–8008 (print) / ISSN 2153–9987 (online)
DOI: https://doi.org/10.1080/10528008.2019.1673664
students to master the topic. Consequently, our goal is
threefold. First, we describe six critical competencies for
becoming proficient with analytics. Second, we discuss
challenges students are likely to face in developing the
competencies. Third, we present effective practices for
instructors attempting to develop these competencies in
students. While we are not the first to identify the six
competencies, we provide guidance to instructors by
synthesizing the competencies, as well as the associated
challenges and effective practices, in a way that makes
them directly relevant to courses that contain marketing
analytics content. Given the call to better integrate ana-
lytics courses into the marketing curriculum (e.g., LeClair,
2018; Mintu-Wimsatt & Lozada, 2018), the competencies
provide a framework for developing a cohesive analytics
concentration, as we describe in the Conclusions section.
We present data that support the value of teaching analy-
tics courses with an eye toward developing these
competencies.
CRITICAL COMPETENCIES, CHALLENGES,
AND EFFECTIVE PRACTICES
General Information about the Competencies
Our experience in developing courses in marketing ana-
lytics lead us to advocate for six competencies that are
critical to proficiency with analytics: (1) assessing data
quality, (2) understanding measurement, (3) managing
datasets, (4) analyzing data, (5) interpreting results, and
(6) communicating results. To develop the competencies,
instructors can use multiple established learning theories,
such as proximal learning theory (Vygotsky, 1978), the-
ory of intelligence (Dweck, 1986), and self-efficacy theory
(Bandura, 1977).Forexample,Wilsonetal.(2018)pro-
vide a useful discussion of creating an innovative market-
ing analytics curriculum based on experiential learning
theory (e.g., Kolb, 1984). Consequently, rather than
defining our efforts through any one theoretical perspec-
tive, we take a more holistic approach by calling upon
various theories to identify challenges and support our
effective practice recommendations.
General Preparation for and Challenges to
Developing the Competencies
Research reveals that the quantitative skills of marketing
students are often lacking (e.g., Aggarwal, Vaidyanathan,
& Rochford, 2007;Davis,Misra,&VanAuken,2002),
which is likely to lead students to be anxious about analy-
tics courses. Research also demonstrates that anxiety inhi-
bits learning (e.g., Chew & Dillon, 2014;Fredrickson,
2004;Hernandez,Schultz,Estrada,Woodcock,&
Chance, 2013;Spielberger,2013), particularly in quanti-
tative courses (Pilling & Nasser, 2015). To confirm and
extend these findings, we surveyed students in our under-
graduate Marketing Metrics and Analytics courses.
1
One
set of survey items captured students’perceptions of the
difficulty of “statistics,”“data,”and “math.”Another set of
items captured students’perceptions of their experience
with “statistics,”“data,”and “math.”These sets of items
were adapted from the Survey of Attitudes toward
Statistics scale (Schau, Stevens, Dauphinee, & Del
Vecchio, 1995). Another set of items, designed to align
with our six competencies, assessed students’comfort with
or knowledge of specific topics to be covered in the course,
(e.g., creating pivot tables, computing z-scores, interpret-
ing graphs). Each item was measured on a seven-point
scale where higher numbers indicate greater difficulty,
experience, knowledge, or comfort. Means for each mea-
sure are provided in Table 1 under the “Beginning of
Semester”column. We also obtained each student’s
overall college GPA and university math placement test
score. We highlight general findings and conclusions
here. Interested readers can contact the authors for addi-
tional details.
Aligning with previous research (Davis et al., 2002),
students entered the course feeling that they had sig-
nificantly less experience with statistics and data than
with math, and experience with each topic was nega-
tively related to perceptions of the topic’s difficulty.
For the specific topics covered in the course, the aver-
age level of comfort or perceived knowledge fell below
the scale midpoint (4) for 22 of the 29 topics. Students
also indicated significantly less comfort with analytics
(a combination of statistics and data) than with math.
1
Course objectives included: (a) Learning how metrics are
derived from organizational strategies and missions. (b) Learning
measurement-relevant concepts, such as construct definition, relia-
bility, and validity. (c) Deriving and computing metrics relevant to
a variety of marketing decisions, including, but not limited to,
decisions related to pricing, online and social media strategy, adver-
tising, product development, customer targeting, branding, and dis-
tribution. (d) Becoming familiar with data sources that provide
information needed to compute marketing metrics. (e) Learning
when and how marketing metrics should and should not be used.
(f) Learning how to clearly and effectively communicate marketing
metrics to others both within and outside of the organization. (g)
Learning how to use Excel to summarize and present data.
2Marketing Education Review
Table 1
Students’Perceptions of Difficulty, Experience, Knowledge, and Comfort at the Beginning and End of the Semester
Measure
Beginning of
Semester
a
End of Semester
a
P-value comparing beginning
and end of semester
b
Perceived Difficulty
Statistics 4.66 (0.75) 4.58 (0.91) .48
Data 4.64 (0.70) 4.57 (0.75) .54
Math 4.61 (0.70) 4.51 (0.88) .30
Significant differences
c
None None
Perceived Experience
Statistics 4.37 (0.85) 4.74 (0.94) < .01
Data 4.35 (1.08) 4.79 (0.86) < .01
Math 6.33 (0.57) 6.35 (0.56) .97
Significant differences
c
Math > Statistics, Data Math > Statistics,
Data
Competency 1: Assessing Data Quality
Determining data appropriateness for answering research
question
d
3.38 (1.29) 4.87 (1.05) < .01
Assessing the quality of data
e
3.78 (1.40) 5.40 (1.03) < .01
Competency 2: Understanding Measurement
Data units of measurement
d
4.06 (1.31) 4.87 (1.20) < .01
Measurement reliability
d
2.86 (1.34) 4.51 (1.09) < .01
Measurement validity
d
2.65 (1.40) 4.54 (1.22) < .01
Competency 3: Managing Datasets
Data management
d
2.59 (1.22) 4.63 (1.20) < .01
Statistical distributions of data
d
3.36 (1.24) 4.54 (1.35) < .01
Creating survey items
d
3.53 (1.32) 5.25 (1.07) < .01
Weighting scores within an index
d
2.44 (1.39) 4.60 (1.28) < .01
Creating a data file (or dataset)
e
3.76 (1.61) 5.31 (1.16) < .01
Working with Excel
e
4.33 (1.64) 5.72 (1.00) < .01
Organizing data in Excel
e
4.40 (1.65) 5.75 (1.05) < .01
Cleaning data in Excel
e
2.94 (1.38) 5.94 (0.98) < .01
Creating index variables in Excel
e
2.62 (1.29) 4.86 (1.24) < .01
Weighting scored items in Excel
e
3.12 (1.65) 5.10 (1.32) < .01
Creating z-scores in Excel
e
3.00 (1.43) 5.36 (1.23) < .01
Competency 4: Analyzing Data
Choosing right analysis techniques
d
2.74 (1.34) 4.63 (1.23) < .01
Linear regression
d
3.64 (1.31) 4.52 (1.19) < .01
Pathway modeling
d
1.94 (1.26) 3.91 (1.43) < .01
Creating Pivot Tables in Excel
e
3.43 (1.86) 5.85 (1.03) < .01
Competency 5: Interpreting Results
Interpreting data
d
4.04 (1.33) 5.11 (1.13) < .01
Interpreting p-values
d
3.46 (1.47) 4.54 (1.30) < .01
Interpreting regression coefficients
d
2.67 (1.43) 4.47 (1.26) < .01
Interpreting effect sizes
d
2.41 (1.25) 4.43 (1.30) < .01
How data are used by business
d
3.40 (1.26) 5.10 (1.03) < .01
Interpreting graphs
e
4.99 (1.27) 5.84 (1.01) < .01
Statistical error and variance
e
3.74 (1.25) 4.49 (1.32) < .01
Competency 6: Communicating Results
Creating graphs in Excel
e
4.76 (1.60) 6.00 (0.95) < .01
Choosing the appropriate graph to illustrate your data
e
4.61 (1.38) 5.73 (1.03) < .01
Effectively writing about technical information
e
3.94 (1.59) 5.12 (1.16) < .01
Effectively presenting technical information orally
e
3.70 (1.59) 5.10 (1.27) < .01
Notes:
a
Means with standard deviations in parentheses
b
Based on dependent samples t-test
c
p< .05 based on Bonferroni adjustments for multiple comparisons
d
Knowledge assessed on a seven-point scale: 1 = No knowledge, 7 = Complete knowledge
e
Comfort assessed on a seven-point scale: 1 = Extremely uncomfortable, 7 = Extremely comfortable
Xxxxxx 2019 3
Only 15% of students said they were moderately or
extremely comfortable with analytics, while 85% of
students indicated something less than moderate
levels of comfort. Neither GPA nor math placement
was significantly related to comfort with analytics.
Thus, students with higher levels of academic success,
even math-specific success, were no more comfortable
with their ability to handle analytics-related concepts
than were students with less academic success.
Noting the challenge of heterogeneous student pre-
paration, Pilling et al. (2012,p.188)state,“Although the
large majority of students arguably possess the prior
knowledge to succeed in the [analytics] course, the level
of functional prior knowledge brought to the course is
inconsistent.”We also found this to be true; our students’
GPAs ranged from 2.7 to 4.0, and their math placement
scores ranged from 40 to 99. LaBarbera and Simonoff
(1999)find that marketing majors often consider quanti-
tative coursework as unimportant. Supporting these find-
ings, among our students, only 47% reported any level of
interest in analytics. Thus, instructors must first get stu-
dents to buy in to the importance of analytics, perhaps by
following the EPIC model of exposure, persuasion, identi-
fication, and commitment (Aragón, Dovidio, & Graham,
2016; Cavanagh et al., 2016). Overall, our findings sup-
port the conclusions of previous research regarding the
challenges inherent to teaching analytics to Marketing
students.
Competency 1: Assessing Data Quality
What This Competency Entails
Data can be created either within the organization
using the data (i.e., internally) or by another organization
(i.e., externally), and data can be created either for the
specific problem at hand (i.e., primary) or for other
research/decision-making purposes (i.e., secondary).
Regardless of who creates the data, or for what purpose,
analytics students should be able to assess the data’squal-
ity to understand its capabilities and limitations. Such an
assessment requires an understanding of the data collec-
tion process and the role of this process in obtaining
actionable information.
Data quality has four dimensions: intrinsic, contextual,
representational, and accessibility (e.g., Ballou & Pazer,
1985;Delone&McLean,1992; Goodhue, 1995;Jarke&
Vassiliou, 1997;Lee,Strong,Kahn,&Wang,2002;Wand
&Wang,1996;Wang&Strong,1996;Zmud,1978). The
intrinsic dimension includes issues such as whether the
data are accurate and free from bias. The contextual
dimension relates to issues such as whether the data are
sufficiently current and relevant to the problem at hand.
The representational dimension refers to issues such as the
data’s format and readability, and the accessibility dimen-
sion refers to whether the user has ready access to the data.
We focus on intrinsic and contextual data quality here,
and we consider representational and accessibility data
quality as components of subsequent competencies.
Why Developing This Competency is a Challenge
The issue of quality may be both ambiguous (see, for
example, Pirsig, 1974) and, due to having multiple
dimensions, complex, and students with a low tolerance
for ambiguity resist assimilating new information that is
ambiguous or complex (DeRoma, Martin, & Kessler,
2003). Fully evaluating data quality can require substan-
tial effort. Consider intrinsic data quality. Assessing
accuracy requires students to thoroughly understand
the data collection process and its potential shortcom-
ings. For example, if data collection involves sampling,
students must be able to recognize problems that could
arise due to sampling. The ease of sampling consumers
and other units, such as social media posts and online
product reviews, is driving the marketing analytics
boom. However, analytics that use data collected from
a sample may be misused if one fails to appreciate sam-
pling’s inherent limitations (i.e., sampling error) or
potential shortcomings due to a flawed process (e.g.,
selecting a nonrepresentative sample). Further, much
marketing analytics data are collected through auto-
mated processes (e.g., Google Analytics tracking web
site visitors). The technical nature of these processes
makes understanding the processes and their potential
limitations difficult.
Students should also consider the extent to which data
arefreeofbias,yetsourcesofbiascanbedifficult to detect.
Although third-party sources may have little incentive to
provide biased or manipulated data, this is not true of all
data sources. For example, businesses may be motivated
to delete negative comments made on their social media
accounts, they can hire services to clean their online
reputations, and they may pay people to post positive
online reviews. However, these sources of bias are not
obvious, particularly when software is used to build ana-
lytics datasets by scraping the Internet for comments and
4Marketing Education Review
data related to specific topics. Consequently, data
obtained in this way may not accurately represent con-
sumer sentiment or behavior, compromising the data’s
trustworthiness.
In terms of contextual quality, the data should be rele-
vant to the problem at hand (i.e., the context). This means
the data should be current, however a consequence of our
data-rich world is that data can become quickly outdated.
Further, data’s age is not always apparent, which is parti-
cularly problematic with automated data collection pro-
cesses. Someone building a dataset by scraping the
Internet for comments related to “Coca-Cola”may cap-
ture news articles or social media posts that have been on
the Internet for many years. The resulting dataset may
contain outdated information that, even if accurate, is
not appropriate for the current problem.
Effective Practices
Most data used with marketing analytics are
numerical, and our experience suggests that students
tend to view numbers as precise and accurate.
However, instructors should emphasize that not all
numbers are created equally. Because there are multi-
ple points at which data collection can go wrong,
instructors must impress upon students the need to
evaluate data quality to avoid drawing unwarranted
conclusions. To do so, instructors should use activ-
ities that illustrate how data quality can be compro-
mised. The results of the 1948 and 2016
U.S. presidential elections nicely illustrate sampling’s
imperfections. Sampling led people to expect Dewey
to defeat Truman in 1948 and Clinton to defeat
Trump in 2016. However, to the surprise of many,
neither of these outcomes occurred (Edwards-Levy,
2016). A discussion of the polling process serves to
highlight how sampling can lead to wrong conclu-
sions. To illustrate potential biases, we discuss how
trade associations collect data on the industries they
represent. However, because trade associations exist
to promote the industries, they may be reluctant to
provide data that reflect poorly on the industry.
We recommend that instructors develop this com-
petency with activities using common sources of mar-
keting data. For example, marketers often utilize data
from the US Census Bureau to segment and identify
attractive markets. Despite the Census Bureau’s exper-
tise, census data may not accurately represent the
population. For example, the Census Bureau acknowl-
edges that not all demographic groups are equally
likely to participate in the Bureau’s data collection
efforts (Westra & Nwaoha-Brown, 2017). Not only do
exercises using data from the Census Bureau and other
government sources illustrate data quality concerns,
they serve to familiarize students with valuable sources
of marketing-relevant data. We also provide students
with details about automated data collection proto-
cols, such as those used by Google, and have students
identify opportunities for the protocols to go wrong.
Although students expect automated data collection
procedures to be highly accurate, they should be
made aware that this is not always true.
We encourage instructors to utilize tools designed for
evaluating information quality, such as the information
quality assessment (IQA) instrument (Lee et al., 2002).
This easy-to-use, multi-item perceptual measure allows
the various dimensions of data quality to be quantified.
Example items include “This information is objective”
and “The information is sufficiently timely.”Instructors
should present students with data from various sources
and have them evaluate the data along each quality
dimension. Even when students lack sufficient knowl-
edge to fully evaluate the data, this tool can foster discus-
sions about why data may score high or low on each of
the dimensions.
Competency 2: Understanding Measurement
What This Competency Entails
Measurement is central to analytics, and becoming
competent in this area involves understanding valid-
ity, reliability, and levels of measurement. Validity
refers to whether data accurately reflect the concept
one intends to measure. Because validity addresses
whether the data are relevant to the problem at
hand, it aligns with the contextual data quality dimen-
sion (e.g., Wang & Strong, 1996; Zmud, 1978).
Reliability refers to whether the data would be
obtained again under similar conditions, and it falls
under the umbrella of intrinsic data quality (e.g.,
Delone & McLean, 1992; Goodhue, 1995). Level of
measurement refers to the nature of the information
the data represent. Data that lack reliability and/or
validity will be of low quality, ultimately leading to
questionable results. Being unable to identify the
Xxxxxx 2019 5
appropriate level of measurement may lead to inap-
propriate analyses. Thus, failing to assess validity,
reliability, and measurement level can undermine the
value of analytics.
Why Developing This Competency is a Challenge
First, consider measurement validity. Analytics are
often used to assess abstract, intangible outcomes such
as customer loyalty, satisfaction, or engagement.
Quantifying such latent constructs, and even “objec-
tive”outcomes, requires clear, precise definitions, how-
ever developing definitions with sufficient levels of
precision to accurately quantify these constructs or out-
comes can be a difficult task. Construct or outcome
definitions are often unique to the current situation
and, thus, must be assessed on a case-by-case basis.
Consider, for example, a “click”on a web page, the
basis for much online marketing analytics. Though see-
mingly straightforward, defining a click is highly tech-
nical as illustrated by the Interactive Advertising Bureau
(2009) guidelines. Companies adhering to these guide-
lines take precautions to ensure that clicks are due to
unique, legitimate website visitors. Without knowledge
of these guidelines and precautions, one may misinter-
pret click-based measures due to the possibility of acci-
dental double-clicks, bot-initiated clicks, or deliberate
manual attempts to manipulate the click count.
Further, assessing validity is challenging because there
are multiple types of validity, including face, content,
predictive, concurrent, convergent, and discriminant,
some with subtle distinctions. Specific types of validity
may be relevant in some circumstances but not in
others. Validity is often a matter of degree, and, as
noted, students struggle with such ambiguity. Finally,
if critical information is missing, validity tests are not
possible.
Reliability is a prerequisite for validity. One way to
assess reliability is to compare the same data from
various sources. However, this is difficult due to the
effort and/or expense required to obtain much of the
data used for marketing analytics. Consider, for exam-
ple, Nielsen television ratings. The process of obtain-
ing large-scale measures of television viewership makes
it difficult to verify the reliability of Nielsen’s mea-
sures. In measuring television viewership since 1950,
Nielsen has established elaborate systems involving
diaries and set-top meters and acquired the knowledge
to provide (presumably) accurate television ratings.
Companies with less experience and resources are unli-
kely to have Nielsen’s expertise. Even Nielsen’s mea-
surement system, refined throughout the last half of
the 20th century, may not be reliable in today’s highly
fragmented media environment. Such concerns are
difficult to assess.
Finally, effectively using analytics requires a thorough
understanding of level of measurement. Being able to
classify data as nominal, ordinal, interval, or ratio is
another complex, ambiguous issue that is difficult for
students to grapple with. For example, long debated is
the issue of whether data obtained from scaled response
questions (e.g., Likert) have ordinal or interval proper-
ties. Some disciplines (e.g., sociology) generally consider
such measures as ordinal, while others (e.g., psychology)
consider them as interval. Given that even experts often
disagree (Velleman & Wilkinson, 1993), we should not
be surprised when students struggle with the distinc-
tions between these categories.
Effective Practices
When teaching reliability and validity, abundant real-
world examples of data that either possess or lack high
levels of validity and/or reliability help students tie new
knowledge to existing structures. In teaching validity,
we find it useful to provide examples of data that lack
each type of validity and have students identify poten-
tial problems with using the data to draw specificcon-
clusions. To illustrate, for content validity (i.e., whether
a measure fully captures the construct’sdomain),weask
students to imagine that they work for a company that
owns a chain of restaurants. In a meeting, a coworker
states: “Overall, our customers are satisfied. They rated
our food quality an average of 5.9 on a 7-point scale.”We
then ask students to identify why this statement could
be wrong or misleading. Discussion leads students to
recognize that food quality is only one factor that
might contribute to “overall”satisfaction. Price, atmo-
sphere, and service may also play roles. Thus, the claim
that food quality satisfaction assesses overall satisfaction
lacks content validity. We take similar approaches to
illustrate face, predictive, concurrent, convergent, and
discriminant validity.
Because validity requires a clear, precise definition of
the underlying construct, we have students evaluate the
concept being measured using an existing construct
6Marketing Education Review
definition paradigm (e.g., Churchill, 1979; Rossiter,
2005). Gilliam and Voss (2013) present a six-step process
for developing marketing construct definitions: (1) write
a preliminary definition, (2) consult the literature to
build a nomological network, (3) assess value added, (4)
refine the definition, (5) have experts judge the defini-
tion, and (6) revise the definition and iterate. Having
students perform these steps for common marketing
analytics constructs will help them better evaluate
whether available data are valid measures of these con-
structs. For example, marketers desire to foster customer
brand loyalty. A useful exercise involves having students
create a preliminary definition of this abstract construct.
We have found that students often develop definitions
that do not differ from similar constructs. For example,
they may define loyalty as “a customer who is happy
with the brand.”Through employing the process advo-
cated by Gilliam and Voss (2013), students realize that
this definition may reflect satisfaction but not loyalty.
Through iteration, students eventually arrive at
adefinition that better reflects loyalty, such as “the
extent to which a customer is devoted to a brand.”
To explain reliability, we begin with true-score theory:
observed score = true score + error. Students find this
equation to be simple and intuitive. We then highlight
the error component as it relates to reliability by provid-
ing examples of random and systematic error. Finally, we
provide examples of measures that lack inter-rater, test-
retest, or parallel forms reliability. For example, we ask
students to imagine a situation in which a retailer mea-
sures the number of units sold in a month both by having
someone do a manual inventory check and another per-
son accessing scanner data records. We indicate that these
processes lead to different results, and we ask students
why this may have occurred. While students usually iden-
tify potential problems with theft or breakage, after dis-
cussion, students come to recognize that the
inconsistency could also be due to one or both measure-
ment processes (e.g., one’s ability to access or count
inventory in the stockroom or problems with the scanner
technology). The lack of parallel-forms reliability leads
students to question data obtained from a single source.
When teaching level of measurement, instructors
should keep scaffold learning in mind (Wood, Bruner, &
Ross, 1976). Because students are unlikely to have sub-
stantial experience with this concept, instructors should
introduce initial building blocks and then expand on
these concepts. For example, a parsimonious
representation of data is as either continuous or grouped.
Under these broad categories lie further distinctions. For
example, continuous data either have a fixed origin (i.e.,
ratio), such as unit sales, or they do not (i.e., interval), such
as shoe size. Grouped data can either be ranked (i.e., ordi-
nal), such as class standing, or they cannot (i.e., nominal),
such as gender. Student proficiency at differentiating
between various levels of measurement requires practice.
Further, we demonstrate how a given concept can be
measured at different levels, depending on how the mea-
sure is obtained. Returning to the customer loyalty exam-
ple, loyalty can be a ratio measure if people provide the
number of times they have purchased a brand in the
past year, an interval measure if people rate the number
of times they have purchased the brand in the past year on
a low/high scale, an ordinal measure if people select from
among several purchase frequency categories (e.g., 0, 1–5,
more than 5), or a nominal measure if people identify
which brands they have purchased in the past year. After
students demonstrate their mastery of these distinctions,
the instructor can introduce debates about topics such as
whether Likert scales are ordinal or interval.
Competency 3: Managing Datasets
What This Competency Entails
Students should be able to create and manipulate
datasets. While marketing employees are perhaps more
likely to use existing, rather than create new, datasets,
understanding this part of the analytics process enables
one to identify where problems may arise. For primary
data, this may involve developing and carrying out the
process to obtain the data (e.g., creating and administer-
ing a questionnaire or running a program to extract
information from the Internet), creating a coding
scheme (e.g., assigning numerical values to non-
numerical data), and entering the data into a computer
file. For secondary data, this may involve accessing exist-
ing data, oftentimes by navigating online data reposi-
tories, and entering the data into a computer file. For
both primary and secondary data, this competency
should include data cleaning, data (re)formatting, and
creating new variables. Data cleaning may involve iden-
tifying and removing outliers (Hodge & Austin, 2004),
looking for signs of diminished effort in survey takers
(Huang, Curran, Kenney, Poposki, & DeShon, 2012),
deleting specific cases, or handling missing data (Little
Xxxxxx 2019 7
& Rubin, 2014). (Re)formatting the data may involve
specific variables (e.g., converting a date variable from
one format to another) or the entire file (e.g., converting
an Excel file to SAS or SPSS). Creating a new variable may
involve combining multiple variables (e.g., dividing
“total time on site”by “number of pages visited”to
obtain “average time per page”) or transforming
a single variable (e.g., taking the inverse of response
times to satisfy underlying distribution assumptions of
the analysis to be performed).
Why Developing This Competency is a Challenge
Managing datasets requires a holistic view of the ana-
lytics process. Students must understand relationships
between variables, the nature of distributions, scale, why
a variable should be reverse coded, or when a difference
score, a weighted score, an averaged score, or
a standardized score would be appropriate. Many of
thesetransformationsareakintomoreabstractconcepts
introduced in algebra, and students must understand
why such transformations are necessary. As with each of
the identified competencies, students often lack experi-
ence. Marketing students typically do not take courses
that develop generalized skills for creating and working
with datasets, as reflected by the results in Table 1.
When students use existing datasets, the ability to
establish this competency is, in part, a function of
representational data quality, or whether the data are
presented “in such a way that it is interpretable, easy
to understand, [and] easy to manipulate …”(Lee et al.,
2002, p. 135). Secondary data may have poor represen-
tational data quality due to insufficient documenta-
tion about the data collection process or what the
variables represent. Further, the analysis software
may not easily manipulate the data file due to various
incompatibilities.
In terms of data cleaning, students must have basic
statistical knowledge, to identify outliers, and be famil-
iar with imputation methods, to handle missing data.
Further, students must know the capabilities of the
analysis software, such as whether it can handle char-
acter (string) data. When we ask students to create
datasets, we have observed that they may use incon-
sistent formats even for the same variable. For exam-
ple, they may use various units (e.g., sales entered as
units sold and dollar value), state the units for some
observations (e.g., entering “10 dollars”) but not
others (e.g., using “10”to represent 10 dollars), or
use both labels (e.g., “male”) and numbers (e.g., “1”
to represent a male). The challenge of (re)formatting
datasets is, in part, technical in nature. Students must
be proficient with, and understand the structure of
datasets created by, various software. Effectively refor-
matting data by creating new variables also requires
knowledge of measurement. For example, to create
a measure of “web site engagement,”students should
know that it is inappropriate to add measures of “time
on site”and “number of pages visited.”
Effective Practices
Regarding the technical aspects of dataset creation,
we encourage instructors to expose students to various
software, such as Excel, SAS, SPSS, and R, to give stu-
dents experience with common dataset formats (Liu &
Levin, 2018). Further, instructors should assign exer-
cises that require students to merge datasets of the
same and different formats, create new variables,
obtain basic descriptive statistics to identify outliers
and missing data, and convert datasets from one for-
mat (e.g., Excel) to another (e.g., SPSS).
More generally, instructors must emphasize the need
to connect the research objectives (i.e., the information
needed by the decision-maker) to the analytics being
performed. Failing to keep research objectives in mind
leaves students in a quagmire and asking, “but how do
Iknowtodothat?”particularly when students must
create or transform variables. Introducing goals as orga-
nizing topics, and the hierarchical nature of these goals,
helps students understand why and when specific
actions are necessary. For example, the concept of case-
wise consistency, or having representative data points
for a given case across all variables of interest, can help to
organize the functions of data cleaning (e.g., imputation
or deletion). The concept of data reduction, or aiming
for the most parsimonious representation of the data,
can help to organize the function of creating new vari-
ables (e.g., assessing internal consistency, standardiza-
tion, weighting, reverse coding, and averaging).
To illustrate these points, an effective exercise has
students compare potential target markets by creating
a measure of economic strength for each market. We
provide students with two measures, average income
and unemployment rate, for each market, and we ask
them to combine these measures into a single
8Marketing Education Review
(parsimonious) measure of economic strength.
Although some students are tempted to simply add
the two measures, after reflection, they realize that
higher values for income are good, while higher values
for unemployment are bad. They also realize that the
two measures are provided in different units with dif-
ferent magnitudes. Thus, before combining the mea-
sures, students need to standardize both measures and
reverse code the unemployment measure. Further,
creating a dataset with a few markets that have incon-
sistent values, such as high unemployment and high
average income, gives students experience in identify-
ing these inconsistencies (i.e., case-wise consistency).
Including a few missing values allows students
to develop approaches for replacing these values (i.e.,
imputation).
Competency 4: Analyzing Data
What This Competency Entails
Upon finalizing the dataset, students must be able
to conduct appropriate analyses. This requires under-
standing the research question at hand, which vari-
ables should be used, the nature of these variables
(i.e., level of measurement), what relationships or
effects are being investigated, what analysis is appro-
priate for estimating the relationships or effects, and
what software should be used. Analyses range in com-
plexity from basic univariate descriptive statistics (e.g.,
frequency distribution, mean, standard deviation), to
bivariate analyses (e.g., correlations, t-tests, simple lin-
ear regression), to analyses with three or more vari-
ables (e.g., multiple linear regression, moderation,
mediation, factor analysis, structural equation model-
ing). Liu and Burns (2018) highlight analysis techni-
ques that are relevant to analytics-related jobs,
including predictive modeling and data mining.
Why Developing This Competency is a Challenge
There is often a steep learning curve for mastering
software and specific analysis techniques. Given the
amount and type of data now available, common ana-
lyses have moved beyond simple descriptive statistics
and univariate analyses. To obtain maximum value
from data, students should be able to identify non-
linear, stochastic, and unobservable phenomena. The
analysis techniques for doing so are inherently
complex, with many underlying assumptions. While
friendlier interfaces, such as drop-down menus, have
enhanced the usability of statistical software, it can be
advantageous for users to have more control over the
analysis. Consequently, there may be value in students
learning to write analysis syntax in software such as R,
SPSS, STATA, MATLAB, and SAS.
While students can fairly easily learn to mimic the
process required to run specific analyses, a bigger chal-
lenge in developing this competency may involve
knowing which specific analysis is appropriate.
Students must be able to align (more abstract) research
questions with (more concrete) analyses. As noted, this
requires students to connect several dots. For example,
if the research question is to “understand the relation-
ship between age and loyalty,”students need to know
how age and loyalty are measured (i.e., the level of
measurement of each), which analyses are appropriate
for these types of measures, and which analyses enable
one to quantify a relationship. Only then can students
follow the steps necessary to run appropriate analyses.
Effective Practices
There is no substitute for practice; mastering data
analysis requires students to have substantial experi-
ence conducting various analyses. To develop insight
into which analysis to run, an effective exercise
involves presenting students with a research question
and variations on the data available, and then having
students identify the appropriate analysis. For exam-
ple, for the research question “what is the relationship
between age and purchase frequency,”both age and
purchase frequency could have ordinal (categories) or
ratio (raw numbers) levels of measurement. As such,
appropriate analysis could involve cross-tabs or regres-
sion. Presenting variations on the research question,
such as “what is the effect of age on purchase fre-
quency,”fosters discussion about differences between
“effects”and “relationships”and other signals that are
important for determining the appropriate analysis.
Further, we build from analyses and concepts with
which students are comfortable. Our students reported
relatively high levels of comfort in creating and inter-
preting graphs in Excel (see Table 1). When students
are able to create appropriate graphs with, for exam-
ple, error bars or scatterplots, this is a springboard to
asking “how do you know if these are significant?”As
Xxxxxx 2019 9
students will have already considered the research
question and level of measurement to create the
graph, applying a t-test, analysis of variance, or corre-
lation becomes an easier leap for students. Upon soli-
difying these basic tests, the instructor can teach more
complex tests by building upon prior knowledge and
using the research question as a guiding goal.
Competency 5: Interpreting Results
What This Competency Entails
Students must be able to explain the results of their
analyses. Interpretation may involve, for example,
determining statistical significance, the nature of any
effects (e.g., positive versus negative relationships), the
practical significance of the effects (i.e., effect sizes),
and identifying and describing trends.
Why Developing This Competency is a Challenge
Findings by Pilling et al. (2012) support the asser-
tion that interpreting data is challenging. A student’s
ability to develop this competency is a function of the
complexity of the analysis. While interpreting results
may be a relatively easy competency to develop for
basic univariate analyses, interpretation is more chal-
lenging for advanced analyses. For example, multiple
tests require adjustments to the significance level (e.g.,
Bonferroni corrections), and logistic regression
requires students to interpret the nonintuitive logit
function (i.e., the log of the odds).
Students must interpret not only statistical signifi-
cance but also practical significance. While various
guidelines have been developed to help with the inter-
pretation of practical significance (e.g., a correlation
greater than .8 is “very strong”), these rule-of-thumb
approaches are perhaps more harmful than helpful.
They discourage students from thinking about the spe-
cific research context, which is critical to evaluating
effect sizes. A correlation of .6 could be extremely large
in some marketing realms and extremely small in others.
Not all interpretation requires assessment of statis-
tical significance. However, even students who intui-
tively understand basic descriptive statistics, such as
means, medians, and percentages, may struggle to
master this competency. We have found that the
common “dashboard”approach, in which numerous
metrics are reported to provide a holistic view of per-
formance, can be overwhelming. Students are often
unsure of how to integrate multiple results into
a parsimonious and useful decision-making tool.
Effective Practices
Beyond the simple recommendation of ample prac-
tice, we have found that instructors need to shift stu-
dents’thought processes from systematic and linear to
broader and more holistic. One way to accomplish this
goal is by having students write about their findings,
making the best argument they can to support their
conclusion. Research reveals that writing in this way
leads people to think more critically and build a more
complete picture of the decision problem, thus reducing
bias and leading to better decisions (Sieck & Yates, 1997).
Research on developing effective critical thinking
skills also suggests that simple prompt questions such
as “Do you have all the necessary information?”and
“Is there any conflict in the evidence?”can spur stu-
dents to think more deeply about what the results
mean (Helsdingen, van Gog, & van Merriënboer,
2011). Doing so encourages students to consider the
specific conditions under which the data were col-
lected, thus helping students interpret the practical
significance of their findings. Given that practical sig-
nificance is context dependent, instructors should
emphasize to students the importance of ignoring gen-
eral effect size guidelines, but instead consider the
context that led to the observed effects.
One exercise we use that encourages students to con-
sider whether they have all the necessary information
involves analyzing data from a retail outlet. We present
students with customer satisfaction data, for both week-
days and weekends, regarding (1) store cleanliness, (2)
product selection, (3) staff friendliness, and (4) checkout
times. We also provide sales volume (in dollars). We first
have students analyze the results across all days, and
they draw general conclusions regarding customer satis-
faction (e.g., “Customers are, on average, satisfied with
product selection”). We then ask students whether it is
possible that these conclusions are not always true. From
personal experience, students posit that shopping on
weekdays and weekends may differ. Thus, we have stu-
dents analyze the data for weekdays and weekends
10 Marketing Education Review
separately, and a more nuanced picture emerges.
Students find that customer satisfaction with store clean-
liness and product selection is lower on weekends than
weekdays,while customer satisfaction with staff friendli-
ness and checkout times is high for both weekends and
weekdays. Sales volume is higher on weekends than
weekdays. Students often struggle to understand what
these results suggest initially. Upon reflection, they
reach the (correct) conclusions that during the weekend,
when sales volume is high, the store is shorthanded.
Employees focus on being friendly and checking out
customers promptly, but they do not have enough
time to restock shelves and tidy up between customers.
The only way to arrive at this conclusion is to look at
multiple results and synthesize the pattern of findings
into a coherent account.This exercise also provides prac-
tice with the common dashboard approach to marketing
metrics.
As another exercise, we ask students if a -
five percent unemployment rate for the United
States is indicative of a strong economy. As students
are often unfamiliar with the actual unemployment
rate, their initial reactions are usually based on
whether five percent “feels”high or low. When
asked if they have all the necessary information to
draw a conclusion, students realize they do not, and
we have them look up historical US unemployment
rates from the US Bureau of Labor Statistics web site.
At different points over the past 70 years, a -
five percent unemployment rate could indicate
a strong or weak economy, depending on recent
past economic conditions. We then explain how
official unemployment rates are computed, and we
ask whether there is any evidence that conflicts
with their conclusion about the strength of the
economy. When students understand that unem-
ployment rates exclude people who have dropped
out of the labor force, they realize that a low unem-
ployment rate does not necessarily indicate a strong
economy, as a large number of people may have
removed themselves from the labor force due to
being unable to find employment. This example
also illustrates the ability to politicize metrics, as
political parties often focus on one metric (e.g.,
unemployment rate) over another (e.g., number of
people not in the labor force).
Competency 6: Communicating Results
What This Competency Entails
Students must learn to communicate the results to
decision makers, which may involve written reports,
including appropriate tables, charts, graphs, or other
data visualization techniques, and oral presentations.
Critically, students must be able to convey the results
succinctly and in ways that make them managerially
actionable.
Why Developing This Competency is a Challenge
Although students need effective communication
skills for conveying technical information such as sta-
tistical results, they typically receive limited training
in this area (Brownell, Price, & Steinman, 2013; Wright
& Larsen, 2016). This may be particularly true in busi-
ness disciplines, where students are often required to
take a course in Business Writing but not a course in
Technical Writing. Consequently, students learning
analytics typically lack the skills necessary to commu-
nicate their findings effectively. While numerous
charts, graphs, tables, and other visualization tools
can facilitate communication of decision-relevant
information, these tools are useless if students lack
either the technical skills to create them or knowledge
about how to best format them (e.g., labeling points
and axes, appropriate scale, headings and titles).
While insufficient training is likely a major reason that
students struggle to communicate their findings, another
impediment is the curse of knowledge (e.g., Camerer,
Loewenstein, & Weber, 1989).HeathandHeath(2007)
state, “…when we know something, it becomes hard for
us to imagine not knowing it. As a result, we become
lousy communicators.”Students who have become pro-
ficientatanalyzingdataand the other competencies
described here may have difficultly relating to people
who do not have similar levels of knowledge. The dis-
connection between statistical analyses and actionable
decisions is likely to be amplified if the decision-maker
lacks sufficient statistical knowledge. It may be unclear to
the decision-maker how a regression coefficient or
ap-value relates to the decision at hand. For these rea-
sons, communicating results is likely to be difficult.
Xxxxxx 2019 11
Effective Practices
Several organizations have created programs designed
to improve technical information communications.
Among these, the Alan Alda Center for Communicating
Science at Stony Brook University offers courses, work-
shops, and outreach.
2
The American Association for the
Advancement of Science created the Center for Public
Engagement with Science and Technology, which offers
a communication toolkit.
3
The National EMSC Data
Analysis Resource Center (NEDARC) provides guidelines
for effectively communicating statistics.
4
Example guide-
lines include: (1) Do not overload the client with statis-
tics. Instead, present only meaningful results that convey
the size of the issue, establish the appropriate context,
and are new or unique findings. (2) Avoid statistical ter-
minology (e.g., “statistically significant,”“p-value”).
Instead, use language that the client is likely to under-
stand (e.g., “more likely”or “less likely”). (3) In general,
use words to convey critical points instead of numbers.
Specific to the realm of marketing analytics, Xavier
University created the Master of Science in Customer
Analytics degree (Houghton et al., 2018). This program
develops storytelling skills to enhance students’ability to
communicate analytics to decision makers.
More generally, communicating results requires stu-
dents to abandon rote plug-in-the-answer thinking and
embrace a deeper understanding of what is to be
described (Johnson, 2016;Radke-Sharpe,1991).
Scaffolding can aid the transition from “just tell me
what to say”to “how do I explain this?”For instance,
when introducing bivariate correlation, we provide exam-
ples of how to communicate the results, such as “There
was a large positive relationship between consumers’rat-
ings of how cool and how innovative they found the
product to be, r=.76,n=260.Themoreinnovative
consumers considered the product, the cooler they also
considered it.”For any associated exercises, we initially
encourage students to simply mimic the wording and
format of these examples. As they gain experience and
begin selecting analyses that they deem appropriate, stu-
dents often refer to our examples to recall how to com-
municate results. With practice, students are able to
interpret and communicate results without help.
As for overcoming the curse of knowledge, students
must first recognize that it exists. We stress to students
that many people, including the clients or decision
makers requesting data, are unlikely to understand sta-
tistical concepts such as p-values, regression coefficients,
and measures of dispersion. In communicating analytics
to clients, one should get to know the client, including
her/his level of statistical knowledge. Before communi-
cating with the client, the analyst should refine the
communications on someone less knowledgeable.
5
CONCLUSIONS
As the importance of marketing analytics continues to
grow, departments seeking to integrate analytics into
their curricula should focus on developing the six critical
competencies described here. Specifically, students must
learn to assess data quality and measurement concepts,
manage datasets, appropriately analyze data, interpret
results, and communicate the results to clients or deci-
sion-makers. Someone lacking any of these competen-
cies is unlikely to realize fully analytics’benefits. We also
encourage instructors to supplement the competencies
identified here with a discussion of ethical considera-
tions associated with using analytics and “big data,”as
organizations must consider the types of data they col-
lect about their customers and how they use these data
(Corrigan, Craciun, & Powell, 2014).
Our goal was to present the competencies and, more
importantly, challenges students are likely to face in
developing them in a marketing context. In doing so,
we have pointed instructors to practices, resources, and
theoretically-based recommendations that they can uti-
lize in developing the competencies, many of which we
employ in our classes. The obvious question is whether
teaching analytics courses with an eye toward develop-
ing the competencies presented here is effective. In our
classes referenced in Table 1, we surveyed students at the
end of the semester using the same questions from the
beginning of the semester. Table 1 presents the results in
the columned labeled “End of Semester.”While the per-
ceived difficulty of statistics and working with data did
not change, students’perceptions of their experience
with statistics and data did increase significantly.
Importantly, after being exposed to many of the
2
http://www.centerforcommunicatingscience.org/.
3
https://www.aaas.org/pes.
4
http://www.nedarc.org/tutorials/utilizingData/index.html.
5
Suggestions provided by http://www.nicholasreese.com/curse-
of-knowledge/.
12 Marketing Education Review
practices and examples described here, students felt sig-
nificantly more knowledgeable or comfortable with all
of the specific course components that related to the
various competencies. At a higher level, as shown in
Figure 1,studentsbecamesignificantly more comforta-
ble with (M
Beginning
=4.52,M
End
=5.69,t
122
=11.01,
p< .001) and interested in (M
Beginning
=4.27,M
End
= 4.63,
t
122
= 3.25, p= .001) analytics from the beginning to the
end of the semester.
How should a marketing department modify its
curriculum to develop these competencies? Building
the competencies by integrating them into existing
courses is likely to be difficult (Saber & Foster, 2011),
due both to time and instructor interest/expertise con-
straints (Mintu-Wimsatt & Lozada, 2018). A more
effective approach involves creating stand-alone ana-
lytics-based courses, as advocated by Pilling et al.
(2012) and Liu and Burns (2018), who provide gui-
dance related to specific educational goals, instruc-
tional plans, metrics and analysis tools to be covered,
and evaluation criteria. Wilson et al. (2018), Liu and
Levin (2018), and LeClair (2018) highlight how such
a course could fit into the marketing curriculum, and
the competencies discussed in the current research
complement these recommendations.
We agree with Liu and Burns (2018) that a single ana-
lytics survey course offers value by exposing students to
relevant topics, but is unlikely to create analytics profi-
ciency. Further, a single course may become isolated, dis-
couraging integration of the topic across the curriculum
(LeClair, 2018). For these reasons, we advocate for
a sequence of courses, each focusing on one or more of
the competencies and presented from a marketing per-
spective. A number of universities have developed
undergraduate and/or graduate marketing analytics con-
centrations. We examined concentrations offered by 10
US universities, and several points are noteworthy. First,
concentrations are typically advertised as requiring three
or four courses. However, these courses often have prere-
quisites, suggesting that, in practice, analytics proficiency
requires more than the three or four advertised
courses. Second, perhaps due to resource and expertise
Figure 1
Changes in Student Comfort with and Interest in Analytics.
Notes: Onset = Beginning of Semester, Complete = End of Semester. Comfort Scale: Onset
α= .75 And Complete α= .89. Interest Scale: Onset α= .81 And Complete α= .85
Xxxxxx 2019 13
constraints, marketing departments typically outsource at
least some courses to other departments (Liu & Levin,
2018), such as Computer Science, Math/Statistics, and
Communications. Third, most of the concentrations we
examined include at least some existing courses that have
been packaged to create a cluster of analytics-related
courses. In some cases, analytics concentrations (or
courses) are simply rebranded “research”concentrations
(or courses), with new titles (e.g., “Marketing Research and
Analytics”). Fourth, marketing analytics concentrations
are sometimes owned by other departments, such as
math or statistics. This suggests that if marketing
departments are not willing or able to offer
a concentration, strong demand is leading other depart-
ments to fill the void.
While these approaches are understandable, they are
likely to lead to concentrations that do not adequately
develop the competencies identified here or are not well
integrated in the curriculum (LeClair, 2018;Mintu-
Wimsatt & Lozada, 2018). Assuming the opportunity to
develop a marketing analytics concentration from
scratch, Table 2 presents a proposed curriculum map for
afive-course concentration based on the six competencies
presented here. For each course, we describe the compe-
tencies to be developed, examples of major topics cov-
ered, and basic pedagogical approaches for doing so. The
Table 2
Proposed Five-Course Marketing Analytics Concentration
Course 1:
Marketing Data and
Information Management
Competencies Developed Assessing data quality, understanding measurement, managing datasets
Major Topics Covered Secondary sources of marketing data (e.g., governments, trade associations);
Acquiring primary marketing data (e.g., survey design, web scraping
software, in-store scanners, consumer panels, experimentation); sampling;
construct definition; reliability; validity; level of measurement; managing
datasets/databases (creating, merging, formatting, creating new variables)
for common analytics software (e.g., Excel, R, SAS)
Pedagogical
Approach(es)
Lecture-based course with numerous exercises designed to familiarize students
with common sources of marketing data, evaluating data, and preparing
data for analysis
Course 2:
Marketing Analytics I
Competency Developed Analyzing data
Major Topics Covered Univariate and bivariate analyses (e.g., descriptive statistics, t-tests, chi-square
tests, simple regression); marketing metric dashboards, including commonly
used metrics; Google Analytics/Adwords; common analysis software
Pedagogical
Approach(es)
Lecture-based course with numerous exercises designed to familiarize students
with common analyses, dashboards, and analysis software
Course 3:
Marketing Analytics II
Competency Developed Analyzing data
Major Topics Covered Multivariate analyses, including multiple regression, cluster analysis,
multidimensional scaling, factor analysis, predictive modeling; search engine
optimization; textual analysis; data mining; advanced database management
(e.g., SQL)
Pedagogical Approaches Lecture-based course with numerous exercises designed to familiarize students
with more advanced analyses and database management
Course 4:
Communicating Analytics for
Effective Decision Making
Competencies Developed Interpreting results, communicating results
Major Topics Covered Managerial implications; written and oral communications; data visualization
(e.g., Tableau)
Pedagogical Approaches Case-based course with multiple analytics-focused case studies, with associated
data files for analysis, that require oral and/or written presentations;
Supplemental exercises to gain experience in interpreting and presenting
results from analyses covered in previous two courses
Course 5:
Marketing Analytics
Practicum/
Internship
Competencies Developed Assessing data quality, understanding measurement, managing datasets,
analyzing data, interpreting results, communicating results
Major Topics Covered Depends on needs of organizational partner
Pedagogical Approach Real-world project-based course to tie competencies together
14 Marketing Education Review
first four courses would be in-class (or online) courses
designed to provide necessary structure for students. The
final course would be an internship or practicum, con-
ducted in conjunction with an outside organization,
designed to provide experiential learning opportunities.
As LeClair (2018,p.9)argues,andweagree,“it is impor-
tant for students to be exposed to and deal with the real
imperfections of data.”The courses could be structured
around a single topic, such as pricing, branding, or adver-
tising. However, we encourage instructors to include exer-
cises and examples from various topics to impress upon
students the value of analytics in each of these areas.
While specific details of the courses would depend on
class size, available resources (e.g., software), and various
other factors, we refer the interested reader to suggestions
by Liu and Levin (2018,pp.18–20).
For some students, analytics will become a career
path. Most others will interact with analytics in manage-
rial roles, or analytics will influence how they perform
their jobs. Regardless, it is becoming difficult for one to
avoid analytics in the workplace. Five of the top ten skills
employers recently said they desired in college graduates
were the ability to make decisions and solve problems
(#1), the ability to obtain and process information (#5),
the ability to analyze quantitative data (#6), proficiency
with computer software programs (#8), and the ability to
create written reports (#9) (Adams, 2014). These desir-
able skills highlight the importance of, and align nicely
with, the analytics competencies identified here.
DISCLOSURE STATEMENT
No potential conflict of interest was reported by the
authors.
REFERENCES
Adams, S. (2014, November 12). The 10 skills employers
most want in 2015 graduates. Forbes. Retreived from
https://www.forbes.com/sites/susanadams/2014/11/12/
the-10-skills-employers-most-want-in-2015-graduates
/#e498e4225116
Aggarwal, P., Vaidyanathan, R., & Rochford, L. (2007). The
wretched refuse of a teeming shore? A critical examina-
tion of the quality of undergraduate marketing
students. Journal of Marketing Education,29(3),
223–233. doi:10.1177/0273475307306888
Aragón, O. R., Dovidio, J. F., & Graham, M. J. (2016).
Colorblind and multicultural ideologies are associated
with faculty adoption of inclusive teaching practices.
Journal of Diversity in Higher Education. doi:10.1037/
dhe0000026
Ariker, M., Diaz, A., Moorman, C., & Westover, M. (2015
November 5). Quantifying the impact of marketing
analytics. Harvard Business Review. Retrieved from
https://hbr.org/2015/11/quantifying-the-impact-of-mar
keting-analytics
Ballou, D. P., & Pazer, H. L. (1985). Modeling data and
process quality in multi-input, multi- output informa-
tion systems. Management Science,31(2), 150–162.
doi:10.1287/mnsc.31.2.150
Bandura, A. (1977). Self-efficacy: Toward a unifying theory of
behavioral change. Psychological Review,84(2), 191–215.
doi:10.1037//0033-295x.84.2.191
Brownell, S. E., Price, J. V., & Steinman, L. (2013). Science
communication to the general public: Why we need to
teach undergraduate and graduate students this skill as
part of their formal training. Journal of Undergrad
Neuroscience Education,12(1), 6–10.
Camerer, C., Loewenstein, G., & Weber, M. (1989). The curse
of knowledge in economic settings: An experimental
analysis. Journal of Political Economy,97(5), 1232–1254.
doi:10.1086/261651
Cavanagh, A. J., Aragón, O. R., Chen, X., Couch, B.,
Durham, M., Bobrownicki, A., …Graham, M. J.
(2016). Student buy-in to active learning in a college
science course. Life Sciences Education,15(4), ar76.
doi:10.1187/cbe.16-07-0212
Chew, P. K., & Dillon, D. B. (2014). Statistics anxiety update:
Refining the construct and recommendations for a new
research agenda. Perspectives on Psychological Science,9
(2), 196–208. doi:10.1177/1745691613518077
Churchill, G. A., Jr. (1979). A paradigm for developing better
measures of marketing constructs. Journal of Marketing
Research,16(1), 64–73. doi:10.1177/002224377901600110
CMO Survey. (2015, August). CMO survey report: Highlights
and insight. Retrieved from https://cmosurvey.org/
results-august-2016/survey-results-august-2015/
Corrigan, H. B., Craciun, G., & Powell, A. M. (2014). How
does Target know so much about its customers?
Utilizing customer analytics to make marketing
decisions. Marketing Education Review,24(2), 159–166.
doi:10.2753/MER1052-8008240206
Davis, R., Misra, S., & Van Auken, S. (2002). A gap analysis
approach to marketing curriculum assessment: A study
of skills and knowledge. Journal of Marketing Education,
22, 218–224. doi:10.1177/0273475302238044
Delone, W. H., & McLean, E. R. (1992). Information systems
success: The quest for the dependent variable.
Information Systems Research,3(1), 60–95. doi:10.1287/
isre.3.1.60
DeRoma, V. M., Martin, K. M., & Kessler, M. L. (2003). The
relationship between tolerance for ambiguity and need
for course structure. Journal of Instructional Psychology,
30(2), 104–110.
Dweck, C. S. (1986). Motivational processes affect learning.
American Psychologist,4(10), 1040–1048. doi:10.1037/
0003-066X.41.10.1040
Edwards-Levy, A. (2016, November 15). Most Americans are
surprised by and unhappy with the election results. The
Huffington Post. Retrieved from http://www.huffington
Xxxxxx 2019 15
post.com/entry/election-results-americans-
surprised_us_582b82c3e4b01d8a014b13bd
Fredrickson, B. L. (2004). The broaden-and-build theory of
positive emotions. Philosophical Transactions of the Royal
Society B Biological Sciences,359(1449), 1367–1378.
doi:10.1098/rstb.2004.1512
Gilliam, D. A., & Voss, K. (2013). A proposed procedure for
construct definition in marketing. European Journal of
Marketing,47(1/2), 5–26. doi:10.1108/03090561
311285439
Goodhue, D. L. (1995). Understanding user evaluations of
information systems. Management Science,41(12),
1827–1844. doi:10.1287/mnsc.41.12.1827
Heath, C., & Heath, D. (2007). Made to stick: Why some ideas
survive and others die. New York, NY: Random House.
Helsdingen, A., van Gog, T., & van Merriënboer, J. (2011).
The effects of practice schedule and critical thinking
prompts on learning and transfer of a complex judg-
ment task. Journal of Educational Psychology,103(2),
383–398. doi:10.1037/a0022370
Hernandez, P. R., Schultz, P. W., Estrada, M., Woodcock, A.,
& Chance, R. C. (2013). Sustaining optimal motivation:
A longitudinal analysis of interventions to broaden par-
ticipation of underrepresented students in STEM.
Journal of Educational Psychology,105(1), 89–107.
doi:10.1037/a0029691
Hodge, V. J., & Austin, J. (2004). A survey of outlier detection
methodologies. Artificial Intelligence Review,22(2),
85–126. doi:10.1023/B:AIRE.0000045502.10941.a9
Houghton, D. M., Schertzer, C., & Beck, S. (2018). The MSCA
program: Developing analytics unicorns. Marketing
Education Review,28(1), 41–51. doi:10.1080/
10528008.2017.1409078
Huang, J. L., Curran, P. G., Kenney, J., Poposki, E. M., &
DeShon, R. P. (2012). Detecting and deterring insufficient
effort in responding to surveys. Journal of Business and
Psychology,27,99–114. doi:10.1007/s10869-011-9231-8
Interactive Advertising Bureau. (2009). Measurement protocols
and guidelines. Retrieved from https://www.iab.com/
guidelines/measurement-protocols-guidelines/.
Jarke, M., & Vassiliou, Y. (1997). Data warehouse quality:
A review of the DWQ project. In D. Strong & B. Kahn
(Eds.), Proceedings of the 1997 conference on information
quality (pp. 299–313). Cambridge, MA: MIT Press.
Johnson, K. G. (2016). Incorporating writing into statistics.
In J. Dewar, P. Hsu & H. Pollatsek (Eds.), Mathematics
Education: A Spectrum of Work in Mathematical
Sciences Departments (pp. 319-334). Cham, Switzerland:
Springer.
Kolb, D. A. (1984). Experience as the source of learning and
development. Upper Saddle River, NJ: Prentice Hall.
LaBarbera, P. A., & Simonoff, J. (1999). Toward enhancing
the quality and quantity of marketing students. Journal
of Marketing Education,21,4–13. doi:10.1177/
0273475399211002
LeClair, D. (2018). Integrating business analytics in the mar-
keting curriculum: Eight recommendations. Marketing
Education Review,28(1), 6–13. doi:10.1080/
10528008.2017.1421050
Lee, Y. W., Strong, D. M., Kahn, B. K., & Wang, R. Y. (2002).
AIMQ: A methodology for information quality
assessment. Information and Management,40, 133–146.
doi:10.1016/S0378-7206(02)00043-5
Little, R. J., & Rubin, D. B. (2014). Statistical analysis with
missing data. Hoboken, NJ: Wiley & Sons.
Liu, X., & Burns, A. C. (2018). Designing a marketing analy-
tics course for the digital age. Marketing Education
Review,28(1), 28–40. doi:10.1080/
10528008.2017.1421049
Liu, Y., & Levin, M. A. (2018). A progressive approach to
teaching analytics in the marketing curriculum.
Marketing Education Review,28(1), 14–27. doi:10.1080/
10528008.2017.1421048
Mintu-Wimsatt, A., & Lozada, H. R. (2018). Business analy-
tics in the marketing curriculum: A call for integration.
Marketing Eduation Review,28(1), 1–5. doi:10.1080/
10528008.2018.1436974
Pilling, B. K., & Nasser. (2015). The early identification of
at-risk students in an undergraduate marketing metrics
course. Analytic Marketing Journal,4(1), 89–106.
Pilling, B. K., Rigdon, E. E., & Brightman, H. J. (2012).
Building a metrics-enabled marketing curriculum: The
cornerstone course. Journal of Marketing Education,34
(2), 179–193. doi:10.1177/0273475312450390
Pirog, S. F., III. (2010). Promoting statistical analysis in the
marketing curriculum: A conjoint analysis exercise.
Marketing Education Review,20, 249–254. doi:10.2753/
MER1052-8008200305
Pirsig, R. M. (1974). Zen and the art of motorcycle maintenance:
An inquiry into values. New York, NY: HarperCollins
Publishers.
Radke-Sharpe, N. (1991). Writing as a component of statis-
tics education. The American Statistician,45(4), 292–293.
Rossiter, J. R. (2005). Reminder: A horse is a horse.
International Journal of Research in Marketing,22(1),
23–25. doi:10.1016/j.ijresmar.2004.11.001
Saber, J. L., & Foster, M. K. (2011). The agony and the
ecstasy: Teaching marketing metrics to undergraduate
business students. Marketing Education Review,21(1),
9–20. doi:10.2753/MER1052-8008210102
Schau, C., Stevens, J., Dauphinee, T. L., & Del Vecchio, A.
(1995). The development and validation of the survey
of attitudes toward statistics. Educational and
Psychological Measurement,55,868–875. doi:10.1177/
0013164495055005022
Schlee, R. P., & Harich, K. R. (2010). Knowledge and skill
requirements for marketing jobs in the 21st century.
Journal of Marketing Education,32(3), 341–352.
doi:10.1177/0273475310380881
Sieck, W., & Yates, J. (1997). Exposition effects on deci-
sion making: Choice and confidence in choice.
Organizational Behavior and Human Decision
Processes,70(3), 207–219. doi:10.1006/obhd.
1997.2706
Spielberger, C. D. (2013). The effects of anxiety on com-
plex learning. In C. Spielberger (Ed.), Anxiety and
behavior (pp. 361–398). New York, NY: Elsevier
Science.
16 Marketing Education Review
Velleman, P. F., & Wilkinson, L. (1993). Nominal, ordinal,
interval, and ratio typologies are misleading. The
American Statistician,47,65–72.
Vygotsky, L. S. (1978). Mind in society. Cambridge, MA:
Harvard University Press.
Wand, Y., & Wang, R. Y. (1996). Anchoring data quality
dimensions in ontological foundations. Communications
of the ACM,39(11), 86–95. doi:10.1145/240455.240479
Wang, R. Y., & Strong, D. M. (1996). Beyond accuracy: What
data quality means to data consumers. Journal of
Management Information Systems,12(4), 5–34.
doi:10.1080/07421222.1996.11518099
Westra, A., & Nwaoha-Brown, F. (2017). Nonresponse bias
analysis for wave 1 of the 2014 Survey of Income and
Program Participation (SIPP). US Census Bureau
Memorandum Retrieved from https://www2.census.
gov/programs-surveys/sipp/tech-documentation/com
plete-documents/2014/2014_SIPP_Wave_1_
Nonresponse_Bias_Report.pdf
Wilson,E.J.,McCabe,C.,&Smith,R.S.(2018). Curriculum
innovation for marketing analytics. Marketing Education
Review,28(1), 52–66. doi:10.1080/10528008.2017.
1419431
Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring
in problem solving. Journal of Child Psychology and
Psychiatry,17(2), 89–100. doi:10.1111/j.1469-7610.1976.
tb00381.x
Wright, N. D., & Larsen, V. (2016). Improving marketing
students’writing skills using a one- page paper.
Marketing Education Review,26(1), 25–32. doi:10.1080/
10528008.2015.1091666
Zmud, R. (1978). Empirical investigation of the dimensionality
of the concept of information. Decision Sciences,9(April),
187–189. doi:10.1111/j.1540-5915.1978.tb01378.x
Xxxxxx 2019 17