ArticlePDF Available

INTEGRATING ANALYTICS INTO MARKETING CURRICULA: CHALLENGES AND EFFECTIVE PRACTICES FOR DEVELOPING SIX CRITICAL COMPETENCIES

Authors:

Abstract and Figures

As organizations become increasingly dependent on marketing analytics, universities are adapting their curricula to equip students with skills necessary to operate in data-rich environments. We describe six competencies that students need to become proficient with analytics: (1) assessing data quality, (2) understanding measurement, (3) managing datasets, (4) analyzing data, (5) interpreting results, and (6) communicating results. We discuss what these competencies entail, challenges students may face in developing them, and effective practices for instructors to foster the competencies. We provide data that support the value of teaching analytics with a focus on developing these competencies.
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=mmer20
Marketing Education Review
ISSN: 1052-8008 (Print) 2153-9987 (Online) Journal homepage: https://www.tandfonline.com/loi/mmer20
INTEGRATING ANALYTICS INTO MARKETING
CURRICULA: CHALLENGES AND EFFECTIVE
PRACTICES FOR DEVELOPING SIX CRITICAL
COMPETENCIES
Danny Weathers & Oriana Aragón
To cite this article: Danny Weathers & Oriana Aragón (2019): INTEGRATING ANALYTICS
INTO MARKETING CURRICULA: CHALLENGES AND EFFECTIVE PRACTICES FOR
DEVELOPING SIX CRITICAL COMPETENCIES, Marketing Education Review, DOI:
10.1080/10528008.2019.1673664
To link to this article: https://doi.org/10.1080/10528008.2019.1673664
Published online: 03 Oct 2019.
Submit your article to this journal
View related articles
View Crossmark data
INTEGRATING ANALYTICS INTO MARKETING CURRICULA: CHALLENGES
AND EFFECTIVE PRACTICES FOR DEVELOPING SIX CRITICAL
COMPETENCIES
Danny Weathers
,
and Oriana Aragón
College of Business, Clemson University, Clemson, SC, USA
As organizations become increasingly dependent on marketing analytics, universities are adapting
their curricula to equip students with skills necessary to operate in data-rich environments. We describe
six competencies that students need to become procient with analytics: (1) assessing data quality, (2)
understanding measurement, (3) managing datasets, (4) analyzing data, (5) interpreting results, and (6)
communicating results. We discuss what these competencies entail, challenges students may face in
developing them, and effective practices for instructors to foster the competencies. We provide data
that support the value of teaching analytics with a focus on developing these competencies.
INTRODUCTION
The ease of collecting data through means such as con-
sumer surveys and panels, in-store scanner systems, and
online behavior tracking software, just to name a few,
allows marketing decision-makers to have more infor-
mation at their disposal than ever before. Consequently,
the eld of marketing analytics, dened here as quanti-
tative data used to make marketing decisions and evalu-
ate marketing performance, is growing rapidly. To
illustrate, a recent CMO Survey (2015) found that com-
panies use analytics for numerous marketing functions
and were expected to devote 11.1% of their marketing
budget to analytics in 2018, up from 6.7% in 2015.
Because theuse of analytics ispositively related to prots
and ROI (Ariker, Diaz, Moorman, & Westover, 2015), the
widespread and growing use of marketing analytics, and
employersdesires to nd employees with quantitative
skills (Schlee & Harich, 2010), are not surprising.
In response to this trend, universities have been adapt-
ing their marketing curricula to train students to operate
in data-rich business environments. These curricular
changes have taken several forms. In some cases, univer-
sities have created new degree programs in Marketing
Analyticsor Business Analytics.In other cases, instruc-
tors have adapted existing courses by integrating course-
relevant analytics content and exercises. Between these
extremes, departments have created stand-alone courses
dedicated to marketing analytics (Pilling, Rigdon, &
Brightman, 2012;Saber&Foster,2011).
Growth in analytics courses has prompted researchers
to provide guidance to instructors who develop and
teach such courses (see, for example, Houghton,
Schertzer, & Beck, 2018;LeClair,2018; Liu & Burns,
2018; Liu & Levin, 2018; Pilling et al., 2012;Pirog,
2010; Saber & Foster, 2011; Wilson, McCabe, & Smith,
2018). Recommendations tend to focus on course con-
tent, course structure, teaching interventions, and inte-
grating analytics courses into the overall curriculum.
They encourage instructors to introduce metrics and
analyses related to specic areas of marketing, such as
brand management and target market assessment, use
evidence-based pedagogical tools, provide active-
learning opportunities, align course content with
intended learning outcomes, provide students with rele-
vant feedback, and push students to stretch in their
learning. They also tout the merits of integrating market-
ing concepts with exercises that utilize specic software,
such as R and Excel, to illustrate concept-relevant
metrics, and they demonstrate the effectiveness of spe-
cic analytics-related exercises.
The current research complements existing research by
examining in detail the skills students need to become
procient with analytics. Regardless of whether instruc-
tors expose students to analytics through stand-alone
courses or on a more limited basis through integrating
analytics exercises into existing courses, instructors
should work to develop specic competencies to enable
Address correspondence to Danny Weathers, College of
Business, Clemson University, Sirrine Hall, Clemson, SC
29634, USA. E-mail: pweath2@clemson.edu
Marketing Education Review, vol. 00, no. 00 (2019), pp. 117.
Copyright Ó2019 Society for Marketing Advances
ISSN: 10528008 (print) / ISSN 21539987 (online)
DOI: https://doi.org/10.1080/10528008.2019.1673664
students to master the topic. Consequently, our goal is
threefold. First, we describe six critical competencies for
becoming procient with analytics. Second, we discuss
challenges students are likely to face in developing the
competencies. Third, we present effective practices for
instructors attempting to develop these competencies in
students. While we are not the rst to identify the six
competencies, we provide guidance to instructors by
synthesizing the competencies, as well as the associated
challenges and effective practices, in a way that makes
them directly relevant to courses that contain marketing
analytics content. Given the call to better integrate ana-
lytics courses into the marketing curriculum (e.g., LeClair,
2018; Mintu-Wimsatt & Lozada, 2018), the competencies
provide a framework for developing a cohesive analytics
concentration, as we describe in the Conclusions section.
We present data that support the value of teaching analy-
tics courses with an eye toward developing these
competencies.
CRITICAL COMPETENCIES, CHALLENGES,
AND EFFECTIVE PRACTICES
General Information about the Competencies
Our experience in developing courses in marketing ana-
lytics lead us to advocate for six competencies that are
critical to prociency with analytics: (1) assessing data
quality, (2) understanding measurement, (3) managing
datasets, (4) analyzing data, (5) interpreting results, and
(6) communicating results. To develop the competencies,
instructors can use multiple established learning theories,
such as proximal learning theory (Vygotsky, 1978), the-
ory of intelligence (Dweck, 1986), and self-efcacy theory
(Bandura, 1977).Forexample,Wilsonetal.(2018)pro-
vide a useful discussion of creating an innovative market-
ing analytics curriculum based on experiential learning
theory (e.g., Kolb, 1984). Consequently, rather than
dening our efforts through any one theoretical perspec-
tive, we take a more holistic approach by calling upon
various theories to identify challenges and support our
effective practice recommendations.
General Preparation for and Challenges to
Developing the Competencies
Research reveals that the quantitative skills of marketing
students are often lacking (e.g., Aggarwal, Vaidyanathan,
& Rochford, 2007;Davis,Misra,&VanAuken,2002),
which is likely to lead students to be anxious about analy-
tics courses. Research also demonstrates that anxiety inhi-
bits learning (e.g., Chew & Dillon, 2014;Fredrickson,
2004;Hernandez,Schultz,Estrada,Woodcock,&
Chance, 2013;Spielberger,2013), particularly in quanti-
tative courses (Pilling & Nasser, 2015). To conrm and
extend these ndings, we surveyed students in our under-
graduate Marketing Metrics and Analytics courses.
1
One
set of survey items captured studentsperceptions of the
difculty of statistics,”“data,and math.Another set of
items captured studentsperceptions of their experience
with statistics,”“data,and math.These sets of items
were adapted from the Survey of Attitudes toward
Statistics scale (Schau, Stevens, Dauphinee, & Del
Vecchio, 1995). Another set of items, designed to align
with our six competencies, assessed studentscomfort with
or knowledge of specic topics to be covered in the course,
(e.g., creating pivot tables, computing z-scores, interpret-
ing graphs). Each item was measured on a seven-point
scale where higher numbers indicate greater difculty,
experience, knowledge, or comfort. Means for each mea-
sure are provided in Table 1 under the Beginning of
Semestercolumn. We also obtained each students
overall college GPA and university math placement test
score. We highlight general ndings and conclusions
here. Interested readers can contact the authors for addi-
tional details.
Aligning with previous research (Davis et al., 2002),
students entered the course feeling that they had sig-
nicantly less experience with statistics and data than
with math, and experience with each topic was nega-
tively related to perceptions of the topics difculty.
For the specic topics covered in the course, the aver-
age level of comfort or perceived knowledge fell below
the scale midpoint (4) for 22 of the 29 topics. Students
also indicated signicantly less comfort with analytics
(a combination of statistics and data) than with math.
1
Course objectives included: (a) Learning how metrics are
derived from organizational strategies and missions. (b) Learning
measurement-relevant concepts, such as construct denition, relia-
bility, and validity. (c) Deriving and computing metrics relevant to
a variety of marketing decisions, including, but not limited to,
decisions related to pricing, online and social media strategy, adver-
tising, product development, customer targeting, branding, and dis-
tribution. (d) Becoming familiar with data sources that provide
information needed to compute marketing metrics. (e) Learning
when and how marketing metrics should and should not be used.
(f) Learning how to clearly and effectively communicate marketing
metrics to others both within and outside of the organization. (g)
Learning how to use Excel to summarize and present data.
2Marketing Education Review
Table 1
StudentsPerceptions of Difficulty, Experience, Knowledge, and Comfort at the Beginning and End of the Semester
Measure
Beginning of
Semester
a
End of Semester
a
P-value comparing beginning
and end of semester
b
Perceived Difficulty
Statistics 4.66 (0.75) 4.58 (0.91) .48
Data 4.64 (0.70) 4.57 (0.75) .54
Math 4.61 (0.70) 4.51 (0.88) .30
Significant differences
c
None None
Perceived Experience
Statistics 4.37 (0.85) 4.74 (0.94) < .01
Data 4.35 (1.08) 4.79 (0.86) < .01
Math 6.33 (0.57) 6.35 (0.56) .97
Significant differences
c
Math > Statistics, Data Math > Statistics,
Data
Competency 1: Assessing Data Quality
Determining data appropriateness for answering research
question
d
3.38 (1.29) 4.87 (1.05) < .01
Assessing the quality of data
e
3.78 (1.40) 5.40 (1.03) < .01
Competency 2: Understanding Measurement
Data units of measurement
d
4.06 (1.31) 4.87 (1.20) < .01
Measurement reliability
d
2.86 (1.34) 4.51 (1.09) < .01
Measurement validity
d
2.65 (1.40) 4.54 (1.22) < .01
Competency 3: Managing Datasets
Data management
d
2.59 (1.22) 4.63 (1.20) < .01
Statistical distributions of data
d
3.36 (1.24) 4.54 (1.35) < .01
Creating survey items
d
3.53 (1.32) 5.25 (1.07) < .01
Weighting scores within an index
d
2.44 (1.39) 4.60 (1.28) < .01
Creating a data file (or dataset)
e
3.76 (1.61) 5.31 (1.16) < .01
Working with Excel
e
4.33 (1.64) 5.72 (1.00) < .01
Organizing data in Excel
e
4.40 (1.65) 5.75 (1.05) < .01
Cleaning data in Excel
e
2.94 (1.38) 5.94 (0.98) < .01
Creating index variables in Excel
e
2.62 (1.29) 4.86 (1.24) < .01
Weighting scored items in Excel
e
3.12 (1.65) 5.10 (1.32) < .01
Creating z-scores in Excel
e
3.00 (1.43) 5.36 (1.23) < .01
Competency 4: Analyzing Data
Choosing right analysis techniques
d
2.74 (1.34) 4.63 (1.23) < .01
Linear regression
d
3.64 (1.31) 4.52 (1.19) < .01
Pathway modeling
d
1.94 (1.26) 3.91 (1.43) < .01
Creating Pivot Tables in Excel
e
3.43 (1.86) 5.85 (1.03) < .01
Competency 5: Interpreting Results
Interpreting data
d
4.04 (1.33) 5.11 (1.13) < .01
Interpreting p-values
d
3.46 (1.47) 4.54 (1.30) < .01
Interpreting regression coefficients
d
2.67 (1.43) 4.47 (1.26) < .01
Interpreting effect sizes
d
2.41 (1.25) 4.43 (1.30) < .01
How data are used by business
d
3.40 (1.26) 5.10 (1.03) < .01
Interpreting graphs
e
4.99 (1.27) 5.84 (1.01) < .01
Statistical error and variance
e
3.74 (1.25) 4.49 (1.32) < .01
Competency 6: Communicating Results
Creating graphs in Excel
e
4.76 (1.60) 6.00 (0.95) < .01
Choosing the appropriate graph to illustrate your data
e
4.61 (1.38) 5.73 (1.03) < .01
Effectively writing about technical information
e
3.94 (1.59) 5.12 (1.16) < .01
Effectively presenting technical information orally
e
3.70 (1.59) 5.10 (1.27) < .01
Notes:
a
Means with standard deviations in parentheses
b
Based on dependent samples t-test
c
p< .05 based on Bonferroni adjustments for multiple comparisons
d
Knowledge assessed on a seven-point scale: 1 = No knowledge, 7 = Complete knowledge
e
Comfort assessed on a seven-point scale: 1 = Extremely uncomfortable, 7 = Extremely comfortable
Xxxxxx 2019 3
Only 15% of students said they were moderately or
extremely comfortable with analytics, while 85% of
students indicated something less than moderate
levels of comfort. Neither GPA nor math placement
was signicantly related to comfort with analytics.
Thus, students with higher levels of academic success,
even math-specic success, were no more comfortable
with their ability to handle analytics-related concepts
than were students with less academic success.
Noting the challenge of heterogeneous student pre-
paration, Pilling et al. (2012,p.188)state,Although the
large majority of students arguably possess the prior
knowledge to succeed in the [analytics] course, the level
of functional prior knowledge brought to the course is
inconsistent.We also found this to be true; our students
GPAs ranged from 2.7 to 4.0, and their math placement
scores ranged from 40 to 99. LaBarbera and Simonoff
(1999)nd that marketing majors often consider quanti-
tative coursework as unimportant. Supporting these nd-
ings, among our students, only 47% reported any level of
interest in analytics. Thus, instructors must rst get stu-
dents to buy in to the importance of analytics, perhaps by
following the EPIC model of exposure, persuasion, identi-
cation, and commitment (Aragón, Dovidio, & Graham,
2016; Cavanagh et al., 2016). Overall, our ndings sup-
port the conclusions of previous research regarding the
challenges inherent to teaching analytics to Marketing
students.
Competency 1: Assessing Data Quality
What This Competency Entails
Data can be created either within the organization
using the data (i.e., internally) or by another organization
(i.e., externally), and data can be created either for the
specic problem at hand (i.e., primary) or for other
research/decision-making purposes (i.e., secondary).
Regardless of who creates the data, or for what purpose,
analytics students should be able to assess the datasqual-
ity to understand its capabilities and limitations. Such an
assessment requires an understanding of the data collec-
tion process and the role of this process in obtaining
actionable information.
Data quality has four dimensions: intrinsic, contextual,
representational, and accessibility (e.g., Ballou & Pazer,
1985;Delone&McLean,1992; Goodhue, 1995;Jarke&
Vassiliou, 1997;Lee,Strong,Kahn,&Wang,2002;Wand
&Wang,1996;Wang&Strong,1996;Zmud,1978). The
intrinsic dimension includes issues such as whether the
data are accurate and free from bias. The contextual
dimension relates to issues such as whether the data are
sufciently current and relevant to the problem at hand.
The representational dimension refers to issues such as the
datas format and readability, and the accessibility dimen-
sion refers to whether the user has ready access to the data.
We focus on intrinsic and contextual data quality here,
and we consider representational and accessibility data
quality as components of subsequent competencies.
Why Developing This Competency is a Challenge
The issue of quality may be both ambiguous (see, for
example, Pirsig, 1974) and, due to having multiple
dimensions, complex, and students with a low tolerance
for ambiguity resist assimilating new information that is
ambiguous or complex (DeRoma, Martin, & Kessler,
2003). Fully evaluating data quality can require substan-
tial effort. Consider intrinsic data quality. Assessing
accuracy requires students to thoroughly understand
the data collection process and its potential shortcom-
ings. For example, if data collection involves sampling,
students must be able to recognize problems that could
arise due to sampling. The ease of sampling consumers
and other units, such as social media posts and online
product reviews, is driving the marketing analytics
boom. However, analytics that use data collected from
a sample may be misused if one fails to appreciate sam-
plings inherent limitations (i.e., sampling error) or
potential shortcomings due to a awed process (e.g.,
selecting a nonrepresentative sample). Further, much
marketing analytics data are collected through auto-
mated processes (e.g., Google Analytics tracking web
site visitors). The technical nature of these processes
makes understanding the processes and their potential
limitations difcult.
Students should also consider the extent to which data
arefreeofbias,yetsourcesofbiascanbedifcult to detect.
Although third-party sources may have little incentive to
provide biased or manipulated data, this is not true of all
data sources. For example, businesses may be motivated
to delete negative comments made on their social media
accounts, they can hire services to clean their online
reputations, and they may pay people to post positive
online reviews. However, these sources of bias are not
obvious, particularly when software is used to build ana-
lytics datasets by scraping the Internet for comments and
4Marketing Education Review
data related to specic topics. Consequently, data
obtained in this way may not accurately represent con-
sumer sentiment or behavior, compromising the datas
trustworthiness.
In terms of contextual quality, the data should be rele-
vant to the problem at hand (i.e., the context). This means
the data should be current, however a consequence of our
data-rich world is that data can become quickly outdated.
Further, datas age is not always apparent, which is parti-
cularly problematic with automated data collection pro-
cesses. Someone building a dataset by scraping the
Internet for comments related to Coca-Colamay cap-
ture news articles or social media posts that have been on
the Internet for many years. The resulting dataset may
contain outdated information that, even if accurate, is
not appropriate for the current problem.
Effective Practices
Most data used with marketing analytics are
numerical, and our experience suggests that students
tend to view numbers as precise and accurate.
However, instructors should emphasize that not all
numbers are created equally. Because there are multi-
ple points at which data collection can go wrong,
instructors must impress upon students the need to
evaluate data quality to avoid drawing unwarranted
conclusions. To do so, instructors should use activ-
ities that illustrate how data quality can be compro-
mised. The results of the 1948 and 2016
U.S. presidential elections nicely illustrate samplings
imperfections. Sampling led people to expect Dewey
to defeat Truman in 1948 and Clinton to defeat
Trump in 2016. However, to the surprise of many,
neither of these outcomes occurred (Edwards-Levy,
2016). A discussion of the polling process serves to
highlight how sampling can lead to wrong conclu-
sions. To illustrate potential biases, we discuss how
trade associations collect data on the industries they
represent. However, because trade associations exist
to promote the industries, they may be reluctant to
provide data that reect poorly on the industry.
We recommend that instructors develop this com-
petency with activities using common sources of mar-
keting data. For example, marketers often utilize data
from the US Census Bureau to segment and identify
attractive markets. Despite the Census Bureaus exper-
tise, census data may not accurately represent the
population. For example, the Census Bureau acknowl-
edges that not all demographic groups are equally
likely to participate in the Bureaus data collection
efforts (Westra & Nwaoha-Brown, 2017). Not only do
exercises using data from the Census Bureau and other
government sources illustrate data quality concerns,
they serve to familiarize students with valuable sources
of marketing-relevant data. We also provide students
with details about automated data collection proto-
cols, such as those used by Google, and have students
identify opportunities for the protocols to go wrong.
Although students expect automated data collection
procedures to be highly accurate, they should be
made aware that this is not always true.
We encourage instructors to utilize tools designed for
evaluating information quality, such as the information
quality assessment (IQA) instrument (Lee et al., 2002).
This easy-to-use, multi-item perceptual measure allows
the various dimensions of data quality to be quantied.
Example items include This information is objective
and The information is sufciently timely.Instructors
should present students with data from various sources
and have them evaluate the data along each quality
dimension. Even when students lack sufcient knowl-
edge to fully evaluate the data, this tool can foster discus-
sions about why data may score high or low on each of
the dimensions.
Competency 2: Understanding Measurement
What This Competency Entails
Measurement is central to analytics, and becoming
competent in this area involves understanding valid-
ity, reliability, and levels of measurement. Validity
refers to whether data accurately reect the concept
one intends to measure. Because validity addresses
whether the data are relevant to the problem at
hand, it aligns with the contextual data quality dimen-
sion (e.g., Wang & Strong, 1996; Zmud, 1978).
Reliability refers to whether the data would be
obtained again under similar conditions, and it falls
under the umbrella of intrinsic data quality (e.g.,
Delone & McLean, 1992; Goodhue, 1995). Level of
measurement refers to the nature of the information
the data represent. Data that lack reliability and/or
validity will be of low quality, ultimately leading to
questionable results. Being unable to identify the
Xxxxxx 2019 5
appropriate level of measurement may lead to inap-
propriate analyses. Thus, failing to assess validity,
reliability, and measurement level can undermine the
value of analytics.
Why Developing This Competency is a Challenge
First, consider measurement validity. Analytics are
often used to assess abstract, intangible outcomes such
as customer loyalty, satisfaction, or engagement.
Quantifying such latent constructs, and even objec-
tiveoutcomes, requires clear, precise denitions, how-
ever developing denitions with sufcient levels of
precision to accurately quantify these constructs or out-
comes can be a difcult task. Construct or outcome
denitions are often unique to the current situation
and, thus, must be assessed on a case-by-case basis.
Consider, for example, a clickon a web page, the
basis for much online marketing analytics. Though see-
mingly straightforward, dening a click is highly tech-
nical as illustrated by the Interactive Advertising Bureau
(2009) guidelines. Companies adhering to these guide-
lines take precautions to ensure that clicks are due to
unique, legitimate website visitors. Without knowledge
of these guidelines and precautions, one may misinter-
pret click-based measures due to the possibility of acci-
dental double-clicks, bot-initiated clicks, or deliberate
manual attempts to manipulate the click count.
Further, assessing validity is challenging because there
are multiple types of validity, including face, content,
predictive, concurrent, convergent, and discriminant,
some with subtle distinctions. Specic types of validity
may be relevant in some circumstances but not in
others. Validity is often a matter of degree, and, as
noted, students struggle with such ambiguity. Finally,
if critical information is missing, validity tests are not
possible.
Reliability is a prerequisite for validity. One way to
assess reliability is to compare the same data from
various sources. However, this is difcult due to the
effort and/or expense required to obtain much of the
data used for marketing analytics. Consider, for exam-
ple, Nielsen television ratings. The process of obtain-
ing large-scale measures of television viewership makes
it difcult to verify the reliability of Nielsens mea-
sures. In measuring television viewership since 1950,
Nielsen has established elaborate systems involving
diaries and set-top meters and acquired the knowledge
to provide (presumably) accurate television ratings.
Companies with less experience and resources are unli-
kely to have Nielsens expertise. Even Nielsens mea-
surement system, rened throughout the last half of
the 20th century, may not be reliable in todays highly
fragmented media environment. Such concerns are
difcult to assess.
Finally, effectively using analytics requires a thorough
understanding of level of measurement. Being able to
classify data as nominal, ordinal, interval, or ratio is
another complex, ambiguous issue that is difcult for
students to grapple with. For example, long debated is
the issue of whether data obtained from scaled response
questions (e.g., Likert) have ordinal or interval proper-
ties. Some disciplines (e.g., sociology) generally consider
such measures as ordinal, while others (e.g., psychology)
consider them as interval. Given that even experts often
disagree (Velleman & Wilkinson, 1993), we should not
be surprised when students struggle with the distinc-
tions between these categories.
Effective Practices
When teaching reliability and validity, abundant real-
world examples of data that either possess or lack high
levels of validity and/or reliability help students tie new
knowledge to existing structures. In teaching validity,
we nd it useful to provide examples of data that lack
each type of validity and have students identify poten-
tial problems with using the data to draw speciccon-
clusions. To illustrate, for content validity (i.e., whether
a measure fully captures the constructsdomain),weask
students to imagine that they work for a company that
owns a chain of restaurants. In a meeting, a coworker
states: Overall, our customers are satised. They rated
our food quality an average of 5.9 on a 7-point scale.We
then ask students to identify why this statement could
be wrong or misleading. Discussion leads students to
recognize that food quality is only one factor that
might contribute to overallsatisfaction. Price, atmo-
sphere, and service may also play roles. Thus, the claim
that food quality satisfaction assesses overall satisfaction
lacks content validity. We take similar approaches to
illustrate face, predictive, concurrent, convergent, and
discriminant validity.
Because validity requires a clear, precise denition of
the underlying construct, we have students evaluate the
concept being measured using an existing construct
6Marketing Education Review
denition paradigm (e.g., Churchill, 1979; Rossiter,
2005). Gilliam and Voss (2013) present a six-step process
for developing marketing construct denitions: (1) write
a preliminary denition, (2) consult the literature to
build a nomological network, (3) assess value added, (4)
rene the denition, (5) have experts judge the deni-
tion, and (6) revise the denition and iterate. Having
students perform these steps for common marketing
analytics constructs will help them better evaluate
whether available data are valid measures of these con-
structs. For example, marketers desire to foster customer
brand loyalty. A useful exercise involves having students
create a preliminary denition of this abstract construct.
We have found that students often develop denitions
that do not differ from similar constructs. For example,
they may dene loyalty as a customer who is happy
with the brand.Through employing the process advo-
cated by Gilliam and Voss (2013), students realize that
this denition may reect satisfaction but not loyalty.
Through iteration, students eventually arrive at
adenition that better reects loyalty, such as the
extent to which a customer is devoted to a brand.
To explain reliability, we begin with true-score theory:
observed score = true score + error. Students nd this
equation to be simple and intuitive. We then highlight
the error component as it relates to reliability by provid-
ing examples of random and systematic error. Finally, we
provide examples of measures that lack inter-rater, test-
retest, or parallel forms reliability. For example, we ask
students to imagine a situation in which a retailer mea-
sures the number of units sold in a month both by having
someone do a manual inventory check and another per-
son accessing scanner data records. We indicate that these
processes lead to different results, and we ask students
why this may have occurred. While students usually iden-
tify potential problems with theft or breakage, after dis-
cussion, students come to recognize that the
inconsistency could also be due to one or both measure-
ment processes (e.g., ones ability to access or count
inventory in the stockroom or problems with the scanner
technology). The lack of parallel-forms reliability leads
students to question data obtained from a single source.
When teaching level of measurement, instructors
should keep scaffold learning in mind (Wood, Bruner, &
Ross, 1976). Because students are unlikely to have sub-
stantial experience with this concept, instructors should
introduce initial building blocks and then expand on
these concepts. For example, a parsimonious
representation of data is as either continuous or grouped.
Under these broad categories lie further distinctions. For
example, continuous data either have a xed origin (i.e.,
ratio), such as unit sales, or they do not (i.e., interval), such
as shoe size. Grouped data can either be ranked (i.e., ordi-
nal), such as class standing, or they cannot (i.e., nominal),
such as gender. Student prociency at differentiating
between various levels of measurement requires practice.
Further, we demonstrate how a given concept can be
measured at different levels, depending on how the mea-
sure is obtained. Returning to the customer loyalty exam-
ple, loyalty can be a ratio measure if people provide the
number of times they have purchased a brand in the
past year, an interval measure if people rate the number
of times they have purchased the brand in the past year on
a low/high scale, an ordinal measure if people select from
among several purchase frequency categories (e.g., 0, 15,
more than 5), or a nominal measure if people identify
which brands they have purchased in the past year. After
students demonstrate their mastery of these distinctions,
the instructor can introduce debates about topics such as
whether Likert scales are ordinal or interval.
Competency 3: Managing Datasets
What This Competency Entails
Students should be able to create and manipulate
datasets. While marketing employees are perhaps more
likely to use existing, rather than create new, datasets,
understanding this part of the analytics process enables
one to identify where problems may arise. For primary
data, this may involve developing and carrying out the
process to obtain the data (e.g., creating and administer-
ing a questionnaire or running a program to extract
information from the Internet), creating a coding
scheme (e.g., assigning numerical values to non-
numerical data), and entering the data into a computer
le. For secondary data, this may involve accessing exist-
ing data, oftentimes by navigating online data reposi-
tories, and entering the data into a computer le. For
both primary and secondary data, this competency
should include data cleaning, data (re)formatting, and
creating new variables. Data cleaning may involve iden-
tifying and removing outliers (Hodge & Austin, 2004),
looking for signs of diminished effort in survey takers
(Huang, Curran, Kenney, Poposki, & DeShon, 2012),
deleting specic cases, or handling missing data (Little
Xxxxxx 2019 7
& Rubin, 2014). (Re)formatting the data may involve
specic variables (e.g., converting a date variable from
one format to another) or the entire le (e.g., converting
an Excel le to SAS or SPSS). Creating a new variable may
involve combining multiple variables (e.g., dividing
total time on siteby number of pages visitedto
obtain average time per page) or transforming
a single variable (e.g., taking the inverse of response
times to satisfy underlying distribution assumptions of
the analysis to be performed).
Why Developing This Competency is a Challenge
Managing datasets requires a holistic view of the ana-
lytics process. Students must understand relationships
between variables, the nature of distributions, scale, why
a variable should be reverse coded, or when a difference
score, a weighted score, an averaged score, or
a standardized score would be appropriate. Many of
thesetransformationsareakintomoreabstractconcepts
introduced in algebra, and students must understand
why such transformations are necessary. As with each of
the identied competencies, students often lack experi-
ence. Marketing students typically do not take courses
that develop generalized skills for creating and working
with datasets, as reected by the results in Table 1.
When students use existing datasets, the ability to
establish this competency is, in part, a function of
representational data quality, or whether the data are
presented in such a way that it is interpretable, easy
to understand, [and] easy to manipulate …”(Lee et al.,
2002, p. 135). Secondary data may have poor represen-
tational data quality due to insufcient documenta-
tion about the data collection process or what the
variables represent. Further, the analysis software
may not easily manipulate the data le due to various
incompatibilities.
In terms of data cleaning, students must have basic
statistical knowledge, to identify outliers, and be famil-
iar with imputation methods, to handle missing data.
Further, students must know the capabilities of the
analysis software, such as whether it can handle char-
acter (string) data. When we ask students to create
datasets, we have observed that they may use incon-
sistent formats even for the same variable. For exam-
ple, they may use various units (e.g., sales entered as
units sold and dollar value), state the units for some
observations (e.g., entering 10 dollars) but not
others (e.g., using 10to represent 10 dollars), or
use both labels (e.g., male) and numbers (e.g., 1
to represent a male). The challenge of (re)formatting
datasets is, in part, technical in nature. Students must
be procient with, and understand the structure of
datasets created by, various software. Effectively refor-
matting data by creating new variables also requires
knowledge of measurement. For example, to create
a measure of web site engagement,students should
know that it is inappropriate to add measures of time
on siteand number of pages visited.
Effective Practices
Regarding the technical aspects of dataset creation,
we encourage instructors to expose students to various
software, such as Excel, SAS, SPSS, and R, to give stu-
dents experience with common dataset formats (Liu &
Levin, 2018). Further, instructors should assign exer-
cises that require students to merge datasets of the
same and different formats, create new variables,
obtain basic descriptive statistics to identify outliers
and missing data, and convert datasets from one for-
mat (e.g., Excel) to another (e.g., SPSS).
More generally, instructors must emphasize the need
to connect the research objectives (i.e., the information
needed by the decision-maker) to the analytics being
performed. Failing to keep research objectives in mind
leaves students in a quagmire and asking, but how do
Iknowtodothat?particularly when students must
create or transform variables. Introducing goals as orga-
nizing topics, and the hierarchical nature of these goals,
helps students understand why and when specic
actions are necessary. For example, the concept of case-
wise consistency, or having representative data points
for a given case across all variables of interest, can help to
organize the functions of data cleaning (e.g., imputation
or deletion). The concept of data reduction, or aiming
for the most parsimonious representation of the data,
can help to organize the function of creating new vari-
ables (e.g., assessing internal consistency, standardiza-
tion, weighting, reverse coding, and averaging).
To illustrate these points, an effective exercise has
students compare potential target markets by creating
a measure of economic strength for each market. We
provide students with two measures, average income
and unemployment rate, for each market, and we ask
them to combine these measures into a single
8Marketing Education Review
(parsimonious) measure of economic strength.
Although some students are tempted to simply add
the two measures, after reection, they realize that
higher values for income are good, while higher values
for unemployment are bad. They also realize that the
two measures are provided in different units with dif-
ferent magnitudes. Thus, before combining the mea-
sures, students need to standardize both measures and
reverse code the unemployment measure. Further,
creating a dataset with a few markets that have incon-
sistent values, such as high unemployment and high
average income, gives students experience in identify-
ing these inconsistencies (i.e., case-wise consistency).
Including a few missing values allows students
to develop approaches for replacing these values (i.e.,
imputation).
Competency 4: Analyzing Data
What This Competency Entails
Upon nalizing the dataset, students must be able
to conduct appropriate analyses. This requires under-
standing the research question at hand, which vari-
ables should be used, the nature of these variables
(i.e., level of measurement), what relationships or
effects are being investigated, what analysis is appro-
priate for estimating the relationships or effects, and
what software should be used. Analyses range in com-
plexity from basic univariate descriptive statistics (e.g.,
frequency distribution, mean, standard deviation), to
bivariate analyses (e.g., correlations, t-tests, simple lin-
ear regression), to analyses with three or more vari-
ables (e.g., multiple linear regression, moderation,
mediation, factor analysis, structural equation model-
ing). Liu and Burns (2018) highlight analysis techni-
ques that are relevant to analytics-related jobs,
including predictive modeling and data mining.
Why Developing This Competency is a Challenge
There is often a steep learning curve for mastering
software and specic analysis techniques. Given the
amount and type of data now available, common ana-
lyses have moved beyond simple descriptive statistics
and univariate analyses. To obtain maximum value
from data, students should be able to identify non-
linear, stochastic, and unobservable phenomena. The
analysis techniques for doing so are inherently
complex, with many underlying assumptions. While
friendlier interfaces, such as drop-down menus, have
enhanced the usability of statistical software, it can be
advantageous for users to have more control over the
analysis. Consequently, there may be value in students
learning to write analysis syntax in software such as R,
SPSS, STATA, MATLAB, and SAS.
While students can fairly easily learn to mimic the
process required to run specic analyses, a bigger chal-
lenge in developing this competency may involve
knowing which specic analysis is appropriate.
Students must be able to align (more abstract) research
questions with (more concrete) analyses. As noted, this
requires students to connect several dots. For example,
if the research question is to understand the relation-
ship between age and loyalty,students need to know
how age and loyalty are measured (i.e., the level of
measurement of each), which analyses are appropriate
for these types of measures, and which analyses enable
one to quantify a relationship. Only then can students
follow the steps necessary to run appropriate analyses.
Effective Practices
There is no substitute for practice; mastering data
analysis requires students to have substantial experi-
ence conducting various analyses. To develop insight
into which analysis to run, an effective exercise
involves presenting students with a research question
and variations on the data available, and then having
students identify the appropriate analysis. For exam-
ple, for the research question what is the relationship
between age and purchase frequency,both age and
purchase frequency could have ordinal (categories) or
ratio (raw numbers) levels of measurement. As such,
appropriate analysis could involve cross-tabs or regres-
sion. Presenting variations on the research question,
such as what is the effect of age on purchase fre-
quency,fosters discussion about differences between
effectsand relationshipsand other signals that are
important for determining the appropriate analysis.
Further, we build from analyses and concepts with
which students are comfortable. Our students reported
relatively high levels of comfort in creating and inter-
preting graphs in Excel (see Table 1). When students
are able to create appropriate graphs with, for exam-
ple, error bars or scatterplots, this is a springboard to
asking how do you know if these are signicant?As
Xxxxxx 2019 9
students will have already considered the research
question and level of measurement to create the
graph, applying a t-test, analysis of variance, or corre-
lation becomes an easier leap for students. Upon soli-
difying these basic tests, the instructor can teach more
complex tests by building upon prior knowledge and
using the research question as a guiding goal.
Competency 5: Interpreting Results
What This Competency Entails
Students must be able to explain the results of their
analyses. Interpretation may involve, for example,
determining statistical signicance, the nature of any
effects (e.g., positive versus negative relationships), the
practical signicance of the effects (i.e., effect sizes),
and identifying and describing trends.
Why Developing This Competency is a Challenge
Findings by Pilling et al. (2012) support the asser-
tion that interpreting data is challenging. A students
ability to develop this competency is a function of the
complexity of the analysis. While interpreting results
may be a relatively easy competency to develop for
basic univariate analyses, interpretation is more chal-
lenging for advanced analyses. For example, multiple
tests require adjustments to the signicance level (e.g.,
Bonferroni corrections), and logistic regression
requires students to interpret the nonintuitive logit
function (i.e., the log of the odds).
Students must interpret not only statistical signi-
cance but also practical signicance. While various
guidelines have been developed to help with the inter-
pretation of practical signicance (e.g., a correlation
greater than .8 is very strong), these rule-of-thumb
approaches are perhaps more harmful than helpful.
They discourage students from thinking about the spe-
cic research context, which is critical to evaluating
effect sizes. A correlation of .6 could be extremely large
in some marketing realms and extremely small in others.
Not all interpretation requires assessment of statis-
tical signicance. However, even students who intui-
tively understand basic descriptive statistics, such as
means, medians, and percentages, may struggle to
master this competency. We have found that the
common dashboardapproach, in which numerous
metrics are reported to provide a holistic view of per-
formance, can be overwhelming. Students are often
unsure of how to integrate multiple results into
a parsimonious and useful decision-making tool.
Effective Practices
Beyond the simple recommendation of ample prac-
tice, we have found that instructors need to shift stu-
dentsthought processes from systematic and linear to
broader and more holistic. One way to accomplish this
goal is by having students write about their ndings,
making the best argument they can to support their
conclusion. Research reveals that writing in this way
leads people to think more critically and build a more
complete picture of the decision problem, thus reducing
bias and leading to better decisions (Sieck & Yates, 1997).
Research on developing effective critical thinking
skills also suggests that simple prompt questions such
as Do you have all the necessary information?and
Is there any conict in the evidence?can spur stu-
dents to think more deeply about what the results
mean (Helsdingen, van Gog, & van Merriënboer,
2011). Doing so encourages students to consider the
specic conditions under which the data were col-
lected, thus helping students interpret the practical
signicance of their ndings. Given that practical sig-
nicance is context dependent, instructors should
emphasize to students the importance of ignoring gen-
eral effect size guidelines, but instead consider the
context that led to the observed effects.
One exercise we use that encourages students to con-
sider whether they have all the necessary information
involves analyzing data from a retail outlet. We present
students with customer satisfaction data, for both week-
days and weekends, regarding (1) store cleanliness, (2)
product selection, (3) staff friendliness, and (4) checkout
times. We also provide sales volume (in dollars). We rst
have students analyze the results across all days, and
they draw general conclusions regarding customer satis-
faction (e.g., Customers are, on average, satised with
product selection). We then ask students whether it is
possible that these conclusions are not always true. From
personal experience, students posit that shopping on
weekdays and weekends may differ. Thus, we have stu-
dents analyze the data for weekdays and weekends
10 Marketing Education Review
separately, and a more nuanced picture emerges.
Students nd that customer satisfaction with store clean-
liness and product selection is lower on weekends than
weekdays,while customer satisfaction with staff friendli-
ness and checkout times is high for both weekends and
weekdays. Sales volume is higher on weekends than
weekdays. Students often struggle to understand what
these results suggest initially. Upon reection, they
reach the (correct) conclusions that during the weekend,
when sales volume is high, the store is shorthanded.
Employees focus on being friendly and checking out
customers promptly, but they do not have enough
time to restock shelves and tidy up between customers.
The only way to arrive at this conclusion is to look at
multiple results and synthesize the pattern of ndings
into a coherent account.This exercise also provides prac-
tice with the common dashboard approach to marketing
metrics.
As another exercise, we ask students if a -
ve percent unemployment rate for the United
States is indicative of a strong economy. As students
are often unfamiliar with the actual unemployment
rate, their initial reactions are usually based on
whether ve percent feelshigh or low. When
asked if they have all the necessary information to
draw a conclusion, students realize they do not, and
we have them look up historical US unemployment
rates from the US Bureau of Labor Statistics web site.
At different points over the past 70 years, a -
ve percent unemployment rate could indicate
a strong or weak economy, depending on recent
past economic conditions. We then explain how
ofcial unemployment rates are computed, and we
ask whether there is any evidence that conicts
with their conclusion about the strength of the
economy. When students understand that unem-
ployment rates exclude people who have dropped
out of the labor force, they realize that a low unem-
ployment rate does not necessarily indicate a strong
economy, as a large number of people may have
removed themselves from the labor force due to
being unable to nd employment. This example
also illustrates the ability to politicize metrics, as
political parties often focus on one metric (e.g.,
unemployment rate) over another (e.g., number of
people not in the labor force).
Competency 6: Communicating Results
What This Competency Entails
Students must learn to communicate the results to
decision makers, which may involve written reports,
including appropriate tables, charts, graphs, or other
data visualization techniques, and oral presentations.
Critically, students must be able to convey the results
succinctly and in ways that make them managerially
actionable.
Why Developing This Competency is a Challenge
Although students need effective communication
skills for conveying technical information such as sta-
tistical results, they typically receive limited training
in this area (Brownell, Price, & Steinman, 2013; Wright
& Larsen, 2016). This may be particularly true in busi-
ness disciplines, where students are often required to
take a course in Business Writing but not a course in
Technical Writing. Consequently, students learning
analytics typically lack the skills necessary to commu-
nicate their ndings effectively. While numerous
charts, graphs, tables, and other visualization tools
can facilitate communication of decision-relevant
information, these tools are useless if students lack
either the technical skills to create them or knowledge
about how to best format them (e.g., labeling points
and axes, appropriate scale, headings and titles).
While insufcient training is likely a major reason that
students struggle to communicate their ndings, another
impediment is the curse of knowledge (e.g., Camerer,
Loewenstein, & Weber, 1989).HeathandHeath(2007)
state, “…when we know something, it becomes hard for
us to imagine not knowing it. As a result, we become
lousy communicators.Students who have become pro-
cientatanalyzingdataand the other competencies
described here may have difcultly relating to people
who do not have similar levels of knowledge. The dis-
connection between statistical analyses and actionable
decisions is likely to be amplied if the decision-maker
lacks sufcient statistical knowledge. It may be unclear to
the decision-maker how a regression coefcient or
ap-value relates to the decision at hand. For these rea-
sons, communicating results is likely to be difcult.
Xxxxxx 2019 11
Effective Practices
Several organizations have created programs designed
to improve technical information communications.
Among these, the Alan Alda Center for Communicating
Science at Stony Brook University offers courses, work-
shops, and outreach.
2
The American Association for the
Advancement of Science created the Center for Public
Engagement with Science and Technology, which offers
a communication toolkit.
3
The National EMSC Data
Analysis Resource Center (NEDARC) provides guidelines
for effectively communicating statistics.
4
Example guide-
lines include: (1) Do not overload the client with statis-
tics. Instead, present only meaningful results that convey
the size of the issue, establish the appropriate context,
and are new or unique ndings. (2) Avoid statistical ter-
minology (e.g., statistically signicant,”“p-value).
Instead, use language that the client is likely to under-
stand (e.g., more likelyor less likely). (3) In general,
use words to convey critical points instead of numbers.
Specic to the realm of marketing analytics, Xavier
University created the Master of Science in Customer
Analytics degree (Houghton et al., 2018). This program
develops storytelling skills to enhance studentsability to
communicate analytics to decision makers.
More generally, communicating results requires stu-
dents to abandon rote plug-in-the-answer thinking and
embrace a deeper understanding of what is to be
described (Johnson, 2016;Radke-Sharpe,1991).
Scaffolding can aid the transition from just tell me
what to sayto how do I explain this?For instance,
when introducing bivariate correlation, we provide exam-
ples of how to communicate the results, such as There
was a large positive relationship between consumersrat-
ings of how cool and how innovative they found the
product to be, r=.76,n=260.Themoreinnovative
consumers considered the product, the cooler they also
considered it.For any associated exercises, we initially
encourage students to simply mimic the wording and
format of these examples. As they gain experience and
begin selecting analyses that they deem appropriate, stu-
dents often refer to our examples to recall how to com-
municate results. With practice, students are able to
interpret and communicate results without help.
As for overcoming the curse of knowledge, students
must rst recognize that it exists. We stress to students
that many people, including the clients or decision
makers requesting data, are unlikely to understand sta-
tistical concepts such as p-values, regression coefcients,
and measures of dispersion. In communicating analytics
to clients, one should get to know the client, including
her/his level of statistical knowledge. Before communi-
cating with the client, the analyst should rene the
communications on someone less knowledgeable.
5
CONCLUSIONS
As the importance of marketing analytics continues to
grow, departments seeking to integrate analytics into
their curricula should focus on developing the six critical
competencies described here. Specically, students must
learn to assess data quality and measurement concepts,
manage datasets, appropriately analyze data, interpret
results, and communicate the results to clients or deci-
sion-makers. Someone lacking any of these competen-
cies is unlikely to realize fully analyticsbenets. We also
encourage instructors to supplement the competencies
identied here with a discussion of ethical considera-
tions associated with using analytics and big data,as
organizations must consider the types of data they col-
lect about their customers and how they use these data
(Corrigan, Craciun, & Powell, 2014).
Our goal was to present the competencies and, more
importantly, challenges students are likely to face in
developing them in a marketing context. In doing so,
we have pointed instructors to practices, resources, and
theoretically-based recommendations that they can uti-
lize in developing the competencies, many of which we
employ in our classes. The obvious question is whether
teaching analytics courses with an eye toward develop-
ing the competencies presented here is effective. In our
classes referenced in Table 1, we surveyed students at the
end of the semester using the same questions from the
beginning of the semester. Table 1 presents the results in
the columned labeled End of Semester.While the per-
ceived difculty of statistics and working with data did
not change, studentsperceptions of their experience
with statistics and data did increase signicantly.
Importantly, after being exposed to many of the
2
http://www.centerforcommunicatingscience.org/.
3
https://www.aaas.org/pes.
4
http://www.nedarc.org/tutorials/utilizingData/index.html.
5
Suggestions provided by http://www.nicholasreese.com/curse-
of-knowledge/.
12 Marketing Education Review
practices and examples described here, students felt sig-
nicantly more knowledgeable or comfortable with all
of the specic course components that related to the
various competencies. At a higher level, as shown in
Figure 1,studentsbecamesignicantly more comforta-
ble with (M
Beginning
=4.52,M
End
=5.69,t
122
=11.01,
p< .001) and interested in (M
Beginning
=4.27,M
End
= 4.63,
t
122
= 3.25, p= .001) analytics from the beginning to the
end of the semester.
How should a marketing department modify its
curriculum to develop these competencies? Building
the competencies by integrating them into existing
courses is likely to be difcult (Saber & Foster, 2011),
due both to time and instructor interest/expertise con-
straints (Mintu-Wimsatt & Lozada, 2018). A more
effective approach involves creating stand-alone ana-
lytics-based courses, as advocated by Pilling et al.
(2012) and Liu and Burns (2018), who provide gui-
dance related to specic educational goals, instruc-
tional plans, metrics and analysis tools to be covered,
and evaluation criteria. Wilson et al. (2018), Liu and
Levin (2018), and LeClair (2018) highlight how such
a course could t into the marketing curriculum, and
the competencies discussed in the current research
complement these recommendations.
We agree with Liu and Burns (2018) that a single ana-
lytics survey course offers value by exposing students to
relevant topics, but is unlikely to create analytics pro-
ciency. Further, a single course may become isolated, dis-
couraging integration of the topic across the curriculum
(LeClair, 2018). For these reasons, we advocate for
a sequence of courses, each focusing on one or more of
the competencies and presented from a marketing per-
spective. A number of universities have developed
undergraduate and/or graduate marketing analytics con-
centrations. We examined concentrations offered by 10
US universities, and several points are noteworthy. First,
concentrations are typically advertised as requiring three
or four courses. However, these courses often have prere-
quisites, suggesting that, in practice, analytics prociency
requires more than the three or four advertised
courses. Second, perhaps due to resource and expertise
Figure 1
Changes in Student Comfort with and Interest in Analytics.
Notes: Onset = Beginning of Semester, Complete = End of Semester. Comfort Scale: Onset
α= .75 And Complete α= .89. Interest Scale: Onset α= .81 And Complete α= .85
Xxxxxx 2019 13
constraints, marketing departments typically outsource at
least some courses to other departments (Liu & Levin,
2018), such as Computer Science, Math/Statistics, and
Communications. Third, most of the concentrations we
examined include at least some existing courses that have
been packaged to create a cluster of analytics-related
courses. In some cases, analytics concentrations (or
courses) are simply rebranded researchconcentrations
(or courses), with new titles (e.g., Marketing Research and
Analytics). Fourth, marketing analytics concentrations
are sometimes owned by other departments, such as
math or statistics. This suggests that if marketing
departments are not willing or able to offer
a concentration, strong demand is leading other depart-
ments to ll the void.
While these approaches are understandable, they are
likely to lead to concentrations that do not adequately
develop the competencies identied here or are not well
integrated in the curriculum (LeClair, 2018;Mintu-
Wimsatt & Lozada, 2018). Assuming the opportunity to
develop a marketing analytics concentration from
scratch, Table 2 presents a proposed curriculum map for
ave-course concentration based on the six competencies
presented here. For each course, we describe the compe-
tencies to be developed, examples of major topics cov-
ered, and basic pedagogical approaches for doing so. The
Table 2
Proposed Five-Course Marketing Analytics Concentration
Course 1:
Marketing Data and
Information Management
Competencies Developed Assessing data quality, understanding measurement, managing datasets
Major Topics Covered Secondary sources of marketing data (e.g., governments, trade associations);
Acquiring primary marketing data (e.g., survey design, web scraping
software, in-store scanners, consumer panels, experimentation); sampling;
construct definition; reliability; validity; level of measurement; managing
datasets/databases (creating, merging, formatting, creating new variables)
for common analytics software (e.g., Excel, R, SAS)
Pedagogical
Approach(es)
Lecture-based course with numerous exercises designed to familiarize students
with common sources of marketing data, evaluating data, and preparing
data for analysis
Course 2:
Marketing Analytics I
Competency Developed Analyzing data
Major Topics Covered Univariate and bivariate analyses (e.g., descriptive statistics, t-tests, chi-square
tests, simple regression); marketing metric dashboards, including commonly
used metrics; Google Analytics/Adwords; common analysis software
Pedagogical
Approach(es)
Lecture-based course with numerous exercises designed to familiarize students
with common analyses, dashboards, and analysis software
Course 3:
Marketing Analytics II
Competency Developed Analyzing data
Major Topics Covered Multivariate analyses, including multiple regression, cluster analysis,
multidimensional scaling, factor analysis, predictive modeling; search engine
optimization; textual analysis; data mining; advanced database management
(e.g., SQL)
Pedagogical Approaches Lecture-based course with numerous exercises designed to familiarize students
with more advanced analyses and database management
Course 4:
Communicating Analytics for
Effective Decision Making
Competencies Developed Interpreting results, communicating results
Major Topics Covered Managerial implications; written and oral communications; data visualization
(e.g., Tableau)
Pedagogical Approaches Case-based course with multiple analytics-focused case studies, with associated
data files for analysis, that require oral and/or written presentations;
Supplemental exercises to gain experience in interpreting and presenting
results from analyses covered in previous two courses
Course 5:
Marketing Analytics
Practicum/
Internship
Competencies Developed Assessing data quality, understanding measurement, managing datasets,
analyzing data, interpreting results, communicating results
Major Topics Covered Depends on needs of organizational partner
Pedagogical Approach Real-world project-based course to tie competencies together
14 Marketing Education Review
rst four courses would be in-class (or online) courses
designed to provide necessary structure for students. The
nal course would be an internship or practicum, con-
ducted in conjunction with an outside organization,
designed to provide experiential learning opportunities.
As LeClair (2018,p.9)argues,andweagree,it is impor-
tant for students to be exposed to and deal with the real
imperfections of data.The courses could be structured
around a single topic, such as pricing, branding, or adver-
tising. However, we encourage instructors to include exer-
cises and examples from various topics to impress upon
students the value of analytics in each of these areas.
While specic details of the courses would depend on
class size, available resources (e.g., software), and various
other factors, we refer the interested reader to suggestions
by Liu and Levin (2018,pp.1820).
For some students, analytics will become a career
path. Most others will interact with analytics in manage-
rial roles, or analytics will inuence how they perform
their jobs. Regardless, it is becoming difcult for one to
avoid analytics in the workplace. Five of the top ten skills
employers recently said they desired in college graduates
were the ability to make decisions and solve problems
(#1), the ability to obtain and process information (#5),
the ability to analyze quantitative data (#6), prociency
with computer software programs (#8), and the ability to
create written reports (#9) (Adams, 2014). These desir-
able skills highlight the importance of, and align nicely
with, the analytics competencies identied here.
DISCLOSURE STATEMENT
No potential conict of interest was reported by the
authors.
REFERENCES
Adams, S. (2014, November 12). The 10 skills employers
most want in 2015 graduates. Forbes. Retreived from
https://www.forbes.com/sites/susanadams/2014/11/12/
the-10-skills-employers-most-want-in-2015-graduates
/#e498e4225116
Aggarwal, P., Vaidyanathan, R., & Rochford, L. (2007). The
wretched refuse of a teeming shore? A critical examina-
tion of the quality of undergraduate marketing
students. Journal of Marketing Education,29(3),
223233. doi:10.1177/0273475307306888
Aragón, O. R., Dovidio, J. F., & Graham, M. J. (2016).
Colorblind and multicultural ideologies are associated
with faculty adoption of inclusive teaching practices.
Journal of Diversity in Higher Education. doi:10.1037/
dhe0000026
Ariker, M., Diaz, A., Moorman, C., & Westover, M. (2015
November 5). Quantifying the impact of marketing
analytics. Harvard Business Review. Retrieved from
https://hbr.org/2015/11/quantifying-the-impact-of-mar
keting-analytics
Ballou, D. P., & Pazer, H. L. (1985). Modeling data and
process quality in multi-input, multi- output informa-
tion systems. Management Science,31(2), 150162.
doi:10.1287/mnsc.31.2.150
Bandura, A. (1977). Self-efcacy: Toward a unifying theory of
behavioral change. Psychological Review,84(2), 191215.
doi:10.1037//0033-295x.84.2.191
Brownell, S. E., Price, J. V., & Steinman, L. (2013). Science
communication to the general public: Why we need to
teach undergraduate and graduate students this skill as
part of their formal training. Journal of Undergrad
Neuroscience Education,12(1), 610.
Camerer, C., Loewenstein, G., & Weber, M. (1989). The curse
of knowledge in economic settings: An experimental
analysis. Journal of Political Economy,97(5), 12321254.
doi:10.1086/261651
Cavanagh, A. J., Aragón, O. R., Chen, X., Couch, B.,
Durham, M., Bobrownicki, A., Graham, M. J.
(2016). Student buy-in to active learning in a college
science course. Life Sciences Education,15(4), ar76.
doi:10.1187/cbe.16-07-0212
Chew, P. K., & Dillon, D. B. (2014). Statistics anxiety update:
Rening the construct and recommendations for a new
research agenda. Perspectives on Psychological Science,9
(2), 196208. doi:10.1177/1745691613518077
Churchill, G. A., Jr. (1979). A paradigm for developing better
measures of marketing constructs. Journal of Marketing
Research,16(1), 6473. doi:10.1177/002224377901600110
CMO Survey. (2015, August). CMO survey report: Highlights
and insight. Retrieved from https://cmosurvey.org/
results-august-2016/survey-results-august-2015/
Corrigan, H. B., Craciun, G., & Powell, A. M. (2014). How
does Target know so much about its customers?
Utilizing customer analytics to make marketing
decisions. Marketing Education Review,24(2), 159166.
doi:10.2753/MER1052-8008240206
Davis, R., Misra, S., & Van Auken, S. (2002). A gap analysis
approach to marketing curriculum assessment: A study
of skills and knowledge. Journal of Marketing Education,
22, 218224. doi:10.1177/0273475302238044
Delone, W. H., & McLean, E. R. (1992). Information systems
success: The quest for the dependent variable.
Information Systems Research,3(1), 6095. doi:10.1287/
isre.3.1.60
DeRoma, V. M., Martin, K. M., & Kessler, M. L. (2003). The
relationship between tolerance for ambiguity and need
for course structure. Journal of Instructional Psychology,
30(2), 104110.
Dweck, C. S. (1986). Motivational processes affect learning.
American Psychologist,4(10), 10401048. doi:10.1037/
0003-066X.41.10.1040
Edwards-Levy, A. (2016, November 15). Most Americans are
surprised by and unhappy with the election results. The
Hufngton Post. Retrieved from http://www.hufngton
Xxxxxx 2019 15
post.com/entry/election-results-americans-
surprised_us_582b82c3e4b01d8a014b13bd
Fredrickson, B. L. (2004). The broaden-and-build theory of
positive emotions. Philosophical Transactions of the Royal
Society B Biological Sciences,359(1449), 13671378.
doi:10.1098/rstb.2004.1512
Gilliam, D. A., & Voss, K. (2013). A proposed procedure for
construct denition in marketing. European Journal of
Marketing,47(1/2), 526. doi:10.1108/03090561
311285439
Goodhue, D. L. (1995). Understanding user evaluations of
information systems. Management Science,41(12),
18271844. doi:10.1287/mnsc.41.12.1827
Heath, C., & Heath, D. (2007). Made to stick: Why some ideas
survive and others die. New York, NY: Random House.
Helsdingen, A., van Gog, T., & van Merriënboer, J. (2011).
The effects of practice schedule and critical thinking
prompts on learning and transfer of a complex judg-
ment task. Journal of Educational Psychology,103(2),
383398. doi:10.1037/a0022370
Hernandez, P. R., Schultz, P. W., Estrada, M., Woodcock, A.,
& Chance, R. C. (2013). Sustaining optimal motivation:
A longitudinal analysis of interventions to broaden par-
ticipation of underrepresented students in STEM.
Journal of Educational Psychology,105(1), 89107.
doi:10.1037/a0029691
Hodge, V. J., & Austin, J. (2004). A survey of outlier detection
methodologies. Articial Intelligence Review,22(2),
85126. doi:10.1023/B:AIRE.0000045502.10941.a9
Houghton, D. M., Schertzer, C., & Beck, S. (2018). The MSCA
program: Developing analytics unicorns. Marketing
Education Review,28(1), 4151. doi:10.1080/
10528008.2017.1409078
Huang, J. L., Curran, P. G., Kenney, J., Poposki, E. M., &
DeShon, R. P. (2012). Detecting and deterring insufcient
effort in responding to surveys. Journal of Business and
Psychology,27,99114. doi:10.1007/s10869-011-9231-8
Interactive Advertising Bureau. (2009). Measurement protocols
and guidelines. Retrieved from https://www.iab.com/
guidelines/measurement-protocols-guidelines/.
Jarke, M., & Vassiliou, Y. (1997). Data warehouse quality:
A review of the DWQ project. In D. Strong & B. Kahn
(Eds.), Proceedings of the 1997 conference on information
quality (pp. 299313). Cambridge, MA: MIT Press.
Johnson, K. G. (2016). Incorporating writing into statistics.
In J. Dewar, P. Hsu & H. Pollatsek (Eds.), Mathematics
Education: A Spectrum of Work in Mathematical
Sciences Departments (pp. 319-334). Cham, Switzerland:
Springer.
Kolb, D. A. (1984). Experience as the source of learning and
development. Upper Saddle River, NJ: Prentice Hall.
LaBarbera, P. A., & Simonoff, J. (1999). Toward enhancing
the quality and quantity of marketing students. Journal
of Marketing Education,21,413. doi:10.1177/
0273475399211002
LeClair, D. (2018). Integrating business analytics in the mar-
keting curriculum: Eight recommendations. Marketing
Education Review,28(1), 613. doi:10.1080/
10528008.2017.1421050
Lee, Y. W., Strong, D. M., Kahn, B. K., & Wang, R. Y. (2002).
AIMQ: A methodology for information quality
assessment. Information and Management,40, 133146.
doi:10.1016/S0378-7206(02)00043-5
Little, R. J., & Rubin, D. B. (2014). Statistical analysis with
missing data. Hoboken, NJ: Wiley & Sons.
Liu, X., & Burns, A. C. (2018). Designing a marketing analy-
tics course for the digital age. Marketing Education
Review,28(1), 2840. doi:10.1080/
10528008.2017.1421049
Liu, Y., & Levin, M. A. (2018). A progressive approach to
teaching analytics in the marketing curriculum.
Marketing Education Review,28(1), 1427. doi:10.1080/
10528008.2017.1421048
Mintu-Wimsatt, A., & Lozada, H. R. (2018). Business analy-
tics in the marketing curriculum: A call for integration.
Marketing Eduation Review,28(1), 15. doi:10.1080/
10528008.2018.1436974
Pilling, B. K., & Nasser. (2015). The early identication of
at-risk students in an undergraduate marketing metrics
course. Analytic Marketing Journal,4(1), 89106.
Pilling, B. K., Rigdon, E. E., & Brightman, H. J. (2012).
Building a metrics-enabled marketing curriculum: The
cornerstone course. Journal of Marketing Education,34
(2), 179193. doi:10.1177/0273475312450390
Pirog, S. F., III. (2010). Promoting statistical analysis in the
marketing curriculum: A conjoint analysis exercise.
Marketing Education Review,20, 249254. doi:10.2753/
MER1052-8008200305
Pirsig, R. M. (1974). Zen and the art of motorcycle maintenance:
An inquiry into values. New York, NY: HarperCollins
Publishers.
Radke-Sharpe, N. (1991). Writing as a component of statis-
tics education. The American Statistician,45(4), 292293.
Rossiter, J. R. (2005). Reminder: A horse is a horse.
International Journal of Research in Marketing,22(1),
2325. doi:10.1016/j.ijresmar.2004.11.001
Saber, J. L., & Foster, M. K. (2011). The agony and the
ecstasy: Teaching marketing metrics to undergraduate
business students. Marketing Education Review,21(1),
920. doi:10.2753/MER1052-8008210102
Schau, C., Stevens, J., Dauphinee, T. L., & Del Vecchio, A.
(1995). The development and validation of the survey
of attitudes toward statistics. Educational and
Psychological Measurement,55,868875. doi:10.1177/
0013164495055005022
Schlee, R. P., & Harich, K. R. (2010). Knowledge and skill
requirements for marketing jobs in the 21st century.
Journal of Marketing Education,32(3), 341352.
doi:10.1177/0273475310380881
Sieck, W., & Yates, J. (1997). Exposition effects on deci-
sion making: Choice and condence in choice.
Organizational Behavior and Human Decision
Processes,70(3), 207219. doi:10.1006/obhd.
1997.2706
Spielberger, C. D. (2013). The effects of anxiety on com-
plex learning. In C. Spielberger (Ed.), Anxiety and
behavior (pp. 361398). New York, NY: Elsevier
Science.
16 Marketing Education Review
Velleman, P. F., & Wilkinson, L. (1993). Nominal, ordinal,
interval, and ratio typologies are misleading. The
American Statistician,47,6572.
Vygotsky, L. S. (1978). Mind in society. Cambridge, MA:
Harvard University Press.
Wand, Y., & Wang, R. Y. (1996). Anchoring data quality
dimensions in ontological foundations. Communications
of the ACM,39(11), 8695. doi:10.1145/240455.240479
Wang, R. Y., & Strong, D. M. (1996). Beyond accuracy: What
data quality means to data consumers. Journal of
Management Information Systems,12(4), 534.
doi:10.1080/07421222.1996.11518099
Westra, A., & Nwaoha-Brown, F. (2017). Nonresponse bias
analysis for wave 1 of the 2014 Survey of Income and
Program Participation (SIPP). US Census Bureau
Memorandum Retrieved from https://www2.census.
gov/programs-surveys/sipp/tech-documentation/com
plete-documents/2014/2014_SIPP_Wave_1_
Nonresponse_Bias_Report.pdf
Wilson,E.J.,McCabe,C.,&Smith,R.S.(2018). Curriculum
innovation for marketing analytics. Marketing Education
Review,28(1), 5266. doi:10.1080/10528008.2017.
1419431
Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring
in problem solving. Journal of Child Psychology and
Psychiatry,17(2), 89100. doi:10.1111/j.1469-7610.1976.
tb00381.x
Wright, N. D., & Larsen, V. (2016). Improving marketing
studentswriting skills using a one- page paper.
Marketing Education Review,26(1), 2532. doi:10.1080/
10528008.2015.1091666
Zmud, R. (1978). Empirical investigation of the dimensionality
of the concept of information. Decision Sciences,9(April),
187189. doi:10.1111/j.1540-5915.1978.tb01378.x
Xxxxxx 2019 17
... Recently, the Data Analytics course has been offered at the undergraduate level. This course requires mathematics, statistics, programming, and communication skills (Brandon, 2015;Weathers & Aragón, 2019). There is a growing body of research on data analytics education that focuses on curriculum development (Brandon, 2015;Lawler & Molluzzo, 2015), teaching strategies (Asamoah et al., 2017), technology-supported teaching (Weirich et al., 2017(Weirich et al., , 2018, and critical competencies (Weathers & Aragón, 2019). ...
... This course requires mathematics, statistics, programming, and communication skills (Brandon, 2015;Weathers & Aragón, 2019). There is a growing body of research on data analytics education that focuses on curriculum development (Brandon, 2015;Lawler & Molluzzo, 2015), teaching strategies (Asamoah et al., 2017), technology-supported teaching (Weirich et al., 2017(Weirich et al., , 2018, and critical competencies (Weathers & Aragón, 2019). However, students' expectations and anxieties in a data analytics course are relatively unknown. ...
... In addition to these skill sets, students are aware that they need to hone their communication skills since the results of the data analysis need to be presented. This theme and skill sets are consistent with the skill competencies identified by Weathers and Aragón (2019). ...
Article
The lack of understanding of course expectations and student anxieties in a Data Analytics course creates a significant gap in the practice of teaching and learning the course. This study investigated course expectations and anxieties of college students in Data Analytics courses offered over three semesters. A total of 2,893-word essays from 91 students were analyzed using text mining methods to achieve this goal. It was discovered that students understood the course but only associated its application to the field of business. Using hierarchical cluster analysis, students’ course expectations were classified into three themes: the goal of data analytics; the skills that would be acquired in the course; and the application of data analytics. Sentiment analysis disclosed that the students had apprehensions about the course because of its complex, meticulous, and mathematical nature. There were more positive than negative notions about the course. Implications of the findings and future work are offered.
... In this milieu, graduates with marketing analytics skills are likely to remain in high demand, with data and analytics shaping most areas of marketing, from digital (Bradford, 2017) to CPG (Chandrasekaran, Levin, Patel, & Roberts, 2013) to sales (McKinsey, 2016) to sports (Deloitte, 2017). The need for marketers to possess analytical skills is expected to continue to grow, with analytics considered an integral part of every marketing job (Ferrell & Ferrell, 2020;Laverie, Humphrey, Manis, & Freberg, 2020;Mintu-Wimsatt & Lozada, 2018;Oh, 2018;Weathers & Aragón, 2019;Yeoh, 2019). While educators and business professionals alike recognize that having analytical skills is an integral part of every marketing job, marketing majors often struggle with their quantitative coursework and try to steer away from courses that are more deeply steeped in quantitative material due to a lack of confidence in learning these skills (Aggarwal, Vaidyanathan, & Rochford, 2007;Tarasi, Wilson, Puri, & Divine, 2012). ...
... As marketing educators, we recognize the importance of our students graduating with a proficiency in quantitative reasoning to be adequately prepared for careers ahead. Recent academic literature recommends techniques for fostering analytical skills, both through dedicated marketing analytics courses (Liu & Burns, 2018;Wilson, McCabe, & Smith, 2018) and by integrating analytics throughout the marketing curricula (Liu & Levin, 2018;Weathers & Aragón, 2019). However, more guidance is needed for understanding and addressing the lack of confidence in learning quantitative work that presents barriers for many marketing students. ...
... The comments also yielded a lengthy list of techniques that the students perceived to be most helpful in adding in their acquisition of quantitative skills. We compiled the list of students' suggested techniques and supplemented it with tools that we use in our own classrooms and recommendations from academic literature (e.g., Liu & Levin, 2018;Weathers & Aragón, 2019). Ultimately, we generated 18 interventions that we divided into four categories: course materials, teaching strategies, classroom support, and curriculum revisions. ...
Article
Marketing graduates with analytical skills are in high demand but many marketing majors lack confidence in working with quantitative material. More guidance is needed to address how to support the self-efficacy of marketing students in acquiring analytical skills. We identify three independent types of confidence that relate to the development of analytical skills: confidence in natural ability, confidence in ability to learn, and confidence in current skills. We then provide the results of a two-phase survey, in which students rated an inventory of interventions designed to nurture and advance analytical capabilities. The results suggest that many of the preferences for interventions differ according to confidence level and type of confidence. For example, students with high confidence in their natural ability are more welcoming of opportunities for self-paced studies, while students with lower confidence in their natural ability are more likely to value using additional in-class time to practice quantitative problems. Overall, the findings indicate that quantitative marketing courses should offer multiple approaches for increasing students’ self-efficacy with analytics skills to reach diverse student segments, while using more uniform teaching approaches for valued interventions that do not significantly differ among student groups in terms of perceived helpfulness.
... The integration of analytics into the curriculum is thus increasingly seen as essential for business schools to stay relevant (LeClair, 2018; Weathers & Aragón, 2019), and universities have adapted in different ways to include the subject. Some have created new degree programs in Marketing Analytics or Business Analytics, with the provision of master's degrees largely outpacing that of undergraduate offerings (Weathers & Aragón, 2019;Wilson et al., 2018). While some have focused on the field's data science and computing aspects (Wilson et al., 2018;Wymbs, 2016), there is a general consensus that a successful marketing analyst requires broad and deep skills across disciplines, alongside developed soft skills (Houghton et al., 2018;LeClair, 2018;Wedel & Kannan, 2016). ...
... After identifying a practiced-based definition of digital marketing analytics, we use it to guide our investigation into the combination of knowledge and skills students require to be capable of analytics practice in marketing. The holistic view on digital marketing analytics capability means there is an integrated set of knowledge, soft and technical skills with the key aim to make results from data managerially actionable (Delen & Zolbanin, 2018;Gupta et al., 2020;Weathers & Aragón, 2019) and to derive value and measurable benefits from its use (Hanssens & Pauwels, 2016;Wedel & Kannan, 2016). ...
... This suggests that traditional research methods are less relevant in modern marketing practice where the emphasis is on deriving insights from existing data, often in real time, to improve firm performance (Finch et al., 2013;Liu & Burns, 2018;Nunan & Di Domenico, 2019). CRM and database skills, which allow firms to collect, analyze and use data from their customers, were found in more than half of the studies, confirming the expansion of customer insights as a critical domain of marketing practice (Buttle & Maklan, 2019;Harrigan & Hulbert, 2011) where CRM is core to effective analytics (Spiller & Tuten, 2019;Weathers & Aragón, 2019;Wilson et al., 2018). Technical skills relating to digital marketing were frequently required, including SEO, SEM and web analytics to analyze the effectiveness of a campaign (Liu & Levin, 2018;Staton, 2016). ...
Article
Full-text available
As marketing continues to be transformed by technology and the explosion of big data, academic research has identified a significant need for analytics skills in marketing education. However, it is unclear whether current curriculum approaches to marketing analytics equip students with the skills employers need and prepare them effectively for data-driven marketing roles. This study identifies the knowledge and skills marketing graduates require for analytics practice to bridge the theory-practice gap and increase students’ employability. Our research reveals that a blend of knowledge, soft and technical skills is needed, and that the ability to communicate insights from data to stakeholders is critical. We offer a practice-informed model which demonstrates that conceptual knowledge, technical skills, tools skills and soft skills are required to develop holistic analytics capability for marketing practice. Actionable takeaways for how educators can embed holistic analytics teaching in marketing education are also provided.
... Additionally, storytelling has been shown to be an impactful addition to pedagogy in marketing and other disciplines and to contribute to enhanced learning outcomes (e.g., Dahlstrom, 2014;Greene, Koh, Bonnici, & Chase, 2015;Nesteruk, 2015;Spiller, 2018). For instance, building communication acumen in marketing students using a framework of storytelling can help encourage analytically-minded students to develop well-rounded skill sets (Houghton, Schertzer, & Beck, 2018). ...
Article
In this experiential learning activity, Telling the Tale, students use the power of story to translate a deeper knowledge of target markets into demonstrable practical competencies, such as buyer persona formulation and copywriting. The activity’s overarching goal is to provide students with hands-on experience in which they hone their creative abilities and develop career-ready skills. To assess effectiveness, this activity was conducted in content marketing and social media marketing courses at two universities to learn about students’ experiences in applying storytelling in creating marketing messaging. Responses to a post-activity survey indicate that the teaching innovation enhances students’ engagement, sense of self-efficacy, and perceptions of performance accomplishment. Student feedback also reveals a deeper understanding of target marketing, clarity in customer-centricity, and an increased level of creative thinking. Ready-to-use classroom materials, grading resources, and adaptation plans for multiple marketing courses are provided.
Conference Paper
The OECD (Organisation for Economic Co-operation and Development) provides a list of key competencies which current learners/student should acquire to be future-proof. The three key competencies are „(1) Use tools (language, technology…) interactively“, „(2) Act autonomously“ and „(3) Interact in heterogeneous groups“. Traditional teaching focuses on providing information. I redesigned the structure of my class “Electrical Engineering” (EE) for first year Bachelor students in order to provide not only fact knowledge transfer, but also to gain and improve said key competencies. In my talk I will present methods and materials, I use, to give my students that expertise. The class format has been changed to an inverted classroom concept. In preparation of the course, the class students work self-paced on teaching videos and easy test questions to gain basic knowledge about a topic (Key 2). In class, students are stimulated to get into discussion with their colleagues via Peer Instruction (PI) questions. To support the discussion between students (Key 3), small whiteboards are handed to student-groups at the beginning of each class. Besides, those whiteboards are used for solving assignments during class time. A supervisor can walk through the room and assesses the progress of the teams. Solutions or mistakes that are worth discussing are presented in the plenum by streaming the whiteboard via a document camera. The teacher establishes a positive culture of failure, as there is no blaming and shaming, but collaborative learning. The whiteboards are a game changer in the attendance time. To encourage the use of different tools (Key 1), simulations of ideal electrical components are integrated in the teaching part. For showing the difference to real live, electrical components as resistor, multimeters etc. were handed to the students during class for guided experiments. This trains scientific working along the lines of thesis - experiment - result - conclusion, but also forces students to think about the differences between ideal models and reality. The modified course concept has been widely complimented by students, stating they are very happy to be in that class and feel well prepared for the exam.
Conference Paper
There are relatively few sources that critically evaluate the main search sources or examine how to go about synthesizing what we already know about the literature on SoTL. We use an academic literacies perspective as a lens with which to explore the different ways that literature searches and reviews may be undertaken. While searching and reviewing the literature is often presented as a scientific objective process, this is a myth; the reality is much messier, nuanced, and iterative. These are complex, context-dependent processes that are socially constructed. There is no one way of searching and reviewing the SoTL literature.
Article
This research explores the perspectives of marketing students and educators about the Big Data courses in marketing education programs. It also examines drivers that predict interest on the part of marketing students in taking Big Data courses. Data was collected through interviews with 20 marketing educators, and a survey was completed by 480 marketing students. Results of this research show how marketing educators can introduce Big Data concepts in their marketing classes and incorporate new teaching techniques. The results of our first study show that ability factors, affective factors, and interest in big data have an impact on marketing students’ attitudes toward big data courses. Nevertheless, our analysis shows that the students’ competitiveness traits and the academic value of big data do not have an impact on marketing students’ attitudes toward big data courses. Our second study emphasizes the importance of integrating big data courses into marketing education programs. It also lists the requirements for teaching big data in marketing programs, including the skills, qualifications, and experience of big data instructors. Collectively, this research provides insights associated with the growing desire to upskill in big data by marketers and the belief that these skills will be necessary for the future.
Article
This research explores the potential causes for student’s lack of interest in enrollment in marketing analytics courses. We reveal that a misconception of marketing as a “soft skills” major of study persists, which leads to biased assessment of the relevance and applicability of curriculum in regards to future career. As a result, students become less interested in quantitative courses. Furthermore, we identified self-efficacy view as a mitigating factor in the effect. We discuss potential intervention approaches to the challenges discovered.
Chapter
Full-text available
Marketing analytics is not different from marketing in any regards. Marketing analytics is the field of taking marketing decisions with the help of data. Marketing analytics uses important business metrics such as ROI in marketing attribution and overall marketing effectiveness, in other words it tells you how your Marketing programme is actually performing. This helps in making marketing strategy, monitoring trends over time, relating with new marketing trends, forecasting the outcomes, understanding your target audiences in greater details, identifying where your competitors are investing their efforts. Marketing analytics is an evergreen field, in the past we have seen lots of progress and in future it will continue. Start-up & New ventures use a lot of marketing analytics, bank and insurance companies use marketing analytics to narrow down consumers, online gaming want the users to remain in their application for a longer period of time so they also use marketing analytics. We get hugely benefited by marketing analytics to develop new strategies to make smart marketing decisions, retain customers and enhance customer satisfaction. It also helps us to Gain insights into past events analysis, current marketing campaigns, and predict consumer behavior. Apart from all these marketing analytics plays a huge role in different fields of marketing. Marketing analytics helps E-Commerce players to predict the demand for products. In addition companies can effectively plan and manage their customer needs and maintain appropriate Coordination with the business network to manage an effective supply chain. Marketing analytics quantify the causality of consumer behavior. Marketing analytics helps brand management through marketing campaigns, analyzing consumer needs, increasing positive perception in the markets. This is helping brands to discover the potential of data that can be captured and analyzed. Through long term sustainable marketing, higher value products can be created, successful Sales and loyal customers. In this paper we have done the systematic literature review of papers from Scopus & ABDC indexed journals in order to map the research advancements. Apart from that we have also focused on future research opportunities in the field of marketing analytics, which will define the road of upcoming researchers
Article
La educación del marketing ha sido tratada en diversos países. Sin embargo, los estudios en la educación del marketing en Latinoamérica están dispersos. El presente estudio se plantea dos objetivos: identificar los temas estudiados en la educación del marketing en Latinoamérica y proponer temas de investigación a partir de los artículos revisados. Para los propósitos de la investigación se ha recurrido a una revisión de literatura. Se ha detectado 36 artículos en relación al tema de investigación. Los temas tratados son: currículo, egresados, docencia y estudiantes. Entre los resultados se encuentran la existencia de bibliografía en ciencias sociales que aborda con suficiencia. Se concluye que la didáctica centrada en el estudiante predomina en la enseñanza del marketing, así como es necesario estudios en la enseñanza del marketing en el contexto virtual y estudios del proceso enseñanza y aprendizaje en el nivel de postgrado en la educación del marketing.
Article
Full-text available
Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers’ job postings, one million tweets about marketing analytics from Twitter, and 13 course syllabi downloaded from the Internet. We apply text-mining techniques to generate insights from this big data set of electronic texts. By integrating insights from marketing practitioners and professors, we have designed a marketing analytics course and offer solutions for better structuring such courses. This study provides suggestions for marketing educators.
Article
Full-text available
College graduates need better preparation for and experience in data analytics for higher-quality problem solving. Using the curriculum innovation framework of Borin, Metcalf, and Tietje (2007) and case study research methods, we offer rich insights about one higher education institution’s work to address the marketing analytics skills gap. Specifically, we describe how a focal marketing department developed an innovative curriculum based on foundational pillars of learning styles (Bloom’s Taxonomy), experiential learning theory, and marketing analytics. The effectiveness of the curriculum is reviewed in light of criteria such as established learning goals and student satisfaction.
Article
Marketing education has responded, to some extent, to the academic challenges emanating from the Big Data revolution. To provide a forum to specifically discuss how business analytics has been integrated into the marketing curriculum, we developed a Special Issue for Marketing Education Review. We start with a call to action that underscores the importance of including analytics in developing marketing students’ skill sets and competencies. In response to the call for integration, five articles discuss analytics based on four pillars of the marketing curriculum: content, pedagogy, structure, and purpose.
Article
Advances in technology and marketing practice have left little doubt that analytics must be integrated into the marketing curriculum, the question for many educators now is how to best to do so. While the response for each school will depend on its mission and context, as well as its strategies and resources, there already is much that can be learned from experiences across the business education industry. This article draws on these experiences to offer eight recommendations across fours areas of the marketing curriculum, including content, pedagogy, structure, and purpose. A wide range of important issues are considered in the process, including the role of experiential learning and blended learning, balance of technical knowledge and soft skills, rise of badges and other non-degree credentials, importance of ethics, and challenges related to program sustainability.
Article
With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of study on the practice of teaching analytics systematically in the marketing curriculum. This article fills the gap by introducing a progressive approach to teaching analytics across different marketing courses by focusing on program-wide curriculum mapping and design. We delineate the framework to develop the curriculum with an analytics component.
Article
Marketing analytics students who can communicate effectively with decision makers are in high demand. These “analytic unicorns” are hard to find. The Master of Science in Customer Analytics (MSCA) degree program at Xavier University seeks to fill that need. In this paper, we discuss the process of creating the MSCA program. We outline the program objectives and course requirements, and discuss the types of students we recruit. Finally, we introduce storytelling skills as a way to help marketing analytics students become analytic unicorns. Student learning outcomes (SLOs), methods of assessment, and learning exercises are included to aid in implementation.
Book
Arguably one of the most profoundly important essays ever written on the nature and significance of "quality" and definitely a necessary anodyne to the consequences of a modern world pathologically obsessed with quantity. Although set as a story of a cross-country trip on a motorcycle by a father and son, it is more nearly a journey through 2,000 years of Western philosophy. For some people, this has been a truly life-changing book.
Article
A critical element in the evolution of a fundamental body of knowledge in marketing, as well as for improved marketing practice, is the development of better measures of the variables with which marketers work. In this article an approach is outlined by which this goal can be achieved and portions of the approach are illustrated in terms of a job satisfaction measure.