ArticlePDF Available

Information Privacy Concerns, Procedural Fairness and Impersonal Trust: An Empirical Investigation

Authors:

Abstract and Figures

This research addresses the tensions that arise between the collection and use of personal information that people provide in the course of most consumer transactions, and privacy. In today's electronic world, the competitive strategies of successful firms increasingly depend on vast amounts of customer data. Ironically, the same information practices that provide value to organizations also raise privacy concerns for individuals. This study hypothesized that organizations can address these privacy concerns and gain business advantage through customer retention by observing procedural fairness: customers will be willing to disclose personal information and have that information subsequently used to create consumer profiles for business use when there are fair procedures in place to protect individual privacy. Because customer relationships are characterized by social distance, customers must depend on strangers to act on their behalf. Procedural fairness serves as an intermediary to bu...
Content may be subject to copyright.
1/17/98
Information Privacy Concerns, Procedural Fairness and Impersonal Trust:
An Empirical Investigation
By
Mary J. Culnan
School of Business
Georgetown University
Washington, D.C. 20057-1008
(202) 687-3802
(202) 687-4031 (fax)
CULNANM @ GUNET.GEORGETOWN.EDU
&
Pamela K. Armstrong
Consultant
Potomac, Maryland
PAMELA.ARMSTRONG@EROLS.COM
Correspondence about the paper should be directed to Mary Culnan
Organization Science, forthcoming
Revised October 4, 1997
The authors acknowledge the helpful comments of Jeff Smith and Bob Zmud, the anonymous
reviewers and the Associate Editor, and especially Bob Bies on earlier versions of this paper. A
preliminary version was presented at the INFORMS National Meeting, May 1996.
1/17/98 2
Information Privacy Concerns, Procedural Fairness and Impersonal Trust: An Empirical
Investigation
Abstract
This research addresses the tensions that arise between the collection and use of personal
information that people provide in the course of most consumer transactions, and privacy. In
today’s electronic world, the competitive strategies of successful firms increasingly depend on
vast amounts of customer data. Ironically, the same information practices that provide value to
organizations also raise privacy concerns for individuals. This study hypothesized that
organizations can address these privacy concerns and gain business advantage through customer
retention by observing procedural fairness: customers will be willing to disclose personal
information and have that information subsequently used to create consumer profiles for business
use when there are fair procedures in place to protect individual privacy. Because customer
relationships are characterized by social distance, customers must depend on strangers to act on
their behalf. Procedural fairness serves as an intermediary to build trust when interchangeable
organizational agents exercise considerable delegated power on behalf of customers who cannot
specify or constrain their behavior. Our hypothesis was supported as we found that when
customers are explicitly told that fair information practices are employed, privacy concerns do not
distinguish consumers who are willing to be profiled from those who are unwilling to have their
personal information used in this way.
KEYWORDS: Information privacy, procedural justice, trust, service quality, organizational
information processing
1/17/98 3
Information Privacy Concerns, Procedural Fairness and Impersonal Trust: An Empirical
Investigation
INTRODUCTION
Two converging trends, one competitive and the other technological, are driving
American business. First, to survive in the increasingly competitive global economy, companies
depend on vast quantities of information to build strong bonds with current customers, and to
attract new customers. Second, information technology (IT) continues to increase in capability
and to decline in cost, allowing information to be used in ways that were previously impossible or
economically impractical. Technology enables companies to record the details of any customer
transaction at the point-of-sale, to store vast quantities of transaction data in their data
warehouse, and to use these data to execute marketing programs with a business partner or alone.
Technology also enables the development of extensive customer databases, making it possible to
deal with customers as individuals. Instantaneous access to the customer’s history by a customer
service representative allows standardized, impersonal encounters with whomever answers the
800-number to assume the appearance of a personal relationship (Gutek, 1995). Therefore, the
marketing strategies of successful firms increasingly depend on effective use of vast amounts of
detailed customer transaction data (Bessen, 1993; Blattberg & Deighton, 1991; Glazer, 1991).
This research addresses the tensions that arise in today's increasingly electronic world
between the collection and use of personal information people provide in the course of most
consumer transactions, and individual privacy. The hypothesis of the study is that consumers will
be willing to disclose personal information and have that information subsequently used to create
profiles for marketing use when their concerns about privacy are addressed by fair procedures.
The major contribution of the research is that it provides empirical evidence that companies can
gain competitive advantage by behaving ethically.
Transaction data generated by customer contacts before, during and after the sale are a
critical resource in the increasingly competitive global economy that is moving from a paradigm of
1/17/98 4
mass production and mass merchandising to one of mass customization and personal service
(Glazer, 1991; Pine, 1993). Table 1 illustrates the data typically generated during a sales
transaction. The richness of the data varies depending upon the technology employed, ranging
from a cash register without scanning capability where essentially no customer data is recorded to
an online service where all of the customer’s “mouse tracks” are recorded (Miller, 1996).
Advances in telecommunications and database technology mean that all transaction data should be
accessible on a timely basis to everyone in the firm with a need for the data. For example, data
collected about product returns in Europe can be used by marketers in the U.S. or by a plant
manager in Mexico to address potential problems in product design or changes in customer
preferences as soon as enough products are returned, and the aggregated data about these returns
makes the organization aware that a problem may exist. Transaction data signaling increased
sales or the success of an advertising campaign for a target market segment or even an absence of
sales data where sales were expected serve the same signaling function to the firm. Because
these individual transactions are in reality, "messages" from customers to the firm that should be
distributed as appropriate to functions across the value chain, information systems that process
these transactions are in fact organizational information systems (Culnan, 1992). Organizations
can gain competitive advantage by collecting and using transaction data effectively (Glazer,
1991).
-- Insert Table 1 About Here --
The use of transaction data as an organizational resource can create positive or negative
outcomes to a firm, based on how the information is used. In positive terms, the use of
transaction data to yield better customer service, higher quality products, and new products that
reflect consumer preferences creates benefits for both consumers and the firm. The collection of
detailed information on consumer preferences enables firms to engage in relationship marketing
and to target offers more accurately based on their customers' specific interests (Blattberg &
Deighton, 1991; Glazer, 1991).
1/17/98 5
There is also a potential downside to the collection and use of greater amounts of
increasingly detailed personal information. Ironically, the same practices that provide value to
organizations and their customers also raise privacy concerns (Bloom, Milne & Adler, 1994).
Privacy is the ability of the individual to control the terms under which personal information is
acquired and used (Westin, 1967). Personal information is information identifiable to an
individual. As Table 1 illustrates, today’s customers leave more electronic footprints detailing
their behavior and preferences; their buying habits are easily profiled, and can be readily shared
with strangers. If the firm’s practices raise privacy concerns resulting from a perception that
personal information is used unfairly, this may lead to customers being unwilling to disclose
additional personal information, customer defections, bad word of mouth, and difficulty attracting
new customers, all of which can negatively impact the bottom line. The growth of the Internet
and other online systems also makes it possible for consumers to engage in “electronic retaliation”
if they object to a company’s practices, by “flaming” the company directly by electronic mail (Bies
& Tripp, 1996), or by posting negative public comments to a computer discussion group. As the
text of Internet discussion groups are archived and can be easily searched by keyword such as
company or product name, these negative comments live on long after they were posted. The
challenge to organizations, then, is to balance the competing forces of the power of information
with privacy in their dealings with their customers.
The failure to use personal information fairly or responsibly may raise two kinds of
information privacy concerns resulting from the inability of an individual to control the use of
personal information. First, an individual's privacy may be invaded if unauthorized access is
gained to personal information as a result of a security breach or an absence of appropriate
internal controls. Second, because computerized information may be readily duplicated and
shared, there is the risk of secondary use, that is information provided for one purpose may be
reused for unrelated purposes without the individual's knowledge or consent. Secondary use
includes sharing personal information with others who were not a party to the original transaction,
or the merging of transaction and demographic data to create a computerized profile of an
1/17/98 6
individual by the organization that originally collected the information (Culnan, 1993, Godwin,
1991; Foxman & Kilcoyne, 1993, Smith, Milberg & Burke, 1996). This paper addresses the
latter concern, secondary use, where organizations make deliberate choices about reuse of their
customers’ personal information, and where the customer may perceive the reuse as varying from
their expectations for fair use, done without their consent, and therefore unfair.
THEORETICAL BACKGROUND & HYPOTHESES
Privacy and Fairness
Prior research on privacy found that individuals are willing to disclose personal
information in exchange for some economic or social benefit subject to the "privacy calculus," an
assessment that their personal information will subsequently be used fairly and they will not suffer
negative consequences (Laufer & Wolfe, 1977; Milne & Gordon, 1993; Stone & Stone, 1990).
For example, a recent survey of Internet users conducted by Georgia Tech found that 78% of the
survey participants would be willing to provide demographic information about themselves to the
owner of a web site if “a statement was provided regarding how the information was used.” Only
6% of the participants would not disclose demographic information under any circumstances
(Georgia Tech, 1996).
In general, individuals are less likely to perceive information collection procedures as
privacy-invasive when a) information is collected in the context of an existing relationship, b) they
perceive that they have the ability to control future use of the information, c) the information
collected or used is relevant to the transaction, and d) they believe the information will be used to
draw reliable and valid inferences about them. See Bies (1993) and Stone and Stone (1990) for
an extensive review of this literature. While the self-disclosure literature has focused on
interpersonal relationships rather than impersonal customer relationships between individuals and
1/17/98 7
firms, its findings are consistent regarding a balancing test. People disclose personal information
to gain the benefits of a close relationship; the benefits of disclosure are balanced with an
assessment of the risks of disclosure (Derlega et. al., 1993).
Creating a willingness in individuals to disclose personal information, then, requires that
organizations also view the collection of personal information as a "social contract" with their firm
where in addition to exchanging money for products or services, the customer also makes non-
monetary exchanges of personal information for intangible benefits such as higher quality service
described above (Glazer, 1991; Milne & Gordon, 1993). Customers will continue to participate
in this social contract as long as the perceived benefits exceed the risks. Developing information
practices that address this perceived risk results in positive experiences with a firm over time,
increasing the customer's perceptions that the firm can be trusted. Trust reflects a willingness to
assume the risks of disclosure (Mayer et. al., 1995). Trust creates switching costs, increasing the
likelihood that the customer will continue in the relationship with the firm (Gundlach & Murphy,
1993). Managing this “second exchange” in a marketing transaction by treating customer
information fairly, then, is essential to building trust in a customer relationship.
Some industry groups have argued that privacy is a customer service issue (Direct
Marketing Association, 1994; Dowling 1993). While the literature on customer service has not
specifically addressed privacy, it has established a link between being treated fairly and customer
satisfaction (Schneider & Bowen, 1995). Berry (1995) found that customers see fairness and
service quality as “inseparable issues” -- since customer perceptions drive service quality, a
service that is perceived as being unfair will also be perceived as being lower in quality.
Conversely, the perception of fair treatment of customers has been shown to be positively related
to higher levels of satisfaction in services (Clemmer & Schneider, 1996). These authors found
this link between fair treatment and customer satisfaction to hold across all four of the service
1/17/98 8
industries they studied. They found that customers evaluate the fairness of the core service
received, the procedures used in service delivery, and the personal treatment received. Fairness is
inherent in the consumer’s basic need for justice. A violation of this need, such as violating a
psychological contract, will result in angry and disloyal customers as described above. Heskett,
Sasser and Hart (1990) note that “many differences attached to the value of a service by
customers are explained by the level of risk perceived by the customer...and the degree to which
such risks can be minimized by the service provider.” The customer who discloses personal
information runs the risk that the information will not be used fairly. Companies that establish fair
information practices and disclose these practices before collecting personal information from
customers are greatly reducing these perceived risks and the subsequent negative consequences.
One goal of offering high quality service is to keep customers coming back and to attract
new ones through positive word-of-mouth. Gutek (1995) notes that as the customer uses the
service over time (assuming he or she continues to perceive the service as fair), trust builds
between the customer and service provider. This trust is crucial, since customers often lack the
expertise or the first-hand knowledge to know whether the service provided is correct (Shapiro,
1987). If the trust is low, then the customer will likely take his or her business elsewhere. If the
customer has absolute trust in the provider, then the provider will be able to learn more about the
customer in order to serve customers better. However, absolute trust also provides a potential
opportunity for the company to exploit the customer (Gutek 1995; Shapiro, 1987). The literature
on organizational justice suggests that procedural fairness of company practices can have a major
positive impact on trust and privacy perceptions (Bies, 1993).
Procedural Fairness
Procedural fairness refers to the perception by the individual that a particular activity in
which they are a participant is conducted fairly (Lind & Tyler, 1988). Factors that contribute to
perceptions of procedural fairness include providing the consumer with voice, and control over
actual outcomes (Folger & Greenberg, 1985; Lind and Tyler, 1988). Research has shown that
even if outcomes are not favorable to an individual, individuals are less likely to be dissatisfied
1/17/98 9
with unfavorable outcomes if they believe that the procedures used to derive those outcomes are
fair (Lind & Tyler, 1988; Greenberg, 1987; Folger & Bies, 1989).
For consumer marketing, fair information practices operationalize procedural fairness.
Fair information practices are procedures that provide individuals with control over the disclosure
and subsequent use of their personal information. They are global standards for the ethical use of
personal information and are at the heart of U.S. privacy laws, the privacy directive adopted by
the European Union in July 1995, and the Clinton Administration's June 1995 guidelines for
personal information use by all National Information Infrastructure participants.
At the heart of fair information practices are two concepts: notice and consent. These
two concepts are reflected in the following principles. When they provide personal information,
people have the right to know why the information is being collected, its expected uses, the steps
that will be taken to protect its confidentiality, integrity and quality, the consequences of
providing or withholding information, and any means of redress available to the individual.
People also have the right to control how their personal information will subsequently be used by
objecting to uses of their personal information when information will be collected for one purpose
and used for other purposes. Fair information practices also state that personal information
should not be used in ways that are incompatible with the individual's understanding of how it will
be used unless there is a compelling public interest for such use (U.S. IITF, 1995). Fair
information practices, therefore, mediate the privacy concerns raised by disclosure and subsequent
use of personal information by empowering the individual with control and voice, even if people
do not choose to invoke the procedures, as well as an assurance that the firm will adhere to a set
of principles that most customers find acceptable (Folger & Bies, 1989; Folger & Greenberg,
1985; Greenberg, 1987; Lind & Tyler, 1988; Mayer et. al., 1995; Shapiro, 1987; Stone & Stone,
1990). Fair information practices, then, make the "deal" with the consumer fair (Donaldson &
Dunfee, 1994; Milne & Gordon, 1993).
In marketing, a central element of fair information practices is the ability of individuals to
remove their names from mailing lists. The 1990 Equifax survey found the majority of the public
1/17/98 10
believes it is acceptable for direct marketers to use names and address on a mailing list if people
who do not want to receive mail offers could remove their names from the mailing list. Culnan
(1995) found that people who were aware of name removal procedures had a lower concern for
privacy than those who were not aware of these procedures, suggesting that awareness of fairness
procedures can address the privacy concerns associated with disclosure and use of personal
information (Greenberg, 1987).
Procedural fairness, then, can create a "privacy leverage point" for organizations by
providing an opportunity for the firm to promote the customer disclosure of personal information
by disclosing its information policies to the customer or prospective customer, provided its
subsequent practices are consistent with the policy. Figure 1 illustrates the role of procedural
fairness in building trust over the life of a customer relationship where customers must rely on
“strangers” to protect their interests (Shapiro, 1987). Over the life of a customer relationship,
firms potentially gather large amounts of personal information about their customers. Some of
this is gathered directly as a result of each transaction, other information is acquired from third
parties, allowing the firm to develop an extensive profile for each customer. Data warehouse
technology allows the firm to perform sophisticated analyses on massive amounts of transaction
data and to develop marketing programs for individual customers. As described previously, other
online technologies make these data available for use thoughout the organization, independent of
physical location. Based on their subsequent experiences with the firm, customers make an
assessment of whether or not they perceive that their personal information was used consistent
with their expectations. If the information was used consistently, the customer is likely to stay in
the relationship. If not, the customer may be likely to defect and/or to engage in bad word of
mouth (Morgan & Hunt,1994). The privacy leverage point, then, provides an intervention
opportunity for firms to build trust with their customers as they collect and use personal
information, therefore making customers willing to disclose personal information by minimizing
the risks of disclosure to these individuals.
-- Insert Figure 1 about here--
1/17/98 11
Hypotheses
This study hypothesizes that procedural fairness can strike a balance between the
competing forces of privacy and information use. When taken together, the literature on privacy,
self-disclosure and procedural justice suggest that procedural fairness, defined here as fair
information practices, can mediate the privacy concerns that often arise when customer
transaction data and other personal information are merged to create profiles for use in targeted
marketing. However, this relationship has not been tested empirically. This study will test two
hypotheses that address the relationship between procedural fairness and privacy:
H1: When people are not explicitly told that fair procedures will be employed for
managing their personal information, people with a greater concern for privacy will be
less willing to have their personal information used for profiling.
H2: When people are explicitly told that fair procedures will be employed for managing
their personal information, privacy concerns will not distinguish people who are unwilling
to be profiled from those who are willing to have their personal information used for
profiling.
Prior experience should also influence an individual’s willingness to be profiled.
Individuals who have prior experience with direct or targeted marketing are more likely to
understand the benefits of profiling and to also to be aware of fair information practices as a
means for exercising control over their personal information (Culnan, 1995). Those with
experience should also have developed a degree of trust in the process as Figure 1 illustrates, and
should be more willing to have their personal information used in this way. For these individuals,
profiling is likely to be perceived as compatible with their existing values and past experiences
(Rogers, 1983). This suggests:
H3: Prior experience with targeted marketing will distinguish people who are willing to
have their personal information used for profiling from those who are not willing.
1/17/98 12
METHODOLOGY
The context for the research is the use of personal information gathered from prospective
subscribers to interactive home information services and the willingness of consumers to allow
personal information to be used in targeted advertising based on customer profiles compiled by
the interactive service providers. The study is based on a fresh analysis data from the 1994
Harris Survey on Interactive Services, Consumers and Privacy. The survey was designed and
sponsored by Privacy and American Business. Data were collected by Louis Harris &
Associates by telephone from a random sample of 1,000 U.S. adults age 18 and older.
Dependent Variable.
Willingness to have one’s personal information used to develop profiles for targeted
marketing was measured by two variables. First, willingness to have personal information to be
used for profiling without being explictly told that fair information practices would be employed
was measured by two four-point Likert-scaled items ranging from “not at all interested” to “very
interested”:
How interested would you be in having this type of advertising [based on
subscriber profiling] presented to you from time to time, on your computer or T.V.
screen? (mean = 2.32, s.d. = 0.97); and
The interactive services provider could also ask you to check off your interests
and activities from a list on the TV or computer screen, so that special offers could
be made to you on-screen. How interested do you think you would be in doing
that? (mean = 2.32, s.d. = 1.01).
The two variables were factor analyzed using a varimax rotation. Both items loaded
unambiguously on a single factor and were combined to form a “use without fair information
practices” (USE-NO FIP) scale (r= .58, p < .001; Cronbach alpha =.74).
Attitudes toward having personal information used in profiling if individuals were
explictly told fair information practices were observed (USE-FIP) was measured by a single item,
“If the rules and safeguards I’ve just mentioned were adopted by companies offering interactive
1/17/98 13
services, how interested would you be in subscribing to a system that used subscriber profile
marketing?” (mean = 2.59, s.d. = 1.03).
The rules and safeguards comprising fair information practices were read to the survey
participants before the “USE-FIP” question was administered and were defined as follows:
Before you decided to subscribe, the service provider would inform you fully about
the collection of subscriber profile information and how it would be used;
You could control the types of products and services advertised to you as well as
when and for how long advertising messages would be displayed on the screen;
You could indicate what information in your subscriber profile could be used for
marketing and what couldn’t; and
You could review the information in your subscriber profile and correct any errors.
For each item, respondents were asked the importance of the practice on a 4-point Likert scale.
The four items were factor analyzed using a varimax rotation. All four items loaded
unambiguously on a single factor (Cronbach alpha = .88).
The questionnaire items were administered in the following order. The USE-NO FIP
items were administered first. Second, respondents were asked about the importance of fair
information practices. Finally, the USE-FIP item was administered.
Independent Variables
In an absolute sense, individuals surrender a measure of privacy whenever they disclose
any personal information. Therefore, taking overt steps to restrict the disclosure of personal
information should reflect a concern for diminished privacy that would result from disclosure.
The first independent variable, behavior that indicated a concern for privacy, was operationalized
using three dichotomous variables that measured an individual taking steps to restrict the
disclosure of personal information. The first two measured an individual’s unwillingness to
disclose personal information to others. The third measured whether an individual had ever been
unwilling to allow personal information to be reused for targeted marketing by another
organization:
1/17/98 14
Have you ever refused to give information to a business or company because you
thought it was not really needed, or was too personal, or haven’t you? (Yes = 70%)
Does your household have an unlisted or unpublished telephone number? (Yes = 23%)
Have you ever asked an organization, such as a publication or business with which you
have a relationship, to take your name off of any list they gave out to other
organizations for sending you mail offers, or not? (Yes = 33%).
The three items were summed to form a scale (mean = 1.30, s.d. = 0.87).
The second independent variable, prior experience with direct marketing was measured by
a series of dichotomous variables. The subjects were asked whether or not they had:
Bought something from a catalog or brochure sent to your residence or workplace
(Yes = 65%)
Bought something offered to you by a telephone call to your residence or workplace
(Yes = 14%)
Bought something from a TV home shopping club (Yes = 13%)
Called a toll-free or 800 number to order something (Yes = 46%)
Used a 900 number that charged for information, products or services (Yes = 4%).
The responses to these items were summed to form a single variable, Direct Marketing
Experience (DMEXP: mean = 1.49, s.d. = 1.08).
Table 2 and contains descriptive statistics for the dependent and independent variables.
Table 2 also contains correlations for the two dependent variables and the independent variables.
-- Insert Table 2 About Here --
Prior research has also established that individuals also vary in their concern for privacy,
based on their demographics and life's experiences. For example, the Harris-Equifax Surveys
found African Americans, Hispanics, women, and less educated people to be most concerned
about privacy. Singer et. al. (1993) found that both demographics and concern for privacy were
significantly related to return rates for the 1990 census, however privacy concerns varied for
white and African-American respondents. Culnan (1995) found that demographics, experience
1/17/98 15
with direct marketing, and concern for privacy significantly discriminated among individuals who
were versus those who were not aware of name removal procedures. These results suggest that
here, an individual’s willingness to have their personal information used for targeted marketing is
also likely to reflect both their demographics and experience. However, prior research also
suggests that these demographic differences are captured by both attitudinal and behavior
variables (Ajzen and Fishbein, 1980). Therefore, no additional demographic or experience
variables were used in this study.
RESULTS
The hypotheses were tested using a discriminant analysis which examined the joint
significance of the relationships in the hypothesized model (Hair et al., 1987). Discriminant
analysis is the appropriate statistical technique for determining if significant differences exist
between the profiles of two groups defined by a categorical dependent variable.
Discriminant analysis was used because because the first dependent variable, USE-NO
FIP, was not normally distributed. When the two items were summed to form the USE-NO FIP
scale, the resulting distribution consisted only of even values. The use of a dichotomous variable
to operationalize the dependent variable, willingness/unwillingness to be profiled, was appropriate
given the hypotheses to be tested.
Both dependent variables were subsequently recoded as dichotomous variables. For both
variables, observations with values above the mean were coded as one, and those below the mean
were coded as zero. A separate discriminant analysis was performed for each of the two
dependent variables using the two independent variables: privacy and direct marketing
experience. These results are shown in Table 3 and Table 4.
-- Insert Table 3 and Table 4 About Here --
1/17/98 16
Table 3 summarizes the results for the first dependent variable, use without being
explicitly told that fair information practices would be observed (USE - NO FIP). Both of the
independent variables are significant discriminators of those who are willing versus those who are
not willing to be profiled without fair information practices. The overall results for the
discriminant function were also significant (Chi-Square = 20.75, 2 d.f., p < 0.001). The function
correctly classified 56.1% of the cases in the holdout sample. This is greater than the
proportional chance criterion of 50.8%, which is calculated as C
p
= p
2
+ (1-p)
2
, where p is the
proportion of people in group 1 and (1-p) is the proportion of people in group 2. We can
interpret the structure correlations as factor loadings to determine the variables that make the
greatest contribution to the discriminant function; generally, the variables with correlations that
exceed |0.30| are considered significant (Hair et al., 1987). Both of the independent variables
were significant.
Table 4 summarizes the results for the discriminant analysis using the second dependent
variable, willingness to be profiled after being told explicitly that fair information practices would
be observed (USE - FIP). Here, only independent variables for direct marketing experience is a
significant discriminator with the structure correlation exceeding 0.30. The overall function is
also significant (Chi-Square = 7.28, 2 d.f., p < 0.05). The discriminant function correctly
classified 60.0% of the holdout sample which is greater than the proportional chance criterion of
52.0%.
These results provide support for the three hypotheses. The first two hypotheses
postulated that privacy would distinguish people who are willing to be profiled from those who
are unwilling to be profiled only when people were not told that fair information practices would
be observed. As hypothesized, the privacy variable was a significant discriminator only in the first
1/17/98 17
discriminant analysis. In the second discriminant analysis, the privacy variable is not a significant
discriminator, providing support for the hypothesis that privacy concerns can be addressed by
explicitly telling customers that the company observes fair information practices.
The third hypothesis postulated that prior experience would also discriminate between
people who are willing to be profiled. In both discriminant functions, the direct marketing
experience variable is significant, providing support for this hypothesis. People who are willing to
be profiled for marketing purposes are more likely to have prior experience with direct marketing
than people who are not willing.
DISCUSSION
Effective use of customer information to support activities across an organization’s value
chain has become a competitive necessity. The key challenge to organizations is to balance the
competitive advantages provided by the use of this information with the privacy concerns that use
of personal information may raise among its customers. This study addressed the role of
procedural fairness in addressing the privacy concerns that may be raised when personal
information is used to develop marketing profiles. The results suggest that companies can gain
business advantage through customer retention by observing procedural fairness.
The research has some methodological limitations. As described above, the study was
based on secondary data analysis of a survey designed to measure public opinion; the original
research was not driven by any theoretical model or framework. Individual questionnaire items
were designed to be unbiased, but not necessarily to pass psychometric muster such as the need to
use multiple items to measure attitudes. The variables used in the present study were constructed
after the fact and as a result, some of them lack the psychometric properties one would expect in a
study that was under the total control of the authors. Therefore the results should be viewed with
1/17/98 18
some caution. The strengths of the research are that the data represent a national random sample
of U.S. adults rather than a convenience sample, and the results are consistent with theory.
The study found that when people were explicitly told that fairness procedures in the form
of fair information practices are observed, only prior experience distinguished individuals who
were willing to be profiled from those who were not willing. When people were not explictly told
that fair information practices were observed, both privacy and experience distinguished the
individuals who were willing from those who were not willing to be profiled. This suggests that
procedural fairness can successfully address privacy concerns, and when fair information practices
are observed, customers will be more willing to continue in a relationship with a firm, allowing the
firm to benefit from the collection and use of data that results from the relationship. These results
are also consistent with prior research related to disclosure of personal information by Internet
users (Georgia Tech, 1996). While industry codes of conduct have called for firms to observe fair
information practices if they want to be perceived as behaving ethically, this is the first empirical
study to find that observing fair information practices is in the business interests of marketers
because building trust through fairness is one basis for attracting and retaining customers as
Figure 1 illustrates.
Since fairness appears to be a key factor in addressing privacy concerns, the results also
suggest that procedural justice is a promising theoretical basis for future research on information
privacy. Much of the organizational research on justice has focused on the fairness of both
outcomes and procedures related to personnel decisions such as layoffs, pay freezes, or the
introduction of drug testing policies (Bies, 1993). This study suggests that in addition to
understanding the relationships between organizations and their employees, this theory can also be
used investigate the relationship between organizations and their customers. For example,
1/17/98 19
Brockner and Siegel (1996) reviewed the procedural justice literature and reported that the level
of procedural fairness influences the degree of trust in exchange relationships. Figure 1 shows
trust moderated by fair information practices as a key factor in an individual’s decision to maintain
a customer relationship with a firm where the customer will disclose large amounts of personal
information over the life of that relationship. The influence of procedural fairness on customer
loyalty, particularly if the customer experiences a negative outcome involving personal
information use without defecting, merits investigation. This is particularly important in
electronic environments, where the evolution of shared norms about fair use of personal
information often lag the capability of the technology.
The study, however, differs from much of the prior research on trust as it focuses on
impersonal trust, or trust in institutions. Much of the prior research on trust has focused on
interpersonal trust where two or more individuals have first-hand knowledge of one another as in
the case of workplace relationships, or business-to-business marketing relationships between
buyers and a sales representatives (see for example Kramer & Tyler, 1996 and Morgan & Hunt,
1996). Consumer marketing relationships are usually characterized by great social distance:
customers may not deal with another person in the case of Internet commerce, or are unlikely to
know any of the people they deal with in the case of face-to-face or telephone transactions.
Because customers must depend on strangers to act on their behalf, procedural fairness
operationalized as fair information practices acts a fiduciary norm to build trust when control
measures derived from social ties and direct contact between the customer and the firm are
unavailable, when faceless and readily interchangeable individual or organizational agents exercise
considerable delegated power or privilege on behalf of customers who can neither specify,
scrutinize, evaluate nor constrain their behavior (Shapiro, 1987; Zucker, 1986).
1/17/98 20
Because it is impossible for firms to go back to their customers for permission each time a
new use for personal information is contemplated, these findings should also have important
implications for practice. Firms which implement fair information practices, and disclose these
practices to their customers can exercise latitude in how they use personal information gathered
from transaction data for marketing without risking customer defections and the other negative
outcomes described previously provided they ensure that their practices are consistent with what
they disclosed to their customers. However, if fair information practices are not embedded in the
work practices of all employees, there is a risk that a customer service representative or product
manager may allow personal information to be used in a way that is at odds with the customers’
norms for acceptable use, resulting in a customer, media or regulatory backlash. Creating a
“culture of privacy” within the organization clearly involves more than creating a privacy policy
based on fair information practices. A senior manager needs to champion privacy. Employees
need to be trained and retrained. Periodic audits should be conducted to ensure that practices
conform to policy. Privacy should be made part of the business case for all new uses of personal
information.
There is some evidence that all U.S. firms have not assimilated this message about the
importance of managing customer privacy issues strategically (Schwartz and Reidenberg, 1996).
The Harris-Equifax privacy surveys consistently find that the majority of consumers believe they
have lost all control over how their personal information is used by business (Harris, 1990-1994).
Smith (1995) investigated how seven different organizations in four industries responded to
growing concerns about privacy. He observed a three-phase cycle of response: drift external
threat and reaction. Rather than address privacy issues proactively, these firms delegated
1/17/98 21
responsibility for privacy to lower-level managers. New policies were developed only in response
to an external threat or crisis.
Further research is needed to understand how to measure privacy as an attitude. For
example, Table 2 shows an unexpected significant positive correlation between privacy and direct
marketing experience. It may be that privacy concerns may be driven by experience and by
context, and that people do not develop attitudes about privacy until they have had some
experience with a particular use of personal information as prior research has suggested (Culnan,
1995). Smith, Milberg and Burke (1996) developed and validated a scale to measure individuals’
privacy concerns with corporate information practices such as sharing information with third
parties. However, there is no validated scale to measure overall privacy attitudes.
Finally, this study only considered one aspect of fairness, procedural fairness. It addressed
consumer perceptions of the fairness of information use based on what the firm disclosed to the
consumer about its information-handling procedures. The study did not address perceptions of
fairness related to the actual ways the firm subsequently reuses personal information. Distributive
fairness relates to ways a firm uses the personal information in its customer database or data
warehouse on a day to day basis, and whether or not the customer perceives these uses as being
fair or unfair. The justice literature suggests that even when a particular outcome is perceived
negatively, customers should be less likely to defect from a relationship if they perceive the
process by which their data were collected and used to be fair (Lind & Tyler, 1988). In order to
understand whether both procedural fairness and trust can buffer a firm from the negative
consequences portrayed in Figure 1 such as defecting when a customer perceives an outcome
negatively, the interaction among procedural fairness, outcomes and trust merits further
investigation.
1/17/98 22
Tomorrow’s emerging information environments will continue to provide greater
decentralized access to personal information. This study showed that privacy is an organizational
issue. Without an organizational policy governing fair use of personal information, organizations
face the risk that information used inappropriately by a single employee or by a single department
can have negative consequences for the entire firm. Conversely, using personal information fairly
throughout the organization can provide a source of competitive advantage by promoting flows of
customer data over time that in today’s competitive global economy, are critical in support of all
activities in a firm’s value chain.
1/17/98 23
REFERENCES
Ajzen, Izek and Fishbein, Martin. 1980. Understanding Attitudes and Predicting Social
Behavior. Englewood Cliffs: Prentice Hall.
Berry, Leonard L. 1995. On Great Service: A Framework for Action. New York: The Free
Press.
Bessen, Jim. 1993. Riding the Marketing Information Wave. Harvard Business Review, 71, 5
(September-October), 150-160.
Bies, Robert J. 1993. Privacy and Procedural Justice in Organizations. Social Justice Research,
6, 1, 69-86.
Bies, Robert J. and Tripp, Thomas M. 1996. Beyond Distrust: “Getting Even” and the Need for
Revenge. In. In. Kramer, R.M. and Tyler, T.R., Trust in Organizations: Frontiers of Theory and
Research, 246-260. Thousand Oaks, CA: Sage.
Blattberg, Robert C. and Deighton, John. (1991). Interactive Marketing: Exploiting the Age of
Addressability. Sloan Management Review, 33, 1, 5-14.
Bloom, Paul N., Milne, George R., and Adler, Robert. (1994). Avoiding Misuse of Information
Technologies: Legal and Societal Considerations. Journal of Marketing, 58, 1 (January), 98-110.
Brockner, Joel and Siegel, Phyllis. 1996. Understanding the Interaction Between Procedural and
Distributive Justice: The Role of Trust. In. Kramer, R.M. and Tyler, T.R., Trust in
Organizations: Frontiers of Theory and Research, 390-413. Thousand Oaks, CA: Sage.
Clemmer, Elizabeth C. and Schneider, Benjamin. 1996. Fair Service. In Swartz, T.A., Bowen,
D.E. and Brown, S.W., Advances in Services Marketing and Management, 109-126. Greenwich,
CT: JAI Press.
Culnan, Mary J. 1995. Consumer Awareness of Name Removal Procedures: Implications for
Direct Marketing, Journal of Direct Marketing, 9, 2, 10-19.
Culnan, Mary J. 1993. 'How Did They Get My Name'?: An Exploratory Investigation of
Consumer Attitudes Toward Secondary Information Use. MIS Quarterly, 17, 3, 341-364.
Culnan, Mary J. 1992. Processing Unstructured Organizational Transactions: Mail Handling in
the U.S. Senate, Organization Science, 3, 1, 117-137.
Derlega, Valerian J. et. al. 1993. Self-Disclosure. Newbury Park: Sage Publications.
1/17/98 24
Direct Marketing Association. 1994. Fair Information Practices Manual. New York: Direct
Marketing Association.
Donaldson, Thomas and Dunfee, Thomas W. 1994. Toward a Unified Conception of Business
Ethics: Integrative Social Contracts Theory, Academy of Management Review, 19, 3 (June),
252-284.
Dowling, Melissa. 1993. When You Know Too Much, Catalog Age, October, 73-75.
Folger, Robert and Bies, Robert J. 1989. Managerial Responsibilities and Procedural Justice,
Employee Responsibilities and Rights Journal, 2, 2, 79-90.
Folger, Robert and Greenberg, Jerald. 1985. Procedural Justice: An Interpretive Analysis of
Personnel Systems. In. Rowland, Kendrith M. & Ferris, Gerald R. Research in Personnel and
Human Resources Management. Vol 3, 141-183. Greenwich: JAI Press.
Foxman, Ellen R. and Kilcoyne, P. 1993. Information Technology, Marketing Practice, and
Consumer Privacy, Journal of Public Policy & Marketing, 12, 1, Spring, 106-119.
Georgia Tech Research Corporation. 1996. Fifth WWW User Survey. URL:
http://www.cc.gatech.edu/gvu/user_surveys. April.
Glazer, Rashi. 1991. Marketing in an Information-Intensive Environment: Strategic Implications
of Knowledge as an Asset. Journal of Marketing, 55, 4 (October), 1-19.
Godwin, Cathy. 1991. Privacy: Recognition of a Consumer Right, Journal of Public Policy &
Marketing, 10, 1, Spring, 149-166.
Greenberg, Jerald. 1987. A Taxonomy of Organizational Justice Theories, Academy of
Management Review, 12, 1, 9-22.
Gutek, Barbara A. 1995. The Dynamics of Service. San Francisco: Jossey-Bass.
Hair, J.E., Anderson, R.E. and Tatham, R.I. 1987. Multivariate Data Analysis with Readings.
New York: Macmillan.
Louis Harris & Associates. Harris-Equifax Consumer Privacy Surveys, 1990-1994. Atlanta:
Equifax Inc.
Heskett, J. L., Sasser, W. E., and Hart, C. W. L. 1990. Service Breakthroughs: Changing the
Rules of the Game. New York: The Free Press.
Kramer, R.M. and Tyler, T.R. 1996. Trust in Organizations: Frontiers of Theory and Research.
Thousand Oaks: Sage.
1/17/98 25
Lind, E. Allan and Tom R. Tyler. 1988. The Social Psychology of Procedural Justice, New
York: Plenum Press.
Miller, Leslie. 1996. Think Nobody on the Net Knows Where You Visit? You’re Wrong.
Sacramento Bee, SC1, June 7.
Milne, George R., and Gordon, Mary Ellen. 1993. Direct Mail Privacy-Efficiency Trade-offs
Within an Implied Social Contract Framework, Journal of Public Policy & Marketing, 12, 2, Fall,
206-215.
Morgan, Robert M. and Hunt, Shelby D. 1994. The commitment-trust theory of relationship
marketing. Journal of Marketing, 58, 3 (July), 20-38.
Pine, B.J. 1993. Mass Customization. Boston: Harvard Business School.
Rogers, Everett M. 1983. Diffusion of Innovations. Third Edition. New York: The Free Press.
Schneider, Benjamin and Bowen, David E. 1995. Winning the Service Game. Boston: Harvard
Business School Press.
Schwartz, Paul M. and Reidenberg, Joel R. 1996. Data Privacy Law. Charlottesville: Michie.
Shapiro, Susan P. 1987. The social control of impersonal trust. American Journal of Sociology,
93, 3, 623-58.
Singer, Eleanor, Mathiowetz, Nancy A., and Couper, Mick P. 1993. The Impact of Privacy and
Confidentiality Concerns on Survey Participation, Public Opinion Quarterly, 57, Winter, 465-482.
Smith, H. Jeff. 1994. Managing Privacy: Information Technology and Corporate America.
Chapel Hill: University of North Carolina Press.
Smith, H. Jeff, Milberg, Sandra J., and Burke, Sandra J. 1996. Information Privacy: Measuring
Individuals’ Concerns About Corporate Practices. MIS Quarterly, 20, 2, 167-196.
Stone, Eugene F. and Stone, Dianna L. 1990. Privacy in Organizations: Theoretical Issues,
Research Findings, and Protection Mechanisms. In. K.M. Rowland and G.R. ferris (Eds),
Research in Personnel and Human Resources Management, Vol. 8, 349-411. Greenwich: JAI
Press.
U.S. Department of Health, Education and Welfare, Secretary’s Advisory Committee on
Automated Personal Data Systems. 1973. Records, Computers and the Rights of Citizens.
Washington: U.S. Government Printing Office.
1/17/98 26
U.S. Information Infrastructure Task Force (IITF). 1995. Privacy and the National Information
Infrastructure: Principles for Providing and Using Personal Information. Washington:
Department of Commerce.
Westin, Alan F. 1967. Privacy and Freedom. New York: Atheneum.
Zucker, Lynne G. 1986. Production of trust: Institutional sources of economic structure, 1840-
1920. In. Research in Organizational Behavior, Vol 8, 53-111, Greenwich: JAI Press.
1/17/98 27
Table 1
Summary of Transaction Data Collected at Point-of-Sale
by Transaction Processing Method
Transaction Processing
Method
Representative Technology
at Point-of-Sale
Transaction Data Gathered
at Point-of-Sale
Manual (customer not
identified)
Cash register without scanner Date, retail location, amount
of purchase
Manual (customer identified) Cash register; credit card Date, retail location, customer,
amount of purchase
Point-of-Sale (customer not
identified)
Cash register with scanner;
inventory database
Date and time, retail location,
items purchased, amount of
purchase
Point-of-Sale (customer
identified)
Cash register with scanner or
mail order; credit card or
customer account; inventory
and customer databases
Date and time, retail location,
items purchased, amount of
purchase, customer
Online (customer identified) Computer-to-computer, credit
card or customer account;
inventory and customer
databases
Date and time, browsing
patterns, items purchased,
amount of purchase, customer
1/17/98 28
Table 2
Descriptive Statistics and Inter-Item Correlations
For Dependent and Scaled Independent Variables
VARIABLE MEAN S.D. 1. Use-No
FIP
2. Use - FIP 3. Privacy
1. Use - No FIP
(2-item scale)
4.64 1.76
2. Use - FIP 2.59 1.03 0.63(a)
3. Privacy
(Sum of 3 dichotomous
variables)
1.305 0.87 -0.05 0.03
4. Direct Marketing
Frequency (DM FREQ)
(Sum of 5 dichotomous
variables)
1.49 1.08 0.21(a) 0.18(a) 0.09(b)
(a) p < .001
(b) p < .01
1/17/98 29
Table 3
Discriminators of Willingness to Be Profiled
Without Fair Information Practices (USE-NO FIP)
Eigenvalue Canonical
Correlation
Wilks’
Lambda
Chi-squared df Significance Holdout Sample
Correctly Classified
0.05 .2154 0.954 20.75 2 0.00001 56.06%
Group Discriminant Function
Group Means (Centroids)
0
(Not willing )
-0.23
1
(Willing)
0.21
Independent Variable Standardized
Canonical
Coefficients
Structure Matrix:
Pooled-within-Groups
Correlations
Privacy -0.53 -0.46
Direct Marketing Frequency (DM
FREQ)
0.89 0.84
1/17/98 30
Table 4
Discriminators of Willingness to be Profiled
With Fair Information Practices (USE - FIP)
Eigenvalue Canonical
Correlation
Wilks’
Lambda
Chi-squared df Significance Holdout Sample
Correctly Classified
0.0168 0.129 0.983 7.28 2 0.026 60.04%
Group Discriminant Function
Group Means (Centroids)
0
(Not willing)
-0.17
1
(Willing)
0.10
Independent Variable Standardized
Canonical
Coefficients
Structure Matrix:
Pooled-within-Groups
Correlations
Privacy 0.14 0.19
Direct Marketing Frequency 0.98 0.99
1/17/98
FIGURE 1
Privacy Leverage Point
TRUST
FIRM COLLECTS
PERSONAL
INFORMATION:
PRIVACY CUSTOMER CUSTOMER
CALCULUS: TRANSACTION INFO: PERCEIVES RETENTION
PRACTICE
CUSTOMER PRE-PURCHASE MATCHES
DISCLOSES IF POLICY ATTRACT
BENEFITS OF PURCHASE NEW
DISCLOSURE FIRM CUSTOMERS
EXCEED RISKS POST-PURCHASE USES
PERSONAL
INFO. CUSTOMER
DISCLOSE MATCH: PERCEIVES CUSTOMER
INFO. POLICY TO PRACTICE DEFECTIONS
CUSTOMER DOES NOT
MATCH UNABLE
PROCEDURAL CUSTOMER INFO: POLICY TO
FAIRNESS: ATTRACT
FAIR INFORMATION DEMOGRAPHICS NEW
PRACTICES BAD CUSTOMERS
PSYCHOGRAPHICS WORD OF
PRIVACY MOUTH
LEVERAGE POINT
1/17/98
... Three bodies of literature are instructive in framing the current study. First, there is a cross-disciplinary literature on privacy and the extent to which individuals trade off costs and benefits when deciding whether to disclose private information (Anderson and Agarwal, 2011;Culnan and Armstrong, 1999). This cognitive risk-benefit analysis, or "privacy calculus," can be influenced by trust and procedural fairness with respect to who is asking for the information, the type of information they are asking for, and the purpose for which the information will be used (Anderson and Agarwal, 2011). ...
... Even so, they decided to share their mental health information with the representatives of that system. This reality confounds some previous literature suggesting that people decide to share personal health information in part based on their trust in the people or institutions with whom information would be shared (Culnan and Armstrong, 1999;Shen et al., 2019b). Study participants did not describe trusting the criminal legal system or individual actors within that system. ...
Article
Objective: The overrepresentation of people with serious mental illnesses in the criminal legal system has spurred information-sharing initiatives to transmit information between mental health service providers and criminal legal system stakeholders with the goal of improving resources and streamlining access to care. However, no research to date has examined the perspectives of people with mental illnesses who have their information shared across these systems or the perspectives of their family members. This study examined the perspectives on mental health-criminal legal system information sharing among people with serious mental illnesses and a history of arrest, as well as their family members. Methods: Researchers interviewed 24 clients with serious mental illnesses and a history of arrest who are enrolled in a randomized, controlled trial of a police-mental health Linkage System as well as 11 of their family members. Participants were recruited and interviewed between November 2020 and February 2021. A thematic analysis was used to code and analyze all interview transcripts. Results: Study participants articulated perceived benefits and concerns around cross-system information sharing. There was strong support for information sharing in both directions, with the anticipation that such information sharing can prevent unnecessary arrest and/or incarceration, promote positive and safe interactions with criminal legal system professionals, and foster greater understanding and access to treatment. Concerns were more limited and largely related to perceived stigma around mental illnesses and the potential consequences of such stigma. Conclusions: While concerns about information sharing should be considered, study participants overwhelmingly perceived the sharing of information between mental health providers and criminal legal stakeholders as a positive intervention. Such perspectives can be understood as a pragmatic choice in the face of criminal legal system contact and additional research could guide programmatic and policy changes.
... Culnan and Armstrong's theory on privacy computing can explain these findings. e theory indicates that users who disclose personal information in exchange for some economic or social benefit are assessed to ensure that their private information is not used illegally and that the individual is not adversely affected [23]. erefore, users are willing to reveal personal information in a transactional environment if the benefits outweigh the risks. ...
Article
Full-text available
Background: China has been promoting sharing of Electronic Health Records (EHRs) data for several years. However, only a few studies have explored the views of Chinese residents on sharing personal health data, and the factors that affect sharing of EHRs have not been fully elucidated. This study sought to explore public attitudes toward sharing EHRs and the factors that affect sharing of personal health data among Chinese residents. Methods: A multi-stage stratified sampling design was adopted in this survey to select residents in Hunan province, resulting in 932 responses randomly. The investigation was carried out with the administration of a 19-item questionnaire. The measure includes items on demographics, willingness to share EHRs, experiences on EHRs, public acknowledgment of the benefits of sharing EHRs, and public awareness of potential risks of sharing EHRs. Results: The score of general willingness to share EHRs was 5.784 ± 2.031. Concerning the domain scores for the willingness, the willingness to share EHRs for research was 2.060 ± 0.942, whereas sharing anonymization EHRs for other nonmedical services was only 1.805 ± 0.877. Multiple linear regression showed that general willingness to share EHRs was related to job-related healthcare (β = 0.520), experiences on EHRs (β = 0.192), public awareness of potential risks of sharing EHRs (β = -0.130), and public acknowledgment of the benefits of sharing EHRs (β = 0.290). Conclusion: The willingness to share EHRs data with Chinese residents was not high. The willingness of Chinese residents towards data sharing in EHRs is influenced by several factors, primarily job-related to healthcare, experiences on EHRs, public acknowledgment of the benefits of sharing EHRs, and public awareness of potential risks of sharing EHRs. The results provide a basis for related research and provide information for designing public health strategies such as formulating policies to improve public acceptance of sharing EHRs and promoting EHRs-based public health services.
... In integrating the concept of data privacy in numerous studies, the Theory of Planned Behaviour (TPB), Theory of Reasonable Action (TRA), and the Technology Acceptance Model (TAM) have been used to explain consumer behaviour. These studies show that privacy concern has a positive impact on perceived risk [25] and a negative impact on trust ( [21]; [77]; [64]), on online buying behaviour [57] and divulging of personal information [44]. The major effect of privacy concerns is shown on perceived behavioural control [29]. ...
Article
Full-text available
Advancements in the internet of things (IoT) and proliferation in the use of smart devices have raised concerns about the data privacy of online users. This study predicts the consequences of perceived data privacy risks on consumer behaviour in Lagos State, Nigeria using the integrated Entropy-Technique for Order Preference by Similarity to Ideal Solution (TOPSIS). We employed Entropy to assign weights to each criterion. Subsequently, responses were systematically ranked to arrive at an inference using TOPSIS. 84.8% agree that any perceived cyber security threat or a breach in their data privacy would stop them from proceeding with the transaction or activity online, or the use of a digital product. Similarly, (86.7%), agree it is critical that online businesses only ask for customer information that is relevant to the use of the product or service. Thus, the findings indicate that the privacy paradox of enlightened online consumers tends to diminish when they are faced with perceived data privacy and cybersecurity risks.
Article
Based on the case of the national electronic identification card (New eID) policy in Taiwan, this study integrated the government’s components with a privacy calculus model to analyze factors which affect personal data disclosure intention. Partial least squares structural equation modeling (PLS-SEM) was used to analyze the survey data and explore how citizens balance benefits and risks associated with the New eID issue. The research results suggested that financial compensation, personalized services, and service compatibility can enhance cognition of privacy-related benefits of citizens, while this cognition will further increase willingness to authorize their personal data. Moreover, the impact of government elements on citizens’ willingness to authorize their personal information also has statistical backings. Citizens’ cognition of privacy-related risks, however, has no statistical effect within the model, which is contrary to findings from previous studies. The study attempted to make contributions to supplementing the connotation of an extant theoretical framework, and discussed the privacy-related questions concerning digital government.
Article
A consistent positivity bias has been found in many sharing economy platforms. Extreme positivity bias in reviews makes the review system meaningless, misleads the public, deteriorates users' trust in the sharing platform, and negatively influences consumers' participation on the platform. Considerable anecdotal evidence shows that privacy concerns are a barrier to providing honest negative reviews. However, research studies on the consumers' impact of privacy concerns on review behaviors are sparse. This study aims to fill the gap by investigating consumers' privacy concerns in writing negative reviews. Specifically, this study examines the effects of two types of privacy concerns, online privacy concerns and physical privacy concerns, on the intention to provide negative reviews and contingent factors (monetary incentive, venting negative feelings, and warning other consumers) that could influence this relationship. The study also examines the effectiveness of two fair information practices (i.e., privacy policy and anonymous feature) in mitigating privacy concerns. We evaluate our research model using a scenario-based online survey, providing broad support for our hypotheses. The study delineates the implications of the results for both practice and research.
Article
Drawing upon the Brands as Intentional Agents Framework, this study investigates the impact of claim specificity of green advertising on consumers’ reactions toward the brand, and reveals a double-edged effect of claim specificity that depends on the dimensions of the brand’s image (perceived competence/warmth). Through two experiments, we show that the positive effect of claim specificity depends on the brand competence level, and that brand warmth interacts with brand competence, such that claim specificity can backfire and lower consumers’ attitudes and purchase intentions when brand warmth and competence are both low. This interactive effect is serially mediated by the perceived manipulative intent and perceived environmental commitment of the brand. Our research contributes to the literature on corporate social responsibility (CSR) communication by providing a nuanced picture of the effects of claim specificity, and offers guidance for companies on how to communicate more effectively about their CSR activities.
Article
The privacy calculus assumes that people weigh perceived privacy risks and benefits before disclosing personal information. So far, empirical studies investigated the privacy calculus on a between-person level and, therefore, were not able to make statements about the intrapersonal psychological processes. In the present preregistered online within-person experiment, participants ( N = 485) were asked to imagine three different disclosure situations in which privacy risks were indicated by a privacy score. As personality variables, rational and intuitive privacy decision-making styles and privacy resignation were assessed. Results of a within-between random effects model showed that benefit perceptions were positively associated with self-disclosure intentions on between- and within-person levels. The privacy score was found to be effective in supporting users to make more privacy aware choices (within-person level). Finally, the rational decision-making style was positively related to privacy risk perception, while especially intuitive decision-makers can benefit from decision-making aids like the privacy score.
Article
With the development of marketization of data elements, privacy computing arises at the right moment in order to solve the contradiction between data circulation and its security risks. It is widely used in financial market, medical service, government management and other scenarios. It provides a new way for the identification of personal data rights, subject self-determination, safe circulation and corporate compliance, but also brings new challenges. From the perspective of the path and application scenarios of the key technologies of privacy computing technology, we can find the problems of personal data protection put forward by the current technology; privacy computing technology still has security risks in the process of data processing, and its industrial business model limits the benefits of individuals to their own data. Meanwhile, laws and regulations related to this technology also need to be refined and implemented. In order to solve these problems, it is necessary to promote the optimization of key technologies and the formulation of standards, build self-discipline rules and associations of the industry, strengthen legislation and special supervision of personal data, and build a global governance framework for the protection of personal data rights in privacy computing. It can enhance the use value of data, promote the integration of data resources and international circulation and sharing under the condition that the rights of personal data are fully protected.
Thesis
Le phénomène de résistance du jeune consommateur est un comportement d’opposition, d’évitement, de rejet voire de rébellion. Les manifestations de cette résistance se présente sous la forme d’un continuum à deux quadrants, allant du silencieux/ individuel à l’expressif /collectif. Ceci est une résultante de l’excès de sollicitation du jeune consommateur par les entreprises via les outils du marketing digital ou mobile, surtout la publicité mobile ou sur Internet, qui est considérée comme l’élément le plus visible, engendrant le phénomène d’intrusion publicitaire dans les médias les plus courtisés par les jeunes adolescents, tels que Internet. Cette étude traite essentiellement la publicité mobile sur Internet, et nous a permis de prouver non seulement la relation de causalité existante entre le niveau d’intrusion publicitaire perçue et les manifestations de la résistance du consommateur adolescent comme conséquence, mais aussi le role modérateur que joue l’attitude envers la publicité entre ces deux variables. Notre modèle conceptuel nous a également permis de cerner l’influence de la personnalisation et du non-respect de la vie privée sur le niveau d’intrusion publicitaire perçue. De plus, nous avons pu éclaircir l’effet médiateur de la vulnérabilité perçue entre ces trois dernières variables. Enfin, nous avons prouvé la relation positive qui existe entre la résistance aux publicités et la relation aux marques qui diffusent ces publicités.Sur la base d’un modèle conceptuel testé sur un échantillon de 125 adolecents de la population de l’ile de france en utilisant la modélisation par équations structurelles (PLS), nous avons déduit que l’intrusion publicitaire perçue est essentiellement associé à l’excessivité et la répétition des annonces. Ces facettes dérangent les adolescents mais ils ne montrent pas une résistance marketing. Cependant, ces memes adolescents confirment que leur relation aux marques peut etre influencée par cette résistance marketing.
Article
This paper analyzes the trade-off consumers face between monetary benefits and personal data disclosure. We use survey data from Norway to study respondents’ willingness to share data in exchange for a discount (WSD) and to pay to keep data private (WPP) for a list of personal data often exchanged online. Additionally, we study the effects of various consumer demographics and attitudes on WPP and WSD. We find that WSD and WPP change for different personal data. WPP is lower than WSD for low-sensitivity data, such as age. WSD increases when the data are used for personalization and when users interact with institutions they trust. WPP is higher than WSD for data personally identifying a respondent, such as pictures. Providing paid privacy protection for these data is a valuable service. Financial institutions and mobile operators are better positioned than others to offer this service. Younger respondents show a higher WPP.
Article
A variety of new information technologies have emerged that clearly can improve the efficiency and effectiveness of marketing programs. However, the use of technologies such as computer matching or automatic order-entry systems to support marketing programs also can lead to legal and societal difficulties. The authors review the types of problems that marketers could encounter when using new information technologies. Particular attention is paid to the possibilities of being charged with (1) participating in collusive information exchanges, (2) maintaining an illegal “essential facility,” (3) storing or transmitting inaccurate and harmful information, and (4) violating the privacy rights of individuals. They offer ideas about how marketing managers and researchers can reduce the chances of facing these types of problems.
Article
Relationship marketing—establishing, developing, and maintaining successful relational exchanges—constitutes a major shift in marketing theory and practice. After conceptualizing relationship marketing and discussing its ten forms, the authors (1) theorize that successful relationship marketing requires relationship commitment and trust, (2) model relationship commitment and trust as key mediating variables, (3) test this key mediating variable model using data from automobile tire retailers, and (4) compare their model with a rival that does not allow relationship commitment and trust to function as mediating variables. Given the favorable test results for the key mediating variable model, suggestions for further explicating and testing it are offered.
Article
The author presents a framework for thinking about the impact of information and information technology on marketing. The focus is on the concept of “information” or “knowledge” as both an asset to be managed and a variable to be researched. After developing a particular operationalization of the value of information in marketing contexts, which can be used to describe firms in terms of their relative levels of “information intensity,” the author presents a series of propositions examining the consequences of increasing information intensity for some key components of firm strategy and organizational structure. The concepts discussed are illustrated with a description of the transaction-based information systems that are being implemented in a variety of firms in pursuit of competitive advantage.
Article
The authors conceptualize direct mail as an implied social contract between marketers and consumers. Four attributes constitute the direct mail social contract: volume, targeting, compensation, and permission. Several proposals have been advanced in an effort to protect consumer privacy in the direct mail environment. These proposals would directly or indirectly result in changes in the levels of the social contract attributes. The authors use a conjoint study to measure the trade-offs consumers make among these attributes. The results suggest consumers want improved targeting efficiency and lower mail volume, and they are not willing to pay for these improvements. These findings suggest that consumers consider several attributes in their evaluation of direct mail social contracts. Proposals to alter the direct mail environment must consider all these attributes in concert.
Article
This paper presents an overview of consumer privacy that integrates the public policy and behavioral literatures. Consumer privacy is defined in terms of control over information disclosure and the environment in which a consumer transaction occurs. These two dimensions generate a 2×2 matrix, identifying four states of privacy based on control over environment, information disclosure, both, or neither. For each state, managerial and policy implications can be derived.
Article
Marketers’ use of the new information technologies has provided the opportunity for improved market segmentation and target marketing. However, the profession faces ethical conflicts because application of these technologies commonly invades consumer privacy. The authors examine the ethical dimensions of marketing practice in relation to consumer privacy. The meaning of privacy in a marketing context is explored and specific marketing threats to consumer privacy are described. After examining current and potential mechanisms to safeguard consumer privacy, the authors conclude that marketers must make an active commitment to ethical behavior in this area if restrictive legislation is to be avoided.
Article
A variety of new information technologies have emerged that clearly can improve the efficiency and effectiveness of marketing programs. However, the use of technologies such as computer matching or automatic order-entry systems to support marketing programs also can lead to legal and societal difficulties. The authors review the types of problems that marketers could encounter when using new information technologies. Particular attention is paid to the possibilities of being charged with (1) participating in collusive information exchanges, (2) maintaining an illegal "essential facility," (3) storing or transmitting inaccurate and harmful information, and (4) violating the privacy rights of individuals. They offer ideas about how marketing managers and researchers can reduce the chances of facing these types of problems.