Information Privacy Concerns, Procedural Fairness and Impersonal Trust:
An Empirical Investigation
Mary J. Culnan
School of Business
Washington, D.C. 20057-1008
(202) 687-4031 (fax)
CULNANM @ GUNET.GEORGETOWN.EDU
Pamela K. Armstrong
Correspondence about the paper should be directed to Mary Culnan
Organization Science, forthcoming
Revised October 4, 1997
The authors acknowledge the helpful comments of Jeff Smith and Bob Zmud, the anonymous
reviewers and the Associate Editor, and especially Bob Bies on earlier versions of this paper. A
preliminary version was presented at the INFORMS National Meeting, May 1996.
Information Privacy Concerns, Procedural Fairness and Impersonal Trust: An Empirical
This research addresses the tensions that arise between the collection and use of personal
information that people provide in the course of most consumer transactions, and privacy. In
today’s electronic world, the competitive strategies of successful firms increasingly depend on
vast amounts of customer data. Ironically, the same information practices that provide value to
organizations also raise privacy concerns for individuals. This study hypothesized that
organizations can address these privacy concerns and gain business advantage through customer
retention by observing procedural fairness: customers will be willing to disclose personal
information and have that information subsequently used to create consumer profiles for business
use when there are fair procedures in place to protect individual privacy. Because customer
relationships are characterized by social distance, customers must depend on strangers to act on
their behalf. Procedural fairness serves as an intermediary to build trust when interchangeable
organizational agents exercise considerable delegated power on behalf of customers who cannot
specify or constrain their behavior. Our hypothesis was supported as we found that when
customers are explicitly told that fair information practices are employed, privacy concerns do not
distinguish consumers who are willing to be profiled from those who are unwilling to have their
personal information used in this way.
KEYWORDS: Information privacy, procedural justice, trust, service quality, organizational
Information Privacy Concerns, Procedural Fairness and Impersonal Trust: An Empirical
Two converging trends, one competitive and the other technological, are driving
American business. First, to survive in the increasingly competitive global economy, companies
depend on vast quantities of information to build strong bonds with current customers, and to
attract new customers. Second, information technology (IT) continues to increase in capability
and to decline in cost, allowing information to be used in ways that were previously impossible or
economically impractical. Technology enables companies to record the details of any customer
transaction at the point-of-sale, to store vast quantities of transaction data in their data
warehouse, and to use these data to execute marketing programs with a business partner or alone.
Technology also enables the development of extensive customer databases, making it possible to
deal with customers as individuals. Instantaneous access to the customer’s history by a customer
service representative allows standardized, impersonal encounters with whomever answers the
800-number to assume the appearance of a personal relationship (Gutek, 1995). Therefore, the
marketing strategies of successful firms increasingly depend on effective use of vast amounts of
detailed customer transaction data (Bessen, 1993; Blattberg & Deighton, 1991; Glazer, 1991).
This research addresses the tensions that arise in today's increasingly electronic world
between the collection and use of personal information people provide in the course of most
consumer transactions, and individual privacy. The hypothesis of the study is that consumers will
be willing to disclose personal information and have that information subsequently used to create
profiles for marketing use when their concerns about privacy are addressed by fair procedures.
The major contribution of the research is that it provides empirical evidence that companies can
gain competitive advantage by behaving ethically.
Transaction data generated by customer contacts before, during and after the sale are a
critical resource in the increasingly competitive global economy that is moving from a paradigm of
mass production and mass merchandising to one of mass customization and personal service
(Glazer, 1991; Pine, 1993). Table 1 illustrates the data typically generated during a sales
transaction. The richness of the data varies depending upon the technology employed, ranging
from a cash register without scanning capability where essentially no customer data is recorded to
an online service where all of the customer’s “mouse tracks” are recorded (Miller, 1996).
Advances in telecommunications and database technology mean that all transaction data should be
accessible on a timely basis to everyone in the firm with a need for the data. For example, data
collected about product returns in Europe can be used by marketers in the U.S. or by a plant
manager in Mexico to address potential problems in product design or changes in customer
preferences as soon as enough products are returned, and the aggregated data about these returns
makes the organization aware that a problem may exist. Transaction data signaling increased
sales or the success of an advertising campaign for a target market segment or even an absence of
sales data where sales were expected serve the same signaling function to the firm. Because
these individual transactions are in reality, "messages" from customers to the firm that should be
distributed as appropriate to functions across the value chain, information systems that process
these transactions are in fact organizational information systems (Culnan, 1992). Organizations
can gain competitive advantage by collecting and using transaction data effectively (Glazer,
-- Insert Table 1 About Here --
The use of transaction data as an organizational resource can create positive or negative
outcomes to a firm, based on how the information is used. In positive terms, the use of
transaction data to yield better customer service, higher quality products, and new products that
reflect consumer preferences creates benefits for both consumers and the firm. The collection of
detailed information on consumer preferences enables firms to engage in relationship marketing
and to target offers more accurately based on their customers' specific interests (Blattberg &
Deighton, 1991; Glazer, 1991).
There is also a potential downside to the collection and use of greater amounts of
increasingly detailed personal information. Ironically, the same practices that provide value to
organizations and their customers also raise privacy concerns (Bloom, Milne & Adler, 1994).
Privacy is the ability of the individual to control the terms under which personal information is
acquired and used (Westin, 1967). Personal information is information identifiable to an
individual. As Table 1 illustrates, today’s customers leave more electronic footprints detailing
their behavior and preferences; their buying habits are easily profiled, and can be readily shared
with strangers. If the firm’s practices raise privacy concerns resulting from a perception that
personal information is used unfairly, this may lead to customers being unwilling to disclose
additional personal information, customer defections, bad word of mouth, and difficulty attracting
new customers, all of which can negatively impact the bottom line. The growth of the Internet
and other online systems also makes it possible for consumers to engage in “electronic retaliation”
if they object to a company’s practices, by “flaming” the company directly by electronic mail (Bies
& Tripp, 1996), or by posting negative public comments to a computer discussion group. As the
text of Internet discussion groups are archived and can be easily searched by keyword such as
company or product name, these negative comments live on long after they were posted. The
challenge to organizations, then, is to balance the competing forces of the power of information
with privacy in their dealings with their customers.
The failure to use personal information fairly or responsibly may raise two kinds of
information privacy concerns resulting from the inability of an individual to control the use of
personal information. First, an individual's privacy may be invaded if unauthorized access is
gained to personal information as a result of a security breach or an absence of appropriate
internal controls. Second, because computerized information may be readily duplicated and
shared, there is the risk of secondary use, that is information provided for one purpose may be
reused for unrelated purposes without the individual's knowledge or consent. Secondary use
includes sharing personal information with others who were not a party to the original transaction,
or the merging of transaction and demographic data to create a computerized profile of an
individual by the organization that originally collected the information (Culnan, 1993, Godwin,
1991; Foxman & Kilcoyne, 1993, Smith, Milberg & Burke, 1996). This paper addresses the
latter concern, secondary use, where organizations make deliberate choices about reuse of their
customers’ personal information, and where the customer may perceive the reuse as varying from
their expectations for fair use, done without their consent, and therefore unfair.
THEORETICAL BACKGROUND & HYPOTHESES
Privacy and Fairness
Prior research on privacy found that individuals are willing to disclose personal
information in exchange for some economic or social benefit subject to the "privacy calculus," an
assessment that their personal information will subsequently be used fairly and they will not suffer
negative consequences (Laufer & Wolfe, 1977; Milne & Gordon, 1993; Stone & Stone, 1990).
For example, a recent survey of Internet users conducted by Georgia Tech found that 78% of the
survey participants would be willing to provide demographic information about themselves to the
owner of a web site if “a statement was provided regarding how the information was used.” Only
6% of the participants would not disclose demographic information under any circumstances
(Georgia Tech, 1996).
In general, individuals are less likely to perceive information collection procedures as
privacy-invasive when a) information is collected in the context of an existing relationship, b) they
perceive that they have the ability to control future use of the information, c) the information
collected or used is relevant to the transaction, and d) they believe the information will be used to
draw reliable and valid inferences about them. See Bies (1993) and Stone and Stone (1990) for
an extensive review of this literature. While the self-disclosure literature has focused on
interpersonal relationships rather than impersonal customer relationships between individuals and
firms, its findings are consistent regarding a balancing test. People disclose personal information
to gain the benefits of a close relationship; the benefits of disclosure are balanced with an
assessment of the risks of disclosure (Derlega et. al., 1993).
Creating a willingness in individuals to disclose personal information, then, requires that
organizations also view the collection of personal information as a "social contract" with their firm
where in addition to exchanging money for products or services, the customer also makes non-
monetary exchanges of personal information for intangible benefits such as higher quality service
described above (Glazer, 1991; Milne & Gordon, 1993). Customers will continue to participate
in this social contract as long as the perceived benefits exceed the risks. Developing information
practices that address this perceived risk results in positive experiences with a firm over time,
increasing the customer's perceptions that the firm can be trusted. Trust reflects a willingness to
assume the risks of disclosure (Mayer et. al., 1995). Trust creates switching costs, increasing the
likelihood that the customer will continue in the relationship with the firm (Gundlach & Murphy,
1993). Managing this “second exchange” in a marketing transaction by treating customer
information fairly, then, is essential to building trust in a customer relationship.
Some industry groups have argued that privacy is a customer service issue (Direct
Marketing Association, 1994; Dowling 1993). While the literature on customer service has not
specifically addressed privacy, it has established a link between being treated fairly and customer
satisfaction (Schneider & Bowen, 1995). Berry (1995) found that customers see fairness and
service quality as “inseparable issues” -- since customer perceptions drive service quality, a
service that is perceived as being unfair will also be perceived as being lower in quality.
Conversely, the perception of fair treatment of customers has been shown to be positively related
to higher levels of satisfaction in services (Clemmer & Schneider, 1996). These authors found
this link between fair treatment and customer satisfaction to hold across all four of the service
industries they studied. They found that customers evaluate the fairness of the core service
received, the procedures used in service delivery, and the personal treatment received. Fairness is
inherent in the consumer’s basic need for justice. A violation of this need, such as violating a
psychological contract, will result in angry and disloyal customers as described above. Heskett,
Sasser and Hart (1990) note that “many differences attached to the value of a service by
customers are explained by the level of risk perceived by the customer...and the degree to which
such risks can be minimized by the service provider.” The customer who discloses personal
information runs the risk that the information will not be used fairly. Companies that establish fair
information practices and disclose these practices before collecting personal information from
customers are greatly reducing these perceived risks and the subsequent negative consequences.
One goal of offering high quality service is to keep customers coming back and to attract
new ones through positive word-of-mouth. Gutek (1995) notes that as the customer uses the
service over time (assuming he or she continues to perceive the service as fair), trust builds
between the customer and service provider. This trust is crucial, since customers often lack the
expertise or the first-hand knowledge to know whether the service provided is correct (Shapiro,
1987). If the trust is low, then the customer will likely take his or her business elsewhere. If the
customer has absolute trust in the provider, then the provider will be able to learn more about the
customer in order to serve customers better. However, absolute trust also provides a potential
opportunity for the company to exploit the customer (Gutek 1995; Shapiro, 1987). The literature
on organizational justice suggests that procedural fairness of company practices can have a major
positive impact on trust and privacy perceptions (Bies, 1993).
Procedural fairness refers to the perception by the individual that a particular activity in
which they are a participant is conducted fairly (Lind & Tyler, 1988). Factors that contribute to
perceptions of procedural fairness include providing the consumer with voice, and control over
actual outcomes (Folger & Greenberg, 1985; Lind and Tyler, 1988). Research has shown that
even if outcomes are not favorable to an individual, individuals are less likely to be dissatisfied
with unfavorable outcomes if they believe that the procedures used to derive those outcomes are
fair (Lind & Tyler, 1988; Greenberg, 1987; Folger & Bies, 1989).
For consumer marketing, fair information practices operationalize procedural fairness.
Fair information practices are procedures that provide individuals with control over the disclosure
and subsequent use of their personal information. They are global standards for the ethical use of
personal information and are at the heart of U.S. privacy laws, the privacy directive adopted by
the European Union in July 1995, and the Clinton Administration's June 1995 guidelines for
personal information use by all National Information Infrastructure participants.
At the heart of fair information practices are two concepts: notice and consent. These
two concepts are reflected in the following principles. When they provide personal information,
people have the right to know why the information is being collected, its expected uses, the steps
that will be taken to protect its confidentiality, integrity and quality, the consequences of
providing or withholding information, and any means of redress available to the individual.
People also have the right to control how their personal information will subsequently be used by
objecting to uses of their personal information when information will be collected for one purpose
and used for other purposes. Fair information practices also state that personal information
should not be used in ways that are incompatible with the individual's understanding of how it will
be used unless there is a compelling public interest for such use (U.S. IITF, 1995). Fair
information practices, therefore, mediate the privacy concerns raised by disclosure and subsequent
use of personal information by empowering the individual with control and voice, even if people
do not choose to invoke the procedures, as well as an assurance that the firm will adhere to a set
of principles that most customers find acceptable (Folger & Bies, 1989; Folger & Greenberg,
1985; Greenberg, 1987; Lind & Tyler, 1988; Mayer et. al., 1995; Shapiro, 1987; Stone & Stone,
1990). Fair information practices, then, make the "deal" with the consumer fair (Donaldson &
Dunfee, 1994; Milne & Gordon, 1993).
In marketing, a central element of fair information practices is the ability of individuals to
remove their names from mailing lists. The 1990 Equifax survey found the majority of the public
believes it is acceptable for direct marketers to use names and address on a mailing list if people
who do not want to receive mail offers could remove their names from the mailing list. Culnan
(1995) found that people who were aware of name removal procedures had a lower concern for
privacy than those who were not aware of these procedures, suggesting that awareness of fairness
procedures can address the privacy concerns associated with disclosure and use of personal
information (Greenberg, 1987).
Procedural fairness, then, can create a "privacy leverage point" for organizations by
providing an opportunity for the firm to promote the customer disclosure of personal information
by disclosing its information policies to the customer or prospective customer, provided its
subsequent practices are consistent with the policy. Figure 1 illustrates the role of procedural
fairness in building trust over the life of a customer relationship where customers must rely on
“strangers” to protect their interests (Shapiro, 1987). Over the life of a customer relationship,
firms potentially gather large amounts of personal information about their customers. Some of
this is gathered directly as a result of each transaction, other information is acquired from third
parties, allowing the firm to develop an extensive profile for each customer. Data warehouse
technology allows the firm to perform sophisticated analyses on massive amounts of transaction
data and to develop marketing programs for individual customers. As described previously, other
online technologies make these data available for use thoughout the organization, independent of
physical location. Based on their subsequent experiences with the firm, customers make an
assessment of whether or not they perceive that their personal information was used consistent
with their expectations. If the information was used consistently, the customer is likely to stay in
the relationship. If not, the customer may be likely to defect and/or to engage in bad word of
mouth (Morgan & Hunt,1994). The privacy leverage point, then, provides an intervention
opportunity for firms to build trust with their customers as they collect and use personal
information, therefore making customers willing to disclose personal information by minimizing
the risks of disclosure to these individuals.
-- Insert Figure 1 about here--
This study hypothesizes that procedural fairness can strike a balance between the
competing forces of privacy and information use. When taken together, the literature on privacy,
self-disclosure and procedural justice suggest that procedural fairness, defined here as fair
information practices, can mediate the privacy concerns that often arise when customer
transaction data and other personal information are merged to create profiles for use in targeted
marketing. However, this relationship has not been tested empirically. This study will test two
hypotheses that address the relationship between procedural fairness and privacy:
H1: When people are not explicitly told that fair procedures will be employed for
managing their personal information, people with a greater concern for privacy will be
less willing to have their personal information used for profiling.
H2: When people are explicitly told that fair procedures will be employed for managing
their personal information, privacy concerns will not distinguish people who are unwilling
to be profiled from those who are willing to have their personal information used for
Prior experience should also influence an individual’s willingness to be profiled.
Individuals who have prior experience with direct or targeted marketing are more likely to
understand the benefits of profiling and to also to be aware of fair information practices as a
means for exercising control over their personal information (Culnan, 1995). Those with
experience should also have developed a degree of trust in the process as Figure 1 illustrates, and
should be more willing to have their personal information used in this way. For these individuals,
profiling is likely to be perceived as compatible with their existing values and past experiences
(Rogers, 1983). This suggests:
H3: Prior experience with targeted marketing will distinguish people who are willing to
have their personal information used for profiling from those who are not willing.
The context for the research is the use of personal information gathered from prospective
subscribers to interactive home information services and the willingness of consumers to allow
personal information to be used in targeted advertising based on customer profiles compiled by
the interactive service providers. The study is based on a fresh analysis data from the 1994
Harris Survey on Interactive Services, Consumers and Privacy. The survey was designed and
sponsored by Privacy and American Business. Data were collected by Louis Harris &
Associates by telephone from a random sample of 1,000 U.S. adults age 18 and older.
Willingness to have one’s personal information used to develop profiles for targeted
marketing was measured by two variables. First, willingness to have personal information to be
used for profiling without being explictly told that fair information practices would be employed
was measured by two four-point Likert-scaled items ranging from “not at all interested” to “very
• How interested would you be in having this type of advertising [based on
subscriber profiling] presented to you from time to time, on your computer or T.V.
screen? (mean = 2.32, s.d. = 0.97); and
• The interactive services provider could also ask you to check off your interests
and activities from a list on the TV or computer screen, so that special offers could
be made to you on-screen. How interested do you think you would be in doing
that? (mean = 2.32, s.d. = 1.01).
The two variables were factor analyzed using a varimax rotation. Both items loaded
unambiguously on a single factor and were combined to form a “use without fair information
practices” (USE-NO FIP) scale (r= .58, p < .001; Cronbach alpha =.74).
Attitudes toward having personal information used in profiling if individuals were
explictly told fair information practices were observed (USE-FIP) was measured by a single item,
“If the rules and safeguards I’ve just mentioned were adopted by companies offering interactive
services, how interested would you be in subscribing to a system that used subscriber profile
marketing?” (mean = 2.59, s.d. = 1.03).
The rules and safeguards comprising fair information practices were read to the survey
participants before the “USE-FIP” question was administered and were defined as follows:
• Before you decided to subscribe, the service provider would inform you fully about
the collection of subscriber profile information and how it would be used;
• You could control the types of products and services advertised to you as well as
when and for how long advertising messages would be displayed on the screen;
• You could indicate what information in your subscriber profile could be used for
marketing and what couldn’t; and
• You could review the information in your subscriber profile and correct any errors.
For each item, respondents were asked the importance of the practice on a 4-point Likert scale.
The four items were factor analyzed using a varimax rotation. All four items loaded
unambiguously on a single factor (Cronbach alpha = .88).
The questionnaire items were administered in the following order. The USE-NO FIP
items were administered first. Second, respondents were asked about the importance of fair
information practices. Finally, the USE-FIP item was administered.
In an absolute sense, individuals surrender a measure of privacy whenever they disclose
any personal information. Therefore, taking overt steps to restrict the disclosure of personal
information should reflect a concern for diminished privacy that would result from disclosure.
The first independent variable, behavior that indicated a concern for privacy, was operationalized
using three dichotomous variables that measured an individual taking steps to restrict the
disclosure of personal information. The first two measured an individual’s unwillingness to
disclose personal information to others. The third measured whether an individual had ever been
unwilling to allow personal information to be reused for targeted marketing by another
• Have you ever refused to give information to a business or company because you
thought it was not really needed, or was too personal, or haven’t you? (Yes = 70%)
• Does your household have an unlisted or unpublished telephone number? (Yes = 23%)
• Have you ever asked an organization, such as a publication or business with which you
have a relationship, to take your name off of any list they gave out to other
organizations for sending you mail offers, or not? (Yes = 33%).
The three items were summed to form a scale (mean = 1.30, s.d. = 0.87).
The second independent variable, prior experience with direct marketing was measured by
a series of dichotomous variables. The subjects were asked whether or not they had:
• Bought something from a catalog or brochure sent to your residence or workplace
(Yes = 65%)
• Bought something offered to you by a telephone call to your residence or workplace
(Yes = 14%)
• Bought something from a TV home shopping club (Yes = 13%)
• Called a toll-free or 800 number to order something (Yes = 46%)
• Used a 900 number that charged for information, products or services (Yes = 4%).
The responses to these items were summed to form a single variable, Direct Marketing
Experience (DMEXP: mean = 1.49, s.d. = 1.08).
Table 2 and contains descriptive statistics for the dependent and independent variables.
Table 2 also contains correlations for the two dependent variables and the independent variables.
-- Insert Table 2 About Here --
Prior research has also established that individuals also vary in their concern for privacy,
based on their demographics and life's experiences. For example, the Harris-Equifax Surveys
found African Americans, Hispanics, women, and less educated people to be most concerned
about privacy. Singer et. al. (1993) found that both demographics and concern for privacy were
significantly related to return rates for the 1990 census, however privacy concerns varied for
white and African-American respondents. Culnan (1995) found that demographics, experience
with direct marketing, and concern for privacy significantly discriminated among individuals who
were versus those who were not aware of name removal procedures. These results suggest that
here, an individual’s willingness to have their personal information used for targeted marketing is
also likely to reflect both their demographics and experience. However, prior research also
suggests that these demographic differences are captured by both attitudinal and behavior
variables (Ajzen and Fishbein, 1980). Therefore, no additional demographic or experience
variables were used in this study.
The hypotheses were tested using a discriminant analysis which examined the joint
significance of the relationships in the hypothesized model (Hair et al., 1987). Discriminant
analysis is the appropriate statistical technique for determining if significant differences exist
between the profiles of two groups defined by a categorical dependent variable.
Discriminant analysis was used because because the first dependent variable, USE-NO
FIP, was not normally distributed. When the two items were summed to form the USE-NO FIP
scale, the resulting distribution consisted only of even values. The use of a dichotomous variable
to operationalize the dependent variable, willingness/unwillingness to be profiled, was appropriate
given the hypotheses to be tested.
Both dependent variables were subsequently recoded as dichotomous variables. For both
variables, observations with values above the mean were coded as one, and those below the mean
were coded as zero. A separate discriminant analysis was performed for each of the two
dependent variables using the two independent variables: privacy and direct marketing
experience. These results are shown in Table 3 and Table 4.
-- Insert Table 3 and Table 4 About Here --
Table 3 summarizes the results for the first dependent variable, use without being
explicitly told that fair information practices would be observed (USE - NO FIP). Both of the
independent variables are significant discriminators of those who are willing versus those who are
not willing to be profiled without fair information practices. The overall results for the
discriminant function were also significant (Chi-Square = 20.75, 2 d.f., p < 0.001). The function
correctly classified 56.1% of the cases in the holdout sample. This is greater than the
proportional chance criterion of 50.8%, which is calculated as C
, where p is the
proportion of people in group 1 and (1-p) is the proportion of people in group 2. We can
interpret the structure correlations as factor loadings to determine the variables that make the
greatest contribution to the discriminant function; generally, the variables with correlations that
exceed |0.30| are considered significant (Hair et al., 1987). Both of the independent variables
Table 4 summarizes the results for the discriminant analysis using the second dependent
variable, willingness to be profiled after being told explicitly that fair information practices would
be observed (USE - FIP). Here, only independent variables for direct marketing experience is a
significant discriminator with the structure correlation exceeding 0.30. The overall function is
also significant (Chi-Square = 7.28, 2 d.f., p < 0.05). The discriminant function correctly
classified 60.0% of the holdout sample which is greater than the proportional chance criterion of
These results provide support for the three hypotheses. The first two hypotheses
postulated that privacy would distinguish people who are willing to be profiled from those who
are unwilling to be profiled only when people were not told that fair information practices would
be observed. As hypothesized, the privacy variable was a significant discriminator only in the first
discriminant analysis. In the second discriminant analysis, the privacy variable is not a significant
discriminator, providing support for the hypothesis that privacy concerns can be addressed by
explicitly telling customers that the company observes fair information practices.
The third hypothesis postulated that prior experience would also discriminate between
people who are willing to be profiled. In both discriminant functions, the direct marketing
experience variable is significant, providing support for this hypothesis. People who are willing to
be profiled for marketing purposes are more likely to have prior experience with direct marketing
than people who are not willing.
Effective use of customer information to support activities across an organization’s value
chain has become a competitive necessity. The key challenge to organizations is to balance the
competitive advantages provided by the use of this information with the privacy concerns that use
of personal information may raise among its customers. This study addressed the role of
procedural fairness in addressing the privacy concerns that may be raised when personal
information is used to develop marketing profiles. The results suggest that companies can gain
business advantage through customer retention by observing procedural fairness.
The research has some methodological limitations. As described above, the study was
based on secondary data analysis of a survey designed to measure public opinion; the original
research was not driven by any theoretical model or framework. Individual questionnaire items
were designed to be unbiased, but not necessarily to pass psychometric muster such as the need to
use multiple items to measure attitudes. The variables used in the present study were constructed
after the fact and as a result, some of them lack the psychometric properties one would expect in a
study that was under the total control of the authors. Therefore the results should be viewed with
some caution. The strengths of the research are that the data represent a national random sample
of U.S. adults rather than a convenience sample, and the results are consistent with theory.
The study found that when people were explicitly told that fairness procedures in the form
of fair information practices are observed, only prior experience distinguished individuals who
were willing to be profiled from those who were not willing. When people were not explictly told
that fair information practices were observed, both privacy and experience distinguished the
individuals who were willing from those who were not willing to be profiled. This suggests that
procedural fairness can successfully address privacy concerns, and when fair information practices
are observed, customers will be more willing to continue in a relationship with a firm, allowing the
firm to benefit from the collection and use of data that results from the relationship. These results
are also consistent with prior research related to disclosure of personal information by Internet
users (Georgia Tech, 1996). While industry codes of conduct have called for firms to observe fair
information practices if they want to be perceived as behaving ethically, this is the first empirical
study to find that observing fair information practices is in the business interests of marketers
because building trust through fairness is one basis for attracting and retaining customers as
Figure 1 illustrates.
Since fairness appears to be a key factor in addressing privacy concerns, the results also
suggest that procedural justice is a promising theoretical basis for future research on information
privacy. Much of the organizational research on justice has focused on the fairness of both
outcomes and procedures related to personnel decisions such as layoffs, pay freezes, or the
introduction of drug testing policies (Bies, 1993). This study suggests that in addition to
understanding the relationships between organizations and their employees, this theory can also be
used investigate the relationship between organizations and their customers. For example,
Brockner and Siegel (1996) reviewed the procedural justice literature and reported that the level
of procedural fairness influences the degree of trust in exchange relationships. Figure 1 shows
trust moderated by fair information practices as a key factor in an individual’s decision to maintain
a customer relationship with a firm where the customer will disclose large amounts of personal
information over the life of that relationship. The influence of procedural fairness on customer
loyalty, particularly if the customer experiences a negative outcome involving personal
information use without defecting, merits investigation. This is particularly important in
electronic environments, where the evolution of shared norms about fair use of personal
information often lag the capability of the technology.
The study, however, differs from much of the prior research on trust as it focuses on
impersonal trust, or trust in institutions. Much of the prior research on trust has focused on
interpersonal trust where two or more individuals have first-hand knowledge of one another as in
the case of workplace relationships, or business-to-business marketing relationships between
buyers and a sales representatives (see for example Kramer & Tyler, 1996 and Morgan & Hunt,
1996). Consumer marketing relationships are usually characterized by great social distance:
customers may not deal with another person in the case of Internet commerce, or are unlikely to
know any of the people they deal with in the case of face-to-face or telephone transactions.
Because customers must depend on strangers to act on their behalf, procedural fairness
operationalized as fair information practices acts a fiduciary norm to build trust when control
measures derived from social ties and direct contact between the customer and the firm are
unavailable, when faceless and readily interchangeable individual or organizational agents exercise
considerable delegated power or privilege on behalf of customers who can neither specify,
scrutinize, evaluate nor constrain their behavior (Shapiro, 1987; Zucker, 1986).
Because it is impossible for firms to go back to their customers for permission each time a
new use for personal information is contemplated, these findings should also have important
implications for practice. Firms which implement fair information practices, and disclose these
practices to their customers can exercise latitude in how they use personal information gathered
from transaction data for marketing without risking customer defections and the other negative
outcomes described previously provided they ensure that their practices are consistent with what
they disclosed to their customers. However, if fair information practices are not embedded in the
work practices of all employees, there is a risk that a customer service representative or product
manager may allow personal information to be used in a way that is at odds with the customers’
norms for acceptable use, resulting in a customer, media or regulatory backlash. Creating a
based on fair information practices. A senior manager needs to champion privacy. Employees
need to be trained and retrained. Periodic audits should be conducted to ensure that practices
conform to policy. Privacy should be made part of the business case for all new uses of personal
There is some evidence that all U.S. firms have not assimilated this message about the
importance of managing customer privacy issues strategically (Schwartz and Reidenberg, 1996).
The Harris-Equifax privacy surveys consistently find that the majority of consumers believe they
have lost all control over how their personal information is used by business (Harris, 1990-1994).
Smith (1995) investigated how seven different organizations in four industries responded to
growing concerns about privacy. He observed a three-phase cycle of response: drift external
threat and reaction. Rather than address privacy issues proactively, these firms delegated
responsibility for privacy to lower-level managers. New policies were developed only in response
to an external threat or crisis.
Further research is needed to understand how to measure privacy as an attitude. For
example, Table 2 shows an unexpected significant positive correlation between privacy and direct
marketing experience. It may be that privacy concerns may be driven by experience and by
context, and that people do not develop attitudes about privacy until they have had some
experience with a particular use of personal information as prior research has suggested (Culnan,
1995). Smith, Milberg and Burke (1996) developed and validated a scale to measure individuals’
privacy concerns with corporate information practices such as sharing information with third
parties. However, there is no validated scale to measure overall privacy attitudes.
Finally, this study only considered one aspect of fairness, procedural fairness. It addressed
consumer perceptions of the fairness of information use based on what the firm disclosed to the
consumer about its information-handling procedures. The study did not address perceptions of
fairness related to the actual ways the firm subsequently reuses personal information. Distributive
fairness relates to ways a firm uses the personal information in its customer database or data
warehouse on a day to day basis, and whether or not the customer perceives these uses as being
fair or unfair. The justice literature suggests that even when a particular outcome is perceived
negatively, customers should be less likely to defect from a relationship if they perceive the
process by which their data were collected and used to be fair (Lind & Tyler, 1988). In order to
understand whether both procedural fairness and trust can buffer a firm from the negative
consequences portrayed in Figure 1 such as defecting when a customer perceives an outcome
negatively, the interaction among procedural fairness, outcomes and trust merits further
Tomorrow’s emerging information environments will continue to provide greater
decentralized access to personal information. This study showed that privacy is an organizational
issue. Without an organizational policy governing fair use of personal information, organizations
face the risk that information used inappropriately by a single employee or by a single department
can have negative consequences for the entire firm. Conversely, using personal information fairly
throughout the organization can provide a source of competitive advantage by promoting flows of
customer data over time that in today’s competitive global economy, are critical in support of all
activities in a firm’s value chain.
Ajzen, Izek and Fishbein, Martin. 1980. Understanding Attitudes and Predicting Social
Behavior. Englewood Cliffs: Prentice Hall.
Berry, Leonard L. 1995. On Great Service: A Framework for Action. New York: The Free
Bessen, Jim. 1993. Riding the Marketing Information Wave. Harvard Business Review, 71, 5
Bies, Robert J. 1993. Privacy and Procedural Justice in Organizations. Social Justice Research,
6, 1, 69-86.
Bies, Robert J. and Tripp, Thomas M. 1996. Beyond Distrust: “Getting Even” and the Need for
Revenge. In. In. Kramer, R.M. and Tyler, T.R., Trust in Organizations: Frontiers of Theory and
Research, 246-260. Thousand Oaks, CA: Sage.
Blattberg, Robert C. and Deighton, John. (1991). Interactive Marketing: Exploiting the Age of
Addressability. Sloan Management Review, 33, 1, 5-14.
Bloom, Paul N., Milne, George R., and Adler, Robert. (1994). Avoiding Misuse of Information
Technologies: Legal and Societal Considerations. Journal of Marketing, 58, 1 (January), 98-110.
Brockner, Joel and Siegel, Phyllis. 1996. Understanding the Interaction Between Procedural and
Distributive Justice: The Role of Trust. In. Kramer, R.M. and Tyler, T.R., Trust in
Organizations: Frontiers of Theory and Research, 390-413. Thousand Oaks, CA: Sage.
Clemmer, Elizabeth C. and Schneider, Benjamin. 1996. Fair Service. In Swartz, T.A., Bowen,
D.E. and Brown, S.W., Advances in Services Marketing and Management, 109-126. Greenwich,
CT: JAI Press.
Culnan, Mary J. 1995. Consumer Awareness of Name Removal Procedures: Implications for
Direct Marketing, Journal of Direct Marketing, 9, 2, 10-19.
Culnan, Mary J. 1993. 'How Did They Get My Name'?: An Exploratory Investigation of
Consumer Attitudes Toward Secondary Information Use. MIS Quarterly, 17, 3, 341-364.
Culnan, Mary J. 1992. Processing Unstructured Organizational Transactions: Mail Handling in
the U.S. Senate, Organization Science, 3, 1, 117-137.
Derlega, Valerian J. et. al. 1993. Self-Disclosure. Newbury Park: Sage Publications.
Direct Marketing Association. 1994. Fair Information Practices Manual. New York: Direct
Donaldson, Thomas and Dunfee, Thomas W. 1994. Toward a Unified Conception of Business
Ethics: Integrative Social Contracts Theory, Academy of Management Review, 19, 3 (June),
Dowling, Melissa. 1993. When You Know Too Much, Catalog Age, October, 73-75.
Folger, Robert and Bies, Robert J. 1989. Managerial Responsibilities and Procedural Justice,
Employee Responsibilities and Rights Journal, 2, 2, 79-90.
Folger, Robert and Greenberg, Jerald. 1985. Procedural Justice: An Interpretive Analysis of
Personnel Systems. In. Rowland, Kendrith M. & Ferris, Gerald R. Research in Personnel and
Human Resources Management. Vol 3, 141-183. Greenwich: JAI Press.
Foxman, Ellen R. and Kilcoyne, P. 1993. Information Technology, Marketing Practice, and
Consumer Privacy, Journal of Public Policy & Marketing, 12, 1, Spring, 106-119.
Georgia Tech Research Corporation. 1996. Fifth WWW User Survey. URL:
Glazer, Rashi. 1991. Marketing in an Information-Intensive Environment: Strategic Implications
of Knowledge as an Asset. Journal of Marketing, 55, 4 (October), 1-19.
Godwin, Cathy. 1991. Privacy: Recognition of a Consumer Right, Journal of Public Policy &
Marketing, 10, 1, Spring, 149-166.
Greenberg, Jerald. 1987. A Taxonomy of Organizational Justice Theories, Academy of
Management Review, 12, 1, 9-22.
Gutek, Barbara A. 1995. The Dynamics of Service. San Francisco: Jossey-Bass.
Hair, J.E., Anderson, R.E. and Tatham, R.I. 1987. Multivariate Data Analysis with Readings.
New York: Macmillan.
Louis Harris & Associates. Harris-Equifax Consumer Privacy Surveys, 1990-1994. Atlanta:
Heskett, J. L., Sasser, W. E., and Hart, C. W. L. 1990. Service Breakthroughs: Changing the
Rules of the Game. New York: The Free Press.
Kramer, R.M. and Tyler, T.R. 1996. Trust in Organizations: Frontiers of Theory and Research.
Thousand Oaks: Sage.
Lind, E. Allan and Tom R. Tyler. 1988. The Social Psychology of Procedural Justice, New
York: Plenum Press.
Miller, Leslie. 1996. Think Nobody on the Net Knows Where You Visit? You’re Wrong.
Sacramento Bee, SC1, June 7.
Milne, George R., and Gordon, Mary Ellen. 1993. Direct Mail Privacy-Efficiency Trade-offs
Within an Implied Social Contract Framework, Journal of Public Policy & Marketing, 12, 2, Fall,
Morgan, Robert M. and Hunt, Shelby D. 1994. The commitment-trust theory of relationship
marketing. Journal of Marketing, 58, 3 (July), 20-38.
Pine, B.J. 1993. Mass Customization. Boston: Harvard Business School.
Rogers, Everett M. 1983. Diffusion of Innovations. Third Edition. New York: The Free Press.
Schneider, Benjamin and Bowen, David E. 1995. Winning the Service Game. Boston: Harvard
Business School Press.
Schwartz, Paul M. and Reidenberg, Joel R. 1996. Data Privacy Law. Charlottesville: Michie.
Shapiro, Susan P. 1987. The social control of impersonal trust. American Journal of Sociology,
93, 3, 623-58.
Singer, Eleanor, Mathiowetz, Nancy A., and Couper, Mick P. 1993. The Impact of Privacy and
Confidentiality Concerns on Survey Participation, Public Opinion Quarterly, 57, Winter, 465-482.
Smith, H. Jeff. 1994. Managing Privacy: Information Technology and Corporate America.
Chapel Hill: University of North Carolina Press.
Smith, H. Jeff, Milberg, Sandra J., and Burke, Sandra J. 1996. Information Privacy: Measuring
Individuals’ Concerns About Corporate Practices. MIS Quarterly, 20, 2, 167-196.
Stone, Eugene F. and Stone, Dianna L. 1990. Privacy in Organizations: Theoretical Issues,
Research Findings, and Protection Mechanisms. In. K.M. Rowland and G.R. ferris (Eds),
Research in Personnel and Human Resources Management, Vol. 8, 349-411. Greenwich: JAI
U.S. Department of Health, Education and Welfare, Secretary’s Advisory Committee on
Automated Personal Data Systems. 1973. Records, Computers and the Rights of Citizens.
Washington: U.S. Government Printing Office.
U.S. Information Infrastructure Task Force (IITF). 1995. Privacy and the National Information
Infrastructure: Principles for Providing and Using Personal Information. Washington:
Department of Commerce.
Westin, Alan F. 1967. Privacy and Freedom. New York: Atheneum.
Zucker, Lynne G. 1986. Production of trust: Institutional sources of economic structure, 1840-
1920. In. Research in Organizational Behavior, Vol 8, 53-111, Greenwich: JAI Press.
Summary of Transaction Data Collected at Point-of-Sale
by Transaction Processing Method
Transaction Data Gathered
Manual (customer not
Cash register without scanner Date, retail location, amount
Manual (customer identified) Cash register; credit card Date, retail location, customer,
amount of purchase
Point-of-Sale (customer not
Cash register with scanner;
Date and time, retail location,
items purchased, amount of
Cash register with scanner or
mail order; credit card or
customer account; inventory
and customer databases
Date and time, retail location,
items purchased, amount of
Online (customer identified) Computer-to-computer, credit
card or customer account;
inventory and customer
Date and time, browsing
patterns, items purchased,
amount of purchase, customer
Descriptive Statistics and Inter-Item Correlations
For Dependent and Scaled Independent Variables
VARIABLE MEAN S.D. 1. Use-No
2. Use - FIP 3. Privacy
1. Use - No FIP
2. Use - FIP 2.59 1.03 0.63(a)
(Sum of 3 dichotomous
1.305 0.87 -0.05 0.03
4. Direct Marketing
Frequency (DM FREQ)
(Sum of 5 dichotomous
1.49 1.08 0.21(a) 0.18(a) 0.09(b)
(a) p < .001
(b) p < .01
Discriminators of Willingness to Be Profiled
Without Fair Information Practices (USE-NO FIP)
Chi-squared df Significance Holdout Sample
0.05 .2154 0.954 20.75 2 0.00001 56.06%
Group Discriminant Function
Group Means (Centroids)
(Not willing )
Independent Variable Standardized
Privacy -0.53 -0.46
Direct Marketing Frequency (DM
Discriminators of Willingness to be Profiled
With Fair Information Practices (USE - FIP)
Chi-squared df Significance Holdout Sample
0.0168 0.129 0.983 7.28 2 0.026 60.04%
Group Discriminant Function
Group Means (Centroids)
Independent Variable Standardized
Privacy 0.14 0.19
Direct Marketing Frequency 0.98 0.99
Privacy Leverage Point
PRIVACY CUSTOMER CUSTOMER
CALCULUS: TRANSACTION INFO: PERCEIVES RETENTION
CUSTOMER PRE-PURCHASE MATCHES
DISCLOSES IF POLICY ATTRACT
BENEFITS OF PURCHASE NEW
DISCLOSURE FIRM CUSTOMERS
EXCEED RISKS POST-PURCHASE USES
DISCLOSE MATCH: PERCEIVES CUSTOMER
INFO. POLICY TO PRACTICE DEFECTIONS
CUSTOMER DOES NOT
PROCEDURAL CUSTOMER INFO: POLICY TO
FAIR INFORMATION DEMOGRAPHICS NEW
PRACTICES BAD CUSTOMERS
PSYCHOGRAPHICS WORD OF