Conference PaperPDF Available

PADS: Privacy-Preserving Auction Design for Allocating Dynamically Priced Cloud Resources

Authors:
  • ByteDance

Abstract and Figures

With the rapid growth of Cloud Computing technologies, enterprises are increasingly deploying their services in the Cloud. Dynamically priced cloud resources such as the Amazon EC2 Spot Instance provides an efficient mechanism for cloud service providers to trade resources with potential buyers using an auction mechanism. With the dynamically priced cloud resource markets, cloud consumers can buy resources at a significantly lower cost than statically priced cloud resources such as the on-demand instances in Amazon EC2. While dynamically priced cloud resources enable to maximize datacenter resource utilization and minimize cost for the consumers, unfortunately, such auction mechanisms achieve these benefits only at a cost significant of private information leakage. In an auction-based mechanism, the private information includes information on the demands of the consumers that can lead an attacker to understand the current computing requirements of the consumers and perhaps even allow the inference of the workload patterns of the consumers. In this paper, we propose PADS, a strategy-proof differentially private auction mechanism that allows cloud providers to privately trade resources with cloud consumers in such a way that individual bidding information of the cloud consumers is not exposed by the auction mechanism. We demonstrate that PADS achieves differential privacy and approximate truthfulness guarantees while maintaining good performance in terms of revenue gains and allocation efficiency. We evaluate PADS through extensive simulation experiments that demonstrate that in comparison to traditional auction mechanisms, PADS achieves relatively high revenues for cloud providers while guaranteeing the privacy of the participating consumers.
Content may be subject to copyright.
PADS: Privacy-preserving Auction Design for
Allocating Dynamically Priced Cloud Resources
Jinlai Xu, Balaji Palanisamy, Yuzhe Tang, S.D. Madhu Kumar
School of Computing and Information, University of Pittsburgh, Pittsburgh, PA, USA
Email: {jinlai.xu, bpalan}@pitt.edu
Department of EECS, Syracuse University, Syracuse, NY, USA
Email: ytang100@syr.edu
Department of CSE, National Institute of Technology Calicut, India
Email: madhu@nitc.ac.in
Abstract—With the rapid growth of Cloud Computing tech-
nologies, enterprises are increasingly deploying their services
in the Cloud. Dynamically priced cloud resources such as the
Amazon EC2 Spot Instance provides an efficient mechanism
for cloud service providers to trade resources with potential
buyers using an auction mechanism. With the dynamically priced
cloud resource markets, cloud consumers can buy resources at a
significantly lower cost than statically priced cloud resources such
as the on-demand instances in Amazon EC2. While dynamically
priced cloud resources enable to maximize datacenter resource
utilization and minimize cost for the consumers, unfortunately,
such auction mechanisms achieve these benefits only at a cost
significant of private information leakage. In an auction-based
mechanism, the private information includes information on
the demands of the consumers that can lead an attacker to
understand the current computing requirements of the consumers
and perhaps even allow the inference of the workload patterns
of the consumers. In this paper, we propose PADS, a strategy-
proof differentially private auction mechanism that allows cloud
providers to privately trade resources with cloud consumers in
such a way that individual bidding information of the cloud
consumers is not exposed by the auction mechanism. We demon-
strate that PADS achieves differential privacy and approximate
truthfulness guarantees while maintaining good performance in
terms of revenue gains and allocation efficiency. We evaluate
PADS through extensive simulation experiments that demonstrate
that in comparison to traditional auction mechanisms, PADS
achieves relatively high revenues for cloud providers while
guaranteeing the privacy of the participating consumers.
I. INTRODUCTION
With the rapid growth of Cloud Computing technologies,
enterprises are increasingly deploying their services in the
Cloud. The evolution of cloud computing and datacenter-
enabled technologies has significantly revolutionized the way
in which users and businesses use computing resources to-
day. The cumulative market for cloud computing services is
expected to increase to more than 100 billion in 2017 [1].
Dynamically priced cloud resources such as the Amazon EC2
Spot Instance [2] provides an effective mechanism for cloud
service providers to trade resources with potential buyers using
an auction mechanism. With the dynamically priced cloud
resource markets, cloud consumers can buy resources at a cost
much lower than statically priced cloud resources such as the
on-demand instances in Amazon EC2 [3]. Such spot instances
can reduce the cost of the computing resources by up to 50%
to 90% if the applications running on the spot instances can
deal with temporary interruptions during job execution [4].
Thus, Spot Instances are highly recommended for applications
such as data mining and batch processing that do not have a
real-time processing requirement [4].
While dynamically priced cloud resources enable to maxi-
mize datacenter resource utilization and minimize cost for the
consumers, unfortunately, such auction mechanisms achieve
these benefits only at a significant cost of private information
leakage. In an auction-based mechanism, private information
includes information on when and who has higher demands
on which types of Virtual Machine (VM). Such information
can lead an attacker to understand the current computing
requirements of the consumers and perhaps even allow the
inference of the workload patterns of the consumers. For
instance, if a consumer makes a higher bid for the spot
instance, an adversary may be able to infer that the requested
resources for the computing task are more important than the
other resources requested by the user. Such adversaries may
also infer other business secrets by combining the bidding
information with other background knowledge and break the
normal order through false-name bids [5] in the auction market
to disrupt the normal fair operation.
Protecting consumer privacy in an auction-based resource
allocation market is an important task. Earlier works have
addressed how to protect privacy in auctions [6] [7] [8] [9]
such that the auction achieves the desired outcomes with-
out revealing the private information of the bidders. While
there has been work on privacy-aware auctions in stock and
spectrum distribution [10] [11] [12] [13] [14] [15], privacy-
preserving auction design for dynamically priced cloud re-
source allocation has not yet received attention from the
research community. In this paper, we propose a privacy-
preserving auction design mechanism called PADS (privacy-
preserving auction design for spot and dynamically priced
cloud resources) that protects the private information in the
bids in the auction process through differential privacy [16]
guarantees.
Concretely, this paper makes the following contributions.
1) To the best of our knowledge, the work presented in this
paper is the first to design a differentially private and
strategy-proof solution for allocating dynamically priced
cloud resources through an auction mechanism.
2) We formally model and analyze the problem of dynam-
ically priced cloud resource allocation as a sealed-bid
auction problem and design two near optimal privacy-
preserving mechanisms. We demonstrate that both the
mechanisms achieve differential privacy guarantees and
hold the strategy-proof property.
3) We propose PADS-ADP, an (, δ)-differentially private
and truthful auction mechanism. Unlike existing so-
lutions, PADS-ADP has the ability to simultaneously
guarantee differential privacy and yet provide the de-
sired features of an auction mechanism. We improve the
performance of PADS-ADP by developing an enhanced
mechanism called PADS-DP which is an -differentially
private and an approximate truthful mechanism. The low
computational complexity of both PADS-ADP and PADS-
DP make it possible to calculate the auction outcome for
low latency real-time scheduling requests.
4) We experimentally evaluate PADS (both PADS-ADP and
PADS-DP) through an extensive simulation study. Our
evaluation results show that PADS achieves a closely sim-
ilar performance compared to traditional auction mecha-
nisms such as VCG [17]–[19] while providing the desired
differential privacy guarantees in the auction process.
The remainder of this paper is organized as follows. In
Section II, we briefly review the related work in the areas
of auction mechanism design and differential privacy. Section
III presents the problem model and reviews a few key concepts
related to auction design and differential privacy. Section
IV presents the design of a near optimal privacy-preserving
mechanism and its properties. In Section V and VI, we
introduce the design of PADS-ADP and PADS-DP and analyze
their properties. Section VII presents our evaluation results.
Finally, we conclude the paper in Section VIII.
II. RE LATE D WORK
In recent years, privacy becomes a significant concern to
users as innovations in technology often require private infor-
mation of users for processing. There are two kinds of privacy-
preserving techniques studied extensively in the literature: k-
anonymity [20] and differential privacy [16]. Many privacy-
preserving solutions are proposed based on the differential
privacy concept, which include two major techniques: the first
set of techniques is represented by perturbation algorithms
[21], [22] which adds noise to protect the privacy for in-
dividual’s contribution to the statistical output; the second
set of techniques include exponential mechanisms [9], [23]
that is used when the noise addition is not a reasonable
approach for guaranteeing privacy. Privacy-preserving auctions
also have been studied extensively in the recent years, and
some of the work has studied the problem in the context
of using differential privacy. For the theoretical aspect, there
are many research efforts trying to add new properties to
the previous basic auction mechanisms to make them more
efficient and effective, for example, McSherry and Talwar
Bid Price
-- Spot Price
Time slot Spot Instance can run
Time
Price ($)
Figure 1. Spot Instance Illustration (Spot Instance Type: g2.8xlarge, Bid
Price: $1.95, Date: June 30, 2017)
[9] presents the basic idea of protecting the privacy of the
bids using an exponential mechanism and demonstrate several
critical properties including the approximate truthfulness and
differential privacy guarantees of the mechanism. Huang and
Kannan [23] proposed a nearly optimal differentially private
auction mechanism using the Gibbs Measure which is also
known as Boltzmann distribution in chemistry and physics [24]
to achieve a more effective optimization for the revenue and
social welfare compared to the techniques presented in [9].
For the application perspective, spectrum allocation auctions
are the most studied ones in the area of privacy-preserving
auctions. There are several research efforts implementing
privacy preserving mechanisms in spectrum auctions such as
[25], [26]. These techniques primarily address the spectrum
resource allocation problem in a privacy-preserving manner.
To the best of our knowledge, there is no prior work addressing
the privacy leakage problem in dynamically priced resource
allocation in clouds. The PADS privacy-preserving auction
mechanism proposed in this work is the first significant ef-
fort towards addressing the privacy-preserving cloud resource
allocation problem using privacy-aware auctions that provide
both differential privacy guarantees and efficiency in terms of
resource allocation.
III. CON CE PT S AND MODEL
In this section, we present the problem description for
privacy-preserving auction design of the dynamically priced
resource allocation and introduce the basic concepts of mech-
anism design and differential privacy.
A. Problem Model
We model the dynamically priced cloud resource allocation
problem as a sequence of auctions using discrete time slots.
In each time slot, we assume an auction mechanism that
determines the winning bids and allocates the resources to the
winners. As shown in Figure 1 which illustrates an example
of Spot Instance, when the bid of a user is higher than the
spot price, the user can run his/her job using a set of virtual
machines (VMs) during the time slot. In the cloud resource
allocation auction, we refer to the users as bidders or buyers
and the Cloud Service Providers (CSPs) as the sellers. The
entity performing the auction mechanism is referred to as the
auctioneer. There are Ktypes of VMs used as the goods in
the auctions. In every time slot t, for each type-kVM, the
sealed-bid auction mechanism decides the users who can run
their jobs during the time slot t. For simplicity, we model each
round of auction in a time slot assuming only one type of VMs
used as goods in the auction. The objective of the seller (CSP)
is to allocate the VMs to the users such that it maximizes its
revenue. We assume that the CSP (seller) has mtype-kVMs.
There are nusers that want to use the type-kVM and each
user iNbids for the VMs with their bid value, bi. Here
Ndenotes the set of bidders N={1,2, ..., n}The bids are
represented by a vector ~
b={b1, b2, ..., bn}. Each user has a
per-VM valuation, which is private to the user, represented by
~v ={v1, v2, ..., vn}. Depending on the bidding strategy, the
bid may be equal or not equal to the real valuation of the
good for the user. The outcome of the auction is determined
by the auction mechanism which can be represented by a
vector ~x ={x1, x2, ..., xn}where xiis a binary indicator that
indicates whether the bid biwins or not. The payments are
represented by ~p ={p1, p2, ..., pn}where piis the payment
of user ito rent the type-kVM in the current time slot. The
objective of each user is to maximize the per-user utility which
can be represented by using the following utility function:
ui= (vipi)xi(1)
where uiis the utility of user i. The seller (CSP) also wants to
maximize its revenue in the auction mechanism. The revenue
of the seller (CSP) can be represented by the sum of the
payments:
REV =
n
X
i
pixi(2)
In a privacy-preserving auction mechanism, one of the
objectives is to protect the inference of the participation of
a bidder from the outcome of the auction. In addition, the
inference of private information such as the bid value, biand
the user’s true valuations of the goods, vineed to be protected
from the outcome of the auction as well.
B. Auction Design Concepts
Before introducing the proposed PADS auction mechanism
, we review some important concepts related to auction mech-
anism designs and privacy-preserving mechanisms.
Mechanism Design
We first introduce Dominant Strategy [27] from game
theory that forms a fundamental solution concept for auction
mechanism designs.
Definition 1. (Dominant Strategy [28]) Strategy siis a player
i’s dominant strategy in a game, if for any strategy s0
i6=si
and any other players’ strategy profile si,
ui(si, si)ui(s0
i, si).(3)
The concept of dominant strategy is related to truthfulness. In
an auction, truthfulness means that revealing truthful informa-
tion is the dominant strategy for every bidder.
Definition 2. (Truthfulness) “Truthfulness” is also called as
strategy-proof or incentive compatibility in auction literature.
In game theory, an asymmetric game where players have
private information is said to be strategy-proof (SP) if it is
a weakly-dominant strategy for every player to reveal his/her
private information.
If an auction mechanism is truthful, then the bidders will
tend to bid with their true valuation of the products. This is a
powerful feature for auction mechanism design as it ensures
that both the buyers and sellers can get maximum utility from
the auction without cheating.
Formally, we can define the truthfulness property as
E[ui(si, si)] E[ui(s0
i, si)]
where the uiis the utility of bidder i,siis the strategy
that bidder ibids with the true value of the product. Here si
represents the strategies for the bidders other than bidder iand
s0
irepresents a strategy other than si. The function illustrates
that the strategy that bids with the true value will give the
bidder the highest utility compared to any other strategies. If
this function is true for all the bidders, it ensures that the
auction mechanism is truthful.
However, exact truthfulness sometimes turns out to be
too strict as a solution, and as an alternative, approximate
truthfulness, or γ-truthfulness [9], [29] has been proposed in
the literature.
Definition 3. (γ-truthfulness) An auction is γ-truthful in
expectation, or γ-truthful for short, if and only if for any
bidding strategy s0
i6=siand for any bid strategies of other
bidders si, there is:
E[ui(si, si)] E[ui(s0
i, si)] γ(4)
where γis a small positive constant.
In auction mechanism design, there is another property
called Individual Rationality guaranteeing that every bidder
will not lose utility in the auction. It is defined as below:
Definition 4. (Individual Rationality) An auction is individual
rational if and only if ui0holds for every bidder iN.
Privacy Concepts and Definitions
We next introduce the concepts and definitions related to
privacy-preserving mechanism designs.
Definition 5. (Differential Privacy) Differential privacy is
a privacy-preserving mechanism that protects an individual
user’s contribution in a dataset. In a differentially private
auction mechanism, the actions of a trusted auctioneer can
be modeled as a randomized algorithm A. A randomized
algorithm Ais -differentially private if for all datasets D1
and D2that differ on a single element (i.e., a bid of one
person), and all SRange(A):
Pr[A(D1)S]e×Pr[A(D2)S]
In the literature, a relaxed definition of differential privacy
has also been introduced.
Definition 6. (Approximate Differential Privacy [21]) A ran-
domized algorithm Ais (, δ)-differentially private if for all
datasets D1and D2that differ on a single element (i.e., a bid
of one person), and all SRange(A):
Pr[A(D1)S]e×Pr[A(D2)S] + δ
Exponential Mechanism is a key mechanism to achieve
differential privacy in a privacy-preserving auction design. An
exponential mechanism builds a distribution of probabilities
to choose the output based on an exponential function to
guarantee -differential privacy [9].
Exponential Mechanism
The exponential mechanism [9] is a general technique for
constructing differentially private algorithms over an arbitrary
range Rof outcomes and for any objective function F(b, r).
The goal of the exponential mechanism is to map, randomly,
a set of ninputs each from a domain Dto some output in
a range Rand protect the individual privacy. It is defined as
follows:
Definition 7. (Exponential Mechanism [9]) For any function
F: (Dn× R)R, and a base measure µover R, define:
E
F(b) :=Choose rwith probability eF (b,r )×µ(r)
, b ∈ Dn, r R(5)
The exponential mechanism guarantees 2-differential pri-
vacy, where ∆ = ¯
bbis an upper-bound of the difference of
the feasible outcomes of two data sets which only differ in a
single data item. An immediate theorem can also be derived
as [29]:
Theorem 1. When used to select an output r∈ R, the
exponential mechanism E
F(b)yields 2differential privacy.
Let ROP T denote the subset of Rachieving F(b, r) =
maxrF(b, r), then the exponential mechanism ensures that:
P r[F(b, E
F(b)) <max
rF(b, r)ln(|R|
|ROP T |)/ t
]et
(6)
In our problem, we want to design an auction mechanism
that can allocate the VMs to the users based on the bids
submitted by the users and achieve the strategy-proof property
while preserving privacy and maximizing the CSP revenue.
In the next section, we propose a near optimum mechanism
for solving the privacy-preserving auction design problem for
cloud resource allocation.
IV. STRAIGHT-FORWARD EXPONENTIAL MECHANISM
We first propose a near optimum privacy preserving mech-
anism which is straight-forward based on the exponential
mechanism proposed in [9] and [23]. The straight-forward
mechanism solves the privacy-preserving problem for the
dynamically priced resource allocation problem using the
exponential mechanism proposed in [9] and [23]. In an auction
mechanism for allocating the VMs, ~
brepresents the bid profile,
and pirepresents the bidder i’s payment. The objective for the
auction is to maximize the revenue of the CSP which can be
calculated as:
Objective:
max REV =
n
X
i
pixi(7)
Subject to: n
X
i
xim(8)
As the exponential mechanism proposed in [23] achieves
approximate truthfulness, the expected revenue is equal to its
expected surplus:
E[REV ] = E[
n
X
i
vixi](9)
The logic behind it is intuitive. In a truthful auction that the
bidders tend to bid with the true valuations, the expected
revenue of the auction is equal to the expected surplus.
With the above objective function and the expected surplus
analysis, the problem of designing the privacy-preserving
and near optimum resource allocation auction can be solved
through an exponential mechanism. Based on the principles
outlined in [23], the privacy-preserving auction mechanism
may assign each possible outcome a probability which is
proportional to the objective function and can be represented
by the revenue F(~
b, ~x) = Pn
ibixi. Then, based on the proba-
bilities, the mechanism can choose the outcome. The payment
for each winner is assigned using a VCG-like mechanism [23].
The detailed auction works as below:
1) Each bidder isubmits its bid: bi;
2) The mechanism chooses the outcome with probability
proportional to eraised to the power of the objective function
and satisfies the Eq. (8):
P r[~x]exp(
n
X
i
bixi)(10)
3) The payment for winner bid biis assigned by the
mechanism proposed in [23].
A. Analysis
Next, we analyze the features of the near optimum privacy-
preserving resource allocation mechanism. In particular, we
analyze the revenue guarantee of the mechanism, the truthful-
ness property and the tractability of the mechanism. As we
know that the expected revenue is equivalent to the expected
surplus, here we just analyze the expected revenue.
Cloud Provider
PADS
Secure Tunnel
Resource
Allocator
User Client 2
Bid Generator
User Input 2
(job type, etc.)
Job Scheduler
User Client 1
Bid Generator
User Input 1
(job type, etc.)
Job Scheduler
User Client 3
Bid Generator
User Input 3
(job type, etc.)
Job Scheduler
Secure Tunnel
Secure Tunnel
User Bids
Differential
private output:
1. Winners
2. Payments
Published
price history
Resource
Allocation
Adversary
Cannot infer
information from
published price data
Publish price
For reference
Figure 2. PADS Architecture
Theorem 2. The expected surplus:
E[
n
X
i
vixi](11)
is maximized when the winning bids are chosen from the near
optimum privacy-preserving mechanism, E
F(~
b).
Theorem 3. The near optimum privacy-preserving mechanism
is truthful, individually rational and -differentially private.
The proofs of the above-mentioned properties are discussed
in [23].
Theorem 4. The exponential mechanism which chooses the
outcome according to the objective function to decide the
winners for allocating the VMs is intractable.
Proof. The choosing function needs to calculate all the pos-
sible outcomes with the constraint defined in the Eq. (8). The
computation complexity can be represented by a combination
which chooses mfrom n,O(n
m). As the number of VMs
(which is represented by m) is usually a large number and
the number of users (which is represented by n) is also
large, it makes the straight-forward near optimum privacy-
preserving mechanism intractable in the dynamically priced
resource allocation problem.
V. PADS-ADP: PRIVACY-PRESERVING AUCTION DESIGN
WITH APP ROXIMATE DIFFERENTIAL PRIVACY
GUAR AN TE ES
The straight-forward near optimum privacy-preserving
mechanism introduced in the previous section can provide
near optimum revenue and -differential privacy. However,
the computation cost is prohibitively expensive to be used in
practice. In this section, we propose PADS-ADP, an alternate
privacy-preserving auction design that uses an iterative winner
decision algorithm and a payment scheme that forces the
bidders to bid with true valuations. We prove that PADS-
ADP can provide (, δ)-differential privacy while achieving
the truthfulness property in the auction.
In PADS, the auctions are conducted in discrete time slots
as described in Section III. As shown in Figure 2, similar to
the Spot Instances provided by Amazon EC2, PADS assumes
that each user has a client which takes care of the bidding
and job scheduling. The user submits the maximum price
he/she wants to pay for the VMs. The user client bids for
the required VMs in every time slot with the maximum price
set by the user. When the bid wins in a time slot, the user
client schedules the jobs to the VMs allocated to it. As shown
in Figure 2 PADS protects the private information of the users
in the auction including who are the winners and how much
they bid and pay for the resources from the adversaries. In
an auction performed using PADS to determine the winners
and payments, the adversaries cannot infer users’ information
from the published price data (provided by the CSP for the
potential customers to refer).
A. Design Details
We now describe the detailed design of PADS-ADP. The
iterative auction mechanism works in a sequence of four
steps: (i) it first calculates the probability distribution over
the set of current bids, R, (ii) then it randomly selects a bid
from the set as the winner in the current round based on
the probabilities calculated from the first step, (iii) next, it
calculates the payment scheme for the winner and (iv) finally,
it removes the winner from the set, R, in the current round
and checks the end condition. The above four steps repeat until
there are no bids in the set Ror the VMs in the resource pool
has been exhausted.
The winners of the auction are determined as follows.
(i) Calculation of Probability Distribution: First, we need
to calculate the probability distribution of the bids which
needs to be used in the exponential mechanism. The difference
between the near optimum solution and PADS-ADP is that
instead of choosing the results from the all possible outcomes,
PADS-ADP chooses one winner for each iteration proportional
to eraised to the power of the bid value:
P r[WW∪ {i}]i=exp(0bi)
PiRexp(0bi)(12)
where Wis the set of winners’ bids such as W={wi, w2, ...},
Ris the current set of the bid, and:
0=
(e1)∆ ln(e/δ)(13)
(ii) Winner Selection: After calculating the probabilities
of all the bids in set Rto be chosen as winners, we get the
probability vector ~
P r ={P r1, P r2, ...}.PADS-ADP randomly
selects a bid biRas the winner in the current round
according to the probabilities for each bid, P ri,iR.
(iii) Payment Scheme: After selecting the winner for
the current iteration, PADS-ADP calculates the payment for
it. As we desire the truthfulness property from the auction
mechanism, the payment scheme is quite important to make
the auction truthful. Here, we use the results developed in [30]
to design the payment scheme. The immediate theorem can be
described as below:
Theorem 5. A mechanism is truthful in expectation if and
only if, for any bidder iand any fixed choice of bids by the
other bidders, ~
bi,
1) ~x is monotonically nondecreasing in bi;
2) pi=biyi(~
b)Rbi
0yi(z)dz, where yi(z)is the probability
that bidder iis selected as a winner when his bid is z.
From the Theorem 5, for the first condition, as we already
conduct the exponential mechanism in the winner selection
step, the probability of choosing the bidder iis proportional
to exp(0bi). Here, 0is a constant which is a positive
number, and the exponential function exp() is monotonically
increasing. Hence the first condition is satisfied. Next, we set
the payment scheme as follows:
pi=biyi(~
b)Zbi
0
yi(z)dz (14)
to satisfy the second condition. Thus, the payment scheme
and the above winner selection algorithm together provide
truthfulness for the auction.
(iv) Post Processing: After calculating the payments, the
indicator of the winning bid biis set to 1, xi= 1. And the bid,
bi, is removed from the set of current bids, R. Then, PADS-
ADP checks whether all the VMs are allocated, or all the bids
are removed from R. If any of the above two conditions are
met, the auction is ended.
Algorithm 1: PADS-ADP Mechanism
Input : Type of the VM : k;
# of VMs: n;
Buy bids: ~
b={b1, b2, ...};
Output: Auction decision: ~x ={x1, x2, ...};
Payment scheme: ~p ={p1, p2, ...};
1Initially, the possibility vector as ~
P r ={P r1, P r2, ...}where P riis set by
Eq. (12), R=~
band the number of winners m= 0;
2while R6=φor m<ndo
3for all biRdo
4P ri=exp(0bi)
PiRexp(0bi);
5end
6Randomly select iaccording to the the probability vector ~
P r;
7Set xi= 1;
8pi=biyi(~
b)Rbi
0yi(z)dz Remove bifrom R;
9m=m+ 1;
10 end
The overall algorithm is shown in Algorithm 1. The time
complexity of Algorithm 1 is O(nm)where nis the number
of users that bid for the VMs and mis the number of VMs
that can be allocated. The worst case time complexity can be
estimated by the accumulation of the computation times of the
probabilities, ~
P r. It can be calculated as: n+ (n1) + ... +
(nm+1) = (n+nm+1)m
2=nm+m2+m
2. Generally, the
number of users is larger than the number of VMs: n > m.
Therefore the worst case time complexity of the algorithm is
O(nm).
After proposing the mechanism, we analyze and prove the
differential privacy guarantee for PADS-ADP.
Theorem 6. For any δ1/2,PADS-ADP provides (, δ)-
differential private.
Proof. Let ~
band ~
b0be two input bid vectors that differ in a
single bidder s’s bid. We show that PADS-ADP achieves bid
privacy preservation including revealing the order in which
the bidders are chosen for an arbitrary sequence of winners
selection W=W0={w1, w2, ..., wl}of arbitrary length lfor
~
band ~
b0respectively. The steps of the proof are inspired by the
results presented in [25]. We consider the relative probability
of PADS results for given bids inputs ~
band ~
b0:
P r[W={w1, w2, ..., wl}]
P r[W0={w1, w2, ..., wl}]
=
l
Y
i=1
exp(0bwi)/Pj(N−{πi})exp(0bj)
exp(0b0
wi)/Pj(N−{πi})exp(0b0
j)
=
l
Y
i=1
exp(0bwi)
exp(0b0
wi)
l
Y
i=1
Pj(N−{πi})exp(0b0
j)
Pj(N−{πi})exp(0bj)
(15)
where πi={w1, w2, ..., wi}. If bs> b0
s, the first product is
less than exp(0∆):
exp(0(bsb0
s)) exp(0∆) (16)
and the second product is less than 1. If bs< b0
s, the fist
product is less than 1. Then, we have
P r[W={w1, w2, ..., wl}]
P r[W0={w1, w2, ..., wl}]
l
Y
i=1
Pj(N−{πi})exp(0b0
j)
Pj(N−{πi})exp(0bj)
=
l
Y
i=1
Pj(N−{πi})exp(0(b0
jbj)) exp(0bj)
Pj(N−{πi})exp(0bj)
=
l
Y
i=1
Ej(N−{πi})[exp(0(b0
jbj)]
(17)
Note that for all β1, we have exp(β)1 + (e1)β.
Therefore, for all 01, we have
l
Y
i=1
Ej(N−{πi})[exp(0(b0
jbj)]
l
Y
i=1
Ej(N−{πi})[1 + (e1)0(b0
jbj)]
exp((e1)0
l
X
i=1
Ej(N−{πi})[b0
jbj])
(18)
If Pl
i=1 Ej(N−{πi})[b0
jbj]∆ ln(e/δ), we have
P r[W={w1, w2, ..., wl}]
P r[W0={w1, w2, ..., wl}]exp((e1)0∆ ln(e/δ)) = exp()
(19)
By Lemma A.1 and A.2 in [29], we have
P r[Pl
i=1 Ej(N−{πi})[b0
jbj]>∆ ln(e/δ)] δ.
Thus, the theorem follows.
VI. PADS-DP: PRI VACY-PRESERVING AUCTION DESIGN
WITH DIFFERENTIAL PRIVACY
In the previous section, we presented PADS-ADP that can
provide (, δ)-differential privacy and truthfulness. A limiting
constraint of PADS-ADP is that it can only provide approx-
imate (, δ)-differential privacy which is relatively weaker
than the rigorous -differential privacy. In addition, as the
iterative winner selection makes the mechanism more random
with a smaller 0in each iteration, the mechanism is too
random to be efficient as demonstrated by our results in the
evaluation Section VII. Another weakness of PADS-ADP is
that the payment scheme is heterogeneous for the winners
as it uses a payment scheme which calculates the payments
by the probability distributions for each bidder to force the
mechanism to be truthful.
In this section, we propose PADS-DP which makes a trade-
off between differential privacy, the truthfulness property and
the revenue earned by the CSP. PADS-DP can provide -
differential privacy with little loss of truthfulness with a
grouping winner selection mechanism.
A. Design Rational
If we want to provide exact truthfulness similar to PADS-
ADP, we need to lose higher utility to achieve truthfulness.
On the contrary, if the mechanism can tolerate a little loss
of the truthfulness which is called “approximate truthfulness”
as defined in Definition 3, we can design a more efficient
mechanism with higher revenues and better privacy guarantees.
Without using the iterative method to choose the winners
one by one, PADS-DP chooses the winners using a grouping
method. The groups are calculated by the bid values which
makes the mechanism individual rational for every user (See
Definition 4). After that, PADS-DP chooses the winner group
from the candidates using the exponential mechanism. In the
next subsection, we will discuss the details of PADS-DP.
B. Design Detail
As shown in Figure 2, the mechanism needs to decide both
the winners and the payments for the winners by the bids
which is the input of the mechanism. Here, if we need to
choose a payment scheme ensuring that every winner pays the
same payment, pi=p, xi= 1, then the possible outcome of
the auction is not n
mbut nwhich is the number of the bids
(each bid can be a possible outcome that p=bi).
The basic idea of PADS-DP is to group the bids by the
bid value. The mechanism is shown in Algorithm 2. First,
we sort the bids in the descending order to build a set to
denote the possible payment outcomes, P={ρ1, ρ2, ..., ρn}
and ρ1ρ2... ρn. Then, we group the bids by the
possible payments P. For each group, it has a payment scheme
p=ρiP. Next, based on the payments, we select the
winners from the highest bid until the lowest bid bjρior
until the number of winners is larger than or equal to m.
We use Sito represent the candidates in the group. There-
fore, in this condition, we have the following score function
for each group:
F(Si, ρi) = ρi|Si|(20)
For each group, based on the score function, we calculate the
probability:
P ri=exp(
2∆ ρi|Si|)
PρjPexp(
2∆ ρj|Si|)(21)
Finally, based on the probabilities, we randomly choose a
group as the winners. The candidates Sias the final winners
and the payment is set to ρi. The computational complexity
Algorithm 2: PADS-DP Mechanism
Input : Type of the VM : k;
# of VMs: m;
Buy bids: ~
b={b1, b2, ...};
Output: Auction winners: S;
Payment scheme: p;
1Initially, sort ~
bwith descending order and generate P=~
b;
2for ρiPdo
3while bjρiand s.t. Eq.(8) do
4SiSi∪ {j};
5end
6P ri=exp(
2∆ ρi|Si|)
PρjPexp(
2∆ ρj|Sj|);
7end
8Randomly select the winner group with probability ~
P r;
9Assume the winner set is Si;
10 SSi;
11 p=ρi;
of Algorithm 2 is O(nm)which is determined by the main
loop from line 2 to 7 in Algorithm 2. It calculates the possible
winners for each set Si. The maximum number of winners
is bounded by mand the number of the sets is n. So the
computational complexity is O(nm).
After presenting the mechanism, we next provide the formal
theoretical analysis of the desirable properties of PADS-DP
mechanism. First, we prove that the PADS-DP mechanism is
-differentially private in Theorem 7.
Theorem 7. The PADS-DP auction is -differentially private.
Proof. We denote~
band ~
b0as two bid profiles that differ in only
one bidder’s bid. We use Mto denote PADS-DP mechanism,
pP, we have:
P r[M(~
b) = p]
P r[M(~
b0) = p]
=exp(
2∆ p|Si|)
exp(
2∆ p|S0
i|)PρjPexp(
2∆ ρj|S0
j|)
PρjPexp(
2∆ ρj|Sj|)
exp(
2∆p)PρjPexp(
2∆ ρj(|Sj|+ 1))
PρjPexp(
2∆ ρj|Sj|)
exp(
2)PρjPexp(ρj|Sj|+
2∆ )
PρjPexp(
2∆ ρj|Sj|)
= exp(
2) exp(
2)
= exp()
(22)
0
500
1000
1500
2000
2500
3000
2000
4000
6000
8000
10000
revenue
# of users
OPT
PADS-ADP
PADS-DP
(a) Revenue (Users)
0
500
1000
1500
2000
2500
3000
3500
4000
100
200
300
400
500
revenue
# of VMs
OPT
PADS-ADP
PADS-DP
(b) Revenue (VMs)
1000
1200
1400
1600
1800
2000
2200
2400
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
revenue
epsilon
OPT
PADS-ADP
PADS-DP
(c) Revenue ()
Figure 3. Evaluation results for revenues
0
500
1000
1500
2000
2500
3000
2000
4000
6000
8000
10000
social welfare
# of users
OPT
PADS-ADP
PADS-DP
(a) Social Welfare (Users)
0
1000
2000
3000
4000
5000
100
200
300
400
500
social welfare
# of VMs
OPT
PADS-ADP
PADS-DP
(b) Social Welfare (VMs)
1000
1200
1400
1600
1800
2000
2200
2400
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
social welfare
epsilon
OPT
PADS-ADP
PADS-DP
(c) Social Welfare ()
Figure 4. Evaluation results for social welfares
Therefore, we have:
P r[M(~
b) = p]exp()P r[M(~
b0) = p],pP(23)
and we arrive at the conclusion that the PADS-DP mechanism
is -differentially private.
Next, we prove that PADS-DP is -truthful.
Theorem 8. The PADS-DP auction is -truthful.
Proof. The step is similar to Theorem 7. We also use ~
band
~
b0as two bid profiles that differ in only one bidder’s bid.
We use the conclusion Eq.23 which can be transformed to
P r[M(~
b) = p]exp()P r[M(~
b0) = p]. Therefore, the
expectation of any bidder is utility satisfies:
EpM(~
b)[ui(p)] = X
pP
ui(p)P r[M(~
b) = p]
X
pP
ui(p) exp()P r[M(~
b0) = p]
= exp()EpM(~
b0)[ui(p)]
(1 )EpM(~
b0)[ui(p)]
=EpM(~
b0)[ui(p)] EpM(~
b0)[ui(p)]
(24)
As the maximum utility of an individual user is bounded by
which is based on the utility function ui(p)=(vip)xi
¯
bb= ∆, we can get
EpM(~
b)[ui(p)] EpM(~
b0)[ui(p)] (25)
Therefore, with the Definition 3, we can conclude that PADS-
DP is -truthful.
VII. EVALUATI ON
We have implemented PADS (both PADS-ADP and PADS-
DP) in a simulator, and extensively evaluate their performance.
On the CSP’s side, the evaluation results show that PADS-
DP can achieve relatively high revenues and social welfares
compared with PADS-ADP. On the users’ side, we analyze the
payments that are paid by the users to obtain the resources.
In addition, we measure the job completion rate showing that
PADS-DP can get near optimum result while maintaining a
relatively high privacy level.
A. Setup
The default setting of the experimental evaluation is de-
scribed below: In our experiments, we assume that each CSP
Table I
DEFAULT CONFIG UR ATIO N
# of bidders 5000 0.1
# of VMs 200 δ0.25
bids [0,1] time slot length 5 minutes
simulate time 1 hour job running time 10 minutes
provides one type of VMs to the users. The simulation time
is set to one hour for the default setting and the time slot
is set to be five minutes similar to the Amazon EC2 Spot
Instance. We generate the bids and the jobs for each user
to model the interactions between the users and the clients
as shown in Figure 2. We assume that the jobs are batch
processing jobs which can be interrupted during execution.
The bids are generated from a uniform distribution bi[0,1]
and are equal to the true valuations of the VMs for the users
as the mechanisms we evaluate in our experiments are all
0
0.2
0.4
0.6
0.8
1
1.2
2000
4000
6000
8000
10000
payment
# of users
OPT
PADS-ADP
PADS-DP
(a) User Payments (Users)
0
0.2
0.4
0.6
0.8
1
1.2
100
200
300
400
500
payment
# of VMs
OPT
PADS-ADP
PADS-DP
(b) User Payments (VMs)
0
0.2
0.4
0.6
0.8
1
1.2
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
payment
epsilon
OPT
PADS-ADP
PADS-DP
(c) User Payments ()
Figure 5. Evaluation results for user payments
truthful or approximately truthful mechanisms. The users do
not change their bids during the one hour simulation period.
The length of the jobs is set to be ten minutes, and therefore
each job requires at least two time slots to complete. If a job
is completed, the user client stops to bid for the VMs. All the
experiment results are averaged for 100 trials. The error bar
of each result represents the 95% confidence interval of the
100 trials.
B. Methodology
In our experiments, we compare PADS-DP and PADS-ADP
with the VCG auction mechanism [17]–[19] (denoted as OPT
in the results) which provides truthfulness but not differential
privacy guarantees. The VCG mechanism is implemented in
the simulator to satisfy the objectives and constraints in the
dynamically priced resource allocation problem in clouds.
To evaluate the performance of PADS, we use the following
four metrics:
1) Revenue: The revenue is calculated from the sum of all
the payments from the users during the overall simulation time.
We conduct three sets of experiments by increasing the number
of users and the number of servers for different settings of .
2) Social Welfare: The social welfare is computed as the
sum of the values of the users [27] which can be calculated
as Pn
ivixi. It is the basic metric to measure the economic
efficiency of the auction mechanisms.
3) User’s Payment: The user’s payment is calculated as the
average payment of each winner for each time slot.
4) Completion Rate: The completion rate of the jobs
measures the fraction of the jobs that complete within their
deadline.
C. Experiment Results
First, we measure the revenue earned by the CSPs in the
auction process. We conduct three sets of experiments to
evaluate the performance of PADS-ADP and PADS-DP. In our
experiments, we study (i) the impact of the number of users,
(ii) the influence of the number of the resources (VMs) on the
auction performance and (iii) the impact of the parameter in
the auction outcome.
From Figure 3a and 3b, we can see that PADS-DP achieves
nearly the same revenue as OPT. In contrast, PADS-ADP
attains only half the revenue of OPT. This observation reflects
0
0.2
0.4
0.6
0.8
1
1.2
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
completion rate
bid value
OPT
PADS-DP(epsilon=0.01)
(a) Completion Rate(= 0.01)
0
0.2
0.4
0.6
0.8
1
1.2
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
completion rate
bid value
OPT
PADS-DP(epsilon=0.1)
(b) Completion Rate (= 0.1)
Figure 6. Evaluation results for completion rate
the fact that PADS-ADP chooses the winners in a more random
iterative manner compared to PADS-DP and hence its revenue
is lower than PADS-DP. As shown in Figure 3a, we can also
observe that when the number of bidders (users) is increased,
the OPT and PADS-DP schemes can get higher revenue, but
PADS-ADP maintains the same revenue. This is also due
to the fact that PADS-ADP chooses winners more randomly
compared to the other two mechanisms. In Figure 3b, we
analyze the performance by increasing the number of VMs.
We observe that all the three schemes namely OPT,PADS-
DP and PADS-ADP obtain higher revenue as resources are
increased. We study the influence of the parameter on the
revenue in Figure 3c. We observe that for (from 0.1 to
1), the influence is perceptible for PADS-DP. However, it is
insignificant for PADS-ADP as PADS-ADP uses an iterative
process to achieve differential privacy, and in each iteration,
the 0=
(e1)∆ ln(e/δ)is calculated based on . Since it is
usually tenfold smaller than , it makes the scale of insen-
sitive. For PADS-DP, with larger , the revenue approaches
that of the OPT scheme with smaller variances (as shown by
smaller error bars in the figure).
Next, we evaluate social welfare which is one of the metrics
to measure the economic efficiency of the auction mechanisms
[27]. In Figure 4a, we observe that PADS-DP achieves nearly
similar social welfare as the number of users is increased. It
is significantly better than that of PADS-ADP which maintains
a constant social welfare irrespective of the number of users
who participate in the auction. The social welfare measured
in Figure 4b suggests that the schemes follow a similar trend
as Figure 4a when the number of VMs is increased. From the
results in Figure 4c for varying values of , we can observe
and conclude that PADS-DP can achieve nearly the same social
welfare as that of OPT even for smaller values of .
In Figure 5, we measure the payment made by the users in
the proposed schemes. We plot the average payment for each
user in each time slot. We observe that in PADS-DP, users
pay less than that of OPT ( Figure 5a and 5b) for a different
number of users and VMs. The trend in Figure 5c is also
similar to that of Figure 3c.
Finally, in Figure 6, we consider the users’ satisfaction
which can be represented by the completion rate. In an
auction-based resource allocation mechanism, if a user bids
higher, the user should have a higher probability to obtain the
resource. In Figure 6a and 6b, we plot the completion rate of
each user marked as blue “X” and the results of OPT marked
as solid lines. We can see that with higher , the completion
rate of PADS-DP is closer to that of OPT. Since = 0.1is a
significantly large privacy protection with differential privacy,
the results demonstrate that PADS-DP can achieve relatively
high differential privacy and user satisfaction simultaneously.
VIII. CONCLUSION
In this paper, we propose a strategy-proof differentially
private auction mechanism for allocating dynamically priced
resources in a cloud. We propose three approaches for
the privacy-aware auction design problem using differential
privacy based on exponential mechanism design. The first
approach uses a straight-forward application of near opti-
mum exponential mechanism that provides truthfulness and
-differential privacy, but the mechanism is intractable for
large-scale resource allocations. The second approach, PADS-
ADP, uses an iterative algorithm to choose the winners of
an auction and achieves (, δ)-differential privacy and runs in
polynomial time. The third approach, PADS-DP, employs a
grouping algorithm to generate the possible outcome groups
and chooses the winner group using the exponential mecha-
nism. We demonstrate that PADS-DP can achieve -differential
privacy and -truthfulness. Experimental evaluation of the
performance of the proposed mechanisms shows that the
proposed techniques can guarantee differential privacy and
truthfulness property of the auction while achieving closely
similar performance in terms of revenue and social welfare as
compared to traditional auctions.
REFERENCES
[1] L. Columbus, “Roundup of cloud computing forecasts and market
estimates q3 update, 2015,” Sep. 2015. [Online]. Available: http:
//www.forbes.com/sites/louiscolumbus/2015/09/27/roundup-of-cloud-\
computing-forecasts- and-market-estimates-q3-update-2015/
#6a2e25b66c7a
[2] Amazon, “Amazon ec2 spot instances,” Jan. 2017. [Online]. Available:
https://aws.amazon.com/ec2/spot/
[3] Amazon, “Amazon ec2 on-demand pricing,” Jan. 2017. [Online].
Available: https://aws.amazon.com/ec2/pricing/on-demand/
[4] L. Zheng, C. Joe-Wong, C. W. Tan, M. Chiang, and X. Wang, “How to
bid the cloud,” in Proceedings of the 2015 ACM Conference on Special
Interest Group on Data Communication. ACM, 2015, pp. 71–84.
[5] M. Yokoo, Y. Sakurai, and S. Matsubara, “The effect of false-name bids
in combinatorial auctions: New fraud in internet auctions,” Games and
Economic Behavior, vol. 46, no. 1, pp. 174–188, 2004.
[6] M. Naor, B. Pinkas, and R. Sumner, “Privacy preserving auctions and
mechanism design,” in Proceedings of the 1st ACM conference on
Electronic commerce. ACM, 1999, pp. 129–139.
[7] K. Q. Nguyen and J. Traor´
e, “An online public auction protocol pro-
tecting bidder privacy,” in Information Security and Privacy. Springer,
2000, pp. 427–442.
[8] H. Kikuchi, S. Hotta, K. Abe, and S. Nakanishi, “Distributed auction
servers resolving winner and winning bid without revealing privacy
of bids,” in Parallel and Distributed Systems: Workshops, Seventh
International Conference on, 2000. IEEE, 2000, pp. 307–312.
[9] F. McSherry and K. Talwar, “Mechanism design via differential privacy,”
in Foundations of Computer Science, 2007. FOCS’07. 48th Annual IEEE
Symposium on. IEEE, 2007, pp. 94–103.
[10] J. McMillan, “Why auction the spectrum?” Telecommunications policy,
vol. 19, no. 3, pp. 191–199, 1995.
[11] Q. Huang, Y. Tao, and F. Wu, “Spring: A strategy-proof and privacy pre-
serving spectrum auction mechanism,” in INFOCOM, 2013 Proceedings
IEEE. IEEE, 2013, pp. 827–835.
[12] P. Cramton, “Spectrum auction design,” Review of Industrial Organiza-
tion, vol. 42, no. 2, pp. 161–190, 2013.
[13] S. Liu, H. Zhu, R. Du, C. Chen, and X. Guan, “Location privacy
preserving dynamic spectrum auction in cognitive radio network,” in
Distributed Computing Systems (ICDCS), 2013 IEEE 33rd International
Conference on. IEEE, 2013, pp. 256–265.
[14] F. Wu, Q. Huang, Y. Tao, and G. Chen, “Towards privacy preservation in
strategy-proof spectrum auction mechanisms for noncooperative wireless
networks,” IEEE/ACM Transactions on Networking (TON), vol. 23,
no. 4, pp. 1271–1285, 2015.
[15] H. Huang, X.-Y. Li, Y.-e. Sun, H. Xu, and L. Huang, “Pps: Privacy-
preserving strategyproof social-efficient spectrum auction mechanisms,
Parallel and Distributed Systems, IEEE Transactions on, vol. 26, no. 5,
pp. 1393–1404, 2015.
[16] C. Dwork, “Differential privacy: A survey of results,” in International
Conference on Theory and Applications of Models of Computation.
Springer, 2008, pp. 1–19.
[17] W. Vickrey, “Counterspeculation, auctions, and competitive sealed ten-
ders,” The Journal of finance, vol. 16, no. 1, pp. 8–37, 1961.
[18] E. H. Clarke, “Multipart pricing of public goods,” Public choice, vol. 11,
no. 1, pp. 17–33, 1971.
[19] T. Groves, “Incentives in teams,” Econometrica: Journal of the Econo-
metric Society, pp. 617–631, 1973.
[20] L. Sweeney, “k-anonymity: A model for protecting privacy,” Interna-
tional Journal of Uncertainty, Fuzziness and Knowledge-Based Systems,
vol. 10, no. 05, pp. 557–570, 2002.
[21] C. Dwork, F. McSherry, K. Nissim, and A. Smith, “Calibrating noise
to sensitivity in private data analysis,” in Theory of Cryptography
Conference. Springer, 2006, pp. 265–284.
[22] V. Rastogi and S. Nath, “Differentially private aggregation of distributed
time-series with transformation and encryption,” in Proceedings of the
2010 ACM SIGMOD International Conference on Management of data.
ACM, 2010, pp. 735–746.
[23] Z. Huang and S. Kannan, “The exponential mechanism for social wel-
fare: Private, truthful, and nearly optimal,” in Foundations of Computer
Science (FOCS), 2012 IEEE 53rd Annual Symposium on. IEEE, 2012,
pp. 140–149.
[24] A. Le Ny, “Introduction to (generalized) gibbs measures,” Ensaios
Matem´
aticos, vol. 15, pp. 1–126, 2008.
[25] R. Zhu and K. G. Shin, “Differentially private and strategy-proof
spectrum auction with approximate revenue maximization,” in Computer
Communications (INFOCOM), 2015 IEEE Conference on. IEEE, 2015,
pp. 918–926.
[26] R. Zhu, Z. Li, F. Wu, K. Shin, and G. Chen, “Differentially private spec-
trum auction with approximate revenue maximization,” in Proceedings
of the 15th ACM international symposium on mobile ad hoc networking
and computing. ACM, 2014, pp. 185–194.
[27] N. Nisan, T. Roughgarden, E. Tardos, and V. V. Vazirani, Algorithmic
game theory. Cambridge University Press Cambridge, 2007, vol. 1.
[28] M. J. Osborne and A. Rubinstein, A course in game theory. MIT press,
1994.
[29] A. Gupta, K. Ligett, F. McSherry, A. Roth, and K. Talwar, “Differentially
private combinatorial optimization,” in Proceedings of the twenty-first
annual ACM-SIAM symposium on Discrete Algorithms. SIAM, 2010,
pp. 1106–1125.
[30] A. Archer and ´
E. Tardos, “Truthful mechanisms for one-parameter
agents,” in Foundations of Computer Science, 2001. Proceedings. 42nd
IEEE Symposium on. IEEE, 2001, pp. 482–491.

Supplementary resource (1)

... However, when the mechanism is applied to multi-item auctions, it is approximately IC rather than IC. Later, Zhu and Shin [2015] and Xu et al. [2017], Jian et al. [2018] propose mechanisms that combine the exponential mechanism with the payment rule in [Archer and Tardos, 2001], applying to combinatorial auctions and reverse auctions, respectively. ...
... We consider two types of score functions, including linear function, σ(θ, o i ) = v i , and log function, σ(θ, o i ) = log v i . Linear and log score functions are used in previous DP auctions, e.g., [McSherry and Talwar, 2007, Xu et al., 2017, Jian et al., 2018. ...
Preprint
Full-text available
Diffusion auction refers to an emerging paradigm of online marketplace where an auctioneer utilises a social network to attract potential buyers. Diffusion auction poses significant privacy risks. From the auction outcome, it is possible to infer hidden, and potentially sensitive, preferences of buyers. To mitigate such risks, we initiate the study of differential privacy (DP) in diffusion auction mechanisms. DP is a well-established notion of privacy that protects a system against inference attacks. Achieving DP in diffusion auctions is non-trivial as the well-designed auction rules are required to incentivise the buyers to truthfully report their neighbourhood. We study the single-unit case and design two differentially private diffusion mechanisms (DPDMs): recursive DPDM and layered DPDM. We prove that these mechanisms guarantee differential privacy, incentive compatibility and individual rationality for both valuations and neighbourhood. We then empirically compare their performance on real and synthetic datasets.
... Ref. [13] proposes a context-aware hierarchical task allocation framework in the scenario of edge computing while preserving differential privacy for the information of both requesters and participants. Also considering the resource of cloud, the work [14] designs a differentially private and strategy-proof solution for allocating dynamically priced cloud resources. Apart from computing resources, differential privacy is also applied in energy management in smart grids, such as optimal power flow [15] and [16]. ...
Article
Full-text available
This paper studies distributed resource allocation problem where all the agents cooperatively minimize the sum of their cost functions. To prevent private information from being disclosed, agents need to keep their cost functions private against potential adversaries and other agents. We first propose a completely distributed algorithm via deviation tracking that deals with constrained resource allocation problem and preserve differential privacy for cost functions by masking states and directions with decaying Laplace noise. Adopting constant stepsize, we prove that the proposed algorithm converges linearly in mean square. The linear convergence is established under the standard assumptions of Lipschitz gradients and strong convexity instead of the assumption of bounded gradients that is usually imposed in most existing works. Moreover, we show that the algorithm preserves differential privacy for every agent's cost function and establish the trade-off between the privacy and the convergence accuracy. Furthermore, we apply the proposed algorithm to economic dispatch problem in IEEE-14 bus system to verify the theoretical results.
... In addition to spectrum allocation, differentially private auction mechanisms have also been developed for resource allocation in cloud computing. Xu et al. [113] proposed a differentially private auction mechanism for trading cloud resources, which preserves the privacy of individual bidding information and achieves strategy-proofness. Their mechanism iteratively progresses through a set of rounds, where each round consists of four steps. ...
Preprint
Artificial Intelligence (AI) has attracted a great deal of attention in recent years. However, alongside all its advancements, problems have also emerged, such as privacy violations, security issues and model fairness. Differential privacy, as a promising mathematical model, has several attractive properties that can help solve these problems, making it quite a valuable tool. For this reason, differential privacy has been broadly applied in AI but to date, no study has documented which differential privacy mechanisms can or have been leveraged to overcome its issues or the properties that make this possible. In this paper, we show that differential privacy can do more than just privacy preservation. It can also be used to improve security, stabilize learning, build fair models, and impose composition in selected areas of AI. With a focus on regular machine learning, distributed machine learning, deep learning, and multi-agent systems, the purpose of this article is to deliver a new view on many possibilities for improving AI performance with differential privacy techniques.
... In addition to spectrum allocation, differentially private auction mechanisms have also been developed for resource allocation in cloud computing. Xu et al. [113] proposed a differentially private auction mechanism for trading cloud resources, which preserves the privacy of individual bidding information and achieves strategy-proofness. Their mechanism iteratively progresses through a set of rounds, where each round consists of four steps. ...
Article
Full-text available
Artificial Intelligence (AI) has attracted a great deal of attention in recent years. However, alongside all its advancements, problems have also emerged, such as privacy violations, security issues and model fairness. Differential privacy, as a promising mathematical model, has several attractive properties that can help solve these problems, making it quite a valuable tool. For this reason, differential privacy has been broadly applied in AI but to date, no study has documented which differential privacy mechanisms can or have been leveraged to overcome its issues or the properties that make this possible. In this paper, we show that differential privacy can do more than just preserve privacy. It can also be used to improve security, stabilize learning, build fair models, and impose composition in selected areas of AI. With a focus on regular machine learning, distributed machine learning, deep learning, and multi-agent systems, the purpose of this article is to deliver a new view on many possibilities for improving AI performance with differential privacy techniques.
... There is also a line of work on auctions with differential privacy, such as [9,31,39]. However, differential privacy aims to protect privacy from analyzing published output by adding noise to the original output, which is quite different from secure multiparty computation fundamentally. ...
Preprint
Due to the great development of secure multi-party computation, many practical secure computation schemes have been proposed. As an example, different secure auction mechanisms have been widely studied, which can protect bid privacy while satisfying various economic properties. However, as far as we know, none of them solve the secure computation problems for multiple data providers (e.g., secure cloud resource auctions) in the malicious security model. In this paper, we use the techniques of cut-and-choose and garbled circuits to propose a general secure computation framework for multiple data providers against malicious adversaries. Specifically, our framework checks input consistency with the cut-and-choose paradigm, conducts maliciously secure computations by running two independent garbled circuits, and verifies the correctness of output by comparing two versions of outputs. Theoretical analysis shows that our framework is secure against a malicious computation party, or a subset of malicious data providers. Taking secure cloud resource auctions as an example, we implement our framework. Extensive experimental evaluations show that the performance of the proposed framework is acceptable in practice.
Article
Cloud service providers typically provide different types of virtual machines (VMs) to cloud users with various requirements. Thanks to its effectiveness and fairness, auction has been widely applied in this heterogeneous resource allocation. Recently, several strategy-proof combinatorial cloud auction mechanisms have been proposed. However, they fail to protect the bid privacy of users from being inferred from the auction results. In this article, we design a differentially private combinatorial cloud auction mechanism (DPCA) to address this privacy issue. Technically, we employ the exponential mechanism to compute a clearing unit price vector with a probability proportional to the corresponding revenue. We further improve the mechanism to reduce the running time while maintaining high revenues, by computing a single clearing unit price, or a subgroup of clearing unit prices at a time, resulting in the improved mechanisms DPCA-S and its generalized version DPCA-M, respectively. We theoretically prove that our mechanisms can guarantee differential privacy, approximate truthfulness and high revenue. Extensive experimental results demonstrate that DPCA can generate near-optimal revenues at the price of relatively high time complexity, while the improved mechanisms achieve a tunable trade-off between auction revenue and running time.
Article
The vast majority of artificial intelligence solutions are founded on game theory, and differential privacy is emerging as perhaps the most rigorous and widely adopted privacy paradigm in the field. However, alongside all the advancements made in both these fields, there is not a single application that is not still vulnerable to privacy violations, security breaches, or manipulation by adversaries. Our understanding of the interactions between differential privacy and game theoretic solutions is limited. Hence, we undertook a comprehensive review of literature in the field, finding that differential privacy has several advantageous properties that can make more of a contribution to game theory than just privacy protection. It can also be used to build heuristic models for game-theoretic solutions, to avert strategic manipulations, and to quantify the cost of privacy protection. With a focus on mechanism design, the aim of this article is to provide a new perspective on the currently held impossibilities in game theory, potential avenues to circumvent those impossibilities, and opportunities to improve the performance of game-theoretic solutions with differentially private techniques.
Article
Full-text available
Many spectrum auction mechanisms have been proposed for spectrum allocation problem, and unfortunately, few of them protect the bid privacy of bidders and achieve good social efficiency. In this paper, we propose PPS, a Privacy Preserving Strategyproof spectrum auction framework. Then, we design two schemes based on PPS separately for 1) the Single-Unit Auction model (SUA), where only single channel to be sold in the spectrum market; and 2) the Multi-Unit Auction model (MUA), where the primary user subleases multi-unit channels to the secondary users and each of the secondary users wants to access multi-unit channels either. Since the social efficiency maximization problem is NP-hard in both auction models, we present allocation mechanisms with approximation factors of (1+ϵ)(1+\epsilon) and 32 separately for SUA and MUA, and further judiciously design strategyproof auction mechanisms with privacy preserving based on them. Our extensive evaluations show that our mechanisms achieve good social efficiency and with low computation and communication overhead.
Conference Paper
Consider the following problem: given a metric space, some of whose points are "clients," select a set of at most k facility locations to minimize the average distance from the clients to their nearest facility. This is just the well-studied k-median problem, for which many approximation algorithms and hardness results are known. Note that the objective function encourages opening facilities in areas where there are many clients, and given a solution, it is often possible to get a good idea of where the clients are located. This raises the following quandary: what if the locations of the clients are sensitive information that we would like to keep private? Is it even possible to design good algorithms for this problem that preserve the privacy of the clients? In this paper, we initiate a systematic study of algorithms for discrete optimization problems in the framework of differential privacy (which formalizes the idea of protecting the privacy of individual input elements). We show that many such problems indeed have good approximation algorithms that preserve differential privacy; this is even in cases where it is impossible to preserve cryptographic definitions of privacy while computing any non-trivial approximation to even the value of an optimal solution, let alone the entire solution. Apart from the k-median problem, we consider the problems of vertex and set cover, min-cut, k-median, facility location, and Steiner tree, and give approximation algorithms and lower bounds for these problems. We also consider the recently introduced sub-modular maximization problem, "Combinatorial Public Projects" (CPP), shown by Papadimitriou et al. [28] to be inapproximable to subpolynomial multiplicative factors by any efficient and truthful algorithm. We give a differentially private (and hence approximately truthful) algorithm that achieves a logarithmic additive approximation.
Article
Amazon's Elastic Compute Cloud (EC2) uses auction-based spot pricing to sell spare capacity, allowing users to bid for cloud resources at a highly reduced rate. Amazon sets the spot price dynamically and accepts user bids above this price. Jobs with lower bids (including those already running) are interrupted and must wait for a lower spot price before resuming. Spot pricing thus raises two basic questions: how might the provider set the price, and what prices should users bid? Computing users' bidding strategies is particularly challenging: higher bid prices reduce the probability of, and thus extra time to recover from, interruptions, but may increase users' cost. We address these questions in three steps: (1) modeling the cloud provider's setting of the spot price and matching the model to historically offered prices, (2) deriving optimal bidding strategies for different job requirements and interruption overheads, and (3) adapting these strategies to MapReduce jobs with master and slave nodes having different interruption overheads. We run our strategies on EC2 for a variety of job sizes and instance types, showing that spot pricing reduces user cost by 90% with a modest increase in completion time compared to on-demand pricing.
Article
The problem of dynamic spectrum redistribution has been extensively studied in recent years. Auctions are believed to be among the most effective tools to solve this problem. A great number of strategy-proof auction mechanisms have been proposed to improve spectrum allocation efficiency by stimulating bidders to truthfully reveal their valuations of spectrum, which are the private information of bidders. However, none of these approaches protects bidders' privacy. In this paper, we present PRIDE, which is a PRIvacy-preserving anD stratEgy-proof spectrum auction mechanism. PRIDE guarantees k-anonymity for both single- and multiple-channel auctions. Furthermore, we enhance PRIDE to provide ell -diversity, which is an even stronger privacy protection than k -anonymity. We not only rigorously prove the economic and privacy-preserving properties of PRIDE, but also extensively evaluate its performance. Our evaluation results show that PRIDE achieves good spectrum redistribution efficiency and fairness with low overhead.
Article
Dynamic spectrum redistribution---under which spectrum owners lease out under-utilized spectrum to users for financial gain---is an effective way to improve spectrum utilization. Auction is a natural way to incentivize spectrum owners to share their idle resources. In recent years, a number of strategy-proof auction mechanisms have been proposed to stimulate bidders to truthfully reveal their valuations. However, it has been shown that truthfulness is not a necessary condition for revenue maximization. Furthermore, in most existing spectrum auction mechanisms, bidders may infer the valuations---which are private information---of the other bidders from the auction outcome. In this paper, we propose a Differentially privatE spectrum auction mechanism with Approximate Revenue maximization (DEAR). We theoretically prove that DEAR achieves approximate truthfulness, privacy preservation, and approximate revenue maximization. Our extensive evaluations show that DEAR achieves good performance in terms of both revenue and privacy preservation.
Article
Consider a data holder, such as a hospital or a bank, that has a privately held collection of person-specific, field structured data. Suppose the data holder wants to share a version of the data with researchers. How can a data holder release a version of its private data with scientific guarantees that the individuals who are the subjects of the data cannot be re-identified while the data remain practically useful? The solution provided in this paper includes a formal protection model named k-anonymity and a set of accompanying policies for deployment. A release provides k-anonymity protection if the information for each person contained in the release cannot be distinguished from at least k-1 individuals whose information also appears in the release. This paper also examines re-identification attacks that can be realized on releases that adhere to k-anonymity unless accompanying policies are respected. The k-anonymity protection model is important because it forms the basis on which the real-world systems known as Datafly, μ-Argus and k-Similar provide guarantees of privacy protection.
Conference Paper
The problem of dynamic spectrum redistribution has been extensively studied in recent years. Auction is believed to be one of the most effective tools to solve this problem. A great number of strategy-proof auction mechanisms have been proposed to improve spectrum allocation efficiency by stimulating bidders to truthfully reveal their valuations of spectrum, which are the private information of bidders. However, none of these approaches protects bidders' privacy. In this paper, we present SPRING, which is the first Strategy-proof and PRivacy preservING spectrum auction mechanism. We not only rigorously prove the properties of SPRING, but also extensively evaluate its performance. Our evaluation results show that SPRING achieves good spectrum redistribution efficiency with low overhead.