Content uploaded by Martín Varela
Author content
All content in this area was uploaded by Martín Varela on Aug 06, 2015
Content may be subject to copyright.
From Service Level Agreements (SLA) to Experience
Level Agreements (ELA): The Challenges of Selling
QoE to the User
Martín Varela†, Patrick Zwickl∗, Peter Reichl∗, Min Xie‡, Henning Schulzrinne§
†VTT Technical Research Centre of Finland, Oulu, Finland
E-mail: martin.varela@vtt.fi
∗Research Group Cooperative Systems (COSY), University of Vienna, Austria
Email: {patrick.zwickl|peter.reichl}@univie.ac.at
‡Telenor Research, Trondheim, Norway
Email: min.xie@telenor.com
§Columbia University, NY, USA
Email: hgs@cs.columbia.edu
Abstract—In contrast to the rather network-centric
notion of Quality of Service (QoS), the concept of Qual-
ity of Experience (QoE) has a strongly user-centric per-
spective on service quality in communication networks
as well as online services. However, related research
on QoE so far has largely neglected the question of
how to operationalize quality differentiation and to
provide corresponding solutions tailored to the end
users. In this paper, we argue that the introduction of
Experience Level Agreements (ELA) as QoE-enabled
counterpiece to traditional QoS-based Service Level
Agreements (SLA) would provide a key step towards
being able to sell service quality to the user. Hence,
we investigate various ideas to exploit QoE awareness
for improving SLAs (ranging from internal aspects
like SLOs by service providers to completely novel
definitions of ELAs which are able to characterize
QoE explicitly), and discuss important problems and
challenges of the proposed transition as well.
Index Terms—Quality of Experience; Service Level
Agreement
I. Introduction
For a long time, the question of how to define, provide
and measure service quality for end users has been of
utmost interest for network operators, application and ser-
vice providers, as well as their customers. With the advent
of packet-based communication, this has led, already in the
early nineties, to several attempts at thoroughly defining
Quality of Service (QoS) [1], [2]. In the two decades to fol-
low, the primary research directions have followed a rather
technology-driven understanding of QoS [3], [4], leading
among other things to the definition of clearly specified
network parameters as a prerequisite for arranging service-
related binding contracts between providers and users, i.e.,
Service Level Agreements (SLA).
However, in recent years a remarkable focus shift could
be observed in the industry as well as the research commu-
nity, (re-)establishing a more user-centric perspective on
service quality around the notion of Quality of Experience
(QoE) as an augmentation of QoS1. The reason for this
can be found in the realization that QoS measures do not
trivially translate into quality as experienced by the users.
Pursuing this idea, recent research has achieved remark-
able progress on corresponding metrics and measurement
methodologies as well as the relationship between QoS and
QoE for a broad variety of services and applications [7],
[8].
In this paper, we argue that the time has come to apply
those results also on the business domain. Therefore, we
propose to complement the mentioned paradigm change
from QoS to QoE by the analogous step from Service Level
Agreements (SLA) towards a novel type of contracts be-
tween providers and end users which take the user-centric
perspective on service quality into explicit account, and
which we propose to call ”Experience Level Agreements”
(ELA).
Such a concept, to the best of our knowledge, so far has
been only vaguely mentioned in very few rather specific
contexts, for instance facility management2and cloud
computing services3. We posit that the idea of guaranteed
service levels has to be prominently introduced to the area
of QoE, where it can play a key role for bringing QoE
into the networks and services. On the other hand, we
identify two main issues with currently available SLAs,
from the end-user’s point of view (be they consumer or
business users). Firstly, SLAs are often non-existing, or
when available, very IT services-oriented (ticket response
times, recovery times, availability), but do not convey
1In fact, while for instance ITU-T defines QoE as “the overall
acceptability of an application or service, as perceived subjectively by
the end user” and further mentions that this includes the complete
end-to-end system effects and may be influenced by user expectations
and context [5], other researchers went even further, for instance
characterizing QoE as “the degree of delight or annoyance of the user
of an application or service…” [6]
2see http://www.experiencelevelagreement.com/
3cf [9] mentioning the need for “experience-oriented SLA”
much in terms of how well the service actually performs
for the user. Secondly, when SLAs are present and service
performance is a part of them, they mostly deal only on
low-level metrics, which do not (except in some specific
cases) easily relate to the quality experienced by the users.
We therefore introduce the notion of a QoE-oriented
SLA — an Experience Level Agreement — motivating
it from a business perspective, and discussing the main
challenges associated with implementing them in different
types of services.
We will start with a short overview of classical SLAs
(Sec. II), and how QoE can interact with them in cer-
tain contexts. We will then introduce and motivate their
QoE-driven ELA counterpart (Sec. III), following with a
discussion of the challenges, both technical and business-
related for making them operational (Sec. IV). Finally,
we will give concluding remarks and an outlook on ELA
in Sec. V.
II. A Brief History of SLAs
Service Level Agreements (SLAs) are a broad and well-
studied topic with a long history in the ICT domain. In
this section we will briefly cover the basic concepts and key
references related to the ideas presented later in this paper,
and will especially focus on the case of telecommunications
and online (over the top) services.
A. SLA Definition and Related Concepts
SLAs have been defined by ITIL [10] as an “agreement
between an IT Service Provider and a Customer. The SLA
describes the IT Service, documents Service Level Targets,
and specifies the responsibilities of the IT Service Provider
and the Customer. A single SLA may cover multiple IT
Services or multiple Customers”. The TM Forum [11]
provides an alternative definition as “a formal negotiated
agreement between two parties. It is a contract that exists
between the Service Provider (SP) and the Customer. It is
designed to create a common understanding about Quality
of Service (QoS), priorities, responsibilities, etc. SLAs
can cover many aspects of the relationship between the
Customer and the SP, such as performance of services,
customer care, billing, service provisioning, etc. However,
although a SLA can cover such aspects, agreement on the
level of service is the primary purpose of a SLA”.
Both definitions convey the same basic ideas; SLAs pro-
vide, among other things, an agreed-upon understanding
of the performance targets of a service. SLAs can cover a
wide variety of service aspects, ranging from performance
(e.g., network QoS) to maximum response times for service
tickets, and can also be applied to non-ICT services.
An SLA commonly has a set of Service Level Objectives
(SLO4) associated with it. These are the targets for the
4Note that our usage of the term SLO is in line with most of related
work, except for ITIL, which has defined the concept of SLR (Service
Level Requirement) for this purpose, while SLO is used with a rather
different meaning.
service level to be attained, and are often measured by a
set of Key Performance Indicators (KPI).
In general, SLAs can be characterized by:
•A set of KPIs for the service in question, often aver-
aged values over a time period (e.g., monthly packet
loss averages), or dependability metrics (MTTF,
MTTR, etc.) [12], [13]
•A clear way to measure those KPIs by either the
customer or the provider (or both).
•Penalties for the cases where violations occur (e.g.,
service refunds, or fines).
Common KPIs used in SLAs are related to the availabil-
ity of the service (e.g., mean time to failure, mean time to
recovery), or to technical QoS parameters in the case of
network services, for example. However, those KPIs can
only be related to the end users perception of the system
performance, much less to its actual QoE. Hence, recently
the term Key Quality Indicators (KQI) has been used
to describe user-perceivable quality aspects of a service
via certain KPIs that directly affect the perceived quality
(e.g., packet losses for IP telephony services) [14]. KQIs
are in some cases very close to KPIs (e.g. the number of
sessions in which the start up delay of the service is higher
than a certain threshold), but for some services (notably
media) the can also be estimates of perceptual quality (e.g.
listening MOS for VoIP, or some estimation of audiovisual
quality for video) [15]. These latter KQIs would provide a
good basis for an ELA (cf. TM Forum recommendations
for KQIs [16]). There is a plethora of literature related to
SLAs, both regarding research and best practices. A recent
survey of European research efforts related to SLAs [9]
provides an excellent overview of on-going work in the
domain (with a focus on cloud services), as well as a meta-
model for an SLA life-cycle. With a more general focus,
the TM Forum has produced a comprehensive handbook
covering basic notions and concepts of SLAs [11] and SLA
management [13].
B. SLAs and QoE
Within the technical domain of SLAs, QoE models can
be a valuable tool for service providers, for example to use
as SLOs, e.g., ensuring that the MOS of a given service
remains above a given acceptability threshold. Having
sufficiently accurate parametric models for QoE [17]–[19]
or even less accurate dimensioning models [20], allows
in some cases to derive performance bounds for some of
the QoE-affecting service parameters, enabling for exam-
ple the choice of optimal (e.g., in terms of cost/quality
ratio) SLAs. In [21], the authors present a scenario in
which quality models provide an optimal choice of SLAs
between a SaaS provider and its upstream (IaaS, network)
providers, in order to attain the desired performance levels
to ensure the users’ QoE is sufficient, considering budget
constraints.
QoE-based SLOs can also be used as part of inter-
carrier SLOs, or for OTT services, agreements between
content providers and network providers, for instance by
setting quality targets over a set of agreed upon (e.g.,
standardized) quality models for different service types.
III. ELA: User-centric Quality Level
Agreements
When buying a service today, consumers commonly face
two issues: firstly, the service is provided on a best-effort
basis with no guarantees of any kind (a typical example
of this would be an ISP’s tiered data plans, which are
sold, e.g., as “100Mbps”, followed by copious amounts
of small print that indicate that what it really means is
“up to 100Mbps, under optimal circumstances, which will
probably never occur in practice”). In other cases (e.g.,
cloud services), performance or dependability guarantees
of any kind are rarely made. Secondly, when performance
or dependability are described to the end user, they are
described in technical terms that are not really relatable to
the quality experienced by the user when using the service.
On the other hand, for any user paying for a service,
there is an expectation that the service should work
reliably and properly, which is not currently addressed by
most service providers. This lack of quality guarantees in
most services, together with recent significant advances in
QoE modeling, opens an opportunity for service providers
to differentiate themselves from their competition and
increase their margins by offering customers minimum
QoE guarantees, or different types of guarantees based
on tiered subscription models. Depending on the nature
of the service in question (e.g., network connectivity vs.
online media vs. cloud services), the options available
to the providers in terms of what they can promise to
customers can vary significantly. In the case of media
services, for example, the tiers can include different base
quality levels based on resolution and encoding, but also
different assurances on the delivered quality itself. For
non-media services, the tiers could be based on guaranteed
resource allocation or response times.
For the successful selling of any kind of product, the
information disclosure process is key, whether it refers to
advertisement activities or clearly conveying the essence of
the product itself to the customer. Today’s network and
service market is dominated by figures and notions that
are difficult to communicate or even measure. In common
SLAs, network performance aspects may be conveyed in
terms of QoS metrics or may even only be specified by
aggregate bandwidth estimates or best effort rates. These
are not terms that end users understand or necessarily care
about, as they are not easily relatable to their experience
when using a given service. So far, there are no means
to market services with QoE guarantees to end-users.
We propose the concept of Experience Level Agreements
(ELAs) to both enable the effective communication of the
QoE to be expected for a set of services and to foster
new business practices based on providing a minimum
QoE guarantees to the users in terms they can actually
comprehend.
A. Definition
In line with the SLA definitions given above, we can
define an ELA as a special type of SLA designed to establish
a common understanding of the quality levels that the
customer will experience through the use of the service,
in terms that are clearly understandable to the customer
and to which he or she can relate.
Syntactically the ELA can be very similar to a common
SLA5, i.e., a product in a defined quality — availability,
consistency of performance, or resources — and price
is sold for specified period of time from a provider to
a customer. The existing frameworks for defining SLAs
should suffice to formally specify ELAs as well. However,
whereas SLAs comprise a set of low-level performance
metrics (e.g., QoS, availability), the ELA conveys the
performance of the service in terms of QoE (and QoE
only); possibly as a set of QoE indicators to which the
user can readily relate. These could be, for example, some
representation of MOS scores as star ratings, though as we
will discuss later, it is likely that new means for conveying
QoE information to users will be needed.
B. Scope
It is important to distinguish between consumer and
professional customer markets (expert users, wholesale
customers, corporations, etc.). ELAs, as proposed herein,
provide a clear way to convey the complex nature of
network and service quality to consumers and some
enterprise-type customers, and their exploration may indi-
rectly assist other commercial users through the availabil-
ity of the QoE monitoring tools or experience simulation
facilities required to operationalize ELAs.
Another restriction concerns the general applicability of
QoE-differentiated services and thus ELAs. Due to the
complexity and risk when providing service experience
guarantees between any destination pair of a certain ser-
vice type (e.g., due to transit service agreements), the
service usage needs be geographically narrowed down, e.g.,
a small region involving a limited number of ISPs.
This line of thought also leads to focusing on services for
which we are capable of both measuring the QoS and/or
QoE (e.g., active monitoring) at the user premises at peak
times for this service, and simulating the effect on the
QoE through “preview” capabilities, i.e., translating QoS
parameters to an experience. The actual ELA is then
handled via the client software used and is specific to this
service. Thus from the current point of view, a focus on
specifically selected Over-The-Top (OTT) services eases
the transition towards QoE marketisation via explicit
ELAs—see Fig. 1(a). More generic arrangements may be
5In what follows we will refer to thos SLAs that are not ELAs
simply as SLAs
Content
Provider
ISP
Enter-
prise
Con-
sumer
Customer
SLA
SLA
SLA
ELA
Content
potentially
provided by
Access ISP
inter-
connection
Network service
+
content ELA
(a) Specific service contracts
Content
Provider
ISP
Enter-
prise
Con-
sumer
Customer
SLA
ELA
SLA
ELA
SLA
QoE-aware
zone
arranged in the background
based on aggregate traffic
demands, forecasts, geographical
distribution of services etc.
inter-
connection
QoS-centric
(b) Service-independent agreements
Fig. 1. The ELA ecosystem making use of SLA and QoE concepts.
enabled at later points in time (cf. Fig. 1(b)), as the
challenges are overcome.
C. ELA vs. SLA
ELA and SLA need to coexist in an end-to-end system,
where the SLA is the interface with the service and
content provider whereas the ELA is the interface with
the end users (cf. Fig. 1). The relationship between ELA
and SLA is analogous to that between QoE and QoS,
forming a chain from users to their back-end realization.
One commonly-studied research question in QoE relates
to the creation of mappings to translate QoS to some
dimensions of QoE (usually perceptual ones) and vice-
versa. An analogous mapping will be needed to derive SLA
parameters from the ELA, and conversely, to bound ELAs
based on the SLA parameters. In this sense, ELAs cannot
in general directly involve QoE (i.e., as experienced by the
user and including emotional, socio-economic and other
user factors) but rather an objective representation of it,
agreed upon by both providers and users.
D. In Operations
Operationally, experience levels need to be captured and
transferred to a contractual form. This could for exam-
ple be achieved by experience simulators that allow to
measure the user’s quality sensitivity for different service
types. Based on this assessment, users could choose their
desired experience level on a quality scale depicting the
available quality tiers (olympic model, ACR-5, star rat-
ings, or other), price, and typical service usage scenarios.
This would yield an ELA choice reflecting both quality
sensitivity and service preferences, which would then be
automatically translated to QoS parameters, via the QoE
models used, as done for instance in [21]. Whenever
service-aware QoS parameters cannot be explicitly defined,
QoS indicators may provide descriptors for aggregate QoS
bounds—e.g., peak bandwidth up to 10Mbps, latency
smaller than 150ms for the specified set of services at the
tested location. QoS parameters are essentially required in
the core network and across network domain borders due
to absence of direct customer contact and the requirement
to aggregate the demands of individual usage flows. For
the business side of ELAs, both customers and ISPs will
require the certainties about the imminent contract, for
which several strategies may exist: Firstly, ISPs may not
only aim at assisting customers to understand the product
offer (e.g., via experience simulators), but they may also
do active network measurements in order to understand
the network at the customer premises, i.e., probing the
QoS at peak times. By studying the performance of the
underlying network and infrastructure, a set of services
can retrieved in this way for which QoE guarantees can
be provided. The validation of an ELA, moreover, requires
reliable and trustworthy information accessible to both
contract parties. In practice, the measured QoS could be
reconstructed for a service to a QoE level estimate (i.e.,
an implicit QoE monitor) on whose account the contract
satisfaction is assessed.
Secondly, ELA validation and QoS-to-QoE transition
could be treated based on the closest access speed and QoS
measurements and crowd-sourced QoE ratings (by actual
customers). On that account, a less cost-intensive solution
may be constructed. For example, ISPs, regulators or an-
other objective third party could use existing network QoS
monitoring infrastructure in order to publish aggregate
results (possibly with regional granularity) for QoS at peak
times. The results would then automatically mapped to
aggregate QoE levels for commonly used services, e.g.,
YouTube, Netflix, Skype, and may feed ELA validation
mechanisms. Particular solutions for the operationaliza-
tion of ELAs are beyond the scope of this paper.
In any case, penalties, as compensations for not match-
ing the agreed QoE standards and the lost free or working
times, may then be issued on the basis of monthly pay-
backs or vouchers for future service usages. Those refunds
may (partially) be covered by insurances or may directly
affect the business figures. It is implicit in this framework
that on average, the added costs for operators, related
to penalties, will be offset by adequate pricing strategies.
That is, more demanding ELAs will carry heftier prices,
and possibly larger expected margins than best-effort
service tiers.
E. But, Why ELAs?
As discussed so far, and further in the following Sec-
tion, it seems pertinent to address the main reasons for
introducing the concept. A large majority of connectivity
options and services marketed today to consumers share
one of both of the following characteristics: they are
provided on a best-effort basis, and they are mostly sold
on a flat-rate pricing model6.
This has led, in many services, to a “race to the bottom”
effect in terms of pricing, which in the long run does not
benefit service providers, who see lower margins, or cus-
tomers, who are stuck with whatever quality of experience
the provider is able to deliver on this pricing model. On the
other hand, some studies [22] indicate that a non-trivial
percentage of customers are indeed willing to pay more for
better quality, with varying degrees of enthusiasm, ranging
from conservative spending, to higher levels of spending
which may even seem irrational. This, in principle, enables
new business opportunities for the service providers, who
can better address different segments of the market by
offering different QoE levels at suitable price points.
It stands to reason, then, that if such type of pricing
differentiation is put in practice, there will be a need for
users to make sure that they get their money’s worth of
QoE, and for providers to be accountable when they don’t.
This is precisely what an SLA is meant to do. However,
the traditional approach to SLA definition and monitoring
is not necessarily a good match for end-users, hence the
proposed ELA concept.
IV. The Challenges of ELAs
The idea of integrating QoE into SLAs, either implicitly
or as user-facing ELAs seems, as argued above, like a nat-
ural progression in the same vein as how QoS has evolved
towards QoE. There are, however, a number of open issues
that need to be worked out before this transition can
take place. In this section we discuss the main research
challenges and questions we have identified in this area.
A. Framework
Today, SLAs for communication services are not widely
spread for consumer-level use. Because of this, ELAs can
not yet build upon an existing and sufficient infrastructure
involving consumers, all involved ISPs, and potentially
also content providers. In particular, automatic mecha-
nisms for simplifying the contractual negotiations and
agreements cannot be assumed to be present. In 2013,
the EU FP7 project ETICS7has concluded with the
proposition of an automated end-to-end QoS agreement
concept based on SLAs [23], that does, however, not
6Network services do often have pricing tiers based on speed or
data transfer caps; higher speeds and higher transfer caps are more
costly. However, these tiers provide bounds as to how well the
connection can perform, rather than guaranteeing that it will perform
“at least this well” for any particular service.
7https://www.ict-etics.eu/, last accessed: 2015-01-29.
include consumers. Despite this restriction, the complex-
ity of the proposed mechanism potentially explains the
limited endeavors for adopting similar concepts in the
industry. Apart from this, services and their customers
are in general spread around the globe, thus introducing
location considerations and requiring fine-grained end-
to-end service quality monitoring in order to attribute
contract breaches to subcontracting ISPs.
This entire range of SLA issues are very likely inherited
by ELAs, which mainly differ in their parameter selection
and semantic interpretation. Proper mechanisms have to
both understand the background transactions required
to enable ISP cooperation (binding contracts, revenue
sharing etc.) and their automated translation to consumer-
facing contracts. For this reason, ELA frameworks should
•be based on automatic mechanisms (for end-to-end
solutions);
•be based on agreed-upon, measurable, technically
valid, communicable, and understandable metrics;
•be resistant to regional usage variations, e.g., switch-
ing service caches or service providers requires statis-
tical modeling or other kind of treatment;
•come with a cooperation framework among providers
as envisioned by [23].
Likewise, a standard set of APIs and a suitable mon-
itoring architecture is also needed, in order to simplify
the inter-domain (not only between carriers, but also be-
tween carriers and service/content providers) interaction
required. For the QoE monitoring aspects, an architecture
such as the one recently proposed by ETSI [24] could be
a good starting point.
B. Language
It is challenging to describe ELA in a single language
that can express technical quality requirements (e.g., QoS)
while being easily understood by customers. ELAs should,
thus, be
•be expressed both formally and in terms understand-
able by customers, the latter in terms of QoE. This
poses some non-trivial questions on how to convey
what certain quality (e.g., a score of 4 on a 5-point
ACR scale) actually feels like.
•convenient to measure by both service providers and
customers, i.e., it should be able to be quantified,
guaranteed, validated and maintained (e.g., in order
to also reduce complaint management efforts);
•consistent across users and platforms, i.e., it should
be applicable to a range of user profiles in the service
domain, and to all the devices with which they access
the service in question8.
C. Marketing
One significant challenge in implementing ELAs lies
not in the technical aspects, but rather on the marketing
8This may actually vary depending on what service is under
consideration
side. The prevailing “best effort / flat rate” approach to
selling online services and network connectivity creates a
strong inertia, which may prove difficult to overcome. For
example, while sophisticated pricing schemes for connec-
tivity with differentiated QoS have been studied for a long
time [25], [26], their implementations remain elusive.
Similar issues are likely to occur when considering differ-
entiated QoE levels, unless effective marketing strategies
can be developed to address this problem.
V. Conclusions and Future Work
In this paper we have introduced the concept of Expe-
rience Level Agreements (ELA), as a QoE-oriented aug-
mentation of SLAs, with the aim of enabling new business
models based on providing different QoE guarantees for
users of online services. While the concept is easy to
motivate from the business / economic perspective, and
some studies suggest that users are indeed willing to pay
more for better quality, we describe several significant
challenges — both technical and business-related — to
address before ELAs can become operational.
Future work on this domain will address the challenges
explored in this paper, as well as expand the scope of ELAs
to business applications, e.g., for SaaS-type use cases.
Acknowledgements
Part of this work has been funded by the Euro-
pean Community’s 7th Framework Programme under
grant agreement no. 611366 (PRECIOUS); see www.
thepreciousproject.eu for further details. M. Varela’s work
was partly funded by Tekes, the Finnish Funding Agency
for Technology and Innovation, in the context of the Celtic
Plus QuEEN project.
References
[1] ITU-T, “ Recommendation E.800 - Definitions of Terms Related
to Quality of Service,” 2008.
[2] ISO, “IEC 7498–1: Open Systems Interconnection – Basic Ref-
erence Model: The Basic Model,” 1994.
[3] J. Crowcroft and T. Roscoe, “QoS’s Downfall: At the bottom,
or not at all,” in Proceedings of the ACM SIGCOMM 2003
Workshops, 2003, pp. 109–114.
[4] M. Varela, L. Skorin-Kapov, and T. Ebrahimi, “Quality of
Service vs. Quality of Experience,” in Quality of Experience
– Advanced Concepts, Applications and Methods, 1st ed., ser.
T-Labs Series in Telecommunication Services, S. Möller and
A. Raake, Eds. Berlin: Springer, 2014, pp. 35–54.
[5] ITU-T, “ Recommendation P.10/G.100 Amendment 2 -
New definitions for inclusion in Recommendation ITU-T
P.10/G.100,” 2008.
[6] P. Le Callet, S. Möller and A. Perkis, Eds., “Qualinet White
Paper on Definitions of Quality of Experience (2012),” Jun.
2012.
[7] M. Fiedler, T. Hoßfeld, and P. Tran-Gia, “A Generic Quantita-
tive Relationship Between Quality of Experience and Quality of
Service,” Network, IEEE, vol. 24, no. 2, pp. 36 –41, march-april
2010.
[8] P. Reichl, S. Egger, R. Schatz, and A. D’Alconzo, “The Loga-
rithmic Nature of QoE and the Role of the Weber-Fechner Law
in QoE Assessment,” in Communications (ICC), 2010 IEEE
International Conference on. IEEE, 2010, pp. 1–5.
[9] L. Blasi, G. Brataas, M. Boniface, J. Butler, F. D’andria,
M. Drescher, R. Jimenez, K. Krogmann, G. Kousiouris,
B. Koller, G. Landi, F. Matera, A. Menychtas, K. Oberle,
S. Phillips, L. Rea, P. Romano, M. Symonds, and
W. Ziegler, “Cloud Computing Service Level Agreements
– Exploitation of Research Results,” European Commission
Directorate General Communications Networks, Content and
Technology, Tech. Rep., June 2013. [Online]. Available: http:
//ec.europa.eu/digital-agenda/en/news/cloud- computing-
service-level- agreements-exploitation-research- results
[10] ITIL, “Glossary of Terms, Definitions and Acronyms, v3 ,”
May 2007. [Online]. Available: http://www.best-management-
practice.com/gempdf/ITIL_Glossary_V3_1_24.pdf
[11] TM Forum, “Sla management handbook: Volume 2 concepts
and principles, v2.5,” 2005.
[12] I. Aib and B. Daheb, SLA Driven Network Management.
ISTE, 2010, pp. 219–246. [Online]. Available: http://dx.doi.
org/10.1002/9780470612118.ch11
[13] T. Forum, “GB045 - SLA Management Handbook, vol. 4,” 2004.
[14] E. Toktar, G. Pujolle, E. Jamhour, M. Penna, and M. Fonseca,
“An xml model for sla definition with key indicators,” in IP
Operations and Management, ser. Lecture Notes in Computer
Science, D. Medhi, J. Nogueira, T. Pfeifer, and S. Wu,
Eds. Springer Berlin Heidelberg, 2007, vol. 4786, pp. 196–
199. [Online]. Available: http://dx.doi.org/10.1007/978-3-540-
75853-2_20
[15] H. Batteram, G. Damm, A. Mukhopadhyay, L. Philippart,
R. Odysseos, and C. Urrutia-Valdes, “Delivering quality of ex-
perience in multimedia networks,” Bell Labs Technical Journal,
vol. 15, no. 1, pp. 175–193, June 2010.
[16] T. Forum, “GB938 - Application Note to SLA Management
Handbook: Video over IP (v0.8),” 2007.
[17] M. N. Garcia, R. Schleicher, and A. Raake, “Impairment-
factor-based audiovisual quality model for IPTV: influence
of video resolution, degradation type, and content type,”
EURASIP Journal on Image and Video Processing, vol.
2011, no. 1, p. 629284, Mar. 2011. [Online]. Available: http:
//jivp.eurasipjournals.com/content/2011/1/629284/abstract
[18] M. Garcia and A. Raake, “Parametric packet-layer video quality
model for IPTV,” in 2010 10th International Conference on
Information Sciences Signal Processing and their Applications
(ISSPA), May 2010, pp. 349 –352.
[19] A. C. da Silva, M. Varela, E. de Souza e Silva, R. Leão, and
G. Rubino, “Quality assessment of interactive real time voice
applications,” Computer Networks, vol. 52, p. 1179–1192, Apr.
2008.
[20] ITU-T, “Recommendation G.107 - The E-model: A
Computational Model for Use in Transmission Planning,”
2011. [Online]. Available: http://www.itu.int/
[21] P. Frangoudis, A. Sgora, G. Rubino, and M. Varela, “QoE-driven
Optimal SLA Selection for Enterprise Cloud Communications,”
in Proceedings of the First IEEE Workshop on QoE-Oriented
Network and Application Management (QoENAM), Sydney,
Australia, Jun. 2014.
[22] A. Sackl, P. Zwickl, and P. Reichl, “The trouble with choice: An
empirical study to investigate the influence of charging strate-
gies and content selection on QoE,” in Network and Service
Management (CNSM), 2013 9th International Conference on,
10 2013, pp. 298–303.
[23] FP7 ETICS, Deliverable D4.4: Final ETICS Architecture and
Functional Entities High Level Design, P. Zwickl and H. Weis-
grab, Eds.
[24] ETSI, “TS 103 294 Quality of Experience – A Monitoring
Architecture,” 12 2014.
[25] R. Cocchi, S. Shenker, S. Member, and L. Zhang, “Pricing
in computer networks: Motivation, formulation, and example,”
IEEE/ACM Transactions on Networking, 1993.
[26] P. Maille and B. Tuffin, “Pricing the internet with multibid
auctions,” Networking, IEEE/ACM Transactions on, vol. 14,
no. 5, pp. 992–1004, Oct 2006.