ArticlePDF Available

Toward Greener Gaming: Estimating National Energy Use and Energy Efficiency Potential

Authors:
  • Lawrence Berkeley National Laboratory (Retiree Affiliate)

Abstract and Figures

Rising computing power, improved graphics quality, higher-resolution displays, and streaming delivery have rendered computer gaming an increasingly energy-intensive activity. However, the role of gaming-related energy use, and how it varies across platforms, has not been substantively examined by the energy or gaming research communities. We measured the energy consumption of 26 gaming systems representing the spectrum of technology, price, and performance. Among the findings, energy use varied widely by hardware, but equally widely depending on which of 37 game titles or 11 benchmarks were run. Cloud-gaming energy use in datacenters and networks is markedly higher than that for local gaming. Virtual-reality gaming can use significantly more or less energy than gaming with conventional displays, depending on hardware and software choices. In aggregate, we find that gaming represents $5 billion per year in energy expenditures across the United States or 34 TWh/year (2.4% of residential electricity nationally), with 24 MT/year of associated carbon-dioxide emissions equivalent to that of 85 million refrigerators or over 5 million cars. Targeted hardware and software strategies can reduce the gaming energy use by approximately half, while maintaining or improving metrics of user experience. In addition to system designers, gamers and game developers can play a significant role in managing the energy required for gaming.
Content may be subject to copyright.
Vol.:(0123456789)
The Computer Games Journal
https://doi.org/10.1007/s40869-019-00084-2
1 3
RESEARCH
Toward Greener Gaming: Estimating National Energy Use
andEnergy Eciency Potential
EvanMills1· NormanBourassa2· LeoRainer3· JimmyMai4· ArmanShehabi5·
NathanielMills6
Received: 9 May 2019 / Accepted: 6 September 2019
© This is a U.S. Government work and not under copyright protection in the US; foreign copyright protection
may apply 2019
Abstract
Rising computing power, improved graphics quality, higher-resolution displays, and
streaming delivery have rendered computer gaming an increasingly energy-intensive
activity. However, the role of gaming-related energy use, and how it varies across
platforms, has not been substantively examined by the energy or gaming research
communities. We measured the energy consumption of 26 gaming systems repre-
senting the spectrum of technology, price, and performance. Among the findings,
energy use varied widely by hardware, but equally widely depending on which of
37 game titles or 11 benchmarks were run. Cloud-gaming energy use in datacenters
and networks is markedly higher than that for local gaming. Virtual-reality gam-
ing can use significantly more or less energy than gaming with conventional dis-
plays, depending on hardware and software choices. In aggregate, we find that gam-
ing represents $5billion per year in energy expenditures across the United States or
34TWh/year (2.4% of residential electricity nationally), with 24MT/year of associ-
ated carbon-dioxide emissions equivalent to that of 85million refrigerators or over
5 million cars. Targeted hardware and software strategies can reduce the gaming
energy use by approximately half, while maintaining or improving metrics of user
experience. In addition to system designers, gamers and game developers can play a
significant role in managing the energy required for gaming.
Keywords Energy· Power· Efficiency· Performance· Carbon emissions· Green
gaming
* Evan Mills
emills@lbl.gov
Extended author information available on the last page of the article
The Computer Games Journal
1 3
1 Introduction
A third of humanity plays computer games, and an even higher proportion does so
in the U.S.—about two-thirds of the population (Nielsen 2018). Yet, prior to the
work described in this article, there were no comprehensive estimates of energy
use for this widespread activity. Nor were there corresponding assessments of
how the associated energy demand might evolve in the future or of the potential
for improved energy efficiency.
The per-system energy used for gaming is higher today than in the early days
of the activity. The extremes of this spectrum are defined by the 1970s-era Pong
game at ~ 10W per system when played on the original consoles, versus today’s
highest-performance purpose-built gaming PCs, with the potential of drawing
closer to 700W. This trend has been accompanied by a growing installed base
of gaming devices together with increasing amounts of time spent gaming (Mills
et al. 2017). This can give rise to a “false-choice” perception of unavoidable
trade-offs between gaming user experience and energy efficiency, yet in recent
years the gaming industry has demonstrated an ability to improve these two fac-
tors simultaneously. That said, limited measures of performance increases have at
times eclipsed absolute energy use reductions that would otherwise have resulted
from efficiency improvements. While the evolution of gaming technology has
been marked by improvement in metrics such as frames per second per watt,
and even declining absolute energy use per system in some cases, the energy use
among the highest-power systems continues to increase. Complicating efforts to
assess progress, frame rate is a highly limited metric for characterizing user expe-
rience, although the gaming-industry and gamers themselves focus heavily on it.
The literature on gaming energy use has focused almost exclusively on game
consoles (Urban etal. 2017; Microsoft, Nintendo, and Sony Interactive Entertain-
ment 2017). Until recently, only one formal study had looked in depth at gaming
on desktop computers (Mills and Mills 2015), and no work had been published
regarding gaming on laptops or with television-linked media-streaming devices
such as Apple TV or Android TV. Neither had the energy used by servers and
associated networks for cloud-based gaming—where graphics processing occurs
in a remote data center—been quantified. The energy impacts on PC gaming sys-
tems of many specific ancillary components, e.g., virtual reality (VR) equipment
and high-end displays, had also not previously been analyzed. The effect of game
choice on gaming device energy use had been examined in a limited fashion in
the case of PlayStation consoles (Koomey etal. 2017), but not on other consoles
or gaming computers. The sensitivity of gaming energy use to user behavior (e.g.,
hours spent gaming) had also not been described in the open literature. Aggregate
U.S. energy use by the full range of gaming platforms had not been estimated.
Paralleling the lack of technical research, no U.S. energy policies or programs
have focused on the gaming end use. Computer energy labeling programs and
standards do not currently consider energy use in the active-gaming mode, and
the computer energy standards, such as those in California, tend to exempt high-
performance PCs used for gaming. Thermal Design Point (TDP) ratings are very
1 3
The Computer Games Journal
poor proxies of actual electrical wattage (varying from 45 to 76% of actual for
CPUs and 63 to 11% of actual for GPUs). Voluntary ENERGY STAR labels
for displays and televisions, and the 80 PLUS rating program for conventional
computer power supplies targeting mainstream computers have limited spillo-
ver benefits for gaming systems. Utility incentive programs, consumer education
campaigns, public-goods R&D, and other time-tested energy policy tools have
also not been applied with the goal of improving energy efficiency and otherwise
managing gaming energy use. Policymakers do not independently track or fore-
cast energy demand for computer gaming, which results in this important end use
being largely obscured or subsumed into vague categories such as “other energy
uses”. The lack of standardized energy measurement protocols and energy-per-
performance metrics impedes progress towards quantifying and managing gam-
ing energy demand. For example, the ratio of frame rate to electrical power does
not consider a broad range of proxies (often unmeasurable) of user experience,
and can even run contrary to other measures of user experience.
This article is an adaptation of the executive summary of a multi-year research
project (Mills etal. 2018). The analysis of aggregate energy use is expanded here
from the state of California to the U.S. as a whole. We find that computer gaming
is one of the most significant residential electric “plug loads”, and among the most
complex energy end uses to understand and manage. Best practices can achieve sig-
nificant energy savings, without compromising (and in some respects improving)
user experience.
2 Project Approach andMethodology
This article summarizes our previous detailed assessments of the computer-gaming
marketplace (technology choice and user behavior). Among the contributions to the
literature, the article:
Provides in-depth baseline energy use measurements across the spectrum of rep-
resentative gaming devices,
Quantifies the per-unit energy savings potential, and
Creates aggregate energy demand scenarios and identifies policy options for sav-
ing energy while maintaining or improving performance and user experience.
We established a green-gaming laboratory to evaluate the power requirements of
gaming equipment. The lab allows us to log power use and frame-rate/quality for
each gaming system in both gaming and non-gaming modes. A data acquisition plat-
form was also developed to aggregate and analyze the large volumes of informa-
tion collected. System Power was measured using a Chroma 66202 Digital Power
Meter which can measure power to 0.1mW resolution with an accuracy of 0.1%
of reading + 0.1% of range. Data were recorded on a 1-s basis and included power
(watts), rms voltage, rms current (amps), power factor, THDi (%), and THDv (%).
Component Power was measured using a Measurement Computing USB-1608FS-
Plus data acquisition system, which samples at 50kHz. Currents were measured
The Computer Games Journal
1 3
using Pico TA234 30-amp current clamps. Voltage, current, and power (V × A)
were recorded on a 1-s average basis. Video Image Output was captured using a
Datapath VisionSC-DP2 capture card capable of capturing 4K Ultra HD video at
up to 60fps. Video was stored in a RAID 0 (data striping) array of three 500GB
SSD drives with a maximum write speed of 450 GB/s. The maximum amount of
data coming through with 4K 60fps video, uncompressed and uncropped, is about
1.39GB/s. Frame Times were measured using multiple methods including FRAPS,
PresentMon, and FCAT VR software running on the PC system under test. The
Nvidia FCAT testing process, which includes the VitualDub video software analysis
application was used to analyze the video capture test files. CPU and GPU Tem-
peratures and CPU Power were recorded using Open Hardware Monitor software to
read embedded system sensors.
We specified a set of 26 pre-built and custom-built systems (10 desktop PCs, 5
laptop PCs, 9 consoles, and 2 media streaming devices) that encompass the range of
functionality and user requirements sought in the marketplace circa 2016 (see online
appendix1 for descriptions of these systems). We tested all systems while con-
nected to a 24 1080p 60Hz display. A subset of these systems was then modified to
achieve energy savings using more efficient technologies and componentry available
in the marketplace. We evaluated various combinations of 37 popular games, 11
simulated frame-rate benchmarks, and 9 other gaming- and non-gaming-mode tests
on all of these systems. Games were selected to reflect popularity and matched with
compatible systems for testing—together representing 206 game-system combina-
tions [see Bourassa etal. (2018a) and online appendix for the pairings selected]. We
collected one-second data on energy use and component temperatures for all tests
and multiple proxies of user experience (frame rate and frame quality) for all gam-
ing-mode tests. We conducted 1109 tests, grouped into 13 categories. After account-
ing for tests redone to resolve bad or missing data and system configuration changes,
the final total of 876 unique parametric tests spanned a variety of variables and sen-
sitivity studies covering a multi-step duty cycle (ranging from “off” to “active gam-
ing”). Detailed results of all the tests are presented in Bourassa etal. (2018b).
To characterize the market structure, we segmented PCs used for gaming into
entry-level, mid-range, and high-end categories, based on price and computing
power (discrete versus integrated graphics, grade of graphics cards, etc.) (Mills
etal. 2017). We also identify four types of gamers on all platforms: Light, Moder-
ate, Intensive, and Extreme, which reflect the differing duty cycle, i.e., numbers of
hours per day of gameplay and other gaming and non-gaming modes engaged in by
the users of gaming systems. The initial in-depth analysis was performed for the
California marketplace, which has been extended here to the entire United States by
scaling the installed base of systems in California per the share of household com-
puter ownership in each state (U.S. Census Bureau 2016), and similarly weighting
state-level electricity prices and carbon emissions factors.
We thus consider the entire technology and behavioral “ecosystem”, treating gam-
ing as an activity involving diverse combinations of hardware and software and user
1 https ://tinyu rl.com/y5ca8 xsq.
1 3
The Computer Games Journal
behaviors rather than isolated devices with generic sets of fixed “standard” utilization
assumptions. These ensembles of factors comprise the core gaming platform—includ-
ing computers (desktop and laptop), gaming consoles and media streaming devices—
together with a variety of peripheral devices including external audio, graphics pro-
cessing unit (GPU) docks, displays, televisions, local networking equipment, and VR
headsets and associated separately powered sensors, together with a wide range of
user-driven behavioral choices that also influence energy use. We assess the use of
these systems during gameplay and in modes other than gameplay (e.g., video stream-
ing, web browsing, idle, sleep, and off; see online appendix). We consider purpose-
built gaming equipment, as well as conventional computers that are used—sometimes
quite intensively—for gaming. The scope excludes low-power battery-powered gam-
ing devices such as smartphones used little if at all when connected to AC power.
A foundational methodological question was whether to make measurements
with simulated benchmarks or actual games. Benchmarks offer intrinsic repeat-
ability, while the workload created by human gameplay will vary. We ran a set of
11 benchmarks on four representative gaming systems (laptops through High-end
desktops), and found an enormous range in results, most of which were higher than
actual gameplay by up to a factor-of-two. We found that a weighted mix of the power
levels measured under Fire Strike’s “Graphics 1” and “Physics” test to best represent
typical power during actual gameplay (there are cases e.g., with system H1, where it
consistently over-predicts actual use) (Fig.1).
Because no single benchmark is appropriate for all gaming systems (and no bench-
marks are available for consoles or PC cloud gaming configurations), we identified a
set of popular games on which to test each system tier and developed a highly scripted
Fig. 1 Power under simulated benchmark is a reasonable proxy, although results under actual gameplay
vary
The Computer Games Journal
1 3
test process (route and pace through game) for each configuration. One uncertainty
for human-based testing is the variance among testers. To explore this, we recruited
22 testers to run 89 trials spanning 11 game titles on a variety of systems and found
almost all results within ± 10% of our standardized human bench test (Bourassa etal.
2018a, b), giving high confidence to our standardized test protocols.
3 Per‑system Power andEnergy Use
Not surprisingly, Mid-range and High-end desktop computers emerge as the highest
per-unit energy users (with notable exceptions, however). After a period of increases
in recent years, consoles have achieved absolute reductions in energy use for some
time (Urban etal. 2017) and in most cases consume less energy than desktop com-
puters on a per-unit basis.
The ranges of power use during active gameplay overlap among the various prod-
uct classes, and even more so during modes other than gameplay (Fig.2a–d). For
example, the most powerful gaming laptop’s power use was greater than or equal to
that measured for two of the Entry-level desktops and one of the Mid-range desk-
tops-, while many of the consoles used as much or more power than all but the high-
est-performance laptop and Entry-level desktop systems. User behavior (amount of
gameplay) has a significant influence on total energy use (Figs.3a, b, 4a, b).
Media streaming devices are by far the least energy intensive gaming technology
locally, although when running cloud-based gaming services a far larger workload
manifests in data centers and intervening networks (Fig. 5) (see Mills etal. 2018
and online appendix for detailed assumptions). We have estimated the associated
Fig. 2 ad Average system power during gameplay and non-gaming modes: 2016
1 3
The Computer Games Journal
“remote” power demand in the upstream network together with the data center host-
ing the servers performing the graphics processing at 520W (more than most local
PCs). The corresponding value is 300-watts for cloud gaming on consoles. For con-
ditions prevailing in 2016, cloud gaming adds approximately 40 to 60% to the oth-
erwise total local annual electricity use (all modes) for desktops, 120 to 300% for
laptops, 30 to 200% for consoles, and 130 to 260% for media streaming devices.
Cloud-based gaming is by far the most energy-intensive form of gaming via the
Internet (compared to traditional online gaming or downloading games), and while
the electricity intensity of networks is declining quickly (Aslan etal. 2017), the elec-
tricity intensity of data centers is improving more slowly.
Most gamers would insist that improvements in energy efficiency should not com-
promise the user experience. However, in this regard, gaming is arguably among the
most difficult energy-using activities to characterize in terms of energy requirements
per unit of performance (Bourassa etal. 2018a, b). Given that most forms of user expe-
rience (e.g., “fun”) are not readily measurable and thus highly subjective, we focused
on frame rates, together with frame quality, and other observations. By these limited
measures, user experience and energy use do not appear to be positively correlated.
Indeed, there are multiple indications that energy can be improved while maintaining
or even improving these proxies of user experience. We also examined component tem-
peratures as an important factor impacting equipment preservation and thermal com-
fort for gamers, and the associated distracting fan noise often deemed undesirable by
Fig. 3 a, b Baseline unit energy consumption for desktops by user type and duty cycle
The Computer Games Journal
1 3
Fig. 4 a, b Baseline unit energy consumption for consoles by user type and duty cycle
Fig. 5 Network and cloud-gaming energy is significant—often more than half of total electricity use:
2016 conditions
1 3
The Computer Games Journal
gamers. We find that temperatures tend to decline as more efficient componentry is
introduced.
Figure6a, b show the range of results in our testing across a wide array of combi-
nations of hardware, games, and user settings. It is readily apparent that performance
expressed in terms of framerate doesn’t correlate with PC power. Figure6a shows that
it is not necessary to have high power levels in order to get high frame rates. Figure6b
shows that efficiency, expressed as fps/watt also varies widely, even at a given system
power level. Laptops tend to perform the best, while entry-level systems tend to be the
worst performers. Efficiency trends downward as overall system power increases.
Some detailed observations from our testing are summarized in Box 1 (2016
conditions).
Fig. 6 a, b Frame rate does not correlate with PC power: Laptop and desktops
The Computer Games Journal
1 3
4 Drivers ofDemand
We find that power requirements vary widely within gaming system categories
(PCs and consoles) and even within product tiers that make up the broad catego-
ries (Fig.2). Per-system energy use varies significantly depending on the technology
Box1 Key observations
Energy consumption and power during gameplay
• Across 26 systems tested, client-side electricity use ranged from 5 to over 1200kWh per year, reflect-
ing equipment choice and usage pattern
• When grouped into three product tiers (averaged across user types, duty cycle, and games played)
annual PC electricity use varies by threefold over 10 desktops and sixfold over five laptops. Use varied
18-fold over nine consoles, and sevenfold over two media-streaming devices
• The fraction of total client-side energy used in gaming mode (across all product tiers and gamer types)
varies from 29 to 32% for PCs (laptops and desktops), 41% for consoles, and 7% for media-streaming
devices (70 to 75% for the “Extreme” gamer types)
• Some gaming laptops use more energy than Entry-level PCs used for gaming
• Some consoles use more energy than gaming laptops
• Across individual systems and game titles, average power during gameplay varied 12-fold for the
desktops, tenfold for the laptops, and 15-fold (for the consoles. The two media streaming devices used
similar amounts of local power: ranging from 3 to 10W
• The fraction of total energy used by the GPU ranges from 45 to 77% when in gaming mode, and is
surprisingly significant when in short-idle mode as well (12 to 33% of total energy use)
• Power in non-gaming modes for consoles is higher than for some PCs while gaming
• Unexpected spikes in PC power during short idle mode corresponded to an average of 9% of total
energy use above that of the expected idle state across all systems (up to 55% on one system). This sug-
gests a need for more realistic test procedures. We did not observe similar patterns for consoles
Operational factors
• Energy use varies more widely by gamer type (intensity of gameplay) than system type
• Annual electricity use (averaged across games) varies by user type by fivefold over ten desktop
systems, 12-fold over five laptops, 75-fold over nine consoles, and eightfold over two media-streaming
devices
• The effectiveness of gaming systems in trimming power use to maintain proportionality with workload
varies widely. Power use during gameplay is the same as that in non-gaming mode for some systems,
with the ratio (energy proportionality) rising to nearly 5:1 for the systems with the best power manage-
ment
• Under-/Over-clocking three PCs decreased/increased gaming power by − 26 to + 37%
• Large variations in power use can occur with in-game user settings on PCs
Game choice
• Energy use while gaming on a given platform varies considerably depending on game choice: up to
3.5-fold (270W) across various games on PCs and up to 1.6-fold on consoles (61W)
• Energy use while gaming for a given game varies by eightfold and 21-fold for the two games playable
on the widest range of platforms in our sample (Sims and Skyrim, respectively)
• Energy requirements for PCs and consoles are not correlated with game genre
Displays
• High-definition 4k displays result in significant increases in energy used by PCs (25 to 64%), and
reductions in framerate, resulting in reduced energy efficiency. Consoles also experience increased
power when connected to 4k
• Multiple displays impose extra workload—all across the duty cycle—on PC systems
1 3
The Computer Games Journal
used, software settings, and game-title played, as well as the intensity of gaming
behavior and broader duty cycle, including other uses of the systems such as video
streaming (Figs.6, 7).
The proportion of total system energy used during gameplay varies widely. For
particularly intensive gamers, gaming can be responsible for up to 70% of total
annual energy use (across all system types), while for light gamers the value falls to
less than 10%.
The effectiveness of power management while in idle or standby mode varies
quite widely. In cases of poor power management, power requirements in non-gam-
ing modes are comparable to those during gameplay. We find that component name-
plate ratings for CPUs and GPUs do not agree well with measured maximum power
use—and manufacturer power ratings for motherboards do not exist—complicating
efforts to optimally design systems and estimate energy use.
Connected peripheral devices are separate energy-using plug loads in their own
right, but also create increased power requirements within the core gaming system
(primarily by working the GPU harder). For example, we found ultra-high-definition
4 K displays significantly elevate gaming system power. The number of displays
is also a factor. Our tests of three high-definition (1080p) side-by-side displays on
one High-end PC system resulted in a 25% (73W) increase in base system power
while gaming, over and above the similar amount of power consumed directly by
the displays. Others have observed similar effects on consoles (Microsoft, Nintendo,
and Sony Interactive Entertainment 2017). Use of an external graphics card dock
boosted one laptop’s energy use by twofold and another’s by threefold. External
audio equipment (common among gamers), is relatively low power but long opera-
tion times translate into significant added energy.
We initially expected VR image rendering loads to drive energy consumption
higher among High-end PCs. Energy use does increase in some of the VR config-
urations we measured, particularly when considering that the existing 2-dimen-
sional display is still used in conjunction with the VR headset (Fig. 8). How-
ever, we’ve found that compared to rendering on conventional 2D displays, which
Fig. 7 Measured gaming desktop component loads: the role of components varies significantly depend-
ing on duty cycle and product tier
The Computer Games Journal
1 3
requires corner-to-corner high quality calculations to feed the full screen, VR
“foveated rendering” can take advantage of human cognitive attention factors and
eye anatomy to reduce rendering calculations for peripheral regions of the user’s
field of view to reduce power draws. Consoles were more difficult to assess, but
energy use appears to be higher during VR operation.
While power and energy use vary dramatically depending on game choice
(Fig.9), we discovered that game genre is not a predictor of energy use (Fig.10);
games that appear relatively simple can use comparable amounts of energy as
high-intensity games due to the quality of imagery and visual effects used. We
found that even the relative rankings of energy use for a given game also varied
according to the system they are played on.
Fig. 8 PC Gaming power changes when shifting from 2D to foveated and full-resolution virtual reality
Fig. 9 The level and pattern of energy use varies considerably by game, even on a given platform: Sys-
tem M4
1 3
The Computer Games Journal
Users’ graphics-related settings also influence energy use for the desktops and
laptops (consoles vary graphics-related settings in order to maintain frame rate).
Across the range of these settings, system power varies mostly by less than 5%, with
the exception of VSync, which reduced energy use up to 39% on some systems.
Conversely, in-game “mods” increased power by 35% in one case (Minecraft) we
evaluated.
We find that within each broad technology group (PCs, consoles, and media-
streaming devices), user behavior (duty cycle, game choice, settings) is a stronger
driver of unit energy consumption than technology choice.
A hypothetical “worst-case” setup, involving the average of our two High-end PC
systems, overclocking, three displays at 4K resolution, cloud-based gaming, and the
“Extreme” user profile would result in annual electricity use of 2560kWh/year (at
2016 Internet network electricity intensity), which is more than double the Baseline
average unit energy consumption for that equipment tier.
On a per-system basis, annual operating costs can range from $5 to $1000/year
at U.S. average energy prices and more than twice that level in areas with higher
prices (see online appendix). The corresponding greenhouse-gas emissions also
vary widely, from 1kg per year for low-power systems in areas with very “clean”
utility grids to 1000 kg per year for higher power systems in regions with “dirty”
grids (see online appendix).
Aside from added network and cloud-based loads, the energy use of individual
gaming system energy is declining in many cases (Mills et al. 2018), yet histori-
cally increasing in aggregate due to an expanding overall installed base and modal
changes within the base, e.g., towards higher-performance systems and increased
time spent gaming (Mills etal. 2017). In recent years, overall demand for localized
Fig. 10 PC power in gameplay often does not vary by genre: 19 popular games
The Computer Games Journal
1 3
gaming held roughly constant due to increased market share of consoles over more
energy-intensive desktop systems. An important unknown is whether the next gen-
eration of consoles will bring increased or decreased energy use. The adoption of
increasingly large and high-resolution displays, and in some cases virtual reality,
further increases energy use of the entire gaming setup. Other drivers include new
cloud-gaming entrants such as Google’s Stadia service, and the potential for smart-
phones to also become popular cloud-gaming devices.
5 Toward Improved Energy Eciency
While measuring the energy efficiency of most energy using systems is rather
straight-forward (e.g., combustion efficiency for a furnace), gaming presents a par-
ticularly elusive situation. Comparisons among systems on the basis of power and
frame rates—a widely used, albeit crude, proxy for efficiency—showed dramatic
variation across all of our testing (Fig.6). As an example of the limitations of this
metric, Pong would be deemed 10-times more “efficient” (at 3fps/W) than our best
High-end system (H2, at 0.3fps/W), but this is of little significance given the vast
differences in actual service levels provided and user expectations.
Fortunately, there are more compelling contemporary examples of efficiency
opportunities. There remain enormous variations in energy efficiency of componen-
try offered in the market, suggesting room for improvement in the installed base. For
some games, our highest-performing desktop system used less energy than many of
the lesser desktop systems and less than even the PlayStation PS4 Pro. This, together
with our observations that power use declined in a wide range of proportions when
shifting from gaming to idle modes, indicates that systems integration is subopti-
mal in many cases. As another indicator of the opportunities, we found dramatically
oversized power supplies in almost all of the PC test systems (even at peak loads).
A concrete illustration of potential improved efficiencies through high-quality
system integration is provided by a comparison of Entry-level system E3 and High-
end system H2 (see online appendix). System E3 offers significantly lower frame
rates under the Fire Strike benchmark and inferior graphics processing capabilities,
yet only 23% less power in active gameplay and substantially higher power in every
other mode (streaming, browsing, idle, sleep). Across the duty cycle, system H2
uses 33% less energy for light users (less time spent in gameplay and 3% less energy
use even for extreme gamers) (see data appendix).
The great strides made by console manufacturers [50% or more gaming power
reductions achieved during the lifecycle of the 7th-generation console systems
(Urban et al. 2017)] provide another “existence proof” that energy efficiency can
be improved, absolute energy use reduced, and user experience and market accept-
ance of the products simultaneously elevated. These qualitative patterns can also be
seen for best-in-class PCs, although increasing users’ performance aspirations (e.g.,
frame rates) have tended to cancel out potential reductions in absolute energy use in
many cases (Walton 2016, 2017). However, when properly specified, we found GPU
savings on the order of 50% when upgrading our base systems, without adverse
effects on performance metrics.
1 3
The Computer Games Journal
Key trends in energy efficiency involve graphics processing and more efficient
display of imagery. Graphics processing units (GPUs) can continue to become more
efficient, as can the data centers and networks being increasingly used to host and
deliver gaming content (Shehabi etal. 2018). We also found that the performance
of dual-GPU systems can be met or even exceeded with single-GPU designs that
use less energy. Efficiency improvements can also be achieved in central process-
ing units (CPUs), motherboards, power supply units, and cooling. Carefully-chosen
displays can yield energy savings without compromising the visual experience.
We found an average 13% energy savings potential for improving power supplies
in our test systems. The trend towards foveated rendering for all head-mounted VR
displays could go a long way towards managing future energy demand associated
due to VR growth and, by reducing the imaging data, support a transition to allow
wireless operation. Game developers can also play a role in designing games to use
energy more efficiently. It is important to keep in mind that improvements in each of
these areas can be offset—and even overwhelmed—by parallel trends and desires in
consumer user experience that translate into increased computational workloads or
more hours spent gaming.
The virtual reality headsets we tested required from 15% less (52W) to 38% more
(93W) power during gameplay than the same system with a similar 2D game, with
the lower case presumably thanks to programmatically limiting maximum resolution
to the central field of vision (foveated rendering). One of the VR systems we tested
had external sensors that use 16W, and these are likely often left on by users when
not in use.
Much of the potential for decreasing (or increasing) energy demand lies in soft-
ware. In an example of software strategies, slowing down 2D screen refresh rates to
match the chosen display (VSync) resulted in 14 to 39% energy savings. Conversely,
overclocking of CPUs and GPUs is a popular strategy among users of higher-end
systems to get faster frame rates, and can boost power use by 7% in some cases,
while underclocking of these components reduced power use by up to 25% in our
testing. Note that percentage savings for given measures varies depending on the
system to which it is applied, paired display, and the game being played. Not all
measures are applicable to all systems.
Dynamic Voltage Frequency Scaling (DVFS) involves changing power states in
real time to better match the resources actually required by the computing process
(e.g., graphics rendering in the case of gaming systems). The practice is widespread
for CPUs, but has only recently been applied to GPUs. A review found energy sav-
ings as high as 54% depending on the nature of the workload, with central values
in the 20% range (Mei etal. 2016). The benefits of DVFS appear to vary widely
depending on the application (and type of activity happening within a gaming ses-
sion). Games defaulted to use exceptionally high frame rates, such as World of
Warcraft, are the best candidates. We tested AMD’s Radeon Chill, since it allowed
for active control of the DVFS feature for two game titles in our testing line-up. The
results for The Elder Scrolls V: Skyrim and The Witcher 3: Wild Hunt demonstrated
negligible power savings when DVFS was in use.
We extensively upgraded one PC system in each of the performance tiers, achiev-
ing 52% average energy savings in gaming mode and 48% in non-gaming mode. Of
The Computer Games Journal
1 3
note, the energy use of the improved High-end PC system was in range of that of the
base Entry-level system. In defining our efficiency packages, we looked closely at a
set of quantitative non-energy indicators. The metrics included frame rate, dropped
frames, proxies for stutter and system stress, and maximum temperatures in the GPU
and CPU. In virtually every case the indicators moved in the direction of improved
user experience as efficiency was improved (Mills etal. 2018).
6 Implications forEnergy Use Nationally
We combined our power measurements for individual systems with corresponding
installed base data and duty cycle profiles to estimate aggregate computer gaming
energy use across the United States. The 134 million gaming platforms existing
in the U.S. as of 2016 consumed 34 TWh/year, corresponding to a $5billion/year
expenditure by consumers, and 24 million tons CO2-equivalent/year of greenhouse-
gas emissions (equal to the emissions of 5 million typical passenger cars). These
values include energy use on the client side for the core system, displays, external
speakers, and local network equipment, as well as energy in upstream networks and
data centers associated with cloud-based gaming.2
Between 2011 and 2016, a shift to a less energy-intensive mix of gaming prod-
ucts in the marketplace and improvements in display efficiency roughly offset the
growth in electricity demand that otherwise would have occurred due to increasing
numbers of systems in use. However, actual gaming electricity demand fell consid-
erably as a result of reductions in the electricity intensity of internet infrastructure
which lowered energy use for video streaming.
The collective energy use for computer gaming in 2016 equates to 2.4% of overall
residential electrical energy use, or, for perspective, that of about 85 million new
refrigerators (at 400kWh/year-unit). This aggregate electricity use is greater than
that of residential freezers, electric cooking, conventional home computing, clothes-
washing, or dishwashing. It is half that of electric clothes drying and one-third that
of residential refrigeration (Fig.11).
When allocating associated network energy, displays, and other peripheral loads
to their respective system types, consoles were responsible for 66% of the total
energy use in 2016 for computer gaming across the duty cycle, followed by 31%
for desktops, 3% for laptops and less than 1% for media-streaming devices, with the
shares shifting toward PCs by 2021 in the Baseline scenario. When considering only
energy use at the core system level, PCs and consoles consume comparable amounts
of energy by 2021 (13 and 19TWh/year, respectively).
In our Baseline scenario, the installed base of gaming systems is projected to grow
by 15% as of 2021 together with a structural shift towards more energy-intensive
2 Using weighted-average residential electricity prices of $0.137/kWh (U.S. EIA 2017, 2019). Given
the structure of most electricity tariffs, at the marginal prices where this consumption actually occurs,
costs would be about 50% higher. The weighted-average electricity emissions factor is 0.71kg marginal
CO2-equivalent per kilowatt-hour (U.S. EPA 2019).
1 3
The Computer Games Journal
regions of the technology spectrum for desktops and laptops, while consoles’
blended unit-energy consumption declines (Mills etal. 2017). The resulting energy
demand is 33.6 TWh/year, with alternate scenarios ranging from 29.4 to 76.9 TWh/
year depending on the evolution of gaming methods and system types (Fig.12).
With the trend towards consoles capturing an ever-larger share of total systems,
and their relatively low unit energy consumption, energy demand declines slightly in
the near term. In contrast, were the relative mix of device types and their unit con-
sumption to remain frozen at 2011 levels—the beginning of our analysis period—
the year-2021 demand would be 55TWh/year, 64% greater than the actual projected
baseline.
Although energy efficiency is fixed in the aforementioned case, and despite an
increase in total installed base, actual projected consumption decreases by 6% from
2016 levels. This is due to structural shifts in the installed base towards less energy-
intensive gaming systems, i.e., increased market share of consoles and declin-
ing electricity use among the newer consoles, as well as projected improvements
in internet electricity efficiency. As in the 2016 base conditions, consoles remain
the highest electricity-using component (in aggregate), followed closely by electric-
ity use in associated networks and data centers. With near-term energy efficiency
improvements in hardware, firmware, and software—and assuming that three-quar-
ters of the stock turns over by 2021—aggregate demand falls to 28 TWh/year, a
17% reduction from the Baseline 2021 case and about 49% compared to the fro-
zen-efficiency-and-market-share case for that year. This scenario provides similar
or improved measurable service levels (user experience) as the baseline scenario,
with added benefits to users including reduced distracting noise and heat production
together with longer battery life for efficient laptop computers.
The following alternative scenarios define an envelope in which energy use, cost,
and emissions rise by 114% or drop by 28% from the 2016 levels, depending on
market evolution (Fig.13).
Fig. 11 Computer gaming consumes more electricity in the U.S. than some familiar residential uses
The Computer Games Journal
1 3
Surge in high-fidelity PC gaming and virtual reality The greatest demand growth
from 2016 levels (114%) occurs through the “Surge” scenario, in which high-
fidelity PC gaming becomes more popular and PC electricity use consequently
comes to dominate the landscape. Meanwhile, network and cloud-based gaming
electricity eclipses that of consoles. Efficiency options could restrain growth to
48%.
Strong uptake of cloud-based gaming Cloud-based gaming becomes wildly pop-
ular and overall computer gaming electricity demand grows by 17%. Aggregate
network and data electricity use is larger than that used locally by consoles or
PCs. Efficiency options could constrain the growth to 2%.
Fig. 12 U.S. gaming energy demand in 2021: Baseline and three alternate scenarios
1 3
The Computer Games Journal
Shift to consoles Electricity demand declines by 18% in the scenario where con-
soles replace half of the more energy-intensive PCs. Consoles become the largest
segment of electricity use, but in the context of lower combined demand across
all gaming activity. Efficiency options could reduce demand by 28%.
Other possible outcomes A combination of increased cloud-based gaming and
the transition to more PC gaming, could yield a far higher electricity demand tra-
jectory—around 97TWh/year, corresponding to a near tripling of demand from
2016 levels.
The relative shares of different gaming product families (PCs, consoles, media-
streaming devices) varies substantially among these scenarios. Up to 27% of total
energy shifts to the data centers and supporting Internet infrastructure.
7 Challenges andRecommendations
Gaming hardware and software manufacturers and energy policy makers—as well
as gamers themselves—will ultimately determine how energy demand changes and
what portion of the aforementioned energy savings potential is captured in practice.
However, lack of standardized energy test procedures and information on component
Fig. 13 Enormous potential variations in U.S. computer gaming energy driven by market structure, user
behavior, and efficiency: 2011–2021
The Computer Games Journal
1 3
and system efficiencies impedes adoption of better practices. Third-party test results
for identical components and systems vary across the consumer-oriented literature.
While gamers make intensive use of in-game diagnostics, energy use is not one
of them. As instantaneous power feedback capabilities become available they should
be effectively delivered to the gamer. Where enabled, developers may consider
“gamifying” this information. Gamers seek out goal-driving systems for scoring and
garnering merit for doing so. Carbon could be introduced as another variable.
Many long-used energy policy strategies are applicable in the computer gaming
arena, including energy labeling, consumer information and education, voluntary
ratings, improved software, and manufacturer R&D. These techniques have been
applied to virtually every sphere of energy use activity aside from gaming. Manda-
tory system-level standards for gaming devices are highly problematic given the ina-
bility to consistently and meaningfully benchmark a unit of energy use per service
(performance) delivered (most of these services are highly subjective and difficult or
impossible to quantify), together with technologies and software that are evolving
more rapidly than regulatory processes can adapt. Moreover, selecting a single met-
ric upon which to base standards could stifle innovation while failing to recognize
true efficiency improvements and their relation to user experience. Component-level
standards may be more manageable, e.g., regarding power management in CPUs,
GPUs, or motherboards. In any case, it is critical that often slow-moving policy-
making does not inadvertently hobble or become irrelevant to industry’s established
innovation process.
Related considerations are that autonomous energy efficiency innovations in this
industry are occurring at a rapid pace, while much of the projected demand growth
is driven by consumer behavior rather than intrinsic component-level energy perfor-
mance. With these drivers in mind, a focus on reducing absolute energy use per sys-
tem and enhancing the currently deficient consumer information environment and
behavioral campaigns hold particular promise.
Either PCs or consoles could drive future electricity demand growth, depending on
how the market evolves. That said, energy demand is lowest in the scenarios dominated
by consoles, whereas scenarios in which substantial demand growth occurs are driven
by PCs. The majority of potential Baseline-scenario efficiency gains (about two-thirds)
comes from PCs, which suggests attention to PCs is of particular importance, particu-
larly given the paucity of such attention to-date. Irrespective of the client-side platforms
chosen, the emergence of cloud-based gaming calls for increased focus on energy effi-
ciency in data centers and networks, irrespective of customer-side gaming platform.
Solid policy as well as technological innovation require understanding the market,
which calls for establishment of standardized energy test procedures that adequately
characterize user experience, ongoing assessment of emerging energy-efficiency oppor-
tunities, improved understanding of user behavior as a driver of demand, market track-
ing to understand the ever-changing installed base, and incorporating the burgeoning
energy use of computer gaming into energy demand forecasting.
Acknowledgements This work was sponsored by the California Energy Commission, under Agreement
#EPC-15-023 and benefitted enormously from engagement with many experts from the gaming industry,
other energy researchers, and real gamers, many serving on our Technical Advisory Committee. Input
1 3
The Computer Games Journal
on the PC side was provided by AMD (Donna Sadowy, Claudio Capobianco, Scott Wasson, and Justin
Murrill) and Nvidia (Tom Peterson, Phil Eisler, Sean Pelletier, Anjul Patney, Nick Stam, John Spitzer,
Luc Bisson, and Sean Cleveland). Representatives of the console industry, including the Entertainment
Software Association (Michael Warnecke) and Sony Interactive Entertainment America, Nintendo of
America, and Microsoft Corporation participated in a project workshop and other information exchanges.
Game developers providing input included Nicole Lazzaro, Bob King, and Tom Bui. Tom’s Hardware
(Fritz Nelson, Joe Pishgar, and Chris Angelini), PC Perspective (Ryan Shrout), and eXtreme Outer Vision
(Slava Maksymyuk) provided discussions about energy-per-performance assessment and consumer deci-
sion-making. Jon Peddie Research (Ted Pollak), Iowa State University (Douglas Gentile), Fraunhofer
USA (Kurt Roth), and Statistica (Liisa Jaaskelainen), laid important groundwork for our characterization
of the gaming marketplace. Colleagues at other research institutions provided in-depth exchanges, includ-
ing Jonathan Koomey (Stanford University), Pierre Delforge (NRDC), Peter May-Ostendorp (Xergy),
Douglas Alexander (Component Engineering), and Vojin Zivojnovik and Davorin Mista (Aggios),
and the USEPA’s ENERGY STAR program (Verena Radulovic, Matt Malinowski, Ben Hill, and John
Clinger). Two dozen Berkeley Lab employees volunteered to intensively test an array of gaming rigs
under various operating conditions to enable us to measure energy use, performance, and user experience
under real-world conditions. Ian Vaino of Lawrence Berkeley National Laboratory’s Workstation Sup-
port Group provided workspace and support for the Green-gaming Lab, system procurement and assem-
bly, and our extensive testing process. Sarah Morgan served as Program Manager. Lawrence Berkeley
National Laboratory is supported by the Office of Science of the United States Department of Energy and
operated under Contract Grant No. DE-AC02-05CH11231.
Funding This work was sponsored by the California Energy Commission.
Compliance with Ethical Standards
Conict of interest We have no financial or personal relationship with a third party whose interests could
be positively or negatively influenced by the article’s content.
References
Aslan, J., Mayers, K., Koomey, J. G., & France, C. (2017). Electricity intensity of internet data transmis-
sion: Untangling the estimates. Journal of Industrial Ecology. https ://doi.org/10.1111/jiec.12630 .
Bourassa, N., Rainer, L., Mai, J., Curtin, C. (2018a). Gaming systems energy performance measurements
& benchmark testing procedures report (Tasks 3&4). Report to the California Energy Commission
under project EPC-15-023.
Bourassa, N., Rainer, L., Mai, J., Curtin, C. (2018b). Final standardized test bed specification and find-
ings report (Task 5). Report to the California Energy Commission under Project EPC-15-023. Law-
rence Berkeley National Laboratory.
Koomey, J., Mayers, K., Aslan, J., Hendy, J. (2017). Performance benchmarks for consoles. Version 33
dated July 4.
Mei, X., Wang, Q., & Chu, X. (2016). A survey and measurement study of GPU DVFS on energy conser-
vation. Digital Communications and Networks, 3(2), 89–100.
Mills, E., Bourassa, N., Rainer, L., Mai, J., Shehabi, A., Mills, N. (2018). Green gaming: Energy effi-
ciency without performance compromise. Task 7 report. Report to the California Energy Commis-
sion under project EPC-15-023. Lawrence Berkeley National Laboratory.
Mills, N., & Mills, E. (2015). Taming the energy use of gaming computers. Energy Efficiency, 9, 321–
338. https ://doi.org/10.1007/s1205 3-015-9371-1.
Mills, E., Pollak, T., Bourassa, N., Rainer, L., Mai, J., Mills, N., Desroches, L.-B., Shehabi, A. (2017). An
energy-focused profile of the video gaming marketplace. Prepared for the California Energy Com-
mission by Lawrence Berkeley National Laboratory.
Nielsen. (2018). Games 360 U.S. Report: 2018.
Shehabi, A., Smith, S. J., Masanet, E., & Koomey, J. (2018). Data center growth in the United States:
Decoupling the demand for services from electricity use. Environmental Research Letters, 10(1088),
1748–9326.
The Computer Games Journal
1 3
Microsoft, Nintendo, and Sony Interactive Entertainment. (2017). Report on the 2017 Review of the
Game Console Self-regulatory Initiative.
Urban, B., Roth, K., Singh, M., & Howes, D. (2017). Energy consumption of consumer electronics in
U.S. homes in 2017. Boston: Fraunhofer USA Center for Sustainable Energy Systems.
U.S. Census Bureau. (2016). Ownership of computing devices. https ://factfi nder .censu s.gov/faces /table
servi ces/jsf/pages /produ ctvie w.xhtml ?pid=ACS_16_1YR_B2801 0&prodT ype=table . Accessed 1
June 2019.
U.S. EIA. (2017). Electric sales, revenue, and average price. United States Department of Energy,
Energy Information Administration. https ://www.eia.gov/elect ricit y/sales _reven ue_price /xls/table
5_a.xlsx. Accessed 1 June 2019.
U.S. EIA. (2019). How is electricity used in US homes? United States Department of Energy, Energy
Information Administration. https ://www.eia.gov/tools /faqs/faq.php?id=96&t=3. Accessed 1 June
2019.
U.S. EPA. (2019). Greenhouse gases equivalencies calculator. United States Environmental Protection
Agency. https ://www.epa.gov/energ y/green house -gases -equiv alenc ies-calcu lator -calcu latio ns-and-
refer ences .
Walton, S. (2016). Then and now: Six generations of $200 mainstream Radeon GPUs compared. TechS-
pot. June 20.
Walton, S. (2017). Then and now: Six generations of GeForce graphics compared. TechSpot. September
12.
Aliations
EvanMills1· NormanBourassa2· LeoRainer3· JimmyMai4· ArmanShehabi5·
NathanielMills6
Nathaniel Mills
http://greeningthebeast.org
1 Building Technology andUrban Systems Division, Lawrence Berkeley National Laboratory,
University ofCalifornia, 1 Cyclotron Road, MS 90-R2058, Berkeley, CA94720, USA
2 Building Technology andUrban Systems Division, Lawrence Berkeley National Laboratory,
University ofCalifornia, 1 Cyclotron Road, MS 90-R3074, Berkeley, CA94720, USA
3 Building Technology andUrban Systems Division, Lawrence Berkeley National Laboratory,
University ofCalifornia, 1 Cyclotron Road, MS 90-R3147, Berkeley, CA94720, USA
4 Information Technology Division, Lawrence Berkeley National Laboratory, University
ofCalifornia, 1 Cyclotron Road, MS R400-0424, Berkeley, CA94720, USA
5 Energy Analysis andEnvironmental Impacts Division, Lawrence Berkeley National Laboratory,
University ofCalifornia, 1 Cyclotron Road, MS 90-R2002, Berkeley, CA94720, USA
6 Reed College MS 765, 3203 SE Woodstock Blvd, Portland, OR97202-8199, USA
... Digital games in particular have advantages such as interactive and multimedia capabilities, popularity, ease of access through conventional devices, and fexibility of use beyond player co-location. However, digital gaming requires radical changes to become sustainable, given production models with embedded social inequality and environmental costs [80], consumption requiring signifcant energy use [103], and disposal creating hazardous ewaste [122]. The benefts of digital games are thus inseparable from the issues that they entail, from which afuent consumers are insulated. ...
... Despite their advantages, digital games cannot be completely appraised without their material qualities [9], including a production model that entails inequalities in both software and hardware production, with outsourced manufacturing done under exhausting and hazardous work conditions and environmental costs externalized to countries with permissive laws [80]. While the use and reuse of analog games can be virtually emission-free, the technological development associated with digital gaming involves increasing energy demand from games, devices, networks and data centers [103], with the Jevons paradox questioning the advantages of solutions based purely on technical efciency [79]. The obsolescence of older devices results in e-waste, which threatens human health and is seldom recycled [122]. ...
Conference Paper
Games are considered promising for engaging people with climate change. In virtual worlds, players can adopt empowering roles to mitigate greenhouse gas emissions and/or adapt to climate impacts. However, the lack of a comprehensive exploration of existing climate-related identities and actions prevents understanding their potential. Here, we analyze 80 video games and classify avatar identities, or expected player roles, into six types. Climate selves encourage direct life changes; climate citizens are easy to identify with and imitate; climate heroes are inspirational figures upholding environmental values; empowered individuals deliberate to avoid a tragedy of the commons; authorities should consider stakeholders and the environment; and faction leaders engage in bi- or multilateral relations. Adaptation is often for decision-making profiles, while empowered individuals, authorities, and faction leaders usually face conflicting objectives. We discuss our results in relation to avatar research and provide suggestions for researchers, designers, and educators.
... Many Asian PE firms are pursuing high growth consumption sectors where little consideration is given to the business model in terms of (1) its ecological impact ('Footprint'), or (2) its societal impact, such as community and worker issues, human rights and access to basic needs ('Utility'). Business sectors that typically produce negative Footprint or low Utility outcomes include oil and gas development, traditional construction materials such as cement (Mahasenan, Smith, and Humphreys 2003), single-use plastic packaging (Sustainability Times, May 21, 2019), electronic products with short lifecycles, 4 convenience-oriented home delivery services (BBC, March 29, 2019), entertainment products including gaming (Mills et al. 2019) and gambling sites, and various predatory lending entities aimed at low income borrowers. The potential for explosive growth over a five year horizonan important timeframe for PE firmsmay commercially outweigh the difficulty of assessing Footprint and Utility consequences that only become visible over longer periods. ...
Article
Full-text available
At this stage of Asia's development there is a need, and an opportunity, to establish a validation methodology that better gauges ESG implementation and sustainability aspirations in Asian private equity. Private equity, like major public market and debt investors such as Blackrock, has adopted language that suggests a proactive approach to ESG management. However, process-oriented ESG compliance presently far outstrips evidence of tangible contributions to ESG objectives and outcomes. This article describes a taxonomy of common approaches to ESG investment practices in Asian private equity and discusses their shortcomings. It then presents ‘Deep ESG’ as an alternative approach that operationalizes ESG and sustainability metrics more holistically than existing frameworks. The Deep ESG framework enables a higher level of market-led intentionality that better informs institutional investors, regulators, communities, and employees as they evaluate private equity's ‘balance sheet’ of ESG outcomes. By investing in tools for goal setting, measurement and evaluation and applying them consistently across all target and portfolio companies, private equity managers can pivot away from a defensive approach by working with stakeholders to shape constructive solutions to urgent sustainability goals.
... One of the most interesting studies concerning cloud gaming was conducted by Mills et al. (2019). They measured the energy consumption of 37 different games on cloud datacenters. ...
Article
Full-text available
Recently, cloud gaming has attracted much attention from service providers. The majority of cloud users have hardware limitations on their devices, hence why users are termed “thin clients”. A cloud service provider aims to afford the best-required frames of game scenes with each thin client in order to provide the best Quality of Experience (QoE). The importance of this issue is particularly critical with the emergence of 4K game service levels. Since 4K game scenes impose many demands upon the Central Processing Unit, Graphical Processing Unit, memory and bandwidth, cloud service providers tend to use the strategy of satisfying the highest number of gamers with the least amount of resources. Given that the problem of frame rate allocation is NP-hard, it is not possible to solve it with exact mathematical methods. In this paper, we propose a method inspired by the Bees algorithm. The approach presented in this article could significantly increases cloud providers’ profitability and reduce server-side wastage. Given the emergence of a new game-as-a-service paradigm for cloud-based games, cloud service providers must pay close attention to the issues of profitability and optimal resource allocation. The simulation of the proposed algorithm on the CloudSim simulator platform illustrates its robustness in terms of significant criteria such as runtime, number of admitted users, frame quality level, the QoE, etc.
Article
We tackle the problem of how to support gaming at the edge of the cellular network. The reduced latency and higher bandwidth that the edge enjoys with respect to cloud-based solutions implies that transferring cloud-based games to the edge could be a premium service for end-users. The goal of this work is to design a scheme compatible with MEC and network slicing principles of 5G and beyond, and which maximizes the utility of a service/infrastructure provider with time-varying edge node capacities due to the access to intermittent renewable energy. We formulate a multi-dimensional integer linear programming problem, proving that it is NP-hard in the strong sense. We prove that our problem is sub-modular and propose an efficient heuristic, GREENING, which considers the allocation of gaming sessions and their migration. For the mentioned scenario, we analyze a wide variety of realistic configurations at the edge, studying how the performance depends on (i) whether the games have a static or dynamic workload, (ii) the distribution of renewable energy through nodes and time, or (iii) the topology of the edge network. Through simulations, we show that our heuristic achieves performance close to that achieved by solving the NP-hard optimization problem, except with extremely lower complexity, and performs up to 25% better than state-of-the-art algorithms.
To address the dramatic economic contraction brought on by the global pandemic, governments at all levels have taken on tremendous debt in order to provide economic stability and prevent a more dramatic collapse. It is likely that, as the initial phase of the pandemic passes, familiar neoliberal austerity claims about the necessity to trim education budgets will gain greater force and acceptance. However, I suggest that these neoliberal policies demand sacrifices of the wrong constituency: Given that Big Tech has amassed huge sums of money over the course of the pandemic, how is it morally justifiable that tech companies benefit from the pandemic while educational institutions shoulder the financial fallout of pandemic government spending? In this paper, I first outline how Big Tech profits from the education sector during the pandemic even as it undermines the democratic function of education in doing so. I then situate these more specific critiques within a broader consideration of the role technology plays in undermining a democratic society. In conclusion, I argue that a pandemic profiteering tax for Big Tech represents the best short-term solution to get ahead of the “austerity curve” and ensure that the COVID-19 crisis serves as an opportunity to deepen our commitments to promoting the democratic function education. Without such commitments, the pandemic will become the turning point at which Big Tech effectively coopts public education for its own ends, to the detriment of democracy. My underlying claim is that technology is in conflict with both democracy and education. This runs against the widespread notion that technology will help promote learning, and that technology helps inform and connect people and therefore helps promote democracy. In what follows I dispel such notions.
Article
We present ENGAGE, the first battery-free, personal mobile gaming device powered by energy harvested from the gamer actions and sunlight. Our design implements a power failure resilient Nintendo Game Boy emulator that can run off-the-shelf classic Game Boy games like Tetris or Super Mario Land. This emulator is capable of intermittent operation by tracking memory usage, avoiding the need for always checkpointing all volatile memory, and decouples the game loop from user interface mechanics allowing for restoration after power failure. We build custom hardware that harvests energy from gamer button presses and sunlight, and leverages a mixed volatility memory architecture for efficient intermittent emulation of game binaries. Beyond a fun toy, our design represents the first battery-free system design for continuous user attention despite frequent power failures caused by intermittent energy harvesting. We tackle key challenges in intermittent computing for interaction including seamless displays and dynamic incentive-based gameplay for energy harvesting. This work provides a reference implementation and framework for a future of battery-free mobile gaming in a more sustainable Internet of Things.
Technical Report
Full-text available
A new Consumer Technology Association (CTA) study, Energy Consumption of Consumer Electronics in U.S. Homes in 2017, finds tech devices in U.S. homes now account for 25 percent less residential energy than they did in 2010 even as the number of these devices in U.S. homes has increased 21 percent since that time. This landmark energy efficiency achievement is due to the consumer tech industry's investments in lightweight materials and energy efficient technologies, as well as the convergence of multi-functional devices and continuous innovation.
Article
Full-text available
Data centers are energy intensive buildings that have grown in size and number to meet the increasing demands of a digital economy. This paper presents a bottom-up model to estimate data center electricity demand in the United States over a 20 year period and examines observed and projected electricity use trends in the context of changing data center operations. Results indicate a rapidly increasing electricity demand at the turn of the century that has significantly subsided to a nearly steady annual electricity use of about 70 billion kWh in recent years. While data center workloads continue to grow exponentially, comparable increases in electricity demand have been avoided through the adoption of key energy efficiency measures and a shift towards large cloud-based service providers. Alternative projections from the model illustrate the wide range in potential electricity that could be consumed to support data centers, with the US data center workload demand estimated for 2020 requiring a total electricity use that varies by about 135 billion kWh, depending on the adoption rate of efficiency measures during this decade. While recent improvements in data center energy efficiency have been a success, the growth of data center electricity use beyond 2020 is uncertain, as modeled trends indicate that the efficiency measures of the past may not be enough for the data center workloads of the future. The results show that successful stabilization of data center electricity will require new innovations in data center efficiency to further decouple electricity demand from the ever-growing demand for data center services.
Article
Full-text available
Energy efficiency has become one of the top design criteria for current computing systems. The dynamic voltage and frequency scaling (DVFS) has been widely adopted by laptop computers, servers, and mobile devices to conserve energy, while the GPU DVFS is still at a certain early age. This paper aims at exploring the impact of GPU DVFS on the application performance and power consumption, and furthermore, on energy conservation. We survey the state-of-the-art GPU DVFS characterizations, and then summarize recent research works on GPU power and performance models. We also conduct real GPU DVFS experiments on NVIDIA Fermi and Maxwell GPUs. According to our experimental results, GPU DVFS has significant potential for energy saving. The effect of scaling core voltage/frequency and memory voltage/frequency depends on not only the GPU architectures, but also the characteristic of GPU applications.
Article
Full-text available
One billion people around the world engage in some form of digital gaming. Gaming is the most energy-intensive use of personal computers, and the high-performance “racecar” systems built expressly for gaming are the fastest growing type of gaming platform. Large performance-normalized variations in nameplate power ratings for gaming computer components available on today’s market indicate significant potential for energy savings: central processing units vary by 4.3-fold, graphics processing units 5.8-fold, power supply units 1.3-fold, motherboards 5.0-fold, and random access memory (RAM) 139.2-fold. Measured performance of displays varies by 11.5-fold. However, underlying the importance of empirical data, we find that measured peak power requirements are considerably lower than nameplate for most components tested, and by about 50 % for complete systems. Based on actual measurements of five gaming PCs with progressively more efficient component configurations, we estimate the typical gaming computer (including display) to use approximately 1400 kWh/year, which is equivalent to the energy use of ten game consoles, six standard PCs, or three refrigerators. The more intensive user segments could easily consume double this central estimate. While gaming PCs represent only 2.5 % of the global installed PC equipment base, our initial scoping estimate suggests that gaming PCs consumed 75 TWh/year ($10 billion) of electricity globally in 2012 or approximately 20 % of total PC, notebook, and console energy usage. Based on projected changes in the installed base, we estimate that consumption will more than double by the year 2020 if the current rate of equipment sales is unabated and efficiencies are not improved. Although they will represent only 10 % of the installed base of gaming platforms in 2020, relatively high unit energy consumption and high hours of use will result in gaming computers being responsible for 40 % of gaming energy use. Savings of more than 75 % can be achieved via premium efficiency components applied at the time of manufacture or via retrofit, while improving reliability and performance (nearly a doubling of performance per unit of energy). This corresponds to a potential savings of approximately 120 TWh/year or $18 billion/year globally by 2020. A consumer decision-making environment largely devoid of energy information and incentives suggests a need for targeted energy efficiency programs and policies in capturing these benefits.
Article
In order to understand the electricity use of Internet services, it is important to have accurate estimates for the average electricity intensity of transmitting data through the Internet (measured as kilowatt-hours per gigabyte [kWh/GB]). This study identifies representative estimates for the average electricity intensity of fixed-line Internet transmission networks over time and suggests criteria for making accurate estimates in the future. Differences in system boundary, assumptions used, and year to which the data apply significantly affect such estimates. Surprisingly, methodology used is not a major source of error, as has been suggested in the past. This article derives criteria to identify accurate estimates over time and provides a new estimate of 0.06 kWh/GB for 2015. By retroactively applying our criteria to existing studies, we were able to determine that the electricity intensity of data transmission (core and fixed-line access networks) has decreased by half approximately every 2 years since 2000 (for developed countries), a rate of change comparable to that found in the efficiency of computing more generally.
Gaming systems energy performance measurements & benchmark testing procedures report (Tasks 3&4)
  • N Bourassa
  • L Rainer
  • J Mai
  • C Curtin
Bourassa, N., Rainer, L., Mai, J., Curtin, C. (2018a). Gaming systems energy performance measurements & benchmark testing procedures report (Tasks 3&4). Report to the California Energy Commission under project EPC-15-023.
Report to the California Energy Commission under project EPC-15-023
  • E Mills
  • N Bourassa
  • L Rainer
  • J Mai
  • A Shehabi
  • N Mills
Mills, E., Bourassa, N., Rainer, L., Mai, J., Shehabi, A., Mills, N. (2018). Green gaming: Energy efficiency without performance compromise. Task 7 report. Report to the California Energy Commission under project EPC-15-023. Lawrence Berkeley National Laboratory.
Final standardized test bed specification and findings report (Task 5)
  • N Bourassa
  • L Rainer
  • J Mai
  • C Curtin
Bourassa, N., Rainer, L., Mai, J., Curtin, C. (2018b). Final standardized test bed specification and findings report (Task 5). Report to the California Energy Commission under Project EPC-15-023. Lawrence Berkeley National Laboratory.