Figure 6 - uploaded by Anders S.G. Andrae
Content may be subject to copyright.
Source publication
This work presents an estimation of the global electricity usage that can be ascribed to Communication Technology (CT) between 2010 and 2030. The scope is three scenarios for use and production of consumer devices, communication networks and data centers. Three different scenarios, best, expected, and worst, are set up, which include annual numbers...
Similar publications
This presentation outlines an estimation of the global electricity usage that can be ascribed to Communication Technology (CT) in the coming decade. The scope is two scenarios for use and production of consumer devices, communication networks and data centers. Two different scenarios— best and expected—are set up, which include annual numbers of so...
Citations
... We assume a Pareto distribution for workload sizes based on findings in [4], [5], [6], [7], [8], [9], [10], and [21]. ...
This paper examines the workload distribution challenges in centralized cloud systems and demonstrates how Hybrid Edge Cloud (HEC) [1] mitigates these inefficiencies. Workloads in cloud environments often follow a Pareto distribution, where a small percentage of tasks consume most resources, leading to bottlenecks and energy inefficiencies. By analyzing both traditional workloads reflective of typical IoT and smart device usage and agentic workloads, such as those generated by AI agents, robotics, and autonomous systems, this study quantifies the energy and cost savings enabled by HEC. Our findings reveal that HEC achieves energy savings of up to 75% and cost reductions exceeding 80%, even in resource-intensive agentic scenarios. These results highlight the critical role of HEC in enabling scalable, cost-effective, and sustainable computing for the next generation of intelligent systems.
... Battery System OPEX cost is based on a 4 h Lithium-ion battery system. 4 Reciprocating gas cost is based on a 10 MW system. 5 Year of reference publication. ...
As a global community our use of data is increasing exponentially with emerging technologies such as artificial intelligence (AI), leading to a vast increase in the energy demand for data centres worldwide. Delivering this increased energy demand is a global challenge, which the rapid growth of renewable generation deployment could solve. For many data centre giants such as Google, Amazon, and Microsoft this has been the solution to date via power purchase agreements (PPAs). However, insufficient investment in grid infrastructure globally has both renewable generation developers and data centre developers facing challenges to connect to the grid. This paper considers the costs and carbon emissions associated with stand-alone hybrid renewable and gas generation microgrids that could be deployed either before a grid connection is available, or to allow the data centre to operate entirely off-grid. WindPRO 4.0 software is used to find optimal configurations with wind and solar generation, backed up by battery storage and onsite gas generation. The results show that off-grid generation could provide lower cost and carbon emissions for each of Europe’s data centre hotspots in Frankfurt, London, Amsterdam, Paris, and Dublin. This paper compares each generation configuration to grid equivalent systems and an onsite gas-only generation solution. The results showed that each hybrid renewable generation configuration had a reduced levelized cost of energy (LCOE) and reduced CO2eq emissions compared to that of its grid and gas-only equivalent. Previous literature does not consider the economic implications caused by a mismatch between generation and consumption. Therefore, this paper introduces a new metric to evaluate and compare the economic performance of each microgrid, the levelized cost of energy utilised (LCOEu) which gives the levelized cost of energy for a given microgrid considering only the energy which is consumed by the data centre. The LCOEu across all sites was found to be between 70 and 102 GBP/MWh with emissions between 0.021 and 0.074 tCO2eq/MWh.
... Data centers use more than 1% of global energy consumption (Masanet et al., 2020). This is expected to increase as much as 8% in 2030 (Anders et al., 2015). Regardless of improvements in energy efficiency, aggregate energy usage has been increasing in the last 15 years (Bashir et al., 2023). ...
Data centers are poised for unprecedented growth due to a revolution in Artificial Intelligence (AI), rise in cryptocurrency mining, and increasing cloud demand for data storage. A sizable portion of the data centers’ growth will occur in the US, requiring a tremendous amount of power. Our hypothesis is that the expansion of data centers will contribute to an increase in US CO 2 emissions. To estimate CO 2 emissions, we applied three forecasted power demands for data centers and applied 56 NREL (National Renewable Energy Laboratory) power mixes and policy scenario cases using 11 AI models. Among these, the linear regression model yielded the most accurate predictions with the highest R-square. We found that overall CO 2 emissions in the US could increase up to 0.4–1.9% due to expansion of data centers by 2030. This increase represents ~3–14% of CO 2 emissions from the US power sector by 2030. Using the state-level power mix forecasts for 2030 among increasing CO 2 emission scenarios, we predict that Virginia’s power mix will maintain emissions in line with the US average, while the Texas, Illinois, and Washington’s power mix are expected to reduce emissions due to greater renewables in their power mix in 2030. However, Illinois and Washington may face challenges due to their limited power resource availability. In contrast, New York and California’s power mix may increase CO 2 emissions due to higher natural gas in their power mix in 2030. The highest variability in data center CO 2 emissions stems from AI-driven demand and improvements in data center efficiency and is followed by the power mix. To reduce CO 2 emissions from data centers, we offer pathways such as reducing power consumption, improving power mix with renewable sources, and using hydrogen in power plants. We propose focusing on New Mexico and Colorado for data centers to minimize CO 2 emissions. Finally, we highlight a set of federal policies supplemented by states to facilitate CO 2 emission reductions across energy, emissions, waste, R&D, and grid infrastructure.
... They consume 10 to 100 times more energy than conventional buildings (Shehabi et al. 2018), but their average lifecycle is estimated at 10 years (Washington State Department of Commerce 2018), which is 5 times less than the timespan of conventional buildings (Eberhardt et al. 2019;Haugbølle and Raffnsøe, 2019). It is estimated that the global energy consumption of data centers is 240 to 340 TWh (IEA 2023a, b), but the industry is expected to consume up to 20% of the energy supply by 2025 (Vardhman and Defensor 2024;Andrae and Edler 2015), considering that the data center workloads have increased by + 340% from 2015 to 2022 (IEA 2023a, b). Moreover, their global greenhouse gas emissions are projected to reach about 14% by 2040 (Jerléus et al. 2024;Belkhir and Elmeligi 2018). ...
Purpose
The purpose of this research is to evaluate the environmental impacts associated with a theoretical data center in Italy. Through the analysis of four different scenarios that vary the energy mix, the IT attributes (i.e., server, storage, and networking refresh interval) and the physical infrastructure (i.e., power system, UPS mode, annual generator use, power distribution, and cooling system), this research intends to identify the best strategies for reducing the environmental impacts of a data center.
Methods
This study estimates the environmental impacts of a data center in Italy by focusing on its global warming potential. The functional unit is a theoretical data center of 8650 m², with an IT capacity of 5000 kW, an IT load of 85%, a rack power density of 5 kW, and a lifespan of 10 years. The environmental impact assessment highlights (i) the yearly total CO2eq by distinguishing among direct and indirect emissions; (ii) the yearly total CO2eq by lifecycle phase (i.e., manufacturing, distribution, installation, use and end-of-life); and (iii) the yearly total CO2eq by system (i.e., IT, core and shell, electrical, mechanical and other). Moreover, this research provides insights in the environmental impacts associated with capital goods, fuel and energy, upstream transportation, and waste in operations.
Results and discussion
In the baseline scenario, the global warming potential of the data center is 677,724 tCO2eq, with a cumulative energy consumption of 558,894 MWh and an emission factor of 925 kg/MWh per year. In Scenario 1, which considers a change in the server, storage, and networking refresh interval, the environmental impacts are 602,197 tCO2eq, but the cumulative energy consumption is equal to the baseline scenario. In Scenario 2, which implements a more efficient physical infrastructure and a generator use of 8760 h/year, the global warming potential is reduced by 28%, and the emission factor is 719 kg/MWh per year. Last, Scenario 3 considers the average European energy mix. It, other conditions being equal, represents the most impactful scenario from an environmental standpoint (686,575 tCO2eq).
Conclusion
The implementation of a more efficient physical infrastructure should consider a natural gas power system instead of a diesel one and an annual generator use of 8760 h/year. Environmental benefits can be achieved through intervention on the cooling system, by substituting chilled water with DX systems. In the case of a Tier IV data center, the energy mix does not affect the environmental performance, and Scope 2 emissions are equal to zero.
... Moreover, the increasing use of cloud computing has also been identified as a significant contributor to digital carbon emissions. Research by Andrae and Edler [138] suggests that the shift towards cloud-based services is leading to higher energy consumption, as more businesses and individuals store their data in the cloud, relying on massive data centers that require constant power and cooling. Masanet et al. [107] highlighted that while the energy efficiency of data centers has improved, their overall carbon emissions have not decreased proportionally due to the rapid growth in digital data traffic. ...
Digital transformation, powered by technologies like AI, IoT, and big data, is reshaping industries and societies at an unprecedented pace. While these innovations promise smarter energy management, precision agriculture, and efficient resource utilization, they also introduce serious environmental challenges. This paper examines the dual impact of digital technologies, highlighting key threats such as rising energy consumption, growing e-waste, and the increased extraction of raw materials. By synthesizing the existing literature, this study highlights mitigation strategies that include adopting energy-efficient practices, integrating renewable energy, and implementing circular economy principles. It emphasizes the need for a balanced approach-making the most of technological advances while protecting the environment. By identifying gaps in the current research, this paper also suggests future areas to explore to ensure that digital progress does not come at the expense of our planet. This review advocates for an integrated strategy to achieve sustainable digital growth aligned with global climate goals.
... The increasing global demand for computational efficiency [1] has pushed the development of conventional CMOS based logic and memory beyond the imagination of the materials scientists and physicists who pioneered the first generation of transistor based devices decades ago. With the limits of the present family of conventional-materials based devices (perpetually) on the horizon there has been an ongoing effort to design more exotic logic and memory systems, systems which * Author to whom any correspondence should be addressed. ...
GdN is a ferromagnetic semiconductor which has seen increasing interest in the preceding decades particularly in the areas of spin- and superconducting- based electronics. Here we report a detailed computational and optical spectroscopy study of the electronic structure of stoichiometric and nitrogen vacancy doped GdN. Based on our calculations we provide the effective mass tensor for undoped GdN, and some indicative values for electron doped GdN. Such a property is valuable as it can affect device design, and can be measured experimentally to validate the existing computation results.
... 44 Scenario modelling for data center GHG emissions To determine total power consumption and GHG emissions, we combine a bottom-up approach, similar to Stobbe et al., 39 with topdown principles and extrapolation for various assumptions, as seen in the works of Andrae. 23,41 Our aim is to maintain flexibility in the model to accommodate variations in assumptions regarding demand growth and efficiency gains. Overall, while our study draws on methodological insights from all the referenced studies, it does not adhere strictly to any single approach. ...
... The cryptocurrency energy consumption of the blockchain is taking since the blockchain is used for metaverse transactions. Since the cryptocurrency's energy numbers are only available before 2022, an exponential function is utilized to evaluate future energy consumption based on previous values [351], [352]. ...
... By multiplying the proportion of the market size metaverse to the market size global IT and energy numbers, we can assess the global metaverse energy consumption from the year 2022 to 2030 with every layer [351]. Figure 19 plots the growth trend and results. ...
To limit the impacts of climate change, the carbon dioxide CO2 emissions (CE) correlated with the energy sector must be decreased. Reduction of CE will have a positive effect on the atmosphere by avoiding the adverse impact of global warming. To attain an eco-environment, the initial energy resource needs to move from traditional fossil fuels to unpolluted renewable energy (RE). Thus, enhancing the utilization of RE actively decreases air pollution and adds secure sustainable energy allocation to ensure future energy needs. Integrating sources of RE not only drops CE but also decreases fuel consumption, leading to significant economic savings. This paper presents the transition of global energy that will have a largely positive impact on the growth and future stability of economies with cost-effective and more sustainable all over the world. Significant reductions can be accomplished by using applicable policies and technologies. In the context of current discussions about climate change and the reduction of CE, this paper critically analyses some policies, technologies, and commonly discussed solutions. Technologies like digital twin (DT), transfer learning (TL), Edge Computing (EC), Distributed Computing (DC), and some other technologies with their work for the reduction of CE are discussed thoroughly in this paper. The given techniques in this survey paper present the best optimal solutions for CE reduction.
... al (2007), energy suppliers record a growth in electricity consumption from 160 TWh/year in 2007 to 259 TWH/year in 2012. Likewise, Gelenbe and Caseau (2015) point out that consumption related to ICT , represents 4.7% worldwide, despite the fact that more efficient ICT technologies are developed every year, considering energy savings, but it is not considered sufficient to reverse the trend of growth in the energy footprint, causing ICTs are responsible for 23% of global greenhouse gas emissions in 2030 [4]. ...
Currently there is a significant increase in energy consumption, due to the use of electronic devices, at the same time the use of renewable energy has grown to reduce the impact of greenhouse gases. Therefore, the importance of implementing energy management in telecommunications networks to reduce costs and negative environmental impacts. So in this article we propose the implementation of an intelligent EMS architecture for telecommunications networks with the use of ZigBee and communication and data transfer elements. In addition, it has a server that collects and calculates energy generation and consumption data to establish usage and purchase patterns and creates useful information for statistical analysis. Finally, it is expected that this scheme will optimize the energy of the telecommunications network and result in energy savings.
... Therefore, the energy consumption of mobile devices is critical for achieving high-performance EI. Except for the end devices, the edge servers running 24/7 consume a large amount of energy and contribute a significant proportion of global carbon emissions, thus, the energy consumption on the edge server should also be reduced to achieve a green data center [42,44]. Even energy consumption has been considered in model split inference in previous works [31], considering the challenges that arise by integrating NOMA into model split inference in EI, the transmission power adjustment, the model split point selection, and the energy consumption are interactional and tightly coupled, which means that the algorithms proposed in previous studies cannot work effectively in NOMA-based EI. ...
... In this section, the details of the most related works of ECC are introduced. These works can be divided into two major categories: the NOMA-based MEC [23][24][25][26][27][28][29][30][31][32] and the model split and resource allocation in EI [9][10][11][12][13][14][15][36][37][38][39][40][41][42][43][44][45][46][47]. Additionally, the disadvantages of these works are also discussed in this section. ...
Even the artificial intelligence (AI) has been widely used and significantly changed our life, deploying the large AI models on resource limited edge devices directly is not appropriate. Thus, the model split inference is proposed to improve the performance of edge intelligence (EI), in which the AI model is divided into different sub-models and the resource-intensive sub-model is offloaded to edge server wirelessly for reducing resource requirements and inference latency. Unfortunately, with the sharp increasing of edge devices, the shortage of spectrum resource in edge network becomes seriously in recent years, which limits the performance improvement of EI. Refer to the NOMA-based edge computing (EC), integrating non-orthogonal multiple access (NOMA) technology with split inference in EI is attractive. However, the NOMA-based communication aspect and the influence of intermediate data transmission fail to be considered properly in model split inference of EI in previous works, and the sophistication in resource allocation caused by NOMA scheme makes it further complicated. Thus, the Effective Communication and Computing resource allocation algorithm is proposed in this paper for accelerating the split inference in NOMA-based EI, shorted as ECC. Specifically, the ECC takes the energy consumption and the inference latency into account to find the optimal model split strategy and resource allocation strategy (subchannel, transmission power, computing resource). Since the minimum inference delay and energy consumption cannot be satisfied simultaneously, the gradient descent (GD) based algorithm is adopted to find the optimal tradeoff between them. Moreover, the loop iteration GD approach (Li-GD) is developed to reduce the complexity of the GD algorithm caused by parameter discretization. The key idea of Li-GD is that: the initial value of the
i
th layer’s GD procedure is selected from the optimal results of the former (
i
- 1) layers’ GD procedure whose intermediate data size is the closest to
i
th layer. Additionally, the properties of the proposed algorithms are investigated, including convergence, complexity, and approximation error. The experimental results demonstrate that the performance of ECC is much better than that of the previous studies.