On building next generation data centers: energy flow in the information technology stack.
ABSTRACT The demand for data center solutions with lower total cost of ownership and lower complexity of management is driving the creation of next generation datacenters The information technology industry is in the midst of a transformation to lower the cost of operation through consolidation and better utilization of critical data center resources. Successful consolidation necessitates increasing utilization of capital intensive "always-on" data center infrastructure, and reducing the recurring cost of power. A need exists, therefore for an end to end methodology that can be used to design and manage dense data centers and determine the cost of operating a data center. The chip core to the cooling tower model must capture the power levels and thermo-fluids behavior of chips, systems, aggregation of systems in racks, rows of racks, room flow distribution, air conditioning equipment, hydronics, vapor compression systems, pumps and heat exchangers. Earlier work has outlined the foundation for creation of a "smart" data center through use of flexible cooling resources and a distributed sensing and control system that can provision the cooling resources based on the need. This paper shows a common platform which serves as an evaluation and basis for policy based control engine for such a "smart" data center with much broader reach -- from chip core to the cooling tower. In this paper, we propose a data center solution, which has three components: Cooling, Power and Compute. These three components collectively improve efficiency and manageability of the data center by supporting greater compaction, flexible building blocks that can be dynamically configured, dynamic optimization, better monitoring and visualization, and policy-based control. Coefficient of performance (COP) of the ensemble is defined that represents an overall measure of the efficiency of performance of energy flow during the operation of a data center.
- SourceAvailable from: Halldor Janetzko[Show abstract] [Hide abstract]
ABSTRACT: Time series prediction methods are used on a daily basis by analysts for making important decisions. Most of these methods use some variant of moving averages to reduce the number of data points before prediction. However, to reach a good prediction in certain applications (e.g., power consumption time series in data centers) it is important to preserve peaks and their patterns. In this paper, we introduce automated peak-preserving smoothing and prediction algorithms, enabling a reliable long term prediction for seasonal data, and combine them with an advanced visual interface: (1) using high resolution cell-based time series to explore seasonal patterns, (2) adding new visual interaction techniques (multi-scaling, slider, and brushing & linking) to incorporate human expert knowledge, and (3) providing both new visual accuracy color indicators for validating the predicted results and certainty bands communicating the uncertainty of the prediction. We have integrated these techniques into a wellfitted solution to support the prediction process, and applied and evaluated the approach to predict both power consumption and server utilization in data centers with 70-80% accuracy.Computer Graphics Forum 01/2011; 30:691-700. · 1.64 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: The detection of frequently occurring patterns, also called motifs, in data streams has been recognized as an important task. To find these motifs, we use an advanced event encoding and pattern discovery algorithm. As a large time series can contain hundreds of motifs, there is a need to support interactive analysis and exploration. In addition, for certain applications, such as data center resource management, service managers want to be able to predict the next day’s power consumption from the previous months’ data. For this purpose, we introduce four novel visual analytics methods: (i) motif layout – using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs; (ii) motif distortion – enlarging or shrinking motifs for visualizing them more clearly; (iii) motif merging – combining a number of identical adjacent motif instances to simplify the display; and (iv) pattern preserving prediction – using a pattern-preserving smoothing and prediction algorithm to provide a reliable prediction for seasonal data. We have applied these methods to three real-world datasets: data center chilling utilization, oil well production, and system resource utilization. The results enable service managers to interactively examine motifs and gain new insights into the recurring patterns to analyze system operations. Using the above methods, we have also predicted both power consumption and server utilization in data centers with an accuracy of 70–80%.Information Visualization 01/2012; 11:71-83. · 1.00 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: The cost of electricity for datacenters is a substantial operational cost that can and should be managed, not only for saving energy, but also due to the ecologic commitment inherent to power consumption. This work proposes, formalizes and numerically evaluates LEAS, a low-energy scheduling model, for clearing scheduling markets, based on the maximization of welfare, subject to utility-level dependant energy costs. We promote energy-efficient policies in management of datacenters, to enhance the efficiency of modernized datacenters. We focus specifically on linear power models, and the implications of the inherent fixed costs related to energy consumption of modern datacenters. We rigorously test the model by running multiple simulation scenarios derived from real workload traces, and evaluate the results using common statistical methods. We conclude with positive results and implications for long-term sustainable management of modern datacenters.06/2011;