Figure - uploaded by Andrey Poddubnyy
Content may be subject to copyright.
Bar chart with cost shares for different cases.

Bar chart with cost shares for different cases.

Source publication
Conference Paper
Full-text available
In this paper, the potential of reducing the electricity cost with the help of energy storage and controllable loads on the Russian retail electricity market is studied. The retail electricity market in Russia features different price categories, allowing consumers to reduce expenses with proper planning. Approximately 2/3 of the cost in flexible p...

Similar publications

Article
Full-text available
With the continuous development of China’s economy, the process of urbanisation and industrialisation has been accelerating, and coupled with the emergence of new energy, China’s urban energy consumption has undergone huge transformation. The urban energy Internet (UEI) is an important part of the country’s ‘carbon peak and carbon neutrality’, and...

Citations

... There have been many studies in recent years, dedicated to the large-scale deployment of charging stations. Some of them were mainly focused on the best allocation of EV stations in the grid [7], [8], others on the best battery load shape to meet the sustainable energy supply or off-peak hours [9], [10]. In this paper the EV charging problem is discussed from both perspectives, considering the personal needs of an EV owner, as well as the requirements of the grid. ...
Article
Full-text available
The extensive penetration of distributed energy resources (DERs), particularly electric vehicles (EVs), creates a huge challenge for the distribution grids due to the limited capacity. An approach for smart charging might alleviate this issue, but most of the optimization algorithms has been developed so far under an assumption of knowing the future, or combining it with complicated forecasting models. In this paper we propose to use reinforcement learning (RL) with replaying past experience to optimally operate an EV charger. We also introduce explorative rewards for better adjusting to environment changes. The reinforcement learning agent controls the charger’s power of consumption to optimize expenses and prevent lines and transformers from being overloaded. The simulations were carried out in the IEEE 13 bus test feeder with the load profile data coming from the residential area. To simulate the real availability of data, an agent is trained with only the transformer current and the local charger’s state, like state of the charge (SOC) and timestamp. Several algorithms, namely Q-learning, SARSA, Dyna-Q and Dyna-Q+ are tested to select the best one to utilize in the stochastic environment and low frequency of data streaming