Figure - available via license: Creative Commons Attribution-ShareAlike 4.0 International
Content may be subject to copyright.
Source publication
The depth of networks plays a crucial role in the effectiveness of deep learning. However, the memory requirement for backpropagation scales linearly with the number of layers, which leads to memory bottlenecks during training. Moreover, deep networks are often unable to handle time-series appearing at irregular intervals. These issues can be resol...