Figure 1 - uploaded by Soheil Kolouri
Content may be subject to copyright.
Depiction of catastrophic forgetting in binary classification tasks when there is a distribution shift from an initial task to a secondary task. When exposed to the distribution of the new task, the uniformly plastic parametric model, f (·, θ), conforms to the new distribution with no constraints on maintaining its performance on the previous task.

Depiction of catastrophic forgetting in binary classification tasks when there is a distribution shift from an initial task to a secondary task. When exposed to the distribution of the new task, the uniformly plastic parametric model, f (·, θ), conforms to the new distribution with no constraints on maintaining its performance on the previous task.

Source publication
Preprint
Full-text available
Catastrophic forgetting/interference is a critical problem for lifelong learning machines, which impedes the agents from maintaining their previously learned knowledge while learning new tasks. Neural networks, in particular, suffer plenty from the catastrophic forgetting phenomenon. Recently there has been several efforts towards overcoming catast...

Contexts in source publication

Context 1
... leads to a fundamental challenge in lifelong learning known as 'catastrophic forgetting/interference', which indicates that a learning agent forgets its previously acquired information when learning a new task. A cartoon depiction of catastrophic forgetting is depicted in Figure 1. An ideal system should provide a balance between its plasticity and stability in order to acquire new information while preserving the old one (e.g., the decision boundary in the rightmost panel in Figure 1). ...
Context 2
... cartoon depiction of catastrophic forgetting is depicted in Figure 1. An ideal system should provide a balance between its plasticity and stability in order to acquire new information while preserving the old one (e.g., the decision boundary in the rightmost panel in Figure 1). ...
Context 3
... simply recording training samples (e.g., episodic memory), to utilizing generative models (e.g., generative adversarial networks, GANs) to learn/memorize the distribution of the data. The idea behind these methods is to make the training samples as identically distributed as possible, by adding random samples from the old distribution to the newly observed training data, providing an identically distributed data that gets close to the ideal case shown in Figure 1. ...
Context 4
... leads to a fundamental chal- lenge in lifelong learning known as 'catastrophic forget- ting/interference', which indicates that a learning agent for- gets its previously acquired information when learning a new task. A cartoon depiction of catastrophic forgetting is depicted in Figure 1. An ideal system should provide a balance between its plasticity and stability in order to ac- quire new information while preserving the old one (e.g., the decision boundary in the rightmost panel in Figure 1). ...
Context 5
... cartoon depiction of catastrophic forgetting is depicted in Figure 1. An ideal system should provide a balance between its plasticity and stability in order to ac- quire new information while preserving the old one (e.g., the decision boundary in the rightmost panel in Figure 1). ...
Context 6
... simply recording train- ing samples (e.g., episodic memory), to utilizing genera- tive models (e.g., generative adversarial networks, GANs) to learn/memorize the distribution of the data. The idea behind these methods is to make the training samples as identically distributed as possible, by adding random sam- ples from the old distribution to the newly observed train- ing data, providing an identically distributed data that gets close to the ideal case shown in Figure 1. ...

Similar publications

Preprint
Full-text available
There is currently a debate within the neuroscience community over the likelihood of the brain performing backpropagation (BP). To better mimic the brain, training a network \textit{one layer at a time} with only a "single forward pass" has been proposed as an alternative to bypass BP; we refer to these networks as "layer-wise" networks. We continu...
Article
Full-text available
As a crucial parameter in estimating precipitable water vapor from tropospheric delay, the weighted mean temperature (Tm) plays an important role in Global Navigation Satellite System (GNSS)-based water vapor monitoring techniques. However, the rigorous calculation of Tm requires vertical profiles of temperature and water vapor pressure that are di...
Article
Full-text available
Vibration signals of rolling element bearings (REBs) contain substantial bearing motion state information. However, the property of nonlinear and nonstationary vibration signals decreases the diagnostic accuracy of REBs. To improve the accuracy of fault diagnosis for REBs, an ensemble approach based on ensemble empirical mode decomposition (EEMD),...
Article
Full-text available
A recurrent neural network (RNN) combines variable-length input data with a hidden state that depends on previous time steps to generate output data. RNNs have been widely used in time-series data analysis, and various RNN algorithms have been proposed, such as the standard RNN, long short-term memory (LSTM), and gated recurrent units (GRUs). In pa...
Article
Full-text available
The present study illustrates the efficiency of the Artificial Neural Network to model the relationship between the collapse potential of gypseous sandy soil with the soil parameters. Sandy soils were taken from four different regions in Iraq to make 180 samples with different properties. The laboratory program involved the estimation of the collap...