Content uploaded by Matteo Gagliolo

Author content

All content in this area was uploaded by Matteo Gagliolo on Dec 01, 2014

Content may be subject to copyright.

A preview of the PDF is not available

Existing Support Vector Machines (SVMs) need pre-wired finite time windows to predict and clas-sify time series. They do not have an internal state necessary to deal with sequences involving arbitrary long-term dependencies. Here we introduce the first recurrent, truly sequential SVM-like devices with in-ternal adaptive states, trained by a novel method called EVOlution of systems with KErnel-based outputs (Evoke), an instance of the recent Evolino class of methods [1, 2]. Evoke evolves recurrent network-like structures to detect and represent temporal dependencies while using quadratic programming/support vector regression to produce precise outputs, in contrast to our recent work [1, 2] which instead uses pseudoinverse regression. Evoke is the first SVM-based mechanism able to learn to classify a context-sensitive language. It also outperforms recent state-of-the-art gradient-based recurrent neural networks (RNNs) on various time series prediction tasks.

Figures - uploaded by Matteo Gagliolo

Author content

All figure content in this area was uploaded by Matteo Gagliolo

Content may be subject to copyright.

Content uploaded by Matteo Gagliolo

Author content

All content in this area was uploaded by Matteo Gagliolo on Dec 01, 2014

Content may be subject to copyright.

A preview of the PDF is not available

... In all these discussed attempts there has not been a recurrent SVM which learns tasks involving time lags of arbitrary length between important input events. However, a pioneering attempt using real valued SVM and neuroevolution for sequence prediction was done by Schmidhuber, et al. [18]. Unfortunately at present the research activity on recurrent SVM is very scarce. ...

... LSTM-CSVM is a Evolino and Evoke based system [18,19]: the underlying idea of these systems is that it is needed two cascade modules: a robust module to process short and long-time dependencies (LSTM) and an optimization module to produce precise outputs (CSVM, Moore-Penrose pseudo inverse method, SVM respectively). The LSTM module addresses the disadvantage of having relevant pieces of information outside the history window and also avoids the problem of the "vanishing error" presented by algorithms like Back-Propagation Through Time (BPTT, e.g., Williams an Zipser 1992) or Real-Time-Recurrent Learning ( RTRL, e.g., Robinson and Fallside 1987) 1 . ...

This chapter introduces a generalization of the real- and complex-valued SVM’s using the Clifford algebra. In this framework
we handle the design of kernels involving the geometric product for linear and nonlinear classification and regression. The
major advantage of our approach is that we redefine the optimization variables as multivectors. This allows us to have a multivector
as output and, therefore we can represent multiple classes according to the dimension of the geometric algebra in which we
work. By using the CSVM with one Clifford kernel we reduce the complexity of the computation greatly. This can be done thanks
to the Clifford product, which performs the direct product between the spaces of different grade involved in the optimization
problem. We conduct comparisons between CSVM and the most used approaches to solve multi-class classification to show that
ours is more suitable for practical use on certain type of problems. In this chapter are included several experiments to show
the application of CSVM to solve classification and regression problems, as well as 3D object recognition for visual guided
robotics. In addition, it is shown the design of a recurrent system involving LSTM network connected with CSVM and we study
the performance of this system with time series experiments and robot navigation using reinforcement learning.

... We call the lagged time series of the signal 291 "ownlag" and of the feature "featurelag". In [19] statistical methods, are "cross-correlation" for detecting featurelags and "autocorrelation" for detecting ownlags. ...

System modeling is a vital part of building energy optimization and control. Grey and white box modeling requires knowledge about the system and a lot of human assistance, which results in costs. In the common case, that information about the system is lacking, the feasibility of grey and white box models decreases further. The installation of sensors and the availability of monitoring data is growing rapidly within building energy systems. This enables the exploitation of statistical modeling, which is already well established in other sectors like computer science and finance. Thus, the present work investigates data-driven machine learning models to explore their potential for modeling building energy systems. The focus is to develop an efficient methodology for data-driven modeling. For this purpose, a comprehensive literature review for detecting optimization methods is conducted. Furthermore, the methodology is implemented in Python and an automated modeling tool is designed. It is used to model various energy systems based on monitoring data; seven use cases on three different systems reveal good results. The models can be used for forecasting, potential analysis, the implementation of various control strategies or as a replacement for missing information within the field of grey box modeling.

This paper introduces the recurrent Clifford support vector machines (RCSVM). First we explain the generalization of the real- and complex-valued support vector machines using the Clifford geometric algebra. In this framework we handle the design of kernels involving the Clifford or geometric product and one redefines the optimization variables as multivectors. This allows us to have a multivector as output therefore we can represent multiple classes according to the dimension of the geometric algebra in which we work. We show that one can apply CSVM to build a recurrent CSVM.We study the performance of the recurrent CSVM with experiments using time series and tasks of visually guided robotics.

This paper presents an improvement of a recurrent learning system called LSTM-CSVM (introduced in [1]) for robot navigation
applications, this approach is used to deal with some of the main issues addressed in the research area: the problem of navigation
on large domains, partial observability, limited number of learning experiences and slow learning of optimal policies. The
advantages of this new version of LSTM-CSVM system, are that it can find optimal paths through mazes and it reduces the number
of generations to evolve the system to find the optimal navigation policy, therefore either the training time of the system
is reduced. This is done by adding an heuristic methodoly to find the optimal path from start state to the goal state.can
contain information about the whole environment or just partial information about it.

Current Neural Network learning algorithms are limited in their ability to model non-linear dynamical systems. Most supervised gradient-based recurrent neural networks (RNNs) suffer from a vanishing error signal that prevents learning from inputs far in the past. Those that do not, still have problems when there are numerous local minima. We introduce a general framework for sequence learning, EVOlution of recurrent systems with LINear outputs (Evolino). Evolino uses evolution to discover good RNN hidden node weights, while using methods such as linear regression or quadratic programming to compute optimal linear mappings from hidden state to output. Using the Long Short-Term Memory RNN Architecture, the method is tested in three very different problem domains: 1) context-sensitive languages, 2) multiple superimposed sine waves, and 3) the Mackey-Glass system. Evolino performs exceptionally well across all tasks, where other methods show notable deficiencies in some. 1

Existing Recurrent Neural Networks (RNNs) are limited in their ability to model dynamical systems with nonlinearities and hidden internal states. Here we use our general framework for sequence learning, EVOlution of recurrent systems with LINear Outputs (Evolino), to discover good RNN hidden node weights through evolution, while using linear regression to compute an optimal linear mapping from hidden state to output. Using the Long Short-Term Memory RNN Architecture, Evolino outperforms previous state-of-the-art methods on several tasks: 1) context-sensitive languages, 2) multiple superimposed sine waves.

Setting of the learning problem consistency of learning processes bounds on the rate of convergence of learning processes controlling the generalization ability of learning processes constructing learning algorithms what is important in learning theory?.