Seyedsaman Emami

Seyedsaman Emami
Universidad Autónoma de Madrid | UAM · Department of Computer Engineering

PhD student

About

6
Publications
119
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
2
Citations
Citations since 2017
6 Research Items
2 Citations
201720182019202020212022202301234
201720182019202020212022202301234
201720182019202020212022202301234
201720182019202020212022202301234

Publications

Publications (6)
Chapter
Gradient Boosting Machines (GBMs) have revealed outstanding proficiency in various machine learning applications, such as classification and regression. Gradient boosting builds a set of regression models in an iterative process, in which at each iteration, a regressor model is trained to reduce a given loss on a given objective. This paper propose...
Preprint
Full-text available
Deep learning has revolutionized the computer vision and image classification domains. In this context Convolutional Neural Networks (CNNs) based architectures are the most widely applied models. In this article, we introduced two procedures for training Convolutional Neural Networks (CNNs) and Deep Neural Network based on Gradient Boosting (GB), n...
Article
Full-text available
This paper presents a novel technique based on gradient boosting to train the final layers of a neural network (NN). Gradient boosting is an additive expansion algorithm in which a series of models are trained sequentially to approximate a given function. A neural network can also be seen as an additive expansion where the scalar product of the res...
Article
Full-text available
Deep learning has revolutionized computer vision and image classification domains. In this context Convolutional Neural Networks (CNNs) based architectures and Deep Neural Networks (DNNs) are the most widely applied models. In this article, we introduced two procedures for training CNNs and DNNs based on Gradient Boosting (GB), namely GB-CNN and GB...
Preprint
Full-text available
This paper presents a computationally efficient variant of gradient boosting for multi-class classification and multi-output regression tasks. Standard gradient boosting uses a 1-vs-all strategy for classifications tasks with more than two classes. This strategy translates in that one tree per class and iteration has to be trained. In this work, we...

Network