Seyedsaman Emami

Seyedsaman Emami
Universidad Autónoma de Madrid | UAM · Department of Computer Engineering

PhD student

About

5
Publications
84
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
2
Citations
Citations since 2017
5 Research Items
2 Citations
20172018201920202021202220230.00.51.01.52.02.53.0
20172018201920202021202220230.00.51.01.52.02.53.0
20172018201920202021202220230.00.51.01.52.02.53.0
20172018201920202021202220230.00.51.01.52.02.53.0

Publications

Publications (5)
Preprint
Full-text available
Deep learning has revolutionized the computer vision and image classification domains. In this context Convolutional Neural Networks (CNNs) based architectures are the most widely applied models. In this article, we introduced two procedures for training Convolutional Neural Networks (CNNs) and Deep Neural Network based on Gradient Boosting (GB), n...
Article
Full-text available
This paper presents a novel technique based on gradient boosting to train the final layers of a neural network (NN). Gradient boosting is an additive expansion algorithm in which a series of models are trained sequentially to approximate a given function. A neural network can also be seen as an additive expansion where the scalar product of the res...
Article
Full-text available
Deep learning has revolutionized computer vision and image classification domains. In this context Convolutional Neural Networks (CNNs) based architectures and Deep Neural Networks (DNNs) are the most widely applied models. In this article, we introduced two procedures for training CNNs and DNNs based on Gradient Boosting (GB), namely GB-CNN and GB...
Preprint
Full-text available
This paper presents a computationally efficient variant of gradient boosting for multi-class classification and multi-output regression tasks. Standard gradient boosting uses a 1-vs-all strategy for classifications tasks with more than two classes. This strategy translates in that one tree per class and iteration has to be trained. In this work, we...

Network