Finding the number of hidden neurons for an MLP neural network using coarse to fine search technique
ABSTRACT Skin detection is an important preliminary process for subsequent feature extraction in image processing techniques. There are several techniques that are used for skin detection. In this work, the multi-layer perceptron (MLP) neural network is used. One of the important aspects of MLP is how to determine the network topology. The number of neurons in the inputs and output layers are determined by the number of available inputs and required outputs respectively. Thus, the only thing remaining is how to determine the number of neurons in the hidden layer. Therefore, we employed the coarse to fine search method to find the number of neurons. First, the number of hidden neurons is initially set using the binary search mode, HN=1, 2, 4, 8, 16, 32, 64 and 128, where HN indicates the number of hidden neurons. The 30 networks with these HN values are trained and their Mean Squared Error (MSE) is calculated. Then a sequential search, fine search, will be used in the neighbourhood of the HN that gave the lowest MSE. The selected number of neurons in the hidden layer is the lowest HN that gave the lowest MSE. The YCbCr colour space is used in this work due to its capability to separate the luminance and chrominance components explicitly. Several chrominance components are investigated.
Conference Proceeding: Structure adaptation in feed-forward neural networks[show abstract] [hide abstract]
ABSTRACT: In this paper two new structures (algorithms) are proposed for adaptively adjusting the network structure. Both neuron pruning and neuron generating are considered for a feedforward neural network. Simulations results are presented to confirm the improvements that are obtained as a result of utilizing the proposed algorithmsNeural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on; 01/1994
- 01/1994; Prentice Hall.
Conference Proceeding: A Real-Time Learning Algorithm for Two-Hidden-Layer Feedforward Networks[show abstract] [hide abstract]
ABSTRACT: In some practical applications, the request of time complexity is more rigidder than space complexity. However, current neural networks seem far from the stardard of real-time applications. In the previous paper of Huang, it has been proved in a novel constructive method that two-hidden-layer feedforward networks (TLFNs) with 2√(m+2)N(«N) hidden neurons can learn any N distinct samples (X i ,t i ) with any arbitrarily small error, where m in the required number of output neurons. On the theoritical basis of previous results, this paper will introduce an improved constructive method of TLFN with real-time learning capacity. The results shown in this paper will prove that both the training and generalization errors of the new TLFN can reach arbitrarily small values if sufficient distinctive training samples are provided. Additionally, this paper will use some experimental results to show the comparison of learning time with traditional gradient descent based learning. methods such as back-propogation (BP) algorithm. The learning algorithm for two-hidden-layer feedforward neural net-works is able to learn any set of oberservations just in one short iteration (one instead of large number of learning epoches) with acceptable learning and testing accuracyControl and Automation, 2003. ICCA '03. Proceedings. 4th International Conference on; 07/2003