Rede perceptron com camadas paralelas (PLP - Parallel Layer Perceptron)
AUTOR(ES)
Douglas Alexandre Gomes Vieira
FONTE
IBICT - Instituto Brasileiro de Informação em Ciência e Tecnologia
DATA DE PUBLICAÇÃO
18/12/2006
RESUMO
This work presents a novel approach to deal with the structural risk minimization (SRM) applied to a general machine learning problem. The formulation is based on the fundamental concept that supervised learning is a bi-objective optimization problem in which two conflicting objectives should be minimized. The objectives are related to the training error, empirical risk (Remp), and the machine complexity (?). In this work one general Q-norm like method to compute the machine complexity is presented and it can be used to model and compare most of the learning machines found in the literature. The main advantage of the proposed complexity measure is that it is a simple method to split the linear and non-linear complexity influences, leading to a better understanding of the learning process. One novel learning machine, the Parallel Layer Perceptron (PLP) network was proposed here using a training algorithm based on the definitions and structures of learning, the Minimum Gradient Method (MGM). The combination of the PLP with the MGM (PLP-MGM) is held using a reliable least-squares procedure and it is the main contribution of this work.
ASSUNTO(S)
ACESSO AO ARTIGO
http://hdl.handle.net/1843/BUOS-8CTH6WDocumentos Relacionados
- Predição recursiva de diâmetros de clones de eucalipto utilizando rede Perceptron de múltiplas camadas para o cálculo de volume
- Impinging flow parallel plates heat sinks
- Parallel queues with heterogeneous servers and probabilistics jockeying
- Reconhecimento de fonemas da lingua portuguesa pelo uso de redes neurais do tipo "perceptron" multi-camadas
- Algoritmos genéticos e processamento paralelo aplicados à definição e treinamento de redes neurais perceptron de múltiplas camadas