Gradient Descent Method
Mostrando 1-12 de 14 artigos, teses e dissertações.
-
1. A SURVEY ON MULTIOBJECTIVE DESCENT METHODS
We present a rigorous and comprehensive survey on extensions to the multicriteria setting of three well-known scalar optimization algorithms. Multiobjective versions of the steepest descent, the projected gradient and the Newton methods are analyzed in detail. At each iteration, the search directions of these methods are computed by solving real-valued optim
Pesqui. Oper.. Publicado em: 2014-12
-
2. Algoritmos Evolutivos aplicados ao Classificador baseado em Segmentos de Reta / Evolutive Algorithms applied to the Straight Line Segment Classifier
During the past years, the use of machine learning techniques have become into one of the most frequently performed tasks, due to the large amount of pattern recognition applications such as: voice recognition, text classification, face recognition, medical image diagnosis, among others. Thus, a great number of techniques dealing with this kind of problem ha
IBICT - Instituto Brasileiro de Informação em Ciência e Tecnologia. Publicado em: 03/07/2012
-
3. A Review of Gradient Algorithms for Numerical Computation of Optimal Trajectories
Abstract: In this paper, two classic direct methods for numerical computation of optimal trajectories were revisited: the steepest descent method and the direct one based upon the second variation theory. The steepest descent method was developed for a Mayer problem of optimal control, with free final state and fixed terminal times. Terminal constraints on t
J. Aerosp. Technol. Manag.. Publicado em: 2012-06
-
4. The global convergence of a descent PRP conjugate gradient method
Recently, Yu and Guan proposed a modified PRP method (called DPRP method) which can generate sufficient descent directions for the objective function. They established the global convergence of the DPRP method based on the assumption that stepsize is bounded away from zero. In this paper, without the requirement of the positive lower bound of the stepsize, w
Computational & Applied Mathematics. Publicado em: 2012
-
5. New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization
Based on the secant condition often satisfied by quasi-Newton methods, two new versions of the Hestenes-Stiefel (HS) nonlinear conjugate gradient method are proposed, which are descent methods even with inexact line searches. The search directions of the proposed methods have the form d k = - θkg k + βkHSd k-1, or d k = -g k + βkHSd k-1+ θky k-1. When ex
Computational & Applied Mathematics. Publicado em: 2009
-
6. 3D image registration of the human brain / Registro de imagens 3D do cerebro humano
Image Registration is the process that aligns two or more images in a common reference system of spacial coordinates [31]. It is an important problem with several applications in Medical Imaging, enabling, for instance, the analysis of changes in anatomy along time by the registration of images from the same modality, and the study of combined anatomic and p
Publicado em: 2009
-
7. Comparação de métodos de otimização para o problema de ajuste de histórico em ambientes paralelos
The process of history matching aims on the determination of the models parameters from a petroleum reservoir. Once adjusted, the models can be used for the prediction of the reservoir behavior. This work presents a comparsion of dierent optimization methods for this problem s solution. Derivative based methods are compared to a genetic algorithm. In particu
Publicado em: 2009
-
8. Gait-pattern adaptation algorithms using neural network / Algoritmos de adaptação do padrão de marcha utilizando redes neurais
Este trabalho apresenta o desenvolvimento de algoritmos de adaptação do padrão de marcha com a utilização de redes neurais artificiais para uma órtese ativa para membros inferiores. Trajetórias estáveis são geradas durante o processo de otimização, considerando um gerador de trajetórias baseado no critério do ZMP (Zero Moment Point) e no modelo
Publicado em: 2009
-
9. Accelerating the Levenberg-Marquardt method for the minimization of the square of functions with box constraints / Acelerando o metodo de Levenberg-Marquardt para a minimização da soma de quadrados de funções com restrições de caixa
In this work, we present an active set algorithm for minimizing the sum of squares of smooth functions, with box constraints. The algorithm is highly inspired in the work of Birgin and Mart´inez [4]. The differences are concentrated on the chosen search direction and on the use of an acceleration technique to update the step. At each iteration, we define an
Publicado em: 2008
-
10. Numerical computation of optimal low-thrust limited-power trajectories - Transfers between coplanar circular orbits
An algorithm based on gradient techniques, proposed in a companion paper, is applied to numerical analysis of optimal low-thrust limited-power trajectories for simple transfer (no rendezvous) between coplanar circular orbits in a central Newtonian gravity field. The proposed algorithm combines the main positive characteristics of two well-known methods in op
Journal of the Brazilian Society of Mechanical Sciences and Engineering. Publicado em: 2005-06
-
11. AGRUPAMENTO E VISUALIZAÇÃO DE DADOS SÍSMICOS ATRAVÉS DE QUANTIZAÇÃO VETORIAL / CLUSTERING AND VISUALIZATION OF SEISMIC DATA USING VECTOR QUANTIZATION
This thesis suggests the use of a new method of seismic data clustering that can aid in the visualization of seismic maps. Seismic data are primarily made of signal and noise and, due to its dual composition, have asymmetric distributions. Seismic data are traditionally classified by methods that lead the proposed groups` references to their mean values. The
Publicado em: 2004
-
12. On the convergence properties of the projected gradient method for convex optimization
When applied to an unconstrained minimization problem with a convex objective, the steepest descent method has stronger convergence properties than in the noncovex case: the whole sequence converges to an optimal solution under the only hypothesis of existence of minimizers (i.e. without assuming e.g. boundedness of the level sets). In this paper we look at
Computational & Applied Mathematics. Publicado em: 2003