Archivio Istituzionale della Ricercahttps://iris.unive.itIl sistema di repository digitale IRIS acquisisce, archivia, indicizza, conserva e rende accessibili prodotti digitali della ricerca.Sat, 22 Feb 2020 04:50:01 GMT2020-02-22T04:50:01Z10771Globally convergent hybridization of particle
swarm optimization using line search-based
derivative-free techniqueshttp://hdl.handle.net/10278/44125Titolo: Globally convergent hybridization of particle
swarm optimization using line search-based
derivative-free techniques
Abstract: The hybrid use of exact and heuristic derivative-free methods for global unconstrained optimization problems is presented. Many real-world problems are modeled by computationally expensive functions, such as problems in simulationbased
design of complex engineering systems. Objective-function values are often provided by systems of partial differential equations, solved by computationally expensive
black-box tools. The objective-function is likely noisy and its derivatives are not provided. On the one hand, the use of exact optimization methods might be computationally too expensive, especially if asymptotic convergence properties are sought. On the other hand, heuristic methods do not guarantee the stationarity of their final solutions. Nevertheless, heuristic methods are usually able to provide an
approximate solution at a reasonable computational cost, and have been widely applied to real-world simulation-based design optimization problems. Herein, an overall
hybrid algorithm combining the appealing properties of both exact and heuristic methods is discussed, with focus on Particle Swarm Optimization (PSO) and line search-based derivative-free algorithms. The theoretical properties of the hybrid algorithm are detailed, in terms of limit points stationarity. Numerical results are presented for a test function and for two real-world optimization problems in
ship hydrodynamics.
Thu, 01 Jan 2015 00:00:00 GMThttp://hdl.handle.net/10278/441252015-01-01T00:00:00ZA Linesearch-based Derivative-free Approach for Nonsmooth Constrained Optimizationhttp://hdl.handle.net/10278/42404Titolo: A Linesearch-based Derivative-free Approach for Nonsmooth Constrained Optimization
Abstract: In this paper, we propose new linesearch-based methods for nonsmooth constrained optimization problems when first-order information on the problem functions is not available. In the first part, we describe a general framework for bound-constrained problems and analyze its convergence toward stationary points, using the Clarke-Jahn directional derivative. In the second part, we consider inequality constrained optimization problems where both objective function and constraints can possibly be nonsmooth. In this case, we first split the constraints into two subsets: difficult general nonlinear constraints and simple bound constraints on the variables. Then, we use an exact penalty function to tackle the difficult constraints and we prove that the original problem can be reformulated as the bound-constrained minimization of the proposed exact penalty function. Finally, we use the framework developed for the bound-constrained case to solve the penalized problem. Moreover, we prove that every accumulation point, under standard assumptions on the search directions, of the generated sequence of iterates is a stationary point of the original constrained problem. In the last part of the paper, we report extended numerical results on both bound-constrained and nonlinearly constrained problems, showing that our approach is promising when compared to some state-of-the-art codes from the literature.
Wed, 01 Jan 2014 00:00:00 GMThttp://hdl.handle.net/10278/424042014-01-01T00:00:00ZA Framework of Conjugate Direction Methods for Symmetric Linear Systems in Optimizationhttp://hdl.handle.net/10278/39515Titolo: A Framework of Conjugate Direction Methods for Symmetric Linear Systems in Optimization
Abstract: In this paper, we introduce a parameter-dependent class of Krylov-based methods, namely Conjugate Directions (Formula Presented.), for the solution of symmetric linear systems. We give evidence that, in our proposal, we generate sequences of conjugate directions, extending some properties of the standard conjugate gradient (CG) method, in order to preserve the conjugacy. For specific values of the parameters in our framework, we obtain schemes equivalent to both the CG and the scaled-CG. We also prove the finite convergence of the algorithms in (Formula Presented.), and we provide some error analysis. Finally, preconditioning is introduced for (Formula Presented.), and we show that standard error bounds for the preconditioned CG also hold for the preconditioned (Formula Presented.).
Thu, 01 Jan 2015 00:00:00 GMThttp://hdl.handle.net/10278/395152015-01-01T00:00:00ZA nonmonotone truncatedNewton-Krylov method exploiting negative curvature directions, forlarge scale unconstrained optimization: complete resultshttp://hdl.handle.net/10278/23960Titolo: A nonmonotone truncatedNewton-Krylov method exploiting negative curvature directions, forlarge scale unconstrained optimization: complete results
Abstract: We propose a new truncated Newton method for large scale unconstrained optimization, where
a Conjugate Gradient (CG)-based technique is adopted to solve Newton's equation. In the
current iteration, the Krylov method computes a pair of search directions: the first approximates the Newton step of the quadratic convex model, while the second is a suitable negative
curvature direction. A test based on the quadratic model of the objective function is used to
select the most promising between the two search directions. Both the latter selection rule
and the CG stopping criterion for approximately solving Newton's equation, strongly rely on
conjugacy conditions. An appropriate linesearch technique is adopted for each search direction:
a nonmonotone stabilization is used with the approximate Newton step, while an Armijo type
linesearch is used for the negative curvature direction. We prove both the global and the superlinear convergence to stationary points satisfying second order necessary conditions.
We carry out a significant numerical experience in order to test our proposal.
Tue, 01 Jan 2008 00:00:00 GMThttp://hdl.handle.net/10278/239602008-01-01T00:00:00ZDense orthogonal initialization for deterministic PSO: ORTHOinit+http://hdl.handle.net/10278/3684793Titolo: Dense orthogonal initialization for deterministic PSO: ORTHOinit+
Fri, 01 Jan 2016 00:00:00 GMThttp://hdl.handle.net/10278/36847932016-01-01T00:00:00ZNotes on a 3-term Conjugacy Recurrence forthe Iterative Solution of Symmetric Linear Systemshttp://hdl.handle.net/10278/19028Titolo: Notes on a 3-term Conjugacy Recurrence forthe Iterative Solution of Symmetric Linear Systems
Abstract: We consider a 3-term recurrence, namely CG_2step, for the iterative solution of symmetric linear systems. The new algorithm generates conjugate directions and extends some standard theoretical properties of the Conjugate Gradient (CG) method [10]. We prove the finite convergence of CG_2step, and we provide some error analysis.
Then, we introduce preconditioning for CG_2step, and we prove that standard error bounds for the CG also hold for our proposal.
Tue, 01 Jan 2008 00:00:00 GMThttp://hdl.handle.net/10278/190282008-01-01T00:00:00ZPreconditioning Newton-Krylov Methods in Non-Convex Large Scale Optimizationhttp://hdl.handle.net/10278/38255Titolo: Preconditioning Newton-Krylov Methods in Non-Convex Large Scale Optimization
Abstract: We consider an iterative preconditioning technique for non-convex large scale optimization. First, we refer to the solution of large scale indefinite linear systems by using a Krylov subspace method, and describe the iterative construction of a preconditioner which does not involve matrices products or matrices storage. The set of directions generated by the Krylov subspace method is used, as by product, to provide an approximate inverse preconditioner. Then, we experience our preconditioner within Truncated Newton schemes for large scale unconstrained optimization, where we generalize the truncation rule by Nash–Sofer (Oper. Res. Lett. 9:219–221, 1990) to the indefinite case, too. We use a Krylov subspace method to both approximately solve the Newton equation and to construct the preconditioner to be used at the current outer iteration. An extensive numerical experience shows that the proposed preconditioning strategy, compared with the unpreconditioned strategy and PREQN (Morales and Nocedal in SIAM J. Optim. 10:1079–1096, 2000), may lead to a reduction of the overall inner iterations. Finally, we show that our proposal has some similarities with the Limited Memory Preconditioners (Gratton et al. in SIAM J. Optim. 21:912–935, 2011).
Tue, 01 Jan 2013 00:00:00 GMThttp://hdl.handle.net/10278/382552013-01-01T00:00:00ZA nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimizationhttp://hdl.handle.net/10278/31199Titolo: A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization
Abstract: We propose a new truncated Newton method for large scale unconstrained optimization, where a Conjugate Gradient (CG)-based technique is adopted to solve Newton’s equation. In the current iteration, the Krylov method computes a pair of
search directions: the first approximates the Newton step of the quadratic convex model, while the second is a suitable negative curvature direction. A test based on the quadratic model of the objective function is used to select the most promising between
the two search directions. Both the latter selection rule and the CG stopping criterion for approximately solving Newton’s equation, strongly rely on conjugacy conditions.
An appropriate linesearch technique is adopted for each search direction: a nonmonotone stabilization is used with the approximate Newton step, while an Armijo type
linesearch is used for the negative curvature direction. The proposed algorithm is both globally and superlinearly convergent to stationary points satisfying second order necessary
conditions. We carry out a significant numerical experience in order to test our proposal.
Thu, 01 Jan 2009 00:00:00 GMThttp://hdl.handle.net/10278/311992009-01-01T00:00:00ZInitial particles position for PSO, in Bound Constrained Optimizationhttp://hdl.handle.net/10278/38708Titolo: Initial particles position for PSO, in Bound Constrained Optimization
Abstract: We consider the solution of bound constrained optimization
problems, where we assume that the evaluation of the objective function is costly, its derivatives are unavailable and the use of exact derivative free algorithms may imply a too large computational burden. There is plenty of real applications, e.g. several design optimization problems [1,
2], belonging to the latter class, where the objective function must be treated as a ‘black-box’ and automatic differentiation turns to be unsuitable. Since the objective function is often obtained as the result of a simulation, it might be affected also by noise, so that the use of finite
differences may be definitely harmful. In this paper we consider the use of the evolutionary Particle Swarm
Optimization (PSO) algorithm, where the choice of the parameters is inspired by [4], in order to avoid diverging trajectories of the particles, and help the exploration of the feasible set. Moreover, we extend the ideas in [4] and propose a specific set of initial particles position for the bound constrained problem.
Tue, 01 Jan 2013 00:00:00 GMThttp://hdl.handle.net/10278/387082013-01-01T00:00:00ZPSO-based tuning of MURAME parameters for creditworthiness evaluation of Italian SMEshttp://hdl.handle.net/10278/3685583Titolo: PSO-based tuning of MURAME parameters for creditworthiness evaluation of Italian SMEs
Sun, 01 Jan 2017 00:00:00 GMThttp://hdl.handle.net/10278/36855832017-01-01T00:00:00Z