dc.contributor.advisor | Bhatnagar, Shalabh | |
dc.contributor.author | Reddy, Danda Sai Koti | |
dc.date.accessioned | 2018-05-22T15:14:41Z | |
dc.date.accessioned | 2018-07-31T04:39:11Z | |
dc.date.available | 2018-05-22T15:14:41Z | |
dc.date.available | 2018-07-31T04:39:11Z | |
dc.date.issued | 2018-05-22 | |
dc.date.submitted | 2017 | |
dc.identifier.uri | https://etd.iisc.ac.in/handle/2005/3582 | |
dc.identifier.abstract | http://etd.iisc.ac.in/static/etd/abstracts/4450/G28169-Abs.pdf | en_US |
dc.description.abstract | Optimization problems involving uncertainties are common in a variety of engineering disciplines such as transportation systems, manufacturing, communication networks, healthcare and finance. The large number of input variables and the lack of a system model prohibit a precise analytical solution and a viable alternative is to employ simulation-based optimization. The idea here is to simulate a few times the stochastic system under consideration while updating the system parameters until a good enough solution is obtained.
Formally, given only noise-corrupted measurements of an objective function, we wish to end a parameter which minimises the objective function. Iterative algorithms using statistical methods search the feasible region to improve upon the candidate parameter. Stochastic approximation algorithms are best suited; most studied and applied algorithms for funding solutions when the feasible region is a continuously valued set. One can use information on the gradient/Hessian of the objective to aid the search process. However, due to lack of knowledge of the noise distribution, one needs to estimate the gradient/Hessian from noisy samples of the cost function obtained from simulation. Simple gradient search schemes take much iteration to converge to a local minimum and are heavily dependent on the choice of step-sizes. Stochastic Newton methods, on the other hand, can counter the ill-conditioning of the objective function as they incorporate second-order information into the stochastic updates. Stochastic Newton methods are often more accurate than simple gradient search schemes.
We propose enhancements to the Hessian estimation scheme used in two recently proposed stochastic Newton methods, based on the ideas of random directions stochastic approximation (2RDSA) [21] and simultaneous perturbation stochastic approximation (2SPSA-31) [6], respectively. The proposed scheme, inspired by [29], reduces the error in the Hessian estimate by
(i) Incorporating a zero-mean feedback term; and (ii) optimizing the step-sizes used in the Hessian recursion. We prove that both 2RDSA and 2SPSA-3 with our Hessian improvement scheme converges asymptotically to the true Hessian. The key advantage with 2RDSA and 2SPSA-3 is that they require only 75% of the simulation cost per-iteration for 2SPSA with improved Hessian estimation (2SPSA-IH) [29]. Numerical experiments show that 2RDSA-IH outperforms both 2SPSA-IH and 2RDSA without the improved Hessian estimation scheme. | en_US |
dc.language.iso | en_US | en_US |
dc.relation.ispartofseries | G28169 | en_US |
dc.subject | Stochastic Newton Methods | en_US |
dc.subject | Hessian Estimation | en_US |
dc.subject | Simultaneous Perturbation Stochastic Approximation (SPSA) | en_US |
dc.subject | Random Directions Stochastic Approximation (RDSA) | en_US |
dc.subject | Finite-difference Stochastic Approximation (FDSA) | en_US |
dc.subject | Hessian Estimation Scheme | en_US |
dc.subject.classification | Computer Science | en_US |
dc.title | Stochastic Newton Methods With Enhanced Hessian Estimation | en_US |
dc.type | Thesis | en_US |
dc.degree.name | MSc Engg | en_US |
dc.degree.level | Masters | en_US |
dc.degree.discipline | Faculty of Engineering | en_US |