Hybrid Methods for Nonlinear Least Squares Problems
Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
2019 - English
This contribution contains a description and analysis of effective methods for minimization of the nonlinear least squares function F(x) = (1=2)fT (x)f(x), where x ∈ Rn and f ∈ Rm, together with extensive computational tests and comparisons of the introduced methods. All hybrid methods are described in detail and their global convergence is proved in a unified way. Some proofs concerning trust region methods, which are difficult to find in the literature, are also added. In particular, the report contains an analysis of a new simple hybrid method with Jacobian corrections (Section 8) and an investigation of the simple hybrid method for sparse least squares problems proposed previously in [33] (Section 14).
Keywords:
numerical optimization; nonlinear least squares; trust region methods; hybrid methods; sparse problems; partially separable problems; numerical experiments
Available in a digital repository NRGL
Hybrid Methods for Nonlinear Least Squares Problems
This contribution contains a description and analysis of effective methods for minimization of the nonlinear least squares function F(x) = (1=2)fT (x)f(x), where x ∈ Rn and f ∈ Rm, together with ...
Does a Singular Symmetric Interval Matrix Contain a Symmetric Singular Matrix?
Rohn, Jiří
2019 - English
We consider the conjecture formulated in the title concerning existence of a symmetric singular matrix in a singular symmetric interval matrix. We show by means of a counterexample that it is generally not valid, and we prove that it becomes true under an additional assumption of positive semide niteness of the midpoint matrix. The proof is constructive.
Keywords:
symmetric interval matrix; singularity; positive semide niteness
Available in a digital repository NRGL
Does a Singular Symmetric Interval Matrix Contain a Symmetric Singular Matrix?
We consider the conjecture formulated in the title concerning existence of a symmetric singular matrix in a singular symmetric interval matrix. We show by means of a counterexample that it is ...
Transforming hierarchical images to program expressions using deep networks
Křen, Tomáš
2018 - English
We present a technique describing how to effectively train a neural network given an image to produce a formal description of the given image. The basic motivation of the proposed technique is an intention to design a new tool for automatic program synthesis capable of transforming sensory data (in our case static image, but generally a phenotype) to a formal code expression (i.e. syntactic tree of a program), such that the code (from evolutionary perspective a genotype) evaluates to a value that is similar to the input data, ideally identical. Our approach is partially based on our technique for generating program expressions in the context of typed functional genetic programming. We present promising results evaluating a simple image description language achieved with a deep network combining convolution encoder of images and recurrent decoder for generating program expressions in the sequential prefix notation and propose possible future applications.
Keywords:
deep networks; automatic program synthesis; image processing
Available in a digital repository NRGL
Transforming hierarchical images to program expressions using deep networks
We present a technique describing how to effectively train a neural network given an image to produce a formal description of the given image. The basic motivation of the proposed technique is an ...
Nonparametric Bootstrap Techniques for Implicitly Weighted Robust Estimators
Kalina, Jan
2018 - English
The paper is devoted to highly robust statistical estimators based on implicit weighting, which have a potential to find econometric applications. Two particular methods include a robust correlation coefficient based on the least weighted squares regression and the minimum weighted covariance determinant estimator, where the latter allows to estimate the mean and covariance matrix of multivariate data. New tools are proposed allowing to test hypotheses about these robust estimators or to estimate their variance. The techniques considered in the paper include resampling approaches with or without replacement, i.e. permutation tests, bootstrap variance estimation, and bootstrap confidence intervals. The performance of the newly described tools is illustrated on numerical examples. They reveal the suitability of the robust procedures also for non-contaminated data, as their confidence intervals are not much wider compared to those for standard maximum likelihood estimators. While resampling without replacement turns out to be more suitable for hypothesis testing, bootstrapping with replacement yields reliable confidence intervals but not corresponding hypothesis tests.
Keywords:
robust statistics; econometrics; correlation coefficient; multivariate data
Fulltext is available at external website.
Nonparametric Bootstrap Techniques for Implicitly Weighted Robust Estimators
The paper is devoted to highly robust statistical estimators based on implicit weighting, which have a potential to find econometric applications. Two particular methods include a robust correlation ...
How to down-weight observations in robust regression: A metalearning study
Kalina, Jan; Pitra, Zbyněk
2018 - English
Metalearning is becoming an increasingly important methodology for extracting knowledge from a data base of available training data sets to a new (independent) data set. The concept of metalearning is becoming popular in statistical learning and there is an increasing number of metalearning applications also in the analysis of economic data sets. Still, not much attention has been paid to its limitations and disadvantages. For this purpose, we use various linear regression estimators (including highly robust ones) over a set of 30 data sets with economic background and perform a metalearning study over them as well as over the same data sets after an artificial contamination. We focus on comparing the prediction performance of the least weighted squares estimator with various weighting schemes. A broader spectrum of classification methods is applied and a support vector machine turns out to yield the best results. While results of a leave-1-out cross validation are very different from results of autovalidation, we realize that metalearning is highly unstable and its results should be interpreted with care. We also focus on discussing all possible limitations of the metalearning methodology in general.
Keywords:
metalearning; robust statistics; linear regression; outliers
Available on request at various institutes of the ASCR
How to down-weight observations in robust regression: A metalearning study
Metalearning is becoming an increasingly important methodology for extracting knowledge from a data base of available training data sets to a new (independent) data set. The concept of metalearning is ...
Sparse Test Problems for Nonlinear Least Squares
Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
2018 - English
This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page http://www.cs.cas.cz/~luksan/test.html. Furthermore, all test problems contained in these subroutines are presented in the analytic form.
Keywords:
large-scale optimization; least squares; test problems
Available in a digital repository NRGL
Sparse Test Problems for Nonlinear Least Squares
This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page ...
Problems for Nonlinear Least Squares and Nonlinear Equations
Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
2018 - English
This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page http://www.cs.cas.cz/~luksan/test.html. Furthermore, all test problems contained in these subroutines are presented in the analytic form.
Keywords:
large-scale optimization; least squares; nonlinear equations,; test problems
Available in a digital repository NRGL
Problems for Nonlinear Least Squares and Nonlinear Equations
This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page ...
Robust Metalearning: Comparing Robust Regression Using A Robust Prediction Error
Peštová, Barbora; Kalina, Jan
2018 - English
The aim of this paper is to construct a classification rule for predicting the best regression estimator for a new data set based on a database of 20 training data sets. Various estimators considered here include some popular methods of robust statistics. The methodology used for constructing the classification rule can be described as metalearning. Nevertheless, standard approaches of metalearning should be robustified if working with data sets contaminated by outlying measurements (outliers). Therefore, our contribution can be also described as robustification of the metalearning process by using a robust prediction error. In addition to performing the metalearning study by means of both standard and robust approaches, we search for a detailed interpretation in two particular situations. The results of detailed investigation show that the knowledge obtained by a metalearning approach standing on standard principles is prone to great variability and instability, which makes it hard to believe that the results are not just a consequence of a mere chance. Such aspect of metalearning seems not to have been previously analyzed in literature.
Keywords:
metalearning; robust regression; outliers; robust prediction error
Fulltext is available at external website.
Robust Metalearning: Comparing Robust Regression Using A Robust Prediction Error
The aim of this paper is to construct a classification rule for predicting the best regression estimator for a new data set based on a database of 20 training data sets. Various estimators considered ...
A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions
Vlček, Jan; Lukšan, Ladislav
2018 - English
Keywords:
Unconstrained minimization; variable metric methods; limited-memory methods; the repeated BFGS update; global convergence; numerical results
Available in digital repository of the ASCR
A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions
Numerical solution of generalized minimax problems
Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
2018 - English
Keywords:
Numerical optimization; nonlinear approximation; nonsmooth optimization; generalized minimax problems; recursive quadratic programming methods; interior point methods; smoothing methods; algorithms; numerical experiments
Available in digital repository of the ASCR
Numerical solution of generalized minimax problems
NRGL provides central access to information on grey literature produced in the Czech Republic in the fields of science, research and education. You can find more information about grey literature and NRGL at service web
Send your suggestions and comments to nusl@techlib.cz
Provider
Other bases