Convergence analysis of central and minimax algorithms in scalar regressor models
Abstract
In this paper, the estimation of a scalar parameter is considered with given lower and upper bounds of the scalar regressor. We derive non-asymptotic, lower and upper bounds on the convergence rates of the parameter estimate variances of the central and the minimax algorithms for noise probability density functions characterized by a thin tail distribution. This presents an extension of the previous work for constant scalar regressors to arbitrary scalar regressors with magnitude constraints. We expect our results to stimulate further research interests in the statistical analysis of these set-based estimators when the unknown parameter is multi-dimensional and the probability distribution function of the noise is more general than the present setup.
Source
Mathematics of Control Signals and SystemsVolume
18Issue
1Collections
- Makale Koleksiyonu [193]
- Scopus İndeksli Yayınlar Koleksiyonu [8325]
- WoS İndeksli Yayınlar Koleksiyonu [7605]