2911. Alternating Minimization for Least Square Estimation in Additive Semi-parametric Regression with Monotone Constraints
Invited abstract in session WA-34: Advancements of OR-analytics in statistics, machine learning and data science 6 , stream Advancements of OR-analytics in statistics, machine learning and data science.
Wednesday, 8:30-10:00Room: Michael Sadler LG10
Authors (first author is the speaker)
| 1. | Anand Kumar
|
| IEOR, IIT Bombay | |
| 2. | Radhenduska Srivastava
|
| Indian Institute of Technology, Bombay | |
| 3. | K. S. Mallikarjuna Rao
|
| Indian Institute of Technology, Bombay |
Abstract
Consider an additive semi-parametric regression model observations are sum of term linear in x and monotone function in z. Each sample point y is observed with noise that are independently and identically distributed Gaussian random variables with mean zero and fixed variance. The monotone components increase with index i.e. z increases strictly with each observation. We note that x is not correlated with z.
In this work, we present an alternating minimization method for least square estimation of the parameters - slope and monotone function. The shape constraint semi-parametric regression models are widely applicable in geological isotopic data, economics, astronomical data etc. We also present proof to the convergence of the proposed alternating minimization method. If sample size n increases in block using a dyadic manner, we also illustrate the rate of convergence of the estimator. A simulation study will be shown to illustrate the finite sample performance and convergence.
Keywords
- Algorithms
- Programming, Nonlinear
- Convex Optimization
Status: accepted
Back to the list of papers