TY - JOUR

T1 - On multivariate discrete least squares

AU - Lee, Yeon Ju

AU - Micchelli, Charles A.

AU - Yoon, Jungho

N1 - Funding Information:
Yeon Ju Lee was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning ( NRF-2015R1C1A1A02037556 ). Jungho Yoon was supported by the grants NRF-2015-R1A5A1009350 and NRF-2015-R1D1A1A09057553 through the National Research Foundation of Korea . Charles A. Micchelli was supported by US National Science Foundation grant DMS-1522339 .
Publisher Copyright:
© 2016

PY - 2016/11/1

Y1 - 2016/11/1

N2 - For a positive integer n∈N we introduce the index set Nn:={1,2,…,n}. Let X:={xi:i∈Nn} be a distinct set of vectors in Rd, Y:={yi:i∈Nn} a prescribed data set of real numbers in R and F:={fj:j∈Nm},md containing X. The discrete least squares problem determines a (generally unique) function f=∑j∈Nmcj⋆fj∈spanF which minimizes the square of the ℓ2−norm ∑i∈Nn(∑j∈Nmcjfj(xi)−yi)2 over all vectors (cj:j∈Nm)∈Rm. The value of f at some s∈O may be viewed as the optimally predicted value (in the ℓ2−sense) of all functions in spanF from the given data X={xi:i∈Nn} and Y={yi:i∈Nn}. We ask “What happens if the components of X and s are nearly the same”. For example, when all these vectors are near the origin in Rd. From a practical point of view this problem comes up in image analysis when we wish to obtain a new pixel value from nearby available pixel values as was done in [2], for a specified set of functions F. This problem was satisfactorily solved in the univariate case in Section 6 of Lee and Micchelli (2013). Here, we treat the significantly more difficult multivariate case using an approach recently provided in Yeon Ju Lee, Charles A. Micchelli and Jungho Yoon (2015).

AB - For a positive integer n∈N we introduce the index set Nn:={1,2,…,n}. Let X:={xi:i∈Nn} be a distinct set of vectors in Rd, Y:={yi:i∈Nn} a prescribed data set of real numbers in R and F:={fj:j∈Nm},md containing X. The discrete least squares problem determines a (generally unique) function f=∑j∈Nmcj⋆fj∈spanF which minimizes the square of the ℓ2−norm ∑i∈Nn(∑j∈Nmcjfj(xi)−yi)2 over all vectors (cj:j∈Nm)∈Rm. The value of f at some s∈O may be viewed as the optimally predicted value (in the ℓ2−sense) of all functions in spanF from the given data X={xi:i∈Nn} and Y={yi:i∈Nn}. We ask “What happens if the components of X and s are nearly the same”. For example, when all these vectors are near the origin in Rd. From a practical point of view this problem comes up in image analysis when we wish to obtain a new pixel value from nearby available pixel values as was done in [2], for a specified set of functions F. This problem was satisfactorily solved in the univariate case in Section 6 of Lee and Micchelli (2013). Here, we treat the significantly more difficult multivariate case using an approach recently provided in Yeon Ju Lee, Charles A. Micchelli and Jungho Yoon (2015).

KW - Collocation matrix

KW - Multivariate least squares

KW - Multivariate Maclaurin expansion

KW - Wronskian

UR - http://www.scopus.com/inward/record.url?scp=84982124217&partnerID=8YFLogxK

U2 - 10.1016/j.jat.2016.07.005

DO - 10.1016/j.jat.2016.07.005

M3 - Article

AN - SCOPUS:84982124217

VL - 211

SP - 78

EP - 84

JO - Journal of Approximation Theory

JF - Journal of Approximation Theory

SN - 0021-9045

ER -