Standard

On Machine Learning In Regression Analysis. / Leora, Svetlana Nikolaevna; Ermakov, Sergey Michaylovich.

10th International Workshop on Simulation and Statistics: Workshop booklet. Salzburg : Universitat Salzburg, 2019. стр. 52.

Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференцийстатья в сборнике материалов конференциинаучнаяРецензирование

Harvard

Leora, SN & Ermakov, SM 2019, On Machine Learning In Regression Analysis. в 10th International Workshop on Simulation and Statistics: Workshop booklet. Universitat Salzburg, Salzburg, стр. 52, 10th International Workshop on Simulation and Statistics, Salzburg, Австралия, 2/09/19.

APA

Leora, S. N., & Ermakov, S. M. (2019). On Machine Learning In Regression Analysis. в 10th International Workshop on Simulation and Statistics: Workshop booklet (стр. 52). Universitat Salzburg.

Vancouver

Leora SN, Ermakov SM. On Machine Learning In Regression Analysis. в 10th International Workshop on Simulation and Statistics: Workshop booklet. Salzburg: Universitat Salzburg. 2019. стр. 52

Author

Leora, Svetlana Nikolaevna ; Ermakov, Sergey Michaylovich. / On Machine Learning In Regression Analysis. 10th International Workshop on Simulation and Statistics: Workshop booklet. Salzburg : Universitat Salzburg, 2019. стр. 52

BibTeX

@inproceedings{e93166de354e4e199bb3b0d4b7336964,
title = "On Machine Learning In Regression Analysis",
abstract = "As is known, the task of constructing a regression function from observed data is ofgreat practical importance. In the case of additive error of observations at points whosecoordinates are given without errors, we have:yj=f(Xj) +εj, wherej= 1,...,Nis the observation number,yj– the observed value,Xj= (x1j,...,xsj) is the point atwhich the observation took place,εjis the observation error. It is also assumedEεj= 0.The task is to define a function f that is usually considered to be given parametrically,f(X) =f(X,U), whereUare unknown parameters. The problem has obvious connectionswith problems of approximation of functions.The report discusses one of the possible approaches using the idea of machine learning.It is based on the approximation problem for some functionf. LetAbe a linear operatoracting in a linear normed spaceF. IfA∗is an adjoint operator toA, the functionsφjandψjofFsatisfy the conditions:AA∗φj=s2jφj,AA∗φj=s2jψj,j= 1,...,r(A), thenamong all m-dimensional (m≤r(A)) operatorsAmthe operator{\~ }Am=∑j=1,msj(·,φj)ψjminimizes the norm of||A−Am||.IfK=I−Ais an operator, such thatKf= 0, and in this equality we replaceAwith itsapproximation{\~ }Am, then we get an approximation tofin the formfˆ=∑j=1,msj(f,φj)ψj.The idea of ””learning” is as follows. LetK(θ) – the parametric family of operators.Using sampled values off, we find an operatorK0=K(θ0),θ0= arg min||K(θ)||, wereθbelongs toθ. AssumingA=I−K0, we find the corresponding functionsφjandψjandconstruct an approximationfˆ for the appropriatem.Real algorithms use spaces of functions defined at discrete points. A well-known particularcase of using this approach is SVD time series analysis. Based on this approach, thereare different generalizations of this analysis. Some examples of generalization are givenin the report.",
author = "Leora, {Svetlana Nikolaevna} and Ermakov, {Sergey Michaylovich}",
year = "2019",
month = sep,
language = "English",
pages = "52",
booktitle = "10th International Workshop on Simulation and Statistics",
publisher = "Universitat Salzburg",
address = "Austria",
note = "10th International Workshop on Simulation and Statistics<br/> ; Conference date: 02-09-2019 Through 06-09-2019",
url = "http://datascience.sbg.ac.at/SimStatSalzburg2019/index.html, https://www.osg.or.at/main.asp?VID=1&kat1=87&kat2=690&NID=3944",

}

RIS

TY - GEN

T1 - On Machine Learning In Regression Analysis

AU - Leora, Svetlana Nikolaevna

AU - Ermakov, Sergey Michaylovich

PY - 2019/9

Y1 - 2019/9

N2 - As is known, the task of constructing a regression function from observed data is ofgreat practical importance. In the case of additive error of observations at points whosecoordinates are given without errors, we have:yj=f(Xj) +εj, wherej= 1,...,Nis the observation number,yj– the observed value,Xj= (x1j,...,xsj) is the point atwhich the observation took place,εjis the observation error. It is also assumedEεj= 0.The task is to define a function f that is usually considered to be given parametrically,f(X) =f(X,U), whereUare unknown parameters. The problem has obvious connectionswith problems of approximation of functions.The report discusses one of the possible approaches using the idea of machine learning.It is based on the approximation problem for some functionf. LetAbe a linear operatoracting in a linear normed spaceF. IfA∗is an adjoint operator toA, the functionsφjandψjofFsatisfy the conditions:AA∗φj=s2jφj,AA∗φj=s2jψj,j= 1,...,r(A), thenamong all m-dimensional (m≤r(A)) operatorsAmthe operator ̃Am=∑j=1,msj(·,φj)ψjminimizes the norm of||A−Am||.IfK=I−Ais an operator, such thatKf= 0, and in this equality we replaceAwith itsapproximation ̃Am, then we get an approximation tofin the formfˆ=∑j=1,msj(f,φj)ψj.The idea of ””learning” is as follows. LetK(θ) – the parametric family of operators.Using sampled values off, we find an operatorK0=K(θ0),θ0= arg min||K(θ)||, wereθbelongs toθ. AssumingA=I−K0, we find the corresponding functionsφjandψjandconstruct an approximationfˆ for the appropriatem.Real algorithms use spaces of functions defined at discrete points. A well-known particularcase of using this approach is SVD time series analysis. Based on this approach, thereare different generalizations of this analysis. Some examples of generalization are givenin the report.

AB - As is known, the task of constructing a regression function from observed data is ofgreat practical importance. In the case of additive error of observations at points whosecoordinates are given without errors, we have:yj=f(Xj) +εj, wherej= 1,...,Nis the observation number,yj– the observed value,Xj= (x1j,...,xsj) is the point atwhich the observation took place,εjis the observation error. It is also assumedEεj= 0.The task is to define a function f that is usually considered to be given parametrically,f(X) =f(X,U), whereUare unknown parameters. The problem has obvious connectionswith problems of approximation of functions.The report discusses one of the possible approaches using the idea of machine learning.It is based on the approximation problem for some functionf. LetAbe a linear operatoracting in a linear normed spaceF. IfA∗is an adjoint operator toA, the functionsφjandψjofFsatisfy the conditions:AA∗φj=s2jφj,AA∗φj=s2jψj,j= 1,...,r(A), thenamong all m-dimensional (m≤r(A)) operatorsAmthe operator ̃Am=∑j=1,msj(·,φj)ψjminimizes the norm of||A−Am||.IfK=I−Ais an operator, such thatKf= 0, and in this equality we replaceAwith itsapproximation ̃Am, then we get an approximation tofin the formfˆ=∑j=1,msj(f,φj)ψj.The idea of ””learning” is as follows. LetK(θ) – the parametric family of operators.Using sampled values off, we find an operatorK0=K(θ0),θ0= arg min||K(θ)||, wereθbelongs toθ. AssumingA=I−K0, we find the corresponding functionsφjandψjandconstruct an approximationfˆ for the appropriatem.Real algorithms use spaces of functions defined at discrete points. A well-known particularcase of using this approach is SVD time series analysis. Based on this approach, thereare different generalizations of this analysis. Some examples of generalization are givenin the report.

UR - https://datascience.sbg.ac.at/SimStatSalzburg2019/SimStat2019BoA.pdf

M3 - Conference contribution

SP - 52

BT - 10th International Workshop on Simulation and Statistics

PB - Universitat Salzburg

CY - Salzburg

T2 - 10th International Workshop on Simulation and Statistics<br/>

Y2 - 2 September 2019 through 6 September 2019

ER -

ID: 48414591