Processing math: 100%

Optimization set

Set

context B
context Y, … Non-strict partially ordered set
context r:BY
definition Or:={βB(bB).r(β)r(b)}

todo
#tag
If p are parameters and c_p(x) curves with x_min(c_p)=f(p) known, try to find x_min(c') by fitting c_p to c'. Now what's p here. Is there a scheme so that we can extend the list p to have guaranteed that there are parameters so that eventually c_p=c'?

If min(r)Y denote the minimum values of r, then

Or=r1(min(r))

with r1:PYPB.

Compare with Solution set.

Parametrized regression

Consider a test set

x0,y0X×Y,

where x0 somehow depends on y0.

Use B-family of fit functions

f:B(XY)

(the indexed subspace of XY is called hypotheses space)

and find from this set find the optimal fit (given by optimal βB) w.r.t. loss function

V:Y×YY

by optimizing

r(β):=V(f(β,x),y)

As a remark, given a function f (resp. a β), the value V(f(β,x0),y0) (or a multiple thereof) is called “empirical risk” in Statistical learning theory.

Linear regression w.r.t. least square

f(β,x):=β0+Ni=1βixi

with loss function

V(ˆy,y)=(ˆyy)(ˆyy)

In practice, xi may be vectors and then V is taken to be an inner product.

Reference


Context

Link to graph
Log In
Improvements of the human condition