optimisation problem

sm2n10

New member
Joined
Jun 7, 2015
Messages
1
Hi, I have the following equation:


f(z)=g(z)+b*u(z)


where z=(x,y) i.e. bivariate,b is a parameter, u(z) the uniform distribution and g(z) a function that represents distance.


By considering for a momment b=0, min(f(z)) can give me the location of the minimum distance. However because I want to have locations that are not the same I add u(z). With b it's possible to change the influence of u(z). Very high values of b give very random positions, while if b is very small, only locations around the minimum are chosen.


Furthermore, I have some reference locations zr={(x1,y1),(x2,x2),...(xn,yn)}. I'm trying to figure out the best b I could have in order to produce from f(z) locations as much close as possible to zr.
Is there anything in code/method I could have in order to solve it?


Thanks
 
Hi, I have the following equation:


f(z)=g(z)+b*u(z)


where z=(x,y) i.e. bivariate,b is a parameter, u(z) the uniform distribution and g(z) a function that represents distance.


By considering for a momment b=0, min(f(z)) can give me the location of the minimum distance. However because I want to have locations that are not the same I add u(z). With b it's possible to change the influence of u(z). Very high values of b give very random positions, while if b is very small, only locations around the minimum are chosen.


Furthermore, I have some reference locations zr={(x1,y1),(x2,x2),...(xn,yn)}. I'm trying to figure out the best b I could have in order to produce from f(z) locations as much close as possible to zr.
Is there anything in code/method I could have in order to solve it?


Thanks
This is a standard model for regression fitting. That is you have a set of 'measured' data [your {(x1,y1), (x2,y2), ..., (xn,yn)}] which you think should follow some model [your g(z)], but the 'measurements' have some error in them [your b u(z)]. Linear regression is so called not because the model is linear in x but because the coefficients occur in a linear fashion.

As a very simple example, assume our data came from a line
xj = x0 + a zj + exj
yj = y0 + b zj + eyj
Then the sum of squared errors E is
E = \(\displaystyle \Sigma e_{xj}^2 + e_{yj}^2\)
We want to make this as small as we can by choosing parameters x0, y0, a, and b. This is a simple linear regression least squares problem, see
http://mathworld.wolfram.com/LeastSquaresFitting.html
for example.

There are linear regression solvers on line, for example a selection for Google
https://www.google.com/search?q=exp...utf-8&oe=utf-8#q=linear+regression+calculator
 
Top