Hi all,
I am currently facing the following optimization problem:
[imath]f(v) = r_f + v'*\mu + v'*diag(\Sigma)*1/2 -v'*\Sigma * v*1/2+(1-\gamma)*v'*\Sigma*v*1/2[/imath]
s.t.
[imath]1'*v=1[/imath] (sum of the vector elements = 1)
where [imath]\Sigma[/imath] = nxn variance-covariance-matrix, [imath]diag(\Sigma)[/imath] = nx1 variance vector, [imath]\mu, \gamma, r_f[/imath] = constants, [imath]1'[/imath] = transpose nx1 vector of ones and finally the variable of interest [imath]v[/imath] = nx1 vector with [imath]v_1, ..., v_n[/imath].
I defined the Lagrangian functions as [imath]L = f(v) + \lambda*(1'*v-1)[/imath] and took the derivates w.r.t. [imath]v[/imath] and w.r.t [imath]\lambda[/imath]. That's the point where I am stuck as I am not able to solve for lambda from the two functions and afterwards for the variable of interest [imath]v[/imath]. The goal in the end is to find the maximizing vector [imath]v[/imath] under the above equality constraint.
Could anybody help on how to solve this problem?
Thank you very much in advance.
I am currently facing the following optimization problem:
[imath]f(v) = r_f + v'*\mu + v'*diag(\Sigma)*1/2 -v'*\Sigma * v*1/2+(1-\gamma)*v'*\Sigma*v*1/2[/imath]
s.t.
[imath]1'*v=1[/imath] (sum of the vector elements = 1)
where [imath]\Sigma[/imath] = nxn variance-covariance-matrix, [imath]diag(\Sigma)[/imath] = nx1 variance vector, [imath]\mu, \gamma, r_f[/imath] = constants, [imath]1'[/imath] = transpose nx1 vector of ones and finally the variable of interest [imath]v[/imath] = nx1 vector with [imath]v_1, ..., v_n[/imath].
I defined the Lagrangian functions as [imath]L = f(v) + \lambda*(1'*v-1)[/imath] and took the derivates w.r.t. [imath]v[/imath] and w.r.t [imath]\lambda[/imath]. That's the point where I am stuck as I am not able to solve for lambda from the two functions and afterwards for the variable of interest [imath]v[/imath]. The goal in the end is to find the maximizing vector [imath]v[/imath] under the above equality constraint.
Could anybody help on how to solve this problem?
Thank you very much in advance.
Last edited: