Please follow the rules of posting in this forum, as enunciated at:Hello,
I have the following question: if f is a convex function, we know that for every x, y and for every 0<=t<=1 we have f(tx+(1-t)y)<=tf(x)+(1-t)f(y), but how can we prove that f(tx+(1-t)y)>=tf(x)+(1-t)f(y) if t>=1.
Thanks