This doubt arose when dealing with the typical exercise of calculating the expected grade of a multiple-choice exam answered at random (where each right answer is given $p_1$ points and each wrong answer gets $-p_2$ points).
Let $X \sim \mathrm{Bi}(n,p)$, and consider $h(x)=p_1x -p_2(n-x)= (p_1-np_2)+(p_1+p_2)x$.
In this example, the following equality holds:
$E[h(X)]=h(E[X])$.
My question is: Is this equality always true, for any random variable $X$ and any function $h$? If not, what are some sufficient conditions for it?
For instance: Do the facts of $X$ being a discrete one-dimensional r.v. with finite expected value and $h$ being a linear (i.e., affine) function always imply $E[h(X)]=h(E[X])$?