Suppose the assumptions are satisfied, but $L>M$. We have to show that this is impossible, which means we need to derive a contradiction.
By the Limit Law for differences, $\lim_{x\to a}\ (g(x)-f(x))=M-L$.
Since $L-M>0$ by assumption, we can take this to be $\epsilon$ and deduce that there exists a $\delta>0$ such that if $0< |x-a|< \delta$ then $|g(x)-f(x)-(M-L)|< L-M$.
Since $r\le |r|$ for any real number $r$, we deduce that if $0< |x-a|< \delta$ then $g(x)-f(x)-M+L< L-M$, hence $g(x)< f(x)$.
Let $c=\text{min}\{b,a+\delta\}$, and choose any $x'\in(a,c)$.
Then $0< x'-a=|x'-a|< \delta$, so it follows that $g(x')< f(x')$.
But this contradicts the assumption that $f(x)\le g(x)$ for all $x\in(a,b)$.
Therefore $L>M$ is impossible, and the theorem is proved.