Chebyshev vs Minimax Note that the best minimax polynomial approximation need not be the Chebyshev polynomial. The latter is the best for least square error. However, if the function is regular enough, the the Chebyshev approximation is often quite close to the minimax one.

It has been shown that if \(p_{n}^{\ast}\) is the minimax approximation and \(p_{n}(x)\) is the Chebyshev approximation of a continuous function on \([-1,1]\), then:

A tighter bound exists if \(f\) is very “regular”. For elementary functions, the minimax approximation is at most one bit more accurate. Speed of Convergence If the function is not fairly regular, then we often need a fairly high degree polynomial, which is slow to compute.

It has been shown that if we make any decreasing sequence of
\(\epsilon_{i}\) that converges to 0, there exists a continuous
function \(f\) such that the error of the minimax polynomial of
degree \(n\) is equal to \(\epsilon_{n}\). Thus, you pick your
rate of convergence, and there will be *some* function \(f\) that
can be approximated with that rate.

No use of this theorem is given. This seems about as useless a theorem as I can think of (at least for now).