\item Section 2.4: Problem 13. Also derive this formula by way of quadratic approximation of $x=f^{-1}(y)$ at $(y,x) = (f(x_n), x_n)$. \item Use the code for Algorithm 2.2 to check if the standard fixed point iteration $x_{n+1} = g(x_n) = 2 \cos(x_n)$ converge to root of $f(x) = x-2\cos(x) = 0$. If not, can you explain why? Can you find a constant $\beta$ such that the modified fixed point iteration $x_{n+1} = \beta x_n + (1-\beta) g(x_n) $ converges (at least locally)? \item Section 2.5: Problems 12(a), 14, 15, 16. \item In Example 1 of section 2.5, the condition in Theorem 2.14 is not satisfied. Nevertheless, one could still get faster convergence of $\hat p_n$ than $p_n$, but of the same order. Analyze Aitken's $\Delta^2$ method to evaluate $\displaystyle{\lim_{n\to\infty}{\hat p_{n}-p \over p_{n+2} - p}}$. Then verify your result numerically.