In matrix recovery from random linear measurements, one is interested in recovering an unknown $M$-by-$N$ matrix $X_0$ from $n<MN$ measurements $y_i=Tr(A_i^T X_0)$ where each $A_i$ is an $M$-by-$N$ measurement matrix with i.i.d random entries, $i=1,ldots,n$. We present a novel matrix recovery algorithm, based on approximate message passing, which iteratively applies an optimal singular value shrinker -- a nonconvex nonlinearity tailored specifically for matrix estimation. Our algorithm typically converges exponentially fast, offering a significant speedup over previously suggested matrix recovery algorithms, such as iterative solvers for Nuclear Norm Minimization (NNM). It is well known that there is a recovery tradeoff between the information content of the object $X_0$ to be recovered (specifically, its matrix rank $r$) and the number of linear measurements $n$ from which recovery is to be attempted. The precise tradeoff between $r$ and $n$, beyond which recovery by a given algorithm becomes possible, traces the so-called phase transition curve of that algorithm in the $(r,n)$ plane. The phase transition curve of our algorithm is noticeably better than that of NNM. Interestingly, it is close to the information-theoretic lower bound for the minimal number of measurements needed for matrix recovery, making it not only state-of-the-art in terms of convergence rate, but also near-optimal in terms of the matrices it successfully recovers.