An algorithm is described for evolving the phase-space density of stars or compact objects around a massive black hole at the center of a galaxy. The technique is based on numerical integration of the Fokker-Planck equation in energy-angular momentum space, f(E,L,t), and includes, for the first time, diffusion coefficients that describe the effects of both random and correlated encounters (resonant relaxation), as well as energy loss due to emission of gravitational waves. Destruction or loss of stars into the black hole are treated by means of a detailed boundary-layer analysis. Performance of the algorithm is illustrated by calculating two-dimensional, time-dependent and steady-state distribution functions and their corresponding loss rates.