We study a class of linear-quadratic stochastic differential games in which each player interacts directly only with its nearest neighbors in a given graph. We find a semi-explicit Markovian equilibrium for any transitive graph, in terms of the empirical eigenvalue distribution of the graphs normalized Laplacian matrix. This facilitates large-population asymptotics for various graph sequences, with several sparse and dense examples discussed in detail. In particular, the mean field game is the correct limit only in the dense graph case, i.e., when the degrees diverge in a suitable sense. Even though equilibrium strategies are nonlocal, depending on the behavior of all players, we use a correlation decay estimate to prove a propagation of chaos result in both the dense and sparse regimes, with the sparse case owing to the large distances between typical vertices. Without assuming the graphs are transitive, we show also that the mean field game solution can be used to construct decentralized approximate equilibria on any sufficiently dense graph sequence.