We report on the MIT Epoch of Reionization (MITEoR) experiment, a pathfinder low-frequency radio interferometer whose goal is to test technologies that improve the calibration precision and reduce the cost of the high-sensitivity 3D mapping required for 21 cm cosmology. MITEoR accomplishes this by using massive baseline redundancy, which enables both automated precision calibration and correlator cost reduction. We demonstrate and quantify the power and robustness of redundancy for scalability and precision. We find that the calibration parameters precisely describe the effect of the instrument upon our measurements, allowing us to form a model that is consistent with $chi^2$ per degree of freedom < 1.2 for as much as 80% of the observations. We use these results to develop an optimal estimator of calibration parameters using Wiener filtering, and explore the question of how often and how finely in frequency visibilities must be reliably measured to solve for calibration coefficients. The success of MITEoR with its 64 dual-polarization elements bodes well for the more ambitious Hydrogen Epoch of Reionization Array (HERA) project and other next-generation instruments, which would incorporate many identical or similar technologies.