We test the concept that seismicity prior to a large earthquake can be understood in terms of the statistical physics of a critical phase transition. In this model, the cumulative seismic strain release increases as a power-law time-to-failure before the final event. Furthermore, the region of correlated seismicity predicted by this model is much greater than would be predicted from simple elasto-dynamic interactions. We present a systematic procedure to test for the accelerating seismicity predicted by the critical point model and to identify the region approaching criticality, based on a comparison between the observed cumulative energy (Benioff strain) release and the power-law behavior predicted by theory. This method is used to find the critical region before all earthquakes along the San Andreas system since 1950 with M 6.5. The statistical significance of our results is assessed by performing the same procedure on a large number of randomly generated synthetic catalogs. The null hypothesis, that the observed acceleration in all these earthquakes could result from spurious patterns generated by our procedure in purely random catalogs, is rejected with 99.5% confidence. An empirical relation between the logarithm of the critical region radius (R) and the magnitude of the final event (M) is found, such that log R mu 0.5 M, suggesting that the largest probable event in a given region scales with the size of the regional fault network.