With the advanced LIGO and Virgo detectors taking observations the detection of gravitational waves is expected within the next few years. Extracting astrophysical information from gravitational wave detections is a well-posed problem and thoroughly studied when detailed models for the waveforms are available. However, one motivation for the field of gravitational wave astronomy is the potential for new discoveries. Recognizing and characterizing unanticipated signals requires data analysis techniques which do not depend on theoretical predictions for the gravitational waveform. Past searches for short-duration un-modeled gravitational wave signals have been hampered by transient noise artifacts, or glitches, in the detectors. In some cases, even high signal-to-noise simulated astrophysical signals have proven difficult to distinguish from glitches, so that essentially any plausible signal could be detected with at most 2-3 $sigma$ level confidence. We have put forth the BayesWave algorithm to differentiate between generic gravitational wave transients and glitches, and to provide robust waveform reconstruction and characterization of the astrophysical signals. Here we study BayesWaves capabilities for rejecting glitches while assigning high confidence to detection candidates through analytic approximations to the Bayesian evidence. Analytic results are tested with numerical experiments by adding simulated gravitational wave transient signals to LIGO data collected between 2009 and 2010 and found to be in good agreement.