We report on the study of the impact of the finite resolution of the chirp rate applied on the frequency difference between the Raman lasers beamsplitters onto the phase of a free fall atom gravimeter. This chirp induces a phase shift that compensates the one due to gravity acceleration, allowing for its precise determination in terms of frequencies. In practice, it is most often generated by a direct digital synthesizer (DDS). Besides the effect of eventual truncation errors, we evaluate here the bias on the g measurement due to the finite time and frequency resolution of the chirp generated by the DDS, and show that it can compromise the measurement accuracy. However, this effect can be mitigated by an adequate choice of the DDS chirp parameters resulting from a trade-off between interferometer phase resolution and induced bias.