Correlated ${cal G}$ distributions can be used to describe the clutter seen in images obtained with coherent illumination, as is the case of B-scan ultrasound, laser, sonar and synthetic aperture radar (SAR) imagery. These distributions are derived using the square root of the generalized inverse Gaussian distribution for the amplitude backscatter within the multiplicative model. A two-parameters particular case of the amplitude ${mathcal G}$ distribution, called ${mathcal G}_{A}^{0}$, constitutes a modeling improvement with respect to the widespread ${mathcal K}_{A}$ distribution when fitting urban, forested and deforested areas in remote sensing data. This article deals with the modeling and the simulation of correlated ${mathcal G}_{A}^{0}$-distributed random fields. It is accomplished by means of the Inverse Transform method, applied to Gaussian random fields with spatial correlation. The main feature of this approach is its generality, since it allows the introduction of negative correlation values in the resulting process, necessary for the proper explanation of the shadowing effect in many SAR images.