The X-ray emission from a simulated massive stellar cluster is investigated. The emission is calculated from a 3D hydrodynamical model which incorporates the mechanical feedback from the stellar winds of 3 O-stars embedded in a giant molecular cloud clump containing 3240 M$_{odot}$ of molecular material within a 4 pc radius. A simple prescription for the evolution of the stars is used, with the first supernova explosion at t=4.4 Myrs. We find that the presence of the GMC clump causes short-lived attenuation effects on the X-ray emission of the cluster. However, once most of the material has been ablated away by the winds the remaining dense clumps do not have a noticable effect on the attenuation compared with the assumed interstellar medium column. We determine the evolution of the cluster X-ray luminosity, L$_X$, and spectra, and generate synthetic images. The intrinsic X-ray luminosity drops from nearly 10$^{34}$ ergs s$^{-1}$ while the winds are `bottled up, to a near constant value of 1.7$times 10^{32}rm ergs s^{-1}$ between t=1-4 Myrs. L$_X$ reduces slightly during each stars red supergiant stage due to the depressurization of the hot gas. However, L$_X$ increases to $approx 10^{34}rm,ergs s^{-1}$ during each stars Wolf-Rayet stage. The X-ray luminosity is enhanced by 2-3 orders of magnitude to $sim 10^{37}rm ergs s^{-1}$ for at least 4600 yrs after each supernova, at which time the blast wave leaves the grid and the X-ray luminosity drops. The X-ray luminosity of our simulation is generally considerably fainter than predicted from spherically-symmetric bubble models, due to the leakage of hot gas material through gaps in the outer shell. This process reduces the pressure within our simulation and thus the X-ray emission. However, the X-ray luminosities and temperatures which we obtain are comparable to similarly powerful massive young clusters.