We present radiation transfer (RT) simulations of evolutionary sequences of massive protostars forming from massive dense cores in environments of high surface densities. The protostellar evolution is calculated with a detailed multi-zone model, with the accretion rate regulated by feedback from an evolving disk-wind outflow cavity. Disk and envelope evolutions are calculated self-consistently. In this framework, an evolutionary track is determined by three environmental initial conditions: the initial core mass M_c, the mean surface density of the ambient star-forming clump Sigma_cl, and the rotational-to-gravitational energy ratio of the initial core, beta_c. Evolutionary sequences with various M_c, Sigma_cl, beta_c are constructed. We find that in a fiducial model with M_c=60Msun, Sigma_cl=1 g/cm^2 and beta_c=0.02, the final star formation efficiency >~0.43. For each evolutionary track, RT simulations are performed at selected stages, with temperature profiles, SEDs, and images produced. At a given stage the envelope temperature is highly dependent on Sigma_cl, but only weakly dependent on M_c. The SED and MIR images depend sensitively on the evolving outflow cavity, which gradually wides as the protostar grows. The fluxes at <~100 microns increase dramatically, and the far-IR peaks move to shorter wavelengths. We find that, despite scatter caused by different M_c, Sigma_cl, beta, and inclinations, sources at a given evolutionary stage appear in similar regions on color-color diagrams, especially when using colors at >~ 70 microns, where the scatter due to the inclination is minimized, implying that such diagrams can be useful diagnostic tools of evolutionary stages of massive protostars. We discuss how intensity profiles along or perpendicular to the outflow axis are affected by environmental conditions and source evolution.