Microlensing not only brings extra magnification lightcurves on top of the intrinsic ones but also shifts them in time domain, making the actual time-delays between images of strongly lensed active galactic nucleus change on the $sim$ day(s) light-crossing time scale of the emission region. The microlensing-induced time-delays would bias strong lens time-delay cosmography if uncounted. However, due to the uncertainties of the disk size and the disk model, the impact is hard to accurately estimate. In this work, we study how to reduce the bias with designed observation strategy based on a standard disk model. We find long time monitoring of the images could alleviate the impact since it averages the microlensing time-lag maps due to the peculia motion of the source relative to the lens galaxy. In addition, images in bluer bands correspond to smaller disk sizes and therefore benefit time-delay measurements as well. We conduct a simulation based on a PG 1115+080-like lensed quasar. The results show the time-delay dispersions caused by microlensing can be reduced by $sim40%$ with 20-year lightcurves while u band relative to r band reduces $sim75%$ of the dispersions. Nevertheless, such an effect can not be totally eliminated in any cases. Further studies are still needed to appropriately incorporate it in inferring an accurate Hubble constant.