We present an analysis of 123 Gamma-ray bursts (GRBs) with known redshifts possessing an afterglow plateau phase. We reveal that $L_a-T^{*}_a$ correlation between the X-ray luminosity $L_a$ at the end of the plateau phase and the plateau duration, $T^*_a$, in the GRB rest frame has a power law slope different, within more than 2 $sigma$, from the slope of the prompt $L_{f}-T^{*}_{f}$ correlation between the isotropic pulse peak luminosity, $L_{f}$, and the pulse duration, $T^{*}_{f}$, from the time since the GRB ejection. Analogously, we show differences between the prompt and plateau phases in the energy-duration distributions with the afterglow emitted energy being on average $10%$ of the prompt emission. Moreover, the distribution of prompt pulse versus afterglow spectral indexes do not show any correlation. In the further analysis we demonstrate that the $L_{peak}-L_a$ distribution, where $L_{peak}$ is the peak luminosity from the start of the burst, is characterized with a considerably higher Spearman correlation coefficient, $rho=0.79$, than the one involving the averaged prompt luminosity, $L_{prompt}-L_a$, for the same GRB sample, yielding $rho=0.60$. Since some of this correlation could result from the redshift dependences of the luminosities, namely from their cosmological evolution we use the Efron-Petrosian method to reveal the intrinsic nature of this correlation. We find that a substantial part of the correlation is intrinsic. We apply a partial correlation coefficient to the new de-evolved luminosities showing that the intrinsic correlation exists.