We show a near optimal direct-sum theorem for the two-party randomized communication complexity. Let $fsubseteq X times Ytimes Z$ be a relation, $varepsilon> 0$ and $k$ be an integer. We show, $$mathrm{R}^{mathrm{pub}}_varepsilon(f^k) cdot log(mathrm{R}^{mathrm{pub}}_varepsilon(f^k)) ge Omega(k cdot mathrm{R}^{mathrm{pub}}_varepsilon(f)) enspace,$$ where $f^k= f times ldots times f$ ($k$-times) and $mathrm{R}^{mathrm{pub}}_varepsilon(cdot)$ represents the public-coin randomized communication complexity with worst-case error $varepsilon$. Given a protocol $mathcal{P}$ for $f^k$ with communication cost $c cdot k$ and worst-case error $varepsilon$, we exhibit a protocol $mathcal{Q}$ for $f$ with external-information-cost $O(c)$ and worst-error $varepsilon$. We then use a message compression protocol due to Barak, Braverman, Chen and Rao [2013] for simulating $mathcal{Q}$ with communication $O(c cdot log(ccdot k))$ to arrive at our result. To show this reduction we show some new chain-rules for capacity, the maximum information that can be transmitted by a communication channel. We use the powerful concept of Nash-Equilibrium in game-theory, and its existence in suitably defined games, to arrive at the chain-rules for capacity. These chain-rules are of independent interest.