We review of the interface between (theoretical) physics and information for non-experts. The origin of information as related to the notion of entropy is described, first in the context of thermodynamics then in the context of statistical mechanics. A close examination of the foundations of statistical mechanics and the need to reconcile the probabilistic and deterministic views of the world leads us to a discussion of chaotic dynamics, where information plays a crucial role in quantifying predictability. We then discuss a variety of fundamental issues that emerge in defining information and how one must exercise care in discussing concepts such as order, disorder, and incomplete knowledge. We also discuss an alternative form of entropy and its possible relevance for nonequilibrium thermodynamics. In the final part of the paper we discuss how quantum mechanics gives rise to the very different concept of quantum information. Entirely new possibilities for information storage and computation are possible due to the massive parallel processing inherent in quantum mechanics. We also point out how entropy can be extended to apply to quantum mechanics to provide a useful measurement for quantum entanglement. Finally we make a small excursion to the interface betweeen quantum theory and general relativity, where one is confronted with an ultimate information paradox posed by the physics of Black Holes. In this review we have limited ourselves; not all relevant topics that touch on physics and information could be covered.