This paper examines how an event from one random variable provides pointwise mutual information about an event from another variable via probability mass exclusions. We start by introducing probability mass diagrams, which provide a visual representation of how a prior distribution is transformed to a posterior distribution through exclusions. With the aid of these diagrams, we identify two distinct types of probability mass exclusions---namely informative and misinformative exclusions. Then, motivated by Fanos derivation of the pointwise mutual information, we propose four postulates which aim to decompose the pointwise mutual information into two separate informational components: a non-negative term associated with the informative exclusion and a non-positive term associated with the misinformative exclusions. This yields a novel derivation of a familiar decomposition of the pointwise mutual information into entropic components. We conclude by discussing the relevance of considering information in terms of probability mass exclusions to the ongoing effort to decompose multivariate information.