Despite their playful purpose social media changed the way users access information, debate, and form their opinions. Recent studies, indeed, showed that users online tend to promote their favored narratives and thus to form polarized groups around a common system of beliefs. Confirmation bias helps to account for users decisions about whether to spread content, thus creating informational cascades within identifiable communities. At the same time, aggregation of favored information within those communities reinforces selective exposure and group polarization. Along this path, through a thorough quantitative analysis we approach connectivity patterns over 1.2M of Facebook users engaged with two very conflicting narratives: scientific and conspiracy news. Analyzing such data, we quantitatively investigate the effect of two mechanisms (namely challenge avoidance and reinforcement seeking) behind confirmation bias, one of the major drivers of human behavior in social media. We find that challenge avoidance mechanism triggers the emergence of two distinct and polarized groups of users (i.e., echo chambers) who also tend to be surrounded by friends having similar systems of beliefs. Through a network based approach, we show how the reinforcement seeking mechanism limits the influence of neighbors and primarily drives the selection and diffusion of contents even among like-minded users, thus fostering the formation of highly polarized sub-clusters within the same echo chamber. Finally, we show that polarized users reinforce their preexisting beliefs by leveraging the activity of their like-minded neighbors, and this trend grows with the user engagement suggesting how peer influence acts as a support for reinforcement seeking.