We focus on safe ego-navigation in dense simulated traffic environments populated by road agents with varying driver behavior. Navigation in such environments is challenging due to unpredictability in agents actions caused by their heterogeneous behaviors. To overcome these challenges, we propose a new simulation technique which consists of enriching existing traffic simulators with behavior-rich trajectories corresponding to varying levels of aggressiveness. We generate these trajectories with the help of a driver behavior modeling algorithm. We then use the enriched simulator to train a deep reinforcement learning (DRL) policy for behavior-guided action prediction and local navigation in dense traffic. The policy implicitly models the interactions between traffic agents and computes safe trajectories for the ego-vehicle accounting for aggressive driver maneuvers such as overtaking, over-speeding, weaving, and sudden lane changes. Our enhanced behavior-rich simulator can be used for generating datasets that consist of trajectories corresponding to diverse driver behaviors and traffic densities, and our behavior-based navigation scheme reduces collisions by $7.13 - 8.40$%, handling scenarios with $8times$ higher traffic density compared to prior DRL-based approaches.