Atomic bandpass filters are widely used in a variety of applications, owing to their high peak transmission and narrow bandwidth. Much of the previous literature has used the Faraday effect to realize such filters, where an axial magnetic field is applied across the atomic medium. Here we show that by using a non-axial magnetic field, the performance of these filters can be improved in comparison to the Faraday geometry. We optimize the performance of these filters using a numerical model and verify their performance by direct quantitative comparison with experimental data. We find excellent agreement between experiment and theory. These optimized filters could find use in many of the areas where Faraday filters are currently used, with little modification to the optical setup, allowing for improved performance with relatively little change.