Lothar Collatz had proposed in 1937 a conjecture in number theory called Collatz conjecture. Till today there is no evidence of proving or disproving the conjecture. In this paper, we propose an algorithmic approach for verification of the Collatz conjecture based on bit representation of integers. The scheme neither encounters any cycles in the so called Collatz sequence and nor the sequence grows indefinitely. Experimental results show that the Collatz sequence starting at the given integer , oscillates for finite number of times, never exceeds 1.7 times (scaling factor) size of the starting integer and finally reaches the value 1. The experimental results show strong evidence that conjecture is correct and paves a way for theoretical proof.