This paper studies the processing principles, implementation challenges, and performance of OFDM-based radars, with particular focus on the fourth-generation Long-Term Evolution (LTE) and fifth-generation (5G) New Radio (NR) mobile networks base stations and their utilization for radar/sensing purposes. First, we address the problem stemming from the unused subcarriers within the LTE and NR transmit signal passbands, and their impact on frequency-domain radar processing. Particularly, we formulate and adopt a computationally efficient interpolation approach to mitigate the effects of such empty subcarriers in the radar processing. We evaluate the target detection and the corresponding range and velocity estimation performance through computer simulations, and show that high-quality target detection as well as high-precision range and velocity estimation can be achieved. Especially 5G NR waveforms, through their impressive channel bandwidths and configurable subcarrier spacing, are shown to provide very good radar/sensing performance. Then, a fundamental implementation challenge of transmitter-receiver (TX-RX) isolation in OFDM radars is addressed, with specific emphasis on shared-antenna cases, where the TX-RX isolation challenges are the largest. It is confirmed that from the OFDM radar processing perspective, limited TX-RX isolation is primarily a concern in detection of static targets while moving targets are inherently more robust to transmitter self-interference. Properly tailored analog/RF and digital self-interference cancellation solutions for OFDM radars are also described and implemented, and shown through RF measurements to be key technical ingredients for practical deployments, particularly from static and slowly moving targets point of view.