A number of hardware upgrades for the Low-Frequency Array (LOFAR) are currently under development. These upgrades are collectively referred to as the LOFAR 2.0 upgrade. The first stage of LOFAR 2.0 will introduce a distributed clock signal and allow for simultaneous observation with all the low-band and high-band antennas of the array. Our aim is to provide a tool for accurate simulations of LOFAR 2.0. We present a software to simulate LOFAR and LOFAR 2.0 observations, which includes realistic models for all important systematic effects such as the first and second order ionospheric corruptions, time-variable primary-beam attenuation, station based delays and bandpass response. The ionosphere is represented as a thin layer of frozen turbulence. Furthermore, thermal noise can be added to the simulation at the expected level. We simulate a full 8-hour simultaneous low- and high-band antenna observation of a calibrator source and a target field with the LOFAR 2.0 instrument. The simulated data is calibrated using readjusted LOFAR calibration strategies. We examine novel approaches of solution-transfer and joint calibration to improve direction-dependent ionospheric calibration for LOFAR. We find that the calibration of the simulated data behaves very similarly to a real observation and reproduces characteristic properties of LOFAR data such as realistic solutions and image quality. We analyze strategies for direction-dependent calibration of LOFAR 2.0 and find that the ionospheric parameters can be determined most accurately when combining the information of the high-band and low-band in a joint calibration approach. In contrast, the transfer of total electron content solutions from the high-band to the low-band shows good convergence but is highly susceptible to the presence of non-ionospheric phase errors in the data.