A central concept in the connection between physics and information theory is entropy, which represents the amount of information extracted from the system by the observer performing measurements in an experiment. Indeed, Jaynes principle of maximum entropy allows to establish the connection between entropy in statistical mechanics and information entropy. In this sense, the dissipated energy in a classical Hamiltonian process, known as the thermodynamic entropy production, is connected to the relative entropy between the forward and backward probability densities. Recently, it was revealed that energetic inefficiency and model inefficiency, defined as the difference in mutual information that the system state shares with the future and past environmental variables, are equivalent concepts in Markovian processes. As a consequence, the question about a possible connection between model unpredictability and energetic inefficiency in the framework of classical physics emerges. Here, we address this question by connecting the concepts of random behavior of a classical Hamiltonian system, the Kolmogorov-Sinai entropy, with its energetic inefficiency, the dissipated work. This approach allows us to provide meaningful interpretations of information concepts in terms of thermodynamic quantities.