In quantum estimation theory, the Holevo bound is known as a lower bound of weighed traces of covariances of unbiased estimators. The Holevo bound is defined by a solution of a minimization problem, and in general, explicit solution is not known. When the dimension of Hilbert space is two and the number of parameters is two, a explicit form of the Holevo bound was given by Suzuki. In this paper, we focus on a logarithmic derivative lies between the symmetric logarithmic derivative (SLD) and the right logarithmic derivative (RLD) parameterized by $betain[0,1]$ to obtain lower bounds of weighted trace of covariance of unbiased estimator. We introduce the maximum logarithmic derivative bound as the maximum of bounds with respect to $beta$. We show that all monotone metrics induce lower bounds, and the maximum logarithmic derivative bound is the largest bound among them. We show that the maximum logarithmic derivative bound has explicit solution when the $d$ dimensional model has $d+1$ dimensional $mathcal{D}$ invariant extension of the SLD tangent space. Furthermore, when $d=2$, we show that the maximization problem to define the maximum logarithmic derivative bound is the Lagrangian duality of the minimization problem to define Holevo bound, and is the same as the Holevo bound. This explicit solution is a generalization of the solution for a two dimensional Hilbert space given by Suzuki. We give also examples of families of quantum states to which our theory can be applied not only for two dimensional Hilbert spaces.