The problem of fitting experimental data to a given model function $f(t; p_1,p_2,dots,p_N)$ is conventionally solved numerically by methods such as that of Levenberg-Marquardt, which are based on approximating the Chi-squared measure of discrepancy by a quadratic function. Such nonlinear iterative methods are usually necessary unless the function $f$ to be fitted is itself a linear function of the parameters $p_n$, in which case an elementary linear Least Squares regression is immediately available. When linearity is present in some, but not all, of the parameters, we show how to streamline the optimization method by reducing the nonlinear activity to the nonlinear parameters only. Numerical examples are given to demonstrate the effectiveness of this approach. The main idea is to replace entries corresponding to the linear terms in the numerical difference quotients with an optimal value easily obtained by linear regression. More generally, the idea applies to minimization problems which are quadratic in some of the parameters. We show that the covariance matrix of $chi^2$ remains the same even though the derivatives are calculated in a different way. For this reason, the standard non-linear optimization methods can be fully applied.