We study the probabilistic degree over reals of the OR function on $n$ variables. For an error parameter $epsilon$ in (0,1/3), the $epsilon$-error probabilistic degree of any Boolean function $f$ over reals is the smallest non-negative integer $d$ such that the following holds: there exists a distribution $D$ of polynomials entirely supported on polynomials of degree at most $d$ such that for all $z in {0,1}^n$, we have $Pr_{P sim D} [P(z) = f(z) ] geq 1- epsilon$. It is known from the works of Tarui ({Theoret. Comput. Sci.} 1993) and Beigel, Reingold, and Spielman ({ Proc. 6th CCC} 1991), that the $epsilon$-error probabilistic degree of the OR function is at most $O(log n.log 1/epsilon)$. Our first observation is that this can be improved to $O{log {{n}choose{leq log 1/epsilon}}}$, which is better for small values of $epsilon$. In all known constructions of probabilistic polynomials for the OR function (including the above improvement), the polynomials $P$ in the support of the distribution $D$ have the following special structure:$P = 1 - (1-L_1).(1-L_2)...(1-L_t)$, where each $L_i(x_1,..., x_n)$ is a linear form in the variables $x_1,...,x_n$, i.e., the polynomial $1-P(x_1,...,x_n)$ is a product of affine forms. We show that the $epsilon$-error probabilistic degree of OR when restricted to polynomials of the above form is $Omega ( log a/log^2 a )$ where $a = log {{n}choose{leq log 1/epsilon}}$. Thus matching the above upper bound (up to poly-logarithmic factors).