The quantum computer is supposed to process information by applying unitary transformations to the complex amplitudes defining the state of N qubits. A useful machine needing N=1000 or more, the number of continuous parameters describing the state of a quantum computer at any given moment is much greater than the number of protons in the Universe. However, the theorists believe that the feasibility of large-scale quantum computing has been proven via the threshold theorem. Like for any theorem, the proof is based on a number of assumptions considered as axioms. However, in the physical world none of these assumptions can be fulfilled exactly. Any assumption can be only approached with some limited precision. So, the rather meaningless error-per-qubit-per-gate threshold must be supplemented by a list of the precisions with which all assumptions behind the threshold theorem should hold. Such a list still does not exist. The theory also seems to ignore the undesired free evolution of the quantum computer caused by the energy differences of quantum states entering any given superposition. Another important point is that the hypothetical quantum computer will be a system of at least a thousand of qubits plus an extremely complex and monstrously sophisticated classical apparatus. This huge and strongly nonlinear system will generally exhibit instabilities and chaotic behavior.