1. An almost century-old problem
Back to business with the question of quantum mechanical completeness. Mi take on the story hasn’t changed in more than fifteen years. I have to insist:
1A. It is a problem
1B. It is unsolved (debatable perhaps…)
A problem it is. That should be clear, if only because of the dozen-odd theories trying to solve it: the orthodox, by Von Neumann; De Broglie-Bohm’s; the transactional; the many-world, by Everett; consistent histories (inspired by the former), the “gravitational”… The panel of physics popes having busied themselves with it includes the likes of Weinberg, ‘t Hooft, Gell-Mann, or Penrose. Although many still shrug their shoulders and say, “what problem?”
The reasons are more that socio-scientific. The traditional picture due to V. Neumann for measurements, as opposed to the more fundamental quantum evolution, is that we sometimes have state sums, , each carrying the potentiality of one result among two possible (, ) for a given property . And when we check which result is verified, and the record produces “result “, we must update the state to,
Namely, we must kill the “unregistered amplitude”, , and resize the outgoing state dividing by the square root of its probability. This is necessary in order to update the statistics, but it violates linearity (proporcionality between the outgoing state and the incoming one). But if we drop this demand:
we ruin unitarity (conservation of probability.) Thus the question is: either unitarity or linearity; we can’t have both.
Looks like a recipe for chicken rather than a physical law…
As to the explanations based on the density matrix, I already said they ignore the question of records.
This is not a matter of words. Nowadays both theory and experiment have developed to a point where it has become possible (1): to erase a quantum measurement and (2): to measure counter-factually. It is thus necessary for both components of the quantum state to survive dynamically. The unregistered amplitudes (as in our example) must keep evolving if they are to give rise to the overall wave front in case we conduct a quantum erasing during the subsequent evolution; or if a detector is placed along a trajectory destined to have an “empty” amplitude in it, in a counter-factual measurement like those occurring for an Elitzur-Vaidman bomb tester. These considerations make the following conclusion inescapable:
The current status of experimentation confirms that V. Neumann’s postulate has been ruled out, as unregistered amplitudes are shown to have physical consequences.
2. One doesn’t need a New York Times revolution
There exist elements in the theory unequivocally pointing towards the solution. These are the key points:
2A. Gauge invariance Gauge indeterminacy
2B. Topological evolution Evolution without local degrees of freedom
These elements have been elucidated in the theory for some time now, but nobody has related them to the question of completeness to the best of my knowledge. The explanation, necessarily brief and provisional, is:
2A: Only by means of gauge invariance one can explain quantum indeterminism; and only having established this correspondence and having understood its implications, one may be able to complete quantum mechanics through dynamical variables of a topological character.
This provides a difference of the generalized Hamiltonian method from what one is familiar with in elementary dynamics. We have arbitrary functions of the time occurring in the general solution of the equations of motion with given initial conditions. These arbitrary functions of the time must mean that we are using a mathematical framework containing arbitrary features, for example, a coordinate system which we can choose in some arbitrary way, or the gauge in electrodynamics. As a result of this arbitrariness in the mathematical framework, the dynamical variables at future times are not completely determined by the initial dynamical variables, and this shows itself up through arbitrary functions appearing in the general solution.
2B: Topological evolution is evolution without propagation, as the number of constraints exactly equals the number of degrees of freedom (field amplitudes).
And this is all I can read from cards 2A and 2B.
3. But hasn’t it been proved that it is impossible to complete quantum mechanics?
No. Theorems concerning hidden variables seem to imply either a non-local realism or else the traditional non-realism, etc. At the end of the day, they all leave the question untouched. They are affected, either from false premises (whenever I check that is -1, I’m also checking that is +1″, in CHSHB, also known as “Bell’s theorem”), or from insufficient conclusions (see 3A). Here I have to postpone details for an upcoming entry, but suffice it to say so far that the key lies in the context. It is well known that Bell’s inequalities are violated by quantum mechanics. This only happens because one assumes that measurement output for particle 1 is tantamount to having measured for particle 2. Suppose, though, that at particle 2’s location someone’s measuring instead of . Then such assertion is no longer true. The result I’m telling you in advance is that, when one takes into account the interaction Hamiltonian on particle 2, the expected values for (that, mind you, hasn’t been measured), change instantly (in a completely local way) at 2. Any experimental verification (like those by Aspect et al.) of the validity of quantum probabilities is incapable of telling what would have happened had I measured something else. When one includes this, purely quantum, Hamiltonian description, the result is CHSHB are satisfied, so quantum mechanics does not violate them anymore. Unfortunately I have to postpone that discussion.
3A. Exact correlations at a distance are not physical actions, but functional dependences The impossibility proofs concerning any completion of quantum mechanics based on exact correlations (GHZ) are thus inconclusive, as they are equivalent to the (trivial) preliminary lemma in Bell’s theorem (“spin up” for particle (1) implies “spin down” for particle (2), with zero dispersion for the sum and non-zero dispersion for each of the terms in the sum). This is a functional dependence between compatible variables. They could hardly produce anything other than perfect correlation, as they are respective functions of each other. I already proved this point for GHZ.
3B. Non-exact correlations at a distance are not physical actions either, but functional dependences between non-commuting variables .
Besides, the demonstrations of impossibility of completion for quantum mechanics based on non-exact correlations (CHSHB theorem) are inconclusive because they ignore how the context (see 3C) affects the quantum state, when they assume that the exact correlations referred to in 3A are still valid, when they actually do not hold anymore (information about the value of variable at point 2, external to the causal cone of point 1, is no longer valid at point 1, in the sense of implying that, provided takes value +1 at point 2, then it takes value -1 at point 1, if what is being done at point 1 is measuring with ).
3C. The context thus completely changes the nature of the experimental question itself: The measuring interaction automatically suspends the validity of the correlations at a distance for variables incompatible with those that are being measured. That is because, provided at point (2) someone’s measuring component of spin, then they are destroying the profile of the quantum state corresponding to component (or any other incompatible with ). In other words: The physicist who is measuring the component of spin for particle (1) has no right to assert that component of spin for particle (2) is the opposite, if what’s going on at (2) out of reach of his causal influence is a measurement of an incompatible component.
3D. The notion that any hidden-variable model has the obligation to express the results of measurements as pre-existing properties of the system can seem natural, and it may have been held by Einstein, but it is ultimately too strong and must be discarded. When this demand is formulated in general (without appealing to quantum mechanics), it could be named prejudice of ad infinitum separability between system and environment. When it is formulated from the quantum formalism, it is always based on an erroneous notion known as eigenvalue realism: Eigenvalues are not properties of a system, it is the interaction term that selects them. In this sense, real eigenvalues of Hermitian operators (observables) that a naive examination of the quantum formalism seems to elevate to the category of properties of a system (say, ontological attributes), are really properties of the interaction between a system and its physical environment, which can be relevant or not, depending on the evolution of both.
3E. Theorems of the ontological kind (in particular, the Bell-Kochen-Specker or BKS) “find observables” (or rather prove the existence, as the theorem is not constructive) which, while being mutually compatible, cannot be determined by any pre-existing variables. This line of enquiry is based on the concept of value definiteness, brought up by V. Neumann. In actuality, these so-called BKS “observables”, although they are Hermitian operators, and while they are embedded in the linear span of the spin subspace, are not themselves spin observables, and as a consequence they are devoid of physical content. The proof is so simple that one cannot help but feel puzzled that nobody, to my knowledge, has appealed to it so far. Those interested can follow the argument below.
XXX. Only for experts: