Physics has this feature among many other: Even the least informed person can pose a question that not even the most experienced can answer. Therefore, posing unanswerable questions has little merit in itself. The goal is to pose questions that get to the heart of the matter, whatever the matter is. That’s what I’m trying here. I think my questions make sense and if the problem of quantum indefiniteness had been actually solved the physicists who claim so should have no problem in answering. In my opinion, anybody sticking to Wojciech H. Zurek’s *einselection* interpretation should be asking themselves these questions very seriously. According to this interpretation, it is enough to show that the interaction with the environment selects a basis of observables in typically thermodynamic contexts, giving rise to the definiteness of a world of classical observables. Then one constructs classical-probability operators and, with a sufficiently precise degree of accuracy, the physical state becomes an eigenstate of the probability operators in a very small time, making classical probability an emergent concept of the theory. The fact that selection of a basis does not imply selection of an alternative is the key to my questions, but I will be more explicit.

1) If, whenever a variable is in a classical regime it is because it is entangled with the environment through a great many degrees of freedom in the environmental variables in the form:

Pure states:

Mixture states:

what distinguishes in this context “ has been measured with output from “ has been measured with output ”?

2) How does this entanglement of either vectors or density matrices describe a counter-factual measurement, like in the Elitzur-Vaidman test or Renninger’s negative-result experiment? In other words: *how does one distinguish between decoherence with factual measurement and decoherence with counter-factual measurement?* And in still other words: why in experiments in which no measurement has been made coherence is lost when the measuring device is placed to intercept alternatives that have not been verified? What do this *no measurement has been made* and this *not been verified *mean? Perhaps things are verified or not without anything in the theory accounting for it or even providing any mathematical image of it?

3) How does the simple entanglement of vectors or matrices explain the fact that *measurements persist in the output;* e.g., whenever a superposition of two states with different linear momenta evolves from a previous deflection produces a localisation result, consecutive measurements of position confirm *that* result even though the amplitude is non-vanishing for *the other* result?

4) There is a classical analysis due to Nevill F. Mott with the purpose of reconciling the spherical symmetry of s waves corresponding to an alpha particle emitted by an unstable nucleus with the rectilinear trajectories it displays. What Mott does basically is to solve the problem in terms of conditional probabilities: If the particle goes trough A and B, the probability that it goes through a point not on the straight line through A and B is zero. The solution to this problem can be found in Wheeler & Zurek, *Quantum Theory and Measurement;* Princeton. Let us consider Mott’s problem. Let be a pure state of system + environment consistent with the condition “the trajectory is a straight line”―whatever that means in terms of the Hilbert-space spectral theorem―. Let now be any pure state consistent with the condition “no straight line has been registered yet” ―in the quantum theory of measurement the latter is normally called a *neutral state*―. Does that include the possibility that a straight line has been recorded but it doesn’t turn out to be a straight line? Doesn’t matter. It does not depend on that. Expand the Hilbert space until the alternatives are both *exhaustive* and *mutually* *exclusive.* Completeness and closure of the Hilbert space guarantee the existence of such projectors. If the postulational basis of quantum mechanics is correct, and must be orthogonal: .

If the exponential evolution is valid and Von Neumann’s analysis of measurement also is, so that , then there exists a certain spectral decomposition of the effective Hamiltonian in terms of -commuting projectors ( ”there is a line whatever”),

(where is a possible―actually, necessary―degeneracy index) and:

(The terms of the expansion are all commuting. We need Dyson’s evolution formula because, although the overall system can be considered closed, the expansion in macroscopically discernible projectors is necessarily made up out of many microscopic degrees of freedom, with their corresponding time dependence.)

Conclusion: Events “there is a straight line” and “there is not a straight line” *persist both,* and they display *mutual decoherence.* (Quite a different question is how to check decoherence between abstract variables, as *decoherence can only be verified in terms of position, and making generic macroscopic states interfere lacks any operational meaning.)* End of story. You can, of course, avoid the question and it is frequently done by appealing to the typically open character of macroscopic systems and the high number of environmental variables involved, but as can be seen, the paradox has nothing to do with such features. If one omits macroscopically discernible states, *including neutral states *like the one previously mentioned, of course the problem cannot be seen. But the the question arises: Why do these states not appear, if following the quantum formalism *anything that can happen, will happen?*

## Catchphrasing physics

*A nything that can happen, will happen,* or everything not forbidden is compulsory, as Murray Gell-Mann had it. This can be a good slogan, that many incorrectly ascribe to Feynman, and means that any alternative no explicitly forbidden by one of the system’s symmetries takes part in evolution with a probability amplitude. But

*what good is a catchphrase, rule of thumb, or even a principle if we have no fundamental way to tell that that happens from that that doesn’t?*

The problem is not that Nature is unpredictable, it is not that quantum is “strange” and classical is “familiar”, nor is it that a bunch of die-hards is trying to explain quantum mechanics in terms of classical mechanics. *The problem is that there is no way to put a tag on an equation of the quantum formalism to say what we mean by “this, and not that, has happened”, and yet, there are instances of physical experience when that distinction is crucial.*

It is possible that nobody will completely and in a contradiction-free way understand the connection between the quantum formalism and the classical, objective world. Maybe it will be, at best, a sum of individual efforts that will step by step shed light on such connection, even working in the opposite direction. As an example, Paul Ehrenfest, trying to liberate physics from indeterminacy got to establish a connection between classical and quantum mechanics through his famous theorem relating classical equations of motion with the time evolution of quantum average values. Ehrenfest’s theorem helps understand better such connection even though Ehrenfest actually failed in his attempt to submit the quantum formalism to the classical concepts.

Why should this question of telling apart factual vs counter-factual measurement or determination be important? Because in quantum cosmology these subtle qualifications are likely to be significant. Is the inflaton in a factual or counter-factual state at the time of re-heating, or slow roll, the big bang or the present phase of accelerated expansion? In any case, does it matter at all that we consider some variables as factually realised and others evolving counter-factually, that is, as an amplitude with no verification? The standard model of cosmology seems to require a variable that evolves factually (the inflaton field) as a classical background for the rest of the fields, over which they develop their quantum fluctuations from a seed of inhomogeneity given by the first deviations from equilibrium in dark matter.

All this seems to suggest that the scalar field plays a fundamentally different role from the rest of the irreducible representations of the Poincaré group; that is, not as generating multiplets and particle families, but as a different element whose function is to provide *pointers* or *position-definite states *to the rest of the field variables. The natural space to place these scalars is the quantum phase, quite simply, because there is no other place to put them without essentially breaking the structure of quantum field theory.

September 1, 2015 at 9:59 am |

[…] to the explanations based on the density matrix, I already said they ignore the question of […]